If you want to make a website go faster, you've got a number of options. One of the best and easiest is to place a proxy caching server in front of a website, that accelerates content delivery.
The open source Varnish Cache is one such technology and is deployed on big name websites, including Facebook and Twitter. Varnish Cache 3.0 was officially released today, expanding the technology with the promise of new modularity for the next generation of web acceleration needs.
Among the new features in Varnish 3.0 is module support which provides a foundation for extensibility. The module concept in Varnish is similar to the one in the Apache HTTP Web Server, where the add-in module provides additional features.
"We'll have core modules as part of Varnish, similar to how the Apache Project has them," Per Buer, CEO of Varnish Software told InternetNews.com. "There will also be other modules that Varnish software will maintain and those will be further from the core and then some there will be modules that will be totally community supported."
Varnish Software is a startup that is providing commercial services and support for the open source Varnish Cache. Buer noted the goal of the module support is to lay the foundation for others to extend Varnish.
Varnish is often used in deployments alongside Apache, as well as the open source ngnx web server. According to Buer, it really doesn't matter which web server is used with Varnish.
"Some people look at Varnish as a way to outsource performance," Buer said. "For example, Ruby on Rails is not known for performance, so many of those developers outsource performance to Varnish."
Varnish is not the first open source proxy caching server in the market either. Though Buer notes it has advantages over other options, including the open source Squid proxy server.
"In addition to performance, Varnish lets you customize in a way that others don't," Buer said.
For example, Buer noted that if you want every user from a particular city to see a particular page, Varnish allows the admin to set a logic policy with the configuration.
"You can make really complex policies on how sites should behave and that's something nobody else does," Buer said.
Another advantage that Varnish has is that it has a built in load-balancer. He noted that since Varnish is a cache, it's better than just a standalone load balancer on its own. For example, if there is a load balancer sitting in front of multiple web servers connected to a database server and the database fails, a typical load balancer won't be able to help. In contrast, since Varnish is a cache, it will be able to serve up a cached version of the page without the database.
With Varnish 3.0, streaming support is also being added which should further provide advantages for the system.
"Until now Varnish has been able to stream when it delivers objects, in a store and forward methodology, " Buer said. "The problem is when big objects come in, especially video objects, where nothing happens as Varnish is fetching video from the back-end and that's not good."
Buer added that Varnish 3.0 begins to fix that issue with preliminary support for full streaming. He noted that there are some technical limitation on streaming in Varnish 3.0 that he expects will be worked on in a supplementary release by the end of the year.
Another item on tap for Varnish before the end of the year is the Varnish Administration Console. The console will be a commercial tool for real-time analysis and administration.