Azure Cache (Preview)

As a member on the Azure cache team, I was very thrilled when we released the Azure Cache (Preview) as part of the June 2012 Azure SDK.

 

The preview offers a great caching solution for applications running on Azure. It provides a lot of benefits over Azure shared cache around aspects such as pricing, quotas/throttling and flexible deployment options. 

In addition to the above areas of improvement over Azure shared cache, we also added a lot of other enhancements as part of the preview release.  I've included an overview of these enhancements below.

 

New APIs

      The preview release contains the following new APIs:

 

  1. DataCache() - You don’t need to use DataCacheFactory any more to create an instance of DataCache.
  2. Increment Decrement APIs - Many applications use cache to maintain numeric counters. These APIs are pretty useful for operating on numeric types stored in cache.
  3. Append Prepend APIs - Append or Prepend to a string that is stored in cache.
  4. Cache level BulkGet - This API supports bulk get of items from a cache. The earlier version of the API only allowed bulk get of items in a named region.
  5. Clear()   - Clears a cache.

 

Memcache Protocol Support

      The preview release of Azure cache supports memcache protocol (both binary and text). Azure cache has many advantages over memcached. With this protocol support, existing applications that were using memcache can migrate easily to Azure and leverage the benefits of Azure Cache.

 

     MSDN has more info on how to enable memcache protocol support. I'll expand on the protocol differences between memcache and Azure Cache in a later post.

 

Appfabric Cache Feature Parity

     All the familiar features of Appfabric Cache (high-availability, notifications, regions, tags etc) can be used with the Azure Cache preview. One exception to this is Read-Through/Write-Behind capability. This is not yet supported in the preview.

 

Improved Performance

     There were a lot of under-the-hood changes to improve the performance (both latency and throughput) of Azure cache.

 

Access to Azure shared cache goes through a LoadBalanced input endpoint which has impact both on latency and throughput (more on this in a later blog). In the preview release, there is no need to go through a load balanced endpoint since the cache role is in the same deployment as the application, the application can use internal endpoints to connect to the cache role instances.

 

Also, as part of this preview, we implemented a more efficient wire protocol between the cache client and servers that has given us good results.

 

The above changes provide significant latency and throughput improvements compared to the Azure shared cache.

 

Improved Management Experience

You can manage the azure cache preview roles just like you manage any other azure application. You can use the service management APIs to increase/decrease the number of cache role instances. You can also monitor a rich set of performance counters to help understand how the cache is performing for your application.

 

You can remote desktop to the cache role instances to look at the cache perf counters, or configure the list of performance counters to collect through code

 

Useful Links

 

Here are a few links that I've found useful in getting started on the azure cache preview.

 

  1. MSDN's Azure Caching section has a lot of good info about the preview release.
  2. You can watch Shyam Seshadri's presentation at TechEd about these enhancements.
  3. Prashant Kumar's blog also gives a good overview of what is new in Azure caching.

 

Over the next few weeks, I plan to do a deep dive into many of the above aspects of Azure Cache Preview.