Page 282 - DCAP312_WEB_TECHNOLOGIES_II
P. 282
Web Technologies-II
Notes Assuming that we do not want to iterate over my product list each time we need a specific
product, we can also store individual requests to products within the cache as well:
public Product GetProductById(int id)
{
Product output =
HttpContext.Current.Cache[“Product” + id] as Product;
if (output == null)
{
output = this.GetAllProducts()
.Where(p => p.ProductId == id)
.SingleOrDefault();
HttpContext.Current.Cache.Add(
“Product” + id, output, null, Cache.NoAbsoluteExpiration,
Cache.NoSlidingExpiration, CacheItemPriority.BelowNormal, null);
}
return output;
}
Doing so can help make data retrieval much faster, though it will take up a tiny bit of RAM. Most
CLR objects are tiny, so they do not take up much RAM. If our objects are huge, then we might
want to rethink this pattern of usage, if we do not have gobs of RAM at hand. In most cases
though, storing business objects in RAM is usually preferable to recreating them when needed.
And, in cases where caching a single entity that was pulled from a collection of cached objects
does not make sense, just remember that using the pattern of caching a single object by ID can
provide a huge boost in performance if that object is being requested on a regular basis.
Do not create MemoryCache instances unless it is required. If you create cache
instances in client and Web applications, the MemoryCache instances should
be created early in the application life cycle.
14.2.3 The Downside of Extensive Caching
Taking this approach also adds two big problems. First, implementing unit tests on top of a
repository that takes advantage of cache can be a total beast because we have thrown a big set
of dependencies, which cannot be easily tested, into the mix.
Second, it is pretty easy to let cached items “trump” real data. This is because we now have a
number of cached objects that need to be removed should we change any of our underlying data.
Common approaches to dealing with this include setting sliding or absolute expiration times
that are good enough or using cache invalidation callbacks from SQL Server. Setting expiration
times can provide a great boost even in highly volatile environments by setting cache timeouts
for just a few seconds and accepting that data can/will be a few seconds old in some cases.
Using cache invalidation callbacks is also a great way to keep volatile data up-to-date while
enjoying the benefits of cache. But it requires a bit of extra work to set up.
14.2.4 Standardizing Cache Access
The final benefit of this approach is that it let me standardize on the creation of cache keys.
If we look at the examples above, we just are creating keys for any cached objects on the fly.
And, since these keys are strings, IntelliSense would not help call out typos or other errors that
would cause stupid caching problems.
276 LOVELY PROFESSIONAL UNIVERSITY