Caching is a great idea and works well, but I’ll take my NetEqualizer with me if forced to choose between the two on my remote island with a satellite link.
Yes there are a few circumstances where a caching server might have a nice impact. Our most successful deployments are in educational environments where the same video is watched repeatedly as an assignment; but for most wide open installations ,expectations of performance far outweigh reality. Lets have at look at what works and also drill down on expectations that are based on marginal assumptions.
From my personal archive of experience here are some of the expectations attributed to caching that perhaps are a bit too optimistic.
“Most of my users go to their Yahoo or Face Book home page every day when they log in and that is the bulk of all they do”
– I doubt this customer’s user base is that conformist :), and they’ll find out once they install their caching solution. But even if true, only some of the content on Face Book and Yahoo is static. A good portion of these pages are by default dynamic, and ever-changing with content. They are marked as Dynamic in their URLs which means the bulk of the page must be reloaded each time. For example, in order for caching to have an impact , the users in this scenario would have to stick to their home pages , and not look at friend photo’s or other pages.
” We expect to see a 30 percent hit rate when we deploy our cache.”
You won’t see a 30 percent hit rate, unless somebody designs some specific robot army to test your cache, hitting the same pages over and over again. Perhaps, on IOS update day, you might see a bulk of your hits going to the same large file and have a significant performance boost for a day. But overall you will be doing well if you get a 3 or 4 percent hit rate.
” I expect the cache hits to take pressure off my Internet Link”
Assuming you want your average user to experience a fast loading Internet, this is where you really want your NetEqualizer ( or similar intelligent bandwidth controller) over your caching engine. The smart bandwidth controller can re-arrange traffic on the fly insuring Interactive hits get the best response. A caching engine does not have that intelligence.
Let’s suppose you have a 100 megabit link to the Internet ,and you install a cache engine that effectively gets a 6 percent hit rate. That would be exceptional hit rate.
So what is the end user experience with a 6 percent hit rate compared to pre-cache ?
-First off, it is not the hit rate that matters when looking at total bandwidth. Much of those hits will likely be smallish image files from the Yahoo home page or common sites, that account for less than 1 percent of your actual traffic. Most of your traffic is likely dominated by large file downloads and only a portion of those may be coming from cache.
– A 6 percent hit rate means that 94 percent miss rate , and if your Internet was slow from congestion before the caching server it will still be slow 94 percent of the time.
– Putting in a caching server would be like upgrading your bandwidth from 100 megabits to 104 megabits to relieve congestion. That cache hits may add to the total throughput in your reports, but the 100 megabit bottleneck is still there, and to the end user, there is little or no difference in user perception at this point. A portion of your Internet access is still marginal or unusable during peak times, and other than the occasional web page or video loading nice and snappy , users are getting duds most of the time.
Even the largest caching server is insignificant in how much data it can store.
– The Internet is Vast and your Cache is not. Think of a tiny Ant standing on top of Mount Everest. YouTube puts up 100 hours of new content every minute of every day. A small commercial caching server can store about 1/1000 of what YouTube uploads in day, not to mention yesterday and the day before and last year. It’s just not going to be in your cache.
So why is a NetEqualizer bandwidth controller so much more superior than a caching server when changing user perception of speed? Because the NetEqualizer is designed to keep Internet access from crashing , and this is accomplished by reducing the large file transfers and video download footprints during peak times. Yes these videos and downloads may be slow or sporadic, but they weren’t going to work anyway, so why let them crush the interactive traffic ? In the end caching and equalizing are not perfect, but from real world trials the equalizer changes the user experience from slow to fast for all Interactive transactions, caching is hit or miss ( pun intended).
Like this:
Like Loading...
Stuck on Desert Island, Do You Take Your Caching Server or Your Netequalizer ?
March 15, 2014 — netequalizerCaching is a great idea and works well, but I’ll take my NetEqualizer with me if forced to choose between the two on my remote island with a satellite link.
Yes there are a few circumstances where a caching server might have a nice impact. Our most successful deployments are in educational environments where the same video is watched repeatedly as an assignment; but for most wide open installations ,expectations of performance far outweigh reality. Lets have at look at what works and also drill down on expectations that are based on marginal assumptions.
From my personal archive of experience here are some of the expectations attributed to caching that perhaps are a bit too optimistic.
“Most of my users go to their Yahoo or Face Book home page every day when they log in and that is the bulk of all they do”
– I doubt this customer’s user base is that conformist :), and they’ll find out once they install their caching solution. But even if true, only some of the content on Face Book and Yahoo is static. A good portion of these pages are by default dynamic, and ever-changing with content. They are marked as Dynamic in their URLs which means the bulk of the page must be reloaded each time. For example, in order for caching to have an impact , the users in this scenario would have to stick to their home pages , and not look at friend photo’s or other pages.
” We expect to see a 30 percent hit rate when we deploy our cache.”
You won’t see a 30 percent hit rate, unless somebody designs some specific robot army to test your cache, hitting the same pages over and over again. Perhaps, on IOS update day, you might see a bulk of your hits going to the same large file and have a significant performance boost for a day. But overall you will be doing well if you get a 3 or 4 percent hit rate.
” I expect the cache hits to take pressure off my Internet Link”
Assuming you want your average user to experience a fast loading Internet, this is where you really want your NetEqualizer ( or similar intelligent bandwidth controller) over your caching engine. The smart bandwidth controller can re-arrange traffic on the fly insuring Interactive hits get the best response. A caching engine does not have that intelligence.
Let’s suppose you have a 100 megabit link to the Internet ,and you install a cache engine that effectively gets a 6 percent hit rate. That would be exceptional hit rate.
So what is the end user experience with a 6 percent hit rate compared to pre-cache ?
-First off, it is not the hit rate that matters when looking at total bandwidth. Much of those hits will likely be smallish image files from the Yahoo home page or common sites, that account for less than 1 percent of your actual traffic. Most of your traffic is likely dominated by large file downloads and only a portion of those may be coming from cache.
– A 6 percent hit rate means that 94 percent miss rate , and if your Internet was slow from congestion before the caching server it will still be slow 94 percent of the time.
– Putting in a caching server would be like upgrading your bandwidth from 100 megabits to 104 megabits to relieve congestion. That cache hits may add to the total throughput in your reports, but the 100 megabit bottleneck is still there, and to the end user, there is little or no difference in user perception at this point. A portion of your Internet access is still marginal or unusable during peak times, and other than the occasional web page or video loading nice and snappy , users are getting duds most of the time.
Even the largest caching server is insignificant in how much data it can store.
– The Internet is Vast and your Cache is not. Think of a tiny Ant standing on top of Mount Everest. YouTube puts up 100 hours of new content every minute of every day. A small commercial caching server can store about 1/1000 of what YouTube uploads in day, not to mention yesterday and the day before and last year. It’s just not going to be in your cache.
So why is a NetEqualizer bandwidth controller so much more superior than a caching server when changing user perception of speed? Because the NetEqualizer is designed to keep Internet access from crashing , and this is accomplished by reducing the large file transfers and video download footprints during peak times. Yes these videos and downloads may be slow or sporadic, but they weren’t going to work anyway, so why let them crush the interactive traffic ? In the end caching and equalizing are not perfect, but from real world trials the equalizer changes the user experience from slow to fast for all Interactive transactions, caching is hit or miss ( pun intended).
Share this:
Like this: