NetEqualizer Testing and Integration of Squid Caching Server


Editor’s Note: Due to the many variables involved with tuning and supporting Squid Caching Integration, this feature will require an additional upfront support charge. It will also require at minimum a NE3000 platform. Contact sales@netequalizer.com for specific details.

In our upcoming 5.0 release, the main enhancement will be the ability to implement YouTube caching from a NetEqualizer. Since a squid-caching server can potentially be implemented separately by your IT department, the question does come up about what the difference is between using the embedded NetEqualizer integration and running the caching server stand-alone on a network.

Here are a few of the key reasons why using the NetEqualizer caching integration provides for the most efficient and effective set up:

1. Communication – For proper performance, it’s important that the NetEqualizer know when a file is coming from cache and when it’s coming from the Internet. It would be counterproductive to have data from cache shaped in any way. To accomplish this, we wrote a new utility, aptly named “cache helper,” to advise the NetEqualizer of current connections originating from cache. This allows the NetEqualizer to permit cached traffic to pass without being shaped.

2. Creative Routing – It’s also important that the NetEqualizer be able to see the public IP addresses of traffic originating on the Internet. However, using a stand-alone caching server prevents this. For example, if you plug a caching server into your network in front of a NetEqualizer (between the NetEqualizer and your users), all port 80 traffic would appear to come from the proxy server’s IP address. Cached or not, it would appear this way in a default setup. The NetEqualizer shaping rules would not be of much use in this mode as they would think all of the Internet traffic was originating from a single server. Without going into details, we have developed a set of special routing rules to overcome this limitation in our implementation.

3. Advanced Testing and Validation – Squid proxy servers by themselves are very finicky. Time and time again, we hear about implementations where a customer installed a proxy server only to have it cause more problems than it solved, ultimately slowing down the network. To ensure a simple yet tight implementation, we ran a series of scenarios under different conditions. This required us to develop a whole new methodology for testing network loads through the Netequalizer. Our current class of load generators is very good at creating a heavy load and controlling it precisely, but in order to validate a caching system, we needed a different approach. We needed a load simulator that could simulate the variations of live internet traffic. For example, to ensure a stable caching system, you must take the following into consideration:

  • A caching proxy must perform quite a large number of DNS look-ups
  • It must also check tags for changes in content for cached Web pages
  • It must facilitate the delivery of cached data and know when to update the cache
  • The squid process requires a significant chunk of CPU and memory resources
  • For YouTube integration, the Squid caching server must also strip some URL tags on YouTube files on the fly

To answer this challenge, and provide the most effective caching feature, we’ve spent the past few months developing a custom load generator. Our simulation lab has a full one-gigabit connection to the Internet. It also has a set of servers that can simulate thousands of simultaneous users surfing the Internet at the same time. We can also queue up a set of YouTube users vying for live video from the cache and Internet. Lastly, we put a traditional point-to-point FTP and UDP load across the NetEqualizer using our traditional load generator.

Once our custom load generator was in place, we were able to run various scenarios that our technology might encounter in a live network setting.  Our testing exposed some common, and not so common, issues with YouTube caching and we were able to correct them. This kind of analysis is not possible on a live commercial network, as experimenting and tuning requires deliberate outages. We also now have the ability to re-create a customer problem and develop actual Squid source code patches should the need arise.

NetEqualizer YouTube Caching FAQ


Editor’s Note: This week, we announced the availability of the NetEqualizer YouTube caching feature we first introduced in October. Over the past month, interest and inquiries have been high, so we’ve created the following Q&A to address many of the common questions we’ve received.

This may seem like a silly question, but why is caching advantageous?

The bottleneck most networks deal with is that they have a limited pipe leading out to the larger public Internet cloud. When a user visits a website or accesses content online, data must be transferred to and from the user through this limited pipe, which is usually meant for only average loads (increasing its size can be quite expensive). During busy times, when multiple users are accessing material from the Internet at once, the pipe can become clogged and service slowed. However, if an ISP can keep a cached copy of certain bandwidth-intensive content, such as a popular video, on a server in their local office, this bottleneck can be avoided. The pipe remains open and unclogged and customers are assured their video will always play faster and more smoothly than if they had to go out and re-fetch a copy from the YouTube server on the Internet.

What is the ROI benefit of caching YouTube? How much bandwidth can a provider conserve?

At the time of this writing, we are still in the early stages of our data collection on this subject. What we do know is that YouTube can account for up to 15 percent of Internet traffic. We expect to be able to cache at least the most popular 300 YouTube videos with this initial release and perhaps more when we release the mass-storage version of our caching server in the future. Considering this, realistic estimates put the savings in terms of bandwidth overhead somewhere between 5 and 15 percent. But this is only the instant benefits in terms bandwidth savings. The long-term customer-satisfaction benefit is that many more YouTube videos will play without interruption on a crowded network (busy hour) than before. Therefore, ROI shouldn’t be measured in bandwidth savings alone.

Why is it just the YouTube caching feature? Why not cache everything?

There are a couple of good reasons not to cache everything.

First, there are quite a few Web pages that are dynamically generated or change quite often, and a caching mechanism relies on content being relatively static. This allows it to grab content from the Internet and store it locally for future use without the content changing. As mentioned, when users/clients visit the specific Web pages that have been stored, they are directed to the locally saved content rather than over the Internet and to the original website. Therefore, caching obviously wouldn’t be possible for pages that are constantly changing. Caching dynamic content can cause all kinds of issues — especially with merchant and secure sites where each page is custom-generated for the client.

Second, a caching server can realistically only store a subset of data that it accesses. Yes, data storage is getting less expensive every year, but a local store is finite in size and will eventually fill up. So, when making a decision on what to cache and what not to cache, YouTube, being both popular and bandwidth intensive, was the logical choice.

Will the NetEqualizer ever cache content beyond YouTube? Such as other videos?

At this time, the NetEqualizer is caching files that traverse port 80 and correspond to video files from 30 seconds to 10 minutes. It is possible that some other port 80 file will fall into this category, but the bulk of it will be YouTube.

Is there anything else about YouTube that makes it a good candidate to cache?

Yes, YouTube content meets the level of stability discussed above that’s needed for effective caching. Once posted, most YouTube videos are not edited or changed. Hence, the copy in the local cache will stay current and be good indefinitely.

When I download large distributions, the download utility often gives me a choice of mirrored sites around the world. Is this the same as caching?

By definition this is also caching, but the difference is that there is a manual step to choosing one of these distribution sites. Some of the large-content open source distributions have been delivered this way for many years. The caching feature on the NetEqualizer is what is called “transparent,” meaning users do not have to do anything to get a cached copy.

If users are getting a file from cache without their knowledge, could this be construed as a violation of net neutrality?

We addressed the tenets of net neutrality in another article and to our knowledge caching has not been controversial in any way.

What about copyright violations? Is it legal to store someone’s content on an intermediate server?

This is a very complex question and anything is possible, but with respect to intent and the NetEqualizer caching mechanism, the Internet provider is only caching what is already freely available. There is no masking or redirection of the actual YouTube administrative wrappings that a user sees (this would be where advertising and promotions appear). Hence, there is no loss of potential of revenue for YouTube. In fact, it would be considered more of a benefit for them as it helps more people use their service where connections might otherwise be too slow.

Final Editor’s Note: While we’re confident this Q&A will answer many of the questions that arise about the NetEqualizer YouTube caching feature, please don’t hesitate to contact us with further inquiries. We can be reached at 1-888-287-2492 or sales@apconnections.net.

NetEqualizer YouTube Caching a Win for Net Neutrality


Over the past few years, much of the controversy over net neutrality has ultimately stemmed from the longstanding rift between carriers and content providers. Commercial content providers such as NetFlix have entire business models that rely on relatively unrestricted bandwidth access for their customers, which has led to an enormous increase in the amount of bandwidth that is being used. In response to these extreme bandwidth loads and associated costs, ISPs have tried all types of schemes to limit and restrict total usage. Some of the solutions that have been tried include:

While in many cases effective, most of these efforts have been mired in controversy with respect to net neutrality. However, caching is the one exception.

Up to this point, caching has proven to be the magic bullet that can benefit both ISPs and consumers (faster access to videos, etc.) while respecting net neutrality. To illustrate this, we’ll run caching through the gauntlet of questions that have been raised about these other solutions in regard to a violation of net neutrality. In the end, it comes up clean.

1. Does caching involve deep introspection of user traffic without their knowledge (like layer-7 shaping and DPI)?

No.

2. Does Caching perform any form of preferential treatment based on content?

No.

3. Does caching perform any form of preferential treatment based on fees?

No.

Yet, despite avoiding these pitfalls, caching has still proven to be extremely effective, allowing Internet providers to manage increasing customer demands without infringing upon customers’ rights or quality of service. It was these factors that led APconnections to develop our most recent NetEqualizer feature, YouTube caching.

For more on this feature, or caching in general, check out our new NetEqualizer YouTube Caching FAQ post.

Enhance Your Internet Service With YouTube Caching


Have you ever wondered why certain videos on YouTube seem to run more smoothly than others? Over the years, I’ve consistently noticed that some videos on my home connection will run without interruption while others are as slow as molasses. Upon further consideration, I determined a simple common denominator for the videos that play without interruption — they’re popular. In other words, they’re trending. And, the opposite is usually true for the slower videos.

To ensure better performance, my Internet provider keeps a local copy of the popular YouTube content (caching), and when I watch a trending video, they send me the stream from their local cache. However, if I request a video that’s not contained in their current cache, I’m sent over the broader Internet to the actual YouTube content servers. When this occurs, my video streams are located off the provider’s local network and my pipe can be restricted. Therefore, the most likely cause for the slower video stream is traffic congestion at peak hours.

Considering this, caching video is usually a win-win for the ISP and Internet consumer. Here’s why…

Benefits of Caching Video for the ISP

Last-mile connections from the point of presence to the customer are usually not overloaded, especially on a wired or fiber network such as a cable operator. Caching video allows a provider to keep traffic on their last mile and hence doesn’t clog the provider’s exchange point with the broader Internet. Adding bandwidth to the exchange point is expensive, but caching video will allow you to provide a higher class of service without the large recurring costs.

Benefits of ISP-Level Caching for the Internet Consumer

Put simply, the benefit is an overall better video-viewing experience. Most consumers could care less about the technical details behind the quality of their Internet service. What matters is the quality itself. In this competitive market and the rising expectations for video service, the ISP needs every advantage it can get.

Why Target YouTube for Caching?

YouTube video is very bandwidth intensive and relatively stable content. By stable, we mean once posted, the video content does not get changed or edited. This makes it a prime candidate for effective caching.

Should an ISP Cache All Of The Data It Can?

While this is the default setting for most Squid caching servers, we recommend only caching the popular free video sites such as YouTube. This would involve some selective filtering, but caching everything in a generic mode can cause confusion with some secure sites not functioning correctly.

Note: With Squid Proxy you’ll need a third party module to cache YouTube.

How Will Caching Work with My NetEqualizer or Other Bandwidth Control Device?

You’ll need to put your caching server in transparent mode and run it on the private side of your NetEqualizer.

NetEqualizer Placement with caching server

Related Article fourteen tips to make your WISP more profitable

The Promise of Streaming Video: An Unfunded Mandate


By Art Reisman, CTO, www.netequalizer.com

Art Reisman CTO www.netequalizer.com
Art Reisman is a partner and co-founder of APconnections, a company that provides bandwidth control solutions (NetEqualizer) to
ISPs, Universities, Libraries, Mining Camps, and any organization where groups of users must share their Internet resources equitably. What follows is an objective educational journey on how consumers and ISPs can live in harmony with the explosion of YouTube video.

The following is written primarily for the benefit of mid-to-small sized internet services providers (ISPs).  However, home consumers may also find the details interesting.  Please follow along as I break down the business cost model of the costs required to keep up with growing video demand.

In the past few weeks, two factors have come up in conversations with our customers, which has encouraged me to investigate this subject further and outline the challenges here:

1) Many of our ISP customers are struggling to offer video at competitive levels during the day, and yet are being squeezed due to high bandwidth costs.  Many look to the NetEqualizer to alleviate video congestion problems.  As you know, there are always trade-offs to be made in handling any congestion issue, which I will discuss at the end of this article.  But back to the subject at hand.  What I am seeing from customers is that there is an underlying fear that they (IT adminstrators) are behind the curve.   As I have an opinion on this, I decided I need to lay out what is “normal” in terms of contention ratios for video, as well what is “practical” for video in today’s world.

2) My internet service provider, a major player that heavily advertises how fast their speed is to the home, periodically slows down standard YouTube Videos.  I should be fair with my accusation, with the Internet you can actually never be quite certain who is at fault.  Whether I am being throttled or not, the point is that there are an ever-growing number of video content providers , who are pushing ahead with plans that do not take into account, nor care about, a last mile provider’s ability to handle the increased load.  A good analogy would be a travel agency that is booking tourists onto a cruise ship without keeping a tally of tickets sold, nor caring, for that matter.  When all those tourists show up to board the ship, some form of chaos will ensue (and some will not be able to get on the ship at all).

Some ISPs are also adding to this issue, by building out infrastructure without regard to content demand, and hoping for the best.  They are in a tight spot, getting caught up in a challenging balancing act between customers, profit, and their ability to actually deliver video at peak times.

The Business Cost Model of an ISP trying to accommodate video demands

Almost all ISPs rely on the fact that not all customers will pull their full allotment of bandwidth all the time.  Hence, they can map out an appropriate subscriber ratio for their network, and also advertise bandwidth rates that are sufficient enough to handle video.  There are four main governing factors on how fast an actual consumer circuit will be:

1) The physical speed of the medium to the customer’s front door (this is often the speed cited by the ISP)
2) The combined load of all customers sharing their local circuit and  the local circuit’s capacity (subscriber ratio factors in here)
3) How much bandwidth the ISP contracts out to the Internet (from the ISP’s provider)

4) The speed at which the source of the content can be served (Youtube’s servers), we’ll assume this is not a source of contention for our examples below, but it certainly should remain a suspect in any finger pointing of a slow circuit.

The actual limit to the am0unt of bandwidth a customer gets at one time, which dictates whether they can run a live streaming video, usually depends  on how oversold their ISP is (based on the “subscriber ratio” mentioned in points 1 and 2 above). If  your ISP can predict the peak loads of their entire circuit correctly, and purchase enough bulk bandwidth to meet that demand (point 3 above), then customers should be able to run live streaming video without interruption.

The problem arises when providers put together a static set of assumptions that break down as consumer appetite for video grows faster than expected.  The numbers below typify the trade-offs a mid-sized provider is playing with in order to make a profit, while still providing enough bandwidth to meet customer expectations.

1) In major metropolitan areas, as of 2010, bandwidth can be purchased in bulk for about $3000 per 50 megabits. Some localities less some more.

2) ISPs must cover a fixed cost per customer amortized: billing, sales staff, support staff, customer premise equipment, interest on investment , and licensing, which comes out to about $35 per month per customer.

3) We assume market competition fixes price at about $45 per month per customer for a residential Internet customer.

4) This leaves $10 per month for profit margin and bandwidth fees.  We assume an even split: $5 a month per customer for profit, and $5 per month per customer to cover bandwidth fees.

With 50 megabits at $3000 and each customer contributing $5 per month, this dictates that you must share the 50 Megabit pipe amongst 600 customers to be viable as a business.  This is the governing factor on how much bandwidth is available to all customers for all uses, including video.

So how many simultaneous YouTube Videos can be supported given the scenario above?

Live streaming YouTube video needs on average about 750kbs , or about 3/4 of a megabit, in order to run without breaking up.

On a 50 megabit shared link provided by an ISP, in theory you could support about 70 simultaneous YouTube sessions, assuming nothing else is running on the network.  In the real world there would always be background traffic other than YouTube.

In reality, you are always going to have a minimum fixed load of internet usage from 600 customers of approximately 10-to-20 megabits.  The 10-to-20 megabit load is just to support everything else, like web sufing, downloads, skype calls, etc.  So realistically you can support about 40 YouTube sessions at one time.  What this implies that if 10 percent of your customers (60 customers) start to watch YouTube at the same time you will need more bandwidth, either that or you are going to get some complaints.  For those ISPs that desperately want to support video, they must count on no more than about 40 simultaneous videos running at one time, or a little less than 10 percent of their customers.

Based on the scenario above, if 40 customers simultaneously run YouTube, the link will be exhausted and all 600 customers will be wishing they had their dial-up back.  At last check, YouTube traffic accounted for 10 percent of all Internet Traffic.  If left completely unregulated, a typical rural ISP could find itself on the brink of saturation from normal YouTube usage already.  With tier-1 providers in major metro areas, there is usually more bandwidth, but with that comes higher expectations of service and hence some saturation is inevitable.

This is why we believe that Video is currently an “unfunded mandate”.  Based on a reasonable business cost model, as we have put forth above, an ISP cannot afford to size their network to have even 10% of their customers running real-time streaming video at the same time.  Obviously, as bandwidth costs decrease, this will help the economic model somewhat.

However, if you still want to tune for video on your network, consider the options below…

NetEqualizer and Trade-offs to allow video

If you are not a current NetEqualizer user, please feel free to call our engineering team for more background.  Here is my short answer on “how to allow video on your network” for current NetEqualizer users:

1) You can determine the IP address ranges for popular sites and give them priority via setting up a “priority host”.
This is not recommended for customers with 50 megs or less, as generally this may push you over into a gridlock situation.

2) You can raise your HOGMIN to 50,000 bytes per second.
This will generally let in the lower resolution video sites.  However, they may still incur Penalities should they start buffering at a higher rate than 50,000.  Again, we would not recommend this change for customers with pipes of 50 megabits or less.

With either of the above changes you run the risk of crowding out web surfing and other interactive uses , as we have described above. You can only balance so much Video before you run out of room.  Please remember that the Default Settings on the NetEq are designed to slow video before the entire network comes to halt.

For more information, you can refer to another of Art’s articles on the subject of Video and the Internet:  How much YouTube can the Internet Handle?

Other blog posts about ISPs blocking YouTube

The pros and cons of Disk (Web) Caching


Eli Riles an independent consultant and former VP of sales for NetEqualizer has extensively investigated the subject of caching with many of  ISPs from around the globe. What follows are some useful observations on disk/web caching.

Effective use of Disk Caching

Suppose you are the administrator for a network, and you have a group of a 1000 users that wake up promptly at 7:00 am each morning and immediately go to MSNBC.com to retrieve the latest news from Wall Street. This synchronized behavior would create 1000 simultaneous requests for the same remote page on the Internet.

Or, in the corporate world, suppose the CEO of a multinational 10,000 employee business, right before the holidays put out an all points 20 page PDF file on the corporate site describing the new bonus plan? As you can imagine all the remote WAN links might get bogged down for hours while each and every employee tried to download this file.

Well it does not take a rocket scientist to figure out that if somehow the MSNBC home page could be stored locally on an internal server that would alleviate quite a bit of pressure on your WAN or Internet link.

And in the case of the CEO memo, if a single copy of the PDF file was placed locally at each remote office it would alleviate the rush of data.

Local Disk Caching does just that.

Offered by various vendors Caching can be very effective in many situations, and vendors can legitimately make claims of tremendous WAN speed improvement in some situations. Caching servers have built in intelligence to store the most recently and most frequently requested information, thus preventing future requests from traversing the WAN link unnecessarily .

You may know that most desktop browsers do their own form caching already. Many web servers keep a time stamp of their last update to data , and browsers such as the popular Internet Explorer will use a cached copy of a remote page after checking the time stamp.

So what is the downside of caching?

There are two main issues that can arise with caching:

1) Keeping the cache current. If you access a cache page that is not current then you are at risk of getting old and incorrect information. Some things you may never want to be cached, for example the results of a transactional database query. It’s not that these problems are insurmountable, but there is always the risk that the data in cache will not be synchronized with changes.

2) Volume. There are some 100 millions of web sites out on the Internet alone. Each site contains upwards of several megabytes of public information. The amount of data is staggering and even the smartest caching scheme cannot account for the variation in usage patterns among users and the likely hood they will hit an un-cached page. If you have a diverse set of users it is unlikely the Cache will have much effect on a given day

Formal definition of Caching

Can your ISP support Video for all?


By Art Reisman, CTO, http://www.netequalizer.com

Art Reisman CTO www.netequalizer.com

Art Reisman

As the Internet continues to grow with higher home user speeds available from Tier 1 providers,  video sites such as YouTube , Netflix,  and others are taking advantage of these fatter pipes. However, unlike the peer-to-peer traffic of several years ago (which seems to be abating), These videos don’t face the veil of copyright scrutiny cast upon p2p which caused most p2p users to back off. They are here to stay, and any ISP currently offering high speed Internet will need to accommodate the subsequent rising demand.

How should a Tier2 or Tier3 provider size their overall trunk to insure smooth video at all times for all users?

From measurements done in our NetEqualizer laboratories, a normal quality video stream requires around 350kbs bandwidth sustained over its life span to insure there are no breaks or interruptions. Newer higher definition videos may run at even higher speeds.


A typical rural wireless WISP will have contention ratios of about 300 users per 10-megabit link. This seems to be the ratio point where a small businesses can turn  a profit.  Given this contention ratio, if 30 customers simultaneously watch YouTube, the link will be exhausted and all 300 customers will be experience protracted periods of poor service.

Even though it is theoretically possible  to support 30 subscribers on a 10 megabit , it would only be possible if the remaining 280 subscribers were idle. In reality the trunk will become saturated with perhaps 10 to 15  active video streams,  as  obviously  the remaining 280 users are not idle. Given this realistic scenario, is it reasonable for an ISP with 10 megabits and 300 subscribers to tout they support video ?

As of late 2007 about 10 percent of Internet traffic was attributed to video. It is safe to safe to assume that number is higher now (Jan 2009). Using the 2007 number, 10 percent of 300 subscribers would yield on average 30 video streams, but that is not a fair number, because the 10 percent of people using video, would only apply to the subscribers who are actively on line, and not all 300. To be fair,  we’ll assume 150 of 300 subscribers are online during peak times.  The calculation now  yields an estimated 15 users doing video at one time, which is right on our upper limit of smooth service for a 10 megabit link, any more and something has to give.

The moral of this story so far is,  you should  be cautious before promoting unlimited video support with contention ratios of 30 subscribers to 1 megabit.  The good news is, most rural providers are not competing in metro areas, hence customers will have to make do with what they have. In areas more intense competition for customers where video support might make a difference, our recommendation is that  you will need to have a ratio closer to 20 subscribers to 1 megabit, and you still may have peak outages.

One trick you can use to support Video with limited Internet resources.

We have previously been on record as not being a supporter of Caching to increase Internet speed, well it is time to back track on that. We are now seeing results that Caching can be a big boost in speeding up popular YouTue videos. Caching and video tend to work well together as consumers tend to flock a small subset of the popular videos. The downside is your local caching server will only be able to archive a subset of the content on the master YouTube servers but this should be enough to give the appearance of pretty good video.

In the end there is no substitute for having a big fat pipe with enough room to run video, we’ll just have to wait and see if the market can support this expense.

How Much YouTube Can the Internet Handle?


By Art Reisman, CTO, http://www.netequalizer.com 

Art Reisman CTO www.netequalizer.com

Art Reisman

 

As the Internet continues to grow and true speeds become higher,  video sites like YouTube are taking advantage of these fatter pipes. However, unlike the peer-to-peer traffic of several years ago (which seems to be abating), YouTube videos don’t face the veil of copyright scrutiny cast upon p2p which caused most users to back off.
 

In our experience, there are trade offs associated with the advancements in technology that have come with YouTube. From measurements done in our NetEqualizer laboratories, the typical normal quality YouTube video needs about 240kbs sustained over the 10 minute run time for the video. The newer higher definition videos run at a rate at least twice that. 

Many of the rural ISPs that we at NetEqualizer support with our bandwidth shaping and control equipment have contention ratios of about 300 users per 10-megabit link. This seems to be the ratio point where these small businesses can turn  a profit.  Given this contention ratio, if 40 customers simultaneously run YouTube, the link will be exhausted and all 300 customers will be wishing they had their dial-up back. At last check, YouTube traffic accounted for 10 percent of all Internet Traffic.  If left completely unregulated,  a typical rural  ISP could find itself on the brink of saturation from normal YouTube usage already. With tier-1 providers in major metro areas there is usually more bandwidth, but with that comes higher expectations of service and hence some saturation is inevitable. 

If you believe there is a conspiracy, or that ISPs are not supposed to profit as they take risk and operate in a market economy, you are entitled to your opinion, but we are dealing with reality. And there will always be tension between users and their providers, much the same as there is with government funds and highway congestion. 

The fact is all ISPs have a fixed amount of bandwidth they can deliver and when data flows exceed their current capacity, they are forced to implement some form of passive constraint. Without them many networks would lock up completely. This is no different than a city restricting water usage when reservoirs are low. Water restrictions are well understood by the populace and yet somehow bandwidth allocations and restrictions are perceived as evil. I believe this misconception is simply due to the fact that bandwidth is so dynamic, if there was a giant reservoir of bandwidth pooled up in the mountains where you could see this resource slowly become depleted , the problem could be more easily visualized. 

The best compromise offered, and the only comprise that is not intrusive is bandwidth rationing at peak hours when needed. Without rationing, a network will fall into gridlock, in which case not only do the YouTube videos come to halt , but  so does e-mail , chat , VOIP and other less intensive applications. 

There is some good news, alternative ways to watch YouTube videos. 

We noticed during out testing that YouTube videos attempt to play back video as a  real-time feed , like watching live TV.  When you go directly to YouTube to watch a video, the site and your PC immediately start the video and the quality becomes dependent on having that 240kbs. If your providers speed dips below this level your video will begin to stall, very annoying;  however if you are willing to wait a few seconds there are tools out there that will play back YouTube videos for you in non real-time. 

Buffering Tool 

They accomplish this by pre-buffering before the video starts playing.  We have not reviewed any of these tools so do your research. We suggest you google “YouTube buffering tools” to see what is out there. Not only do these tools smooth out the YouTube playback during peak times or on slower connections , but they also help balance the load on the network during peak times. 

Bio Art Reisman is a partner and co-founder of APconnections, a company that provides bandwidth control solutions (NetEqualizer) to ISPs, Universities, Libraries, Mining Camps and any organization where groups of users must share their Internet resources equitably. What follows is an objective educational journey on how consumers and ISPs can live in harmony with the explosion of YouTube video.

YouTube: The Unfunded Mandate


As some of you may know, I have chimed in several times on the debate on Internet access and the games ISP play to block certain types of traffic (Bittorrent).  I have leaned toward the side of Internet providers and defended some of their restrictive practices. I took quite a bit of heat for some of my previous positions. For example, this excerpt was posted in a discussion forum as a reply to an opinion piece I wrote recently for Extreme Tech magazine:

“So I was wondering why Extremetech would allow such blatant misinformation and FUD on their site…”

First off, please understand my point of reference before assuming I am an industry shill. I am an unbiased observer sitting on the sideline.

Secondly, you can villainize providers all you want, but they exist to make a profit. It is, after all, a business. And now they are facing a new threat with the explosion of YouTube and other video content. Here are some trends that we have seen.

Back in 2006, on a typical footprint of usage patterns on an ISP network, streams exceeding 200kbs (that is 200 kilo bits of data per second) averaged around 2 percent of the users at any one time. Almost all other streams were well under 50kbs. The 2006  ratio of big users to small users allowed  a typical Internet provider to serve approximately 500 people on a 10 megabit circuit without any serious issues. Today we are seeing 10 to 15 percent of the active streams exceeding 200 kbs. That is about a 700 percent increase in the last two years. This increase is mostly attributed to increased online video with  YouTube leading the way.

The ramification of YouTube and its impact on bandwidth demands is putting the squeeze on providers– like it or not they have not choice to but to implement some sort of quota system on bandwidth. Providers invested in certain sized networks and capacities based on the older usage model and smaller increases over time, not 700 percent in 2 years.  Some providers did build out higher capaciites with the hopes of reaping returns by supplying  their own video content, but as the caption says, running other people’s video content without sharing the revenue was not planned for.

Was that a mistake this lack of capacity an evil greed driven conspiracy? No, it was just all they could afford at that time. Video has always been out there, but several years ago it was just not in any form of original content that made it compelling to watch from a public content site . I am not predicting Armageddon caused by overburdened Internet access, however, in the next few years you will see things get ugly with finger pointing and most likely Congress getting involved, obviously to saber rattle and score brownie points with their constituents.

With all that said, we will do our best to stay net neutral and help everybody sort it out without playing sides.

See our recent article on net neutrality for more details.

%d bloggers like this: