By Art Reisman, CTO, www.netequalizer.com

Art Reisman is a partner and co-founder of APconnections, a company that provides bandwidth control solutions (NetEqualizer) to ISPs, Universities, Libraries, Mining Camps, and any organization where groups of users must share their Internet resources equitably. What follows is an objective educational journey on how consumers and ISPs can live in harmony with the explosion of YouTube video.
The following is written primarily for the benefit of mid-to-small sized internet services providers (ISPs). However, home consumers may also find the details interesting. Please follow along as I break down the business cost model of the costs required to keep up with growing video demand.
In the past few weeks, two factors have come up in conversations with our customers, which has encouraged me to investigate this subject further and outline the challenges here:
1) Many of our ISP customers are struggling to offer video at competitive levels during the day, and yet are being squeezed due to high bandwidth costs. Many look to the NetEqualizer to alleviate video congestion problems. As you know, there are always trade-offs to be made in handling any congestion issue, which I will discuss at the end of this article. But back to the subject at hand. What I am seeing from customers is that there is an underlying fear that they (IT adminstrators) are behind the curve. As I have an opinion on this, I decided I need to lay out what is “normal” in terms of contention ratios for video, as well what is “practical” for video in today’s world.
2) My internet service provider, a major player that heavily advertises how fast their speed is to the home, periodically slows down standard YouTube Videos. I should be fair with my accusation, with the Internet you can actually never be quite certain who is at fault. Whether I am being throttled or not, the point is that there are an ever-growing number of video content providers , who are pushing ahead with plans that do not take into account, nor care about, a last mile provider’s ability to handle the increased load. A good analogy would be a travel agency that is booking tourists onto a cruise ship without keeping a tally of tickets sold, nor caring, for that matter. When all those tourists show up to board the ship, some form of chaos will ensue (and some will not be able to get on the ship at all).
Some ISPs are also adding to this issue, by building out infrastructure without regard to content demand, and hoping for the best. They are in a tight spot, getting caught up in a challenging balancing act between customers, profit, and their ability to actually deliver video at peak times.
The Business Cost Model of an ISP trying to accommodate video demands
Almost all ISPs rely on the fact that not all customers will pull their full allotment of bandwidth all the time. Hence, they can map out an appropriate subscriber ratio for their network, and also advertise bandwidth rates that are sufficient enough to handle video. There are four main governing factors on how fast an actual consumer circuit will be:
1) The physical speed of the medium to the customer’s front door (this is often the speed cited by the ISP)
2) The combined load of all customers sharing their local circuit and the local circuit’s capacity (subscriber ratio factors in here)
3) How much bandwidth the ISP contracts out to the Internet (from the ISP’s provider)
4) The speed at which the source of the content can be served (Youtube’s servers), we’ll assume this is not a source of contention for our examples below, but it certainly should remain a suspect in any finger pointing of a slow circuit.
The actual limit to the am0unt of bandwidth a customer gets at one time, which dictates whether they can run a live streaming video, usually depends on how oversold their ISP is (based on the “subscriber ratio” mentioned in points 1 and 2 above). If your ISP can predict the peak loads of their entire circuit correctly, and purchase enough bulk bandwidth to meet that demand (point 3 above), then customers should be able to run live streaming video without interruption.
The problem arises when providers put together a static set of assumptions that break down as consumer appetite for video grows faster than expected. The numbers below typify the trade-offs a mid-sized provider is playing with in order to make a profit, while still providing enough bandwidth to meet customer expectations.
1) In major metropolitan areas, as of 2010, bandwidth can be purchased in bulk for about $3000 per 50 megabits. Some localities less some more.
2) ISPs must cover a fixed cost per customer amortized: billing, sales staff, support staff, customer premise equipment, interest on investment , and licensing, which comes out to about $35 per month per customer.
3) We assume market competition fixes price at about $45 per month per customer for a residential Internet customer.
4) This leaves $10 per month for profit margin and bandwidth fees. We assume an even split: $5 a month per customer for profit, and $5 per month per customer to cover bandwidth fees.
With 50 megabits at $3000 and each customer contributing $5 per month, this dictates that you must share the 50 Megabit pipe amongst 600 customers to be viable as a business. This is the governing factor on how much bandwidth is available to all customers for all uses, including video.
So how many simultaneous YouTube Videos can be supported given the scenario above?
Live streaming YouTube video needs on average about 750kbs , or about 3/4 of a megabit, in order to run without breaking up.
On a 50 megabit shared link provided by an ISP, in theory you could support about 70 simultaneous YouTube sessions, assuming nothing else is running on the network. In the real world there would always be background traffic other than YouTube.
In reality, you are always going to have a minimum fixed load of internet usage from 600 customers of approximately 10-to-20 megabits. The 10-to-20 megabit load is just to support everything else, like web sufing, downloads, skype calls, etc. So realistically you can support about 40 YouTube sessions at one time. What this implies that if 10 percent of your customers (60 customers) start to watch YouTube at the same time you will need more bandwidth, either that or you are going to get some complaints. For those ISPs that desperately want to support video, they must count on no more than about 40 simultaneous videos running at one time, or a little less than 10 percent of their customers.
Based on the scenario above, if 40 customers simultaneously run YouTube, the link will be exhausted and all 600 customers will be wishing they had their dial-up back. At last check, YouTube traffic accounted for 10 percent of all Internet Traffic. If left completely unregulated, a typical rural ISP could find itself on the brink of saturation from normal YouTube usage already. With tier-1 providers in major metro areas, there is usually more bandwidth, but with that comes higher expectations of service and hence some saturation is inevitable.
This is why we believe that Video is currently an “unfunded mandate”. Based on a reasonable business cost model, as we have put forth above, an ISP cannot afford to size their network to have even 10% of their customers running real-time streaming video at the same time. Obviously, as bandwidth costs decrease, this will help the economic model somewhat.
However, if you still want to tune for video on your network, consider the options below…
NetEqualizer and Trade-offs to allow video
If you are not a current NetEqualizer user, please feel free to call our engineering team for more background. Here is my short answer on “how to allow video on your network” for current NetEqualizer users:
1) You can determine the IP address ranges for popular sites and give them priority via setting up a “priority host”.
This is not recommended for customers with 50 megs or less, as generally this may push you over into a gridlock situation.
2) You can raise your HOGMIN to 50,000 bytes per second.
This will generally let in the lower resolution video sites. However, they may still incur Penalities should they start buffering at a higher rate than 50,000. Again, we would not recommend this change for customers with pipes of 50 megabits or less.
With either of the above changes you run the risk of crowding out web surfing and other interactive uses , as we have described above. You can only balance so much Video before you run out of room. Please remember that the Default Settings on the NetEq are designed to slow video before the entire network comes to halt.
For more information, you can refer to another of Art’s articles on the subject of Video and the Internet: How much YouTube can the Internet Handle?
Other blog posts about ISPs blocking YouTube
Cloud Computing – Do You Have Enough Bandwidth? And a Few Other Things to Consider
December 10, 2011 — netequalizerThe following is a list of things to consider when using a cloud-computing model.
Bandwidth: Is your link fast enough to support cloud computing?
We get asked this question all the time: What is the best-practice standard for bandwidth allocation?
Well, the answer depends on what you are computing.
– First, there is the application itself. Is your application dynamically loading up modules every time you click on a new screen? If the application is designed correctly, it will be lightweight and come up quickly in your browser. Flash video screens certainly spruce up the experience, but I hate waiting for them. Make sure when you go to a cloud model that your application is adapted for limited bandwidth.
– Second, what type of transactions are you running? Are you running videos and large graphics or just data? Are you doing photo processing from Kodak? If so, you are not typical, and moving images up and down your link will be your constraining factor.
– Third, are you sharing general Internet access with your cloud link? In other words, is that guy on his lunch break watching a replay of royal wedding bloopers on YouTube interfering with your salesforce.com access?
The good news is (assuming you will be running a transactional cloud computing environment – e.g. accounting, sales database, basic email, attendance, medical records – without video clips or large data files), you most likely will not need additional Internet bandwidth. Obviously, we assume your business has reasonable Internet response times prior to transitioning to a cloud application.
Factoid: Typically, for a business in an urban area, we would expect about 10 megabits of bandwidth for every 100 employees. If you fall below this ratio, 10/100, you can still take advantage of cloud computing but you may need some form of QoS device to prevent the recreational or non-essential Internet access from interfering with your cloud applications. See our article on contention ratio for more information.
Security: Can you trust your data in the cloud?
For the most part, chances are your cloud partner will have much better resources to deal with security than your enterprise, as this should be a primary function of their business. They should have an economy of scale – whereas most companies view security as a cost and are always juggling those costs against profits, cloud-computing providers will view security as an asset and invest more heavily.
We addressed security in detail in our article how secure is the cloud, but here are some of the main points to consider:
1) Transit security: moving data to and from your cloud provider. How are you going to make sure this is secure?
2) Storage: handling of your data at your cloud provider, is it secure once it gets there from an outside hacker?
3) Inside job: this is often overlooked, but can be a huge security risk. Who has access to your data within the provider network?
Evaluating security when choosing your provider.
You would assume the cloud company, whether it be Apple or Google (Gmail, Google Calendar), uses some best practices to ensure security. My fear is that ultimately some major cloud provider will fail miserably just like banks and brokerage firms. Over time, one or more of them will become complacent. Here is my check list on what I would want in my trusted cloud computing partner:
1) Do they have redundancy in their facilities and their access?
2) Do they screen their employees for criminal records and drug usage?
3) Are they willing to let you, or a truly independent auditor, into their facility?
4) How often do they back-up data and how do they test recovery?
Big Brother is watching.
This is not so much a traditional security threat, but if you are using a free service you are likely going to agree, somewhere in their fine print, to expose some of your information for marketing purposes. Ever wonder how those targeted ads appear that are relevant to the content of the mail you are reading?
Link reliability.
What happens if your link goes down or your provider link goes down, how dependent are you? Make sure your business or application can handle unexpected downtime.
Editors note: unless otherwise stated, these tips assume you are using a third-party provider for resources applications and are not a large enterprise with a centralized service on your Internet. For example, using QuickBooks over the Internet would be considered a cloud application (and one that I use extensively in our business), however, centralizing Microsoft excel on a corporate server with thin terminal clients would not be cloud computing.
Share this: