Canadians request comments on traffic shaping practices

Art Reisman CTO

I am not sure if this is open to Canadians only, but the CRTC (the Canadian equivalent of the FCC) has set up a site for comments regarding their policies on Internet traffic shaping. The site is open from now till April 30th and can be found at

So if you get the chance chime in and give them your thoughts.

For the fun of it (see below) I grabbed a few of the existing comments truely at random. After reading them it is funny how the consumer sentiments so far are in total agreement with what we NetEqualizer have been proselytizing  which is:  “Traffic management is fine as long as there is full disclosure of policies”. Nobody wants to pump gas without knowing the grade and the price and the same goes for their Internet service.


“Any traffic management practices deviating from complete network neutrality, that is to say, any practices that single out one protocol over another, should certainly be disclosed to the user in the service agreement. To disclose anything less would be consumer fraud.”

“Traffic management has a real impact on the product that a consumer is paying for. All ISPs are not created equal and consumers aren’t in a position to analyze the complexities of network management and the possible impacts on their usage.”

“All traffic shaping practices should be disclosed, in plain English, online and as a part of the terms of service.”

“I agree with the other posters thus far — if ISPs are allowed to get away with uncompetitive throttling of Internet traffic, those techniques and the effect on the customer should be fully disclosed in plain versions of both official languages.”

“Any new communication technologies can be thwarted if ISPs deem them to be competitive with any of their services, stifling innovation. Even the CBC has used BitTorrent to distribute programming, and..”

When is Deep Packet Inspection a Good Thing?


Update September 2011

Seems some shareholders  of a company who over promised layer 7 technology are not happy.

By Eli Riles

As many of our customers are aware, we publicly stated back in October 2008 that we officially had switched all of our bandwidth control solutions over to behavior-based shaping. Consequently, we  also completely disavowed Deep Packet Inspection in a move that has Ars Technica described as “vendor throws deep packet inspection under the bus.”

In the last few weeks, there has been a barrage of attacks on Deep Packet Inspection, and then a volley of PR supporting it from those implementing the practice.

I had been sitting on an action item to write something in defense of DPI, and then this morning I came across a pro-DPI blog post in the New York Times. The following excerpt is in reference to using DPI to give priority to certain types of traffic such as gaming:

“Some customers will value what they see as low priority as high priority,” he said. I asked Mr. Scott what he thought about the approach of Plusnet, which lets consumers pay more if they want higher priority given to their game traffic and downloads. Surprisingly, he had no complaints.

“If you said to me, the consumer, ‘You can choose what applications to prioritize and which to deprioritize, and, oh, by the way, prices will change as a result of how you do this,’ I don’t have a problem with that,” he said.

The key to this excerpt is the phrase, “IF YOU ASK THE CONSUMER WHAT THEY WANT.” This implies permission. If you use DPI as an opt-in , above-board technology, then obviously there is nothing wrong with it. The threat to privacy is only an issue if you use DPI without consumer knowledge. It should not be up to the provider to decide appropriate use of DPI,  regardless of good intent.

The quickest way to deflate the objections  of the DPI opposition is to allow consumers to choose. If you subscribe to a provider that allows you to have higher priority for certain application, and it is in their literature, then by proxy you have granted permission to monitor your traffic. I can still see the Net Neutrality purist unhappy with any differential service, but realistically I think there is a middle ground.

I read an article the other day where a defender of DPI practices (sorry no reference) pointed out how spam filtering is widely accepted and must use DPI techniques to be effective. The part the defender again failed to highlight was that most spam filtering is done as an opt-in with permission. For example, the last time I checked my Gmail account, it gave the option to turn the spam filter off.

In sum, we are fully in support of DPI technology when the customer is made aware of its use and has a choice to opt out. However, any use of DPI done unknowingly and behind the scenes is bound to create controversy and may even be illegal. The exception would be a court order for a legal wiretap. Therefore, the Deep Packet Inspection debate isn’t necessarily a black and white case of two mutually exclusive extremes of right and wrong. If done candidly, DPI can be beneficial to both the Internet user and provider.

See also what is deep packet inspection.

Eli Riles, a consultant for APconnections (Netequalizer), is a retired insurance agent from New York. He is a self-taught expert in network infrastructure. He spends half the year traveling and visiting remote corners of the earth. The other half of the year you’ll find him in his computer labs testing and tinkering with the latest network technology.

For questions or comments, please contact him at

Is Your ISP Throttling Your Bandwidth?

Editor’s  Note: With all the recent media coverage about ISPs giving preferential treatment to VOIP, and the controversy over Net Neutrality, we thought it might be interesting to revisit this original article Art published in PC Magazine back in 2007.

Update August 2010 the FCC is not being fooled anymore.

Analysis: The White Lies ISPs Tell About Broadband Speeds

By Art Reisman, CTO, APconnections (

In a recent PC Magazine article, writer Jeremy Kaplan did a fantastic job of exposing the true Internet access speeds of the large consumer providers.

He did this by creating a speed test that measured the throughput of continuous access to popular Web sites like Google, Expedia, and many others. Until this report was published, the common metric for comparing ISPs was through the use of the numerous Internet speed test sites available online.

The problem with this validation method was that it could not simulate real speeds encountered when doing typical Web surfing and downloading operations. Plus, ISPs can tamper with the results of speed tests — more on this later.

When I saw the results of PC Magazine’s testing, I was a bit relieved to see that the actual speeds of large providers was somewhere between 150 Kbit/s and 200 Kbit/s. This is a far cry from the two, three or even four megabit download speeds frequently hyped in ISP marketing literature.

These slower results were more in line with what I have experienced from my home connection, even though online Internet speed tests always show results close, if not right on, the advertised three megabits per second. There are many factors that dictate your actual Internet speed, and there are also quite a few tricks that can be used to create the illusion of a faster connection.

Before I continue, I should confess that I make my living by helping ISPs stretch their bandwidth among their users. In doing this, I always encourage all parties to be honest with their customers, and in most cases providers are. If you read the fine print in your service contract, you will see disclaimers stating that “actual Internet speeds may vary”, or something to that effect. Such disclaimers are not an attempt to deceive, but rather a simple reflection of reality.

Guaranteeing a fixed-rate speed to any location on the Internet is not possible, nor was the Internet ever meant to be such a conduit. It has always been a best-effort mechanism. I must also confess that I generally only work with smaller ISPs. The larger companies have their own internal network staff, and hence I have no specific knowledge of how they deal with oversold conditions, if they deliberately oversell, and, if so, by how much. Common business sense leads me to believe they must oversell to some extent in order to be profitable. But, again, this isn’t something I can prove.

Editors update Sept 2009: Since this article was written many larger providers have come clean.

A Matter of Expectations

How would you feel if you pumped a gallon of gas only to find out that the service station’s meter was off by 10 percent in its favor? Obviously you would want the owners exposed immediately and demand a refund, and possibly even lodge a criminal complaint against the station. So, why does the consumer tolerate such shenanigans with their ISP?

Put simply, it’s a matter of expectations.

ISPs know that new and existing customers are largely comparing their Internet-speed experiences to dial-up connections, which often barely sustain 28 Kbit/s. So, even at 150 Kbits/s, customers are getting a seven-fold increase in speed, which is like the difference between flying in a jet and driving your car. With the baseline established by dial-up being so slow, most ISPs really don’t need to deliver a true sustained three megabits to be successful.

As a consumer, reliable information is the key to making good decisions in the marketplace. Below are some important questions you may want to ask your provider about their connection speeds. It is unlikely the sales rep will know the answers, or even have access to them, but perhaps over time, with some insistence, details will be made available.

Five Questions to Ask Your ISP

1.) What is the contention ratio in my neighborhood?

At the core of all Internet service is a balancing act between the number of people who are sharing a resource and how much of that resource is available.

For example, a typical provider starts out with a big pipe of Internet access that is shared via exchange points with other large providers. They then subdivide this access out to their customers in ever smaller chunks — perhaps starting with a gigabit exchange point and then narrowing down to a 10 megabit local pipe that is shared with customers across a subdivision or area of town.

The speed you, the customer, can attain is limited to how many people might be sharing that 10 megabit local pipe at any one time. If you are promised one megabit service, it is likely that your provider would have you share your trunk with more than 10 subscribers and take advantage of the natural usage behavior, which assumes that not all users are active at one time.

The exact contention ratio will vary widely from area to area, but from experience, your provider will want to maximize the number of subscribers who can share the pipe, while minimizing service complaints due to a slow network. In some cases, I have seen as many as 1,000 subscribers sharing 10 megabits. This is a bit extreme, but even with a ratio as high as this, subscribers will average much faster speeds when compared to dial-up.

2.) Does your ISP’s exchange point with other providers get saturated?

Even if your neighborhood link remains clear, your provider’s connection can become saturated at its exchange point. The Internet is made up of different provider networks and backbones. If you send an e-mail to a friend who receives service from a company other than your provider, then your ISP must send that data on to another network at an exchange point. The speed of an exchange point is not infinite, but is dictated by the type of switching equipment. If the exchange point traffic exceeds the capacity of the switch or receiving carrier, then traffic will slow.

3.) Does your provider give preferential treatment to speed test sites?

As we alluded to earlier, it is possible for an ISP to give preferential treatment to individual speed test sites. Providers have all sorts of tools at their disposal to allow and disallow certain kinds of traffic. It seems rather odd to me that in the previously cited PC Magazine test, which used highly recognized Web sites, the speed results were consistently well under advertised connection speeds. One explanation for this is that providers give full speed only when going to common speed test Web sites.

4.) Are file-sharing queries confined to your provider network?

Another common tactic to save resources at the exchange points of a provider is to re-route file-sharing requests to stay within their network. For example, if you were using a common file-sharing application such as BitTorrent, and you were looking some non-copyrighted material, it would be in your best interest to contact resources all over the world to ensure the fastest download.

However, if your provider can keep you on their network, they can avoid clogging their exchange points. Since companies keep tabs on how much traffic they exchange in a balance sheet, making up for surpluses with cash, it is in their interest to keep traffic confined to their network, if possible.

5.) Does your provider perform any usage-based throttling?

The ability to increase bandwidth for a short period of time and then slow you down if you persist at downloading is another trick ISPs can use. Sometimes they call this burst speed, which can mean speeds being increased up to five megabits, and they make this sort of behavior look like a consumer benefit. Perhaps Internet usage will seem a bit faster, but it is really a marketing tool that allows ISPs to advertise higher connection speeds – even though these speeds can be sporadic and short-lived.

For example, you may only be able to attain five megabits at 12:00 a.m. on Tuesdays, or some other random unknown times. Your provider is likely just letting users have access to higher speeds at times of low usage. On the other hand, during busier times of day, it is rare that these higher speeds will be available.

In writing this article, my intention was not to create a conspiracy theory about unscrupulous providers. Any market with two or more choices ensures that the consumer will benefit. Before you ask for a Congressional investigation, keep in mind that ISPs’ marketing tactics are no different from those of other industries, meaning they will generally cite best-case scenarios when promoting their products. Federal regulation would only thwart the very spirit of the Internet, which, as said before, has always been a best-effort infrastructure.

But, with the information above, it is your job as a consumer to comparison shop and seek answers. Your choices are what drive the market and asking questions such as these are what will point ISPs in the right direction.

Since we first published this article, Google and others have been trying to educate consumers on Net Neutrality. There is now a consortium called M-Lab which has put together a sophisticated speed test site designed to give specific details on what your ISP is doing to your connection. See the article below for more information.

Related article Ten things your internet provider does not want you to know.

Created by APconnections, the NetEqualizer is a plug-and-play bandwidth control and WAN/Internet optimization appliance that is flexible and scalable. When the network is congested, NetEqualizer’s unique “behavior shaping” technology dynamically and automatically gives priority to latency sensitive applications, such as VoIP and email. Click here for a full price list.

New Speed Test Tools from M-Lab Expose ISP Bandwidth Throttling Practices

In a recent article, we wrote about the “The White Lies ISPs tell about their bandwidth speeds“.  We even hinted at how they (your ISP)  might be inclined to give preferential treatment to normal speed test sites.  Well, now there is a speed test site from M-Lab that goes beyond simple speed tests. M-lab gives the consumer sophisticated results and exposes any tricks your ISP might be up to.

Features provided include:

  • Network Diagnostic Tool – Test your connection speed and receive sophisticated diagnosis of problems limiting speed.
  • Glasnost – Test whether BitTorrent is being blocked or throttled.
  • Network Path and Application Diagnosis – Diagnose common problems that impact last-mile broadband networks.
  • DiffProbe (coming soon) – Determine whether an ISP is giving some traffic a lower priority than other traffic.
  • NANO (coming soon) – Determine whether an ISP is degrading the performance of a certain subset of users, applications, or destinations.

Click here to learn more about M-Lab.

Related article on how to determine your true video speed over the Internet.

Net Neutrality Defined,Barack Obama is on the bandwagon

By Art Reisman, CTO,

Art Reisman CTO

Art Reisman

There continues to be a flurry of Net Neutrality articles published and according to one, Barack Obama is a big supporter of Net Neutrality.  Of course that was a fleeting campaign soundbite that the media picked up without much context.

I was releived to see that finally a politically entity put a definition on Net Neutrality.

From the government of Norway we get:

“The new rules lay out three guidelines. First, Internet users must be given complete and accurate information about the service they are buying, including capacity and quality. Second, users are allowed to send and receive content of their choice, use services and applications of their choice. and connect any hardware and software that doesn’t harm the network. Finally, the connection cannot be discriminated against based on application, service, content, sender, or receiver.”

Full Article: Norway gets net neutrality—voluntary, but broadly supported

I could not agree more. Note that this definition does not rule out some form a fair bandwidth shaping, and that is an important distinction because the Internet will be reduced to gridlock without some traffic control.

The funniest piece of irony in this whole debate is that the larger service providers are warning of Armageddon without some form of fairness rules, (and I happen to agree) , while at the same time their marketing arm is creating an image of infinite unfettered access for $29 a month. (I omitted a reference link because they change daily)

More Resistence for Deep Packet Inspection

Editors note:

We come across stories from irate user groups every day. It seems the more the public knows about deep packet inspection practices the less likely it becomes. In Canada it looks like the resistance is getting some heavy hitters.

Google, Amazon, others want CRTC to ban internet interference

Last Updated: Tuesday, February 24, 2009 | 4:53 PM ET Comments49Recommend97

A coalition of more than 70 technology companies, including internet search leader Google, online retailer Amazon and voice over internet provider Skype, is calling on the CRTC to ban internet service providers from “traffic shaping,” or using technology that favours some applications over others.

In a submission filed Monday to the Canada Radio-television and Telecommunications Commission (CRTC) in advance of a July probe into the issue of internet traffic management, the Open Internet Coalition said traffic shaping network management “discourages investment in broadband networks, diminishes consumer choice, interferes with users’ freedom of expression, and inhibits innovation.”

Full Article

More on Deep Packet Inspection and the NebuAd case

By Art Reisman

CTO of APconnections, makers of the plug-and-play bandwidth control and traffic shaping appliance NetEqualizer

Art Reisman CTO

Editors note:

This  latest article published in DSL reports reminds me of the time  where a bunch of friends (not me),  are smoking a joint in a car when the police pull them over, and the guy holding the joint takes the fall for everybody.  I don’t want to see any of these ISPs get hammered as I am sure they are good companies.

It seems like this case should be easily settled.  Even if privacy laws were viloated , the damage was perhaps a few unwanted AD’s that popped up on a browser, not some form of extortion of private records. In any case, the message should be clear to any ISP, don’t implement DPI of any kind to be safe.  And yet, for every NebuAd privacy lawsuit case article I come across , I must see at least two or three press releases from vendors announcing major deals with  for DPI equipment ?

FUll Original article link from DSL reports

ISPs Play Dumb In NebuAD Lawsuit
Claim they were ‘passive participants’ in user data sales…
08:54AM Thursday Feb 05 2009 by Karl Bode
tags: legal · business · privacy · consumers · Embarq · CableOne · Knology
Tipped by funchords See Profile

The broadband providers argue that they can’t be sued for violating federal or state privacy laws if they didn’t intercept any subscribers. In court papers filed late last week, they argue that NebuAd alone allegedly intercepted traffic, while they were merely passive participants in the plan.

By “passive participants,” they mean they took (or planned to take) money from NebuAD in exchange for allowing NebuAD to place deep packet inspection hardware on their networks. That hardware collected all browsing activity for all users, including what pages were visited, and how long each user stayed there. It’s true many of the the carriers were rather passive in failing to inform customers these trials were occurring — several simply tried to slip this through fine print in their terms of service or acceptable use policies.

Four Reasons Why Peer-to-Peer File Sharing Is Declining in 2009

By Art Reisman

CTO of APconnections, makers of the plug-and-play bandwidth control and traffic shaping appliance NetEqualizer

Art Reisman CTO

I recently returned from a regional NetEqualizer tech seminar with attendees from Western Michigan University, Eastern Michigan University and a few regional ISPs.  While having a live look at Eastern Michigan’s p2p footprint, I remarked that it was way down from what we had been seeing in 2007 and 2008.  The consensus from everybody in the room was that p2p usage is waning. Obviously this is not a wide data base to draw a conclusion from, but we have seen the same trend at many of our customer installs (3 or 4 a week), so I don’t think it is a fluke. It is kind of ironic, with all the controversy around Net Neutrality and Bit-torrent blocking,  that the problem seems to be taking care of itself.

So, what are the reasons behind the decline? In our opinion, there are several reasons:

1) Legal Itunes and other Mp3 downloads are the norm now. They are reasonably priced and well marketed. These downloads still take up bandwidth on the network, but do not clog access points with connections like torrents do.

2) Most music aficionados are well stocked with the classics (bootleg or not) by now and are only grabbing new tracks legally as they come out. The days of downloading an entire collection of music at once seem to be over. Fans have their foundation of digital music and are simply adding to it rather than building it up from nothing as they were several years ago.

3) The RIAA enforcement got its message out there. This, coupled with reason #1 above, pushed users to go legal.

4) Legal, free and unlimited. YouTube videos are more fun than slow music downloads and they’re free and legal. Plus, with the popularity of YouTube, more and more television networks have caught on and are putting their programs online.

Despite the decrease in p2p file sharing, ISPs are still experiencing more pressure on their networks than ever from Internet congestion. YouTube and NetFlix  are more than capable of filling in the void left by waning Bit-torrents.  So, don’t expect the controversy over traffic shaping and the use of bandwidth controllers to go away just yet.

Cox Shaping Policies Similar to NetEqualizer

Editor’s Note: Cox today announced a bandwidth management policy similar to NetEqualizer, but with a twist. It seems they are only delaying p2p during times of congestion (similar to NetEqualizer), but in order to specifically determine traffic is p2p, they are possibly employing some form of Deep Packet Inspection (not similar to NetEqualizer, which is traffic-type agnostic). If anybody has inside knowledge, we would appreciate comments here and will make corrections to our assertion if needed.

As this all plays out, it will be interesting to see how they differentiate p2p from video and if they are actually doing Deep Packet Inspection.  Also, if DPI is part of the Cox strategy, how will this sit with the FCC when they clearly strong armed  Comcast to stop using DPI ?

Cox Will Shape Its Broadband Traffic; Delay P2P & FTP Transfers

Om Malik | | Tuesday, January 27, 2009

Cox Communications, the third largest cable company and broadband service provider is joining Comcast in traffic shaping and delaying traffic it thinks is not time sensitive. They call it congestion management, making it seem like a innocuous practice, though in reality it is anything but innocous. Chalk this up as yet-another-incumbent-behaving-badly!

To be fair, in the past Cox had made it pretty clear that it was going to play god with traffic flowing through its pipes. Next month, they will start testing a new method of managing traffic on its network in Kansas and Arkansas. Cox, outlining the congestion management policy on their website notes:

“…automatically ensures that all time-sensitive Internet traffic — such as web pages, voice calls, streaming videos and gaming — moves without delay. Less time-sensitive traffic, such as file uploads, peer-to-peer and Usenet newsgroups, may be delayed momentarily — but only when the local network is congested.”

Full article

Is Barack Obama going to turn the tide toward Net Neutrality ?

NetWork World of Canada discusses some interesting scenarios about possible policy changes with the new adminstration.

In the article the author (Howard Solomon) specifically sites Obama’s leaning…

Meanwhile, the new President favours net neutrality, the principle that Internet service providers (ISPs) shouldn’t interfere with content traveling online, which could hurt Sandvine, a builder of deep packet inspection appliances for ISPs. At least one Senator is expected to introduce limiting legislation this month.

Will this help NetEqualizer sales and our support for behavior-based Net Neutral policy shaping?

According to Eli Riles vice president of sales at APconnections, “I don’t think it will change things much, we are already seeing steady growth, and I don’t expect a rush to purchase our equipment due to a government policy change. We sell mostly to Tier2 and Tier3 providers who have already generally stopped purchasing Layer 7 solutions mostly due to the higher cost and less so due to moral high ground or government mandate.”

related article

Stay tuned…

Comcast fairness techniques comparison with NetEqualizer

Comcast is now rolling out the details of their new policy on Traffic shaping Fairness as they get away from their former Deep Packet inspection.

For the complete Comcast article click here

Below we compare techniques with the NetEqualizer

Note: Feel free to  comment if you feel we  need to make any corrections in our comparison our goal is to be as accurate as possible.

1) Both techniques make use of slowing users down if they exceed a bandwidth limit over a time period.

2) The Comcast bandwidth limit kicks in after 15 minutes and is based only on a customers usage over that time period, it is not based on the congestion going on in the overall network.

3) NetEqualizer bandwidth limits are based on the last 8 seconds of customer usage, but only kick when the overall Network is full.  (the aggregate bandwidth utilization of all users on the line has reached a critical level)

4) Comcast punishes offenders by cutting them back  50 percent for a minimum of 15 minutes

5) NetEqualizer punishes offenders  just a few seconds and then lets them back to full strength. It will hit the offending connection with a decrease ranging from 50 to 80 percent.

6) Comcast puts a restriction on all traffic to the user during the 15 minute Penalty period

7) NetEqualizer only punishes offending connections , for example if you were running an FTP download and a streaming audio , only the FTP download would be effected by the restriction.

In our opinion both methods are effective and fair.

FYI NetEqualizer also has a Quota system which is used by a very small percent of our customers. It is very similar to the Comcast 15 minute system only that the time interval is done in Days.

Details on the NetEqualizer Quota based system can be found in the user guide page 11.

Created by APconnections, the NetEqualizer is a plug-and-play bandwidth control and WAN/Internet optimization appliance that is flexible and scalable. When the network is congested, NetEqualizer’s unique “behavior shaping” technology dynamically and automatically gives priority to latency sensitive applications, such as VoIP and email. Click here for a full price list.

Five Questions You Should Ask about Internet Speed and Bursting

Art Reisman

By Art Reisman, CTO, APconnections

Editor’s Note: With consumers up in arms about net neutrality, they should also be asking their ISPs for some truth in advertising when it comes their Internet speed and the specifics concerning how and when bursting occurs.

With all the talk of net neutrality and deep packet inspection, we thought it was time to revisit the illusion created by providers offering “burstable” Internet speeds.

What is a burstable Internet speed? Well, it’s a common trick used by providers that lets you temporarily enjoy their highest speed, but then after a certain time period or after a bandwidth quota is reached, you automatically get knocked down  to a slower speed.

Generally, your provider leaves the specifics of when this bursting takes place out of their standard literature.  Instead, they will likely cite a best-case number when marketing their service. When bursting is mentioned, if ever, it is likely done in the fine print.

But, this doesn’t mean that there aren’t ways to hold your ISP accountable. Below are some questions that you should ask your Internet service provider to find out exactly what you are paying for.

  1. Is the speed advertised in their marketing literature available all the time, or is that a best-case speed (or burst) that you may or may not achieve on a regular basis?
  2. Do you get charged, penalized, or black-listed for using this higher speed?
  3. How long can you burst for? For example, is a burst one second, 10 seconds, or 10 hours at a time?
  4. Can you get exactly how this bursting feature works in writing?
  5. Can you trade in the bursting feature for a guaranteed sustained top speed that is always on and not considered bursting?

While we can’t promise that these questions will always elicit an upfront, honest and informed response, they’re a step in the right direction. For a more in depth article on the subject and business behind “bursting” you should also  check out Bursting Is for the Birds.

Canadians Mull over Privacy and Deep Packet Inspection

Editor’s note: Seems the Canadians are also finally forced to face the issue of deep packet inspection. I guess the cat is out of the bag in Canada? One troubling note in the article below is the authors insinuation that the only way to control Internet bandwidth is through DPI .

Privacy Commissioner of Canada -

CRTC begins dialogue on traffic shaping

Posted on November 21st, 2008 by Daphne Guerrero

Yesterday, the CRTC rendered its decision on ISP’s traffic shaping practices. It announced that it was denying the Canadian Internet Service Providers’ (CAIP) request that Bell Canada, which provides wholesale ADSL services to smaller ISPs across the country, cease the traffic-shaping practices it has adopted for its wholesale customers.

“Based on the evidence before us, we found that the measures employed by Bell Canada to manage its network were not discriminatory. Bell Canada applied the same traffic-shaping practices to wholesale customers as it did to its own retail customers,” said Konrad von Finckenstein, Q.C., Chairman of the CRTC.

Moreover, the CRTC recognized that traffic-shaping “raises a number of questions” for both end-users and ISPs and has decided to hold a public hearing next July to consider them.

Read the full article

Delusions of Net Neutrality

I saw this post this morning, and I thought it was fantastically well written and informative.

Delusions of Net Neutrality

A mathematics professor at the University of Minnesota, Andrew Odlyzko, has a pretty blistering critique of Internet Service Provider’s (ISPs) arguments against net neutrality and about their love of streaming over download. It’s worth a read of the abstract if nothing more – his paper, The delusions of net neutrality (caution, links to a pdf) destroys many a myth of the internet and video. Having been to many a conference lately where the best minds in the room can only imagine the internet making a better tv, I appreciate some astute analysis of the reality.

Odlyzko shows that ISPs and others are pushing for a world where the goals of the internet are reduced to streaming movies, in relatively walled envrionments, and that the costs to build a network capable of this demand that net neutrality be curtailed.

Full Article

YouTube: The Unfunded Mandate

As some of you may know, I have chimed in several times on the debate on Internet access and the games ISP play to block certain types of traffic (Bittorrent).  I have leaned toward the side of Internet providers and defended some of their restrictive practices. I took quite a bit of heat for some of my previous positions. For example, this excerpt was posted in a discussion forum as a reply to an opinion piece I wrote recently for Extreme Tech magazine:

“So I was wondering why Extremetech would allow such blatant misinformation and FUD on their site…”

First off, please understand my point of reference before assuming I am an industry shill. I am an unbiased observer sitting on the sideline.

Secondly, you can villainize providers all you want, but they exist to make a profit. It is, after all, a business. And now they are facing a new threat with the explosion of YouTube and other video content. Here are some trends that we have seen.

Back in 2006, on a typical footprint of usage patterns on an ISP network, streams exceeding 200kbs (that is 200 kilo bits of data per second) averaged around 2 percent of the users at any one time. Almost all other streams were well under 50kbs. The 2006  ratio of big users to small users allowed  a typical Internet provider to serve approximately 500 people on a 10 megabit circuit without any serious issues. Today we are seeing 10 to 15 percent of the active streams exceeding 200 kbs. That is about a 700 percent increase in the last two years. This increase is mostly attributed to increased online video with  YouTube leading the way.

The ramification of YouTube and its impact on bandwidth demands is putting the squeeze on providers– like it or not they have not choice to but to implement some sort of quota system on bandwidth. Providers invested in certain sized networks and capacities based on the older usage model and smaller increases over time, not 700 percent in 2 years.  Some providers did build out higher capaciites with the hopes of reaping returns by supplying  their own video content, but as the caption says, running other people’s video content without sharing the revenue was not planned for.

Was that a mistake this lack of capacity an evil greed driven conspiracy? No, it was just all they could afford at that time. Video has always been out there, but several years ago it was just not in any form of original content that made it compelling to watch from a public content site . I am not predicting Armageddon caused by overburdened Internet access, however, in the next few years you will see things get ugly with finger pointing and most likely Congress getting involved, obviously to saber rattle and score brownie points with their constituents.

With all that said, we will do our best to stay net neutral and help everybody sort it out without playing sides.

See our recent article on net neutrality for more details.

%d bloggers like this: