Internet Regulation, what is the world coming to ?


A friend of mine just forwarded an article titled “How Net Neutrality Rules Could Undermine the Open Internet”

Basically Net Neutrality advocates are now worried that bringing the FCC in to help enforce Neutrality will set a legal precedent allowing wide-reaching control over other aspects of the Internet. For example, some form of content control extending into gray areas.

Let’s look at the history of the FCC for precedents.

The FCC came into existence to manage and enforce the wireless spectrum,  essentially so you did not get 1000 radio/tv stations blasting signals over each other in every city.  A very necessary and valid government service. Without it, there would be utter anarchy in the airwaves. Imagine roads without traffic signals, or airports without control towers.

At some point in time, their control over frequencies got into content and accessibility mandates.  How did this come about? Simply put, it is the normal progression of government asserting control over a resource. It is what it is, neither good nor bad, just a reflection of a society that looks to government to make things “right”. And like an escaped non-native species in the Hawaiian Islands, it tends to take as much real estate as the ecosystem will allow.

What I do know as a certainty, the FCC, once in the door at regulating anything on the Internet, will continue to grow in order to make things “right” and “fair” during our browsing experience.

At best we can hope the inevitable progression of control by the FCC gets thwarted at every turn allowing us a few more good years of the good old Internet as we know it. I’ll take the current Internet flaws for a few more years while I can.

For more information on non-native species invading Hawaii’s ecosystem, check out this blog, from the Kohala Watershed Partnership.

For an overview of Net Neutrality – check out this Net Neutrality for Dummies Article explaining the act’s possible effects on the everyday internet user.

For a discussion on the possible lawlessness of the FCC’s control over the internet, read this blog entitled “Is the FCC Lawless?”.

A Brief History of Peer to Peer File Sharing and the Attempts to Block It


By Art Reisman

The following history is based on my notes and observations as both a user of peer to peer, and as a network engineer tasked with cleaning  it up.

Round One, Napster, Centralized Server, Circa 2002

Napster was a centralized service, unlike the peer to peer behemoths of today there was never any question of where the copyrighted material was being stored and pirated from. Even though Napster did not condone pirated music and movies on their site, the courts decided by allowing copyrighted material to exist on their servers, they were in violation of copyright law. Napster’s days of free love were soon over.

From an historic perspective the importance of the decision to force the shut down of Napster was that it gave rise to a whole new breed of p2p applications. We detailed this phenomenon in our 2008 article.

Round Two, Mega-Upload  Shutdown, Centralized Server, 2012

We again saw a doubling down on p2p client sites (they expanded) when the Mega-Upload site, a centralized sharing site, was shutdown back in Jan 2012.

“On the legal side, the recent widely publicized MegaUpload takedown refocused attention on less centralized forms of file sharing (i.e. P2P). Similarly, improvements in P2P technology coupled with a growth in file sharing file size from content like Blue-Ray video also lead many users to revisit P2P.”

Read the full article from deepfield.net

The shut down of Mega-Upload had a personal effect on me as I had used it to distribute a 30 minute account from a 92-year-old WWII vet where he recalled, in oral detail, his experience of surviving a German prison camp.

Blocking by Signature, Alias Layer 7 Shaping, Alias Deep packet inspection. Late 1990’s till present

Initially, the shining star savior in the forefront against spotting illegal content on your network, this technology can be expensive and fail miserably in the face of newer encrypted p2p applications. It also can get quite expensive to keep up with the ever changing application signatures, and yet it is still often the first line of defense attempted by ISPs.

We covered this topic in detail, in our recent article,  Layer 7 Shaping Dying With SSL.

Blocking by Website

Blocking the source sites where users download their p2p clients is still possible. We see this method applied at mostly private secondary schools, where content blocking is an accepted practice. This method does not work for computers and devices that already have p2p clients. Once loaded, p2p files can come from anywhere and there is no centralized site to block.

Blocking Uninitiated Requests. Circa Mid-2000

The idea behind this method is to prevent your Network from serving up any content what so ever! Sounds a bit harsh, but the average Internet consumer rarely, if ever, hosts anything intended for public consumption. Yes at one time, during the early stages of the Internet, my geek friends would set up home pages similar to what everybody exposes on Facebook today. Now, with the advent hosting sites, there is just no reason for a user to host content locally, and thus, no need to allow access from the outside. Most firewalls have a setting to disallow uninitiated requests into your network (obviously with an exemption for your publicly facing servers).

We actually have an advanced version of this feature in our NetGladiator security device. We watch each IP address on your internal network and take note of outgoing requests, nobody comes in unless they were invited. For example, if we see a user on the Network make a request to a Yahoo Server , we expect a response to come back from a Yahoo server; however if we see a Yahoo server contact a user on your network without a pending request, we block that incoming request. In the world of p2p this should prevent an outside client from requesting a receiving a copyrighted file hosted on your network, after all no p2p client is going to randomly send out invites to outside servers or would they?

I spent a few hours researching this subject, and here is what I found (this may need further citations). It turns out that p2p distribution may be a bit more sophisticated and has ways to get around the block uninitiated query firewall technique.

P2P networks such as Pirate Bay use a directory service of super nodes to keep track of what content peers have and where to find them. When you load up your p2p client for the first time, it just needs to find one super node to get connected, from there it can start searching for available files.

Note: You would think that if these super nodes were aiding and abetting in illegal content that the RIAA could just shut them down like they did Napster. There are two issues with this assumption:

1) The super nodes do not necessarily host content, hence they are not violating any copyright laws. They simply coordinate the network in the same way DNS service keep track of URL names and were to find servers.
2) The super nodes are not hosted by Pirate Bay, they are basically commandeered from their network of users, who unwittingly or unknowingly agree to perform this directory service when clicking the license agreement that nobody ever reads.

From my research I have talked to network administrators that claim despite blocking uninitiated outside requests on their firewalls, they still get RIAA notices. How can this be?

There are only two ways this can happen.

1) The RIAA is taking liberty to simply accuse a network of illegal content based on the directory listings of a super node. In other words if they find a directory on a super node pointing to copyrighted files on your network, that might be information enough to accuse you.

2) More likely, and much more complex, is that the Super nodes are brokering the transaction as a condition of being connected. Basically this means that when a p2p client within your network, contacts a super node for information, the super node directs the client to send data to a third-party client on another network. Thus the send of information from the inside of your network looks to the firewall as if it was initiated from within. You may have to think about this, but it makes sense.

Behavior based thwarting of p2p. Circa 2004 – NetEqualizer

Behavior-based shaping relies on spotting the unique footprint of a client sending and receiving p2p applications. From our experience, these clients just do not know how to lay low and stay under the radar. It’s like the criminal smuggling drugs doing 100 MPH on the highway, they just can’t help themselves. Part of the p2p methodology is to find as many sources of files as possible, and then, download from all sources simultaneously. Combine this behavior with the fact that most p2p consumers are trying to build up a library of content, and thus initiating many file requests, and you get a behavior footprint that can easily be spotted. By spotting this behavior and making life miserable for these users, you can achieve self compliance on your network.

Read a smarter way to block p2p traffic.

Blocking the RIAA probing servers

If you know where the RIAA is probing from you can deny all traffic to their probes and thus prevent the probe of files on your network, and ensuing nasty letters to desist.

What Does Net Privacy Have to Do with Bandwidth Shaping?


I definitely understand the need for privacy. Obviously, if I was doing something nefarious, I wouldn’t want it known, but that’s not my reason. Day in and day out, measures are taken to maintain my privacy in more ways than I probably even realize. You’re likely the same way.

For example, to avoid unwanted telephone and mail solicitations, you don’t advertise your phone numbers or give out your address. When you buy something with your credit card, you usually don’t think twice about your card number being blocked out on the receipt. If you go to the pharmacist, you take it for granted that the next person in line has to be a certain distance behind so they can’t hear what prescription you’re picking up. The list goes on and on. For me personally, I’m sure there are dozens, if not hundreds, of good examples why I appreciate privacy in my life. This is true in my daily routines as well as in my experiences online.

The topic of Internet privacy has been raging for years. However, the Internet still remains a hotbed for criminal activity and misuse of personal information. Email addresses are valued commodities sold to spammers. Search companies have dedicated countless bytes of storage to every search term and IP address made. Websites place tracking cookies on your system so they can learn more about your Web travels, habits, likes, dislikes, etc.  Forensically, you can tell a lot about a person from their online activities. To be honest, it’s a little creepy.

Maybe you think this is much ado about nothing. Why should you care? However, you may recall that less than four years ago, AOL accidentally released around 20 million search keywords from over 650,000 users. Now, those 650,000 users and their searches will exist forever in cyberspace.  Could it happen again? Of course, why wouldn’t it happen again since all it takes is a packed laptop to walk out the door?

Internet privacy is an important topic, and as a result, technology is becoming more and more available to help people protect information they want to keep confidential. And that’s a good thing. But what does this have to do with bandwidth management? In short, a lot (no pun intended)!

Many bandwidth management products (from companies like Blue Coat, Allot, and Exinda, for example) intentionally work at the application level. These products are commonly referred to as Layer 7 or Deep Packet Inspect tools. Their mission is to allocate bandwidth specifically by what you’re doing on the Internet. They want to determine how much bandwidth you’re allowed for YouTube, Netflix, Internet games, Facebook, eBay, Amazon, etc. They need to know what you’re doing so they can do their job.

In terms of this article, whether you’re philosophically adamant about net privacy (like one of the inventors of the Internet), or could care less, is really not important. The question is, what happens to an application-managed approach when people take additional steps to protect their own privacy?

For legitimate reasons, more and more people will be hiding their IPs, encrypting, tunneling, or otherwise disguising their activities and taking privacy into their own hands. As privacy technology becomes more affordable and simple, it will become more prevalent. This is a mega-trend, and it will create problems for those management tools that use this kind of information to enforce policies.

However, alternatives to these application-level products do exist, such as “fairness-based” bandwidth management. Fairness-based bandwidth management, like the NetEqualizer, is the only a 100% neutral solution and ultimately provides a more privacy friendly approach for Internet users and a more effective solution for administrators when personal privacy protection technology is in place. Fairness is the idea of managing bandwidth by how much you can use, not by what you’re doing. When you manage bandwidth by fairness instead of activity, not only are you supporting a neutral, private Internet, but you’re also able to address the critical task of bandwidth allocation, control and quality of service.

The Dark Side of Net Neutrality


Net neutrality, however idyllic in principle, comes with a price. The following article was written to shed some light on the big money behind the propaganda of net neutrality. It may change your views, but at the very least it will peel back one more layer of the the onion that is the issue of net neutrality.

First, an analogy to set the stage:

I live in a neighborhood that equally shares a local community water system among 60 residential members. Nobody is metered. Through a mostly verbal agreement, all users try to keep our usage to a minimum. This requires us to be very water conscious, especially in the summer months when the main storage tanks need time to recharge overnight.

Several years ago, one property changed hands, and the new owner started raising organic vegetables using a drip irrigation system. The neighborhood precedent had always been that using water for a small lawn and garden area was an accepted practice, however, the new neighbor expanded his garden to three acres and now sells his produce at the local farmers market. Even with drip irrigation, his water consumption is likely well beyond the rest of the neighborhood combined.

You can see where I am going with this. Based on this scenario, it’s obvious that an objective observer would conclude that this neighbor should pay an additional premium — especially when you consider he is exploiting the community water for a commercial gain.

The Internet, much like our neighborhood example, was originally a group of cooperating parties (educational and government institutions) that connected their networks in an effort to easily share information. There was never any intention of charging for access amongst members. As the Internet spread away from government institutions, last-mile carriers such as cable and phone companies invested heavily in infrastructure. Their  business plans assumed that all parties would continue to use the Internet with lightweight content such as Web pages, e-mails, and the occasional larger document or picture.

In the latter part of 2007, a few companies, with substantial data content models, decided to take advantage of the low delivery fees for movies and music by serving them up over the Internet. Prior to their new-found Internet delivery model, content providers had to cover the distribution costs for the physical delivery of records, video cassettes and eventually discs.

As of 2010, Internet delivery costs associated with the distribution of media had plummeted to near zero. It seems that consumers have pre-paid their delivery cost when they paid their monthly Internet bill. Everybody should be happy, right?

The problem is, as per our analogy with the community water system, we have a few commercial operators jamming the pipes with content, and jammed pipes have a cost. Upgrading a full Internet pipe at any level requires a major investment, and providers to date are already leveraged and borrowed with their existing infrastructure. Thus, the Internet companies that carry the data need to pass this cost on to somebody else.

As a result of these conflicting interests, we now have a pissing match between carriers and content providers in which the latter are playing the “neutrality card” and the former are lobbying lawmakers to grant them special favors in order to govern ways to limit access.

Therefore, whether it be water, the Internet or grazing on public lands, absolute neutrality can be problematic — especially when money is involved. While the concept of neutrality certainly has the overwhelming support of consumer sentiment, be aware that there are, and  always will be, entities exploiting the system.

Related Articles

For more on NetFlix, see Level 3-Netflix Expose their Hidden Agenda.

What Is Deep Packet Inspection and Why the Controversy?


By Art Reisman

Art Reisman CTO www.netequalizer.com

Editor’s note: Art Reisman is the CTO of APconnections. APconnections designs and manufactures the popular NetEqualizer bandwidth shaper. APconnections removed all deep packet inspection technology from their NetEqualizer product over 2 years ago.

Article Updated March 2012

As the debate over Deep Packet Inspection continues, network administrators are often faced with a difficult decision: ensure network quality or protect user privacy. However, the legality of the practice is now being called into question, adding a new twist to the mix. Yet, for many Internet users, deep packet inspection continues to be an ambiguous term in need of explanation. In the discussion that follows, deep packet inspection will be explored in the context of the ongoing debate.

Exactly what is deep packet inspection?

All traffic on the Internet travels around in what is called an IP packet. An IP packet is a string of characters moving from computer A to computer B. On the outside of this packet is the address where it is being sent. On the inside of the packet is the data that is being transmitted.

The string of characters on the inside of the packet can be conceptually thought of as the “payload,” much like the freight inside of a railroad car. These two elements, the address and the payload, comprise the complete IP packet.

When you send an e-mail across the Internet, all your text is bundled into packets and sent on to its destination. A deep packet inspection device literally has the ability to look inside those packets and read your e-mail (or whatever the content might be).

Products sold that use DPI are essentially specialized snooping devices that examine the content (pay load inside) of Internet packets. Other terms sometimes used to describe techniques that examine Internet data are packet shapers, layer-7 traffic shaping, etc.

How is deep packet inspection related to net neutrality?

Net neutrality is based on the belief that nobody has the right to filter content on the Internet. Deep packet inspection is a method used for filtering. Thus, there is a conflict between the two approaches. The net neutrality debate continues to rage in its own right.

Why do some Internet providers use deep packet inspection devices?

There are several reasons:

1) Targeted advertising If a provider knows what you are reading, they can display content advertising on the pages they control, such as your login screen or e-mail account.

2) Reducing “unwanted” traffic — Many providers are getting overwhelmed by types of traffic that they deem as less desirable such as Bittorrent and other forms of peer-to-peer. Bittorrent traffic can overwhelm a network with volume. By detecting and redirecting the Bittorrent traffic, or slowing it down, a provider can alleviate congestion.

3) Block offensive material — Many companies or institutions that perform content filtering are looking inside packets to find, and possibly block, offensive material or web sites.

4) Government spying — In the case of Iran (and to some extent China), DPI is used to keep tabs on the local population.

When is it appropriate to use deep packet inspection?

1) Full disclosure — Private companies/institutions/ISPs that notify employees that their Internet use is not considered private have the right to snoop, although I would argue that creating an atmosphere of mistrust is not the mark of a healthy company.

2) Law enforcement — Law enforcement agencies with a warrant issued by a judge would be the other legitimate use.

3) Intrusion detection and prevention– It is one thing to be acting as an ISP  and to eaves drop on a public conversation;  it is entirely another paradigm if you are a  private business examining the behavior of somebody  coming in your front door. For example in a private home it is within your right to look through your peep hole and not let shady characters into your home.  In a private business it is a good idea to use Deep packet inspection in order to block unwanted intruders from your network. Blocking bad guys before they break into and damage your network and is perfectly acceptable.

4) Spam filtering- Most consumers are very happy to have their ISP or email provider remove spam.  I would categorize this type of DPI as implied disclosure. For example, in Gmail you do have the option to turn Spam filtering off, and although most consutomers may not realize that google is reading their mail ( humans don’t read it but computer scanners do), their motives are understood. What consumers may not realize is that their email provider is also reading everything they do in order to set target advertising

Does Content filtering use Deep Packet Inspection ?

For the most part no. Content filtering is generally  done at the URL level. URL’s are generally considered public information, as routers need to look this up anyway. We have only encountered content filters at private institutions that are within their right.

What about spam filtering, does that use Deep Packet Inspection?

Yes many Spam filters will look at content, and most people could not live without their spam filter, however with spam filtering most people have opted in at one point or another, hence it is generally done with permission.

What is all the fuss about?

It seems that consumers are finally becoming aware of what is going on behind the scenes as they surf the Internet, and they don’t like it. What follows are several quotes and excerpts from articles written on the topic of deep packet inspection. They provide an overview not only of how DPI is currently being used, but also the many issues that have been raised with the practice.

For example, this is an excerpt from a recent PC world article:

Not that we condone other forms of online snooping, but deep packet inspection is the most egregious and aggressive invasion of privacy out there….It crosses the line in a way that is very frightening.

Paul Stephens, director of policy and advocacy for the Privacy Rights Clearinghouse, as quoted in the E-Commerce Times on November 14, 2008. Read the full article here.

Recently, Comcast had their hand slapped for re-directing Bittorrent traffic:

Speaking at the Stanford Law School Center for Internet and Society, FCC Chairman Kevin Martin said he’s considering taking action against the cable operator for violating the agency’s network-neutrality principles. Seems Martin was troubled by Comcast’s dissembling around the BitTorrent issue, not to mention its efforts to pack an FCC hearing on Net neutrality with its own employees.

— Digital Daily, March 10, 2008. Read the full article here.

Later in 2008, the FCC came down hard on Comcast.

In a landmark ruling, the Federal Communications Commission has ordered Comcast to stop its controversial practice of throttling file sharing traffic.

By a 3-2 vote, the commission on Friday concluded that Comcast monitored the content of its customers’ internet connections and selectively blocked peer-to-peer connections.

Wired.com, August 1, 2008.Read the full article here.

To top everything off, some legal experts are warning companies practicing deep packet inspection that they may be committing a felony.

University of Colorado law professor Paul Ohm, a former federal computer crimes prosecutor, argues that ISPs such as Comcast, AT&T and Charter Communications that are or are contemplating ways to throttle bandwidth, police for copyright violations and serve targeted ads by examining their customers’ internet packets are putting themselves in criminal and civil jeopardy.

Wired.com, May 22, 2008. Read the full article here.

However, it looks like things are going the other way in the U.K. as Britain’s Virgin Media has announced they are dumping net neutrality in favor of targeting bittorrent.

The UK’s second largest ISP, Virgin Media, will next year introduce network monitoring technology to specifically target and restrict BitTorrent traffic, its boss has told The Register.

The Register, December 16, 2008. Read the full article here.

Canadian ISPs confess en masse to deep packet inspection in January 2009.

With the amount of attention being paid to Comcast recently, a lot of people around the world have begun to look at their ISPs and wonder exactly what happens to their traffic once it leaves. This is certainly true for Canada, where several Canadian ISPs have come under the scrutiny of the CRTC, the regulatory agency responsible for Canada. After investigation, it was determined that all large ISPs in Canada filter P2P traffic in some fashion.

Tech Spot, January 21, 2009. Read the full article here.

In April 2009, U.S. lawmakers announced plans to introduce legislation that would limit the how ISPs could track users. Online privacy advocates spoke out in support of such legislation.

In our view, deep packet inspection is really no different than postal employees opening envelopes and reading letters inside. … Consumers simply do not expect to be snooped on by their ISPs or other intermediaries in the middle of the network, so DPI really defies legitimate expectations of privacy that consumers have.

Leslie Harris, president and CEO of the Center for Democracy and Technology, as quoted on PCWorld.com on April 23, 2009. Read the full article here.

The controversy continues in the U.S. as AT&T is accused of traffic shaping, lying and blocking sections of the Internet.

7/26/2009 could mark a turning point in the life of AT&T, when the future looks back on history, as the day that the shady practices of an ethically challenged company finally caught up with them: traffic filtering, site banning, and lying about service packages can only continue for so long before the FCC, along with the bill-paying public, takes a stand.

Kyle Brady, July 27, 2009. Read the full article here.

[February 2011 Update] The Egyptian government uses DPI to filter elements of their Internet Traffic, and this act in itself becomes the news story. In this video in this news piece, Al Jazeera takes the opportunity to put out an unflattering piece on the company Naurus that makes the DPI technology and sold it to the Egyptians.

While the debate over deep packet inspection will likely rage on for years to come, APconnections made the decision to fully abandon the practice over two years ago, having since proved the viability of alternative approaches to network optimization. Network quality and user privacy are no longer mutually exclusive goals.

Created by APconnections, the NetEqualizer is a plug-and-play bandwidth control and WAN/Internet optimization appliance that is flexible and scalable. When the network is congested, NetEqualizer’s unique “behavior shaping” technology dynamically and automatically gives priority to latency sensitive applications, such as VoIP and email. Click here for a full price list.

NetEqualizer Brand Becoming an Eponym for Fairness and Net Neutrality techniques


An eponym is a general term used to describe from what or whom something derived its name. Therefore, a proprietary eponym could be considered a brand name, product or service mark which has fallen into general use.

Examples of common brand Eponyms include Xerox, Google, and  Band Aid.  All of these brands have become synonymous with the general use of the class of product regardless of the actual brand.

Over the past 7 years we have spent much of our time explaining the NetEqualizer methods to network administrators around the country;  and now,there is mounting evidence,  that  the NetEqualizer brand, is taking on a broader societal connotation. NetEqualizer, is in the early stages as of becoming and Eponym for the class of bandwidth shapers that, balance network loads and ensure fairness and  Neutrality.   As evidence, we site the following excerpts taken from various blogs and publications around the world.

From Dennis OReilly <Dennis.OReilly@ubc.ca> posted on ResNet Forums

These days the only way to classify encrypted streams is through behavioral analysis.  ….  Thus, approaches like the NetEqualizer or script-based ‘penalty box’ approaches are better.

Wisp tutorial Butch Evans

About 2 months ago, I began experimenting with an approach to QOS that mimics much of the functionality of the NetEqualizer (http://www.netequalizer.com) product line.

TMC net

Comcast Announces Traffic Shaping Techniques like APconnections’ NetEqualizer…

From Technewsworld

It actually sounds a lot what NetEqualizer (www.netequalizer.com) does and most people are OK with it…..

From Network World

NetEqualizer looks at every connection on the network and compare it to the overall trunk size to determine how to eliminate congestion on the links

Star Os Forum

If you’d really like to have your own netequalizer-like system then my advice…..

Voip-News

Has anyone else tried Netequalizer or something like it to help with VoIP QoS? It’s worked well so far for us and seems to be an effective alternative for networks with several users…..

NetEqualizer YouTube Caching a Win for Net Neutrality


Over the past few years, much of the controversy over net neutrality has ultimately stemmed from the longstanding rift between carriers and content providers. Commercial content providers such as NetFlix have entire business models that rely on relatively unrestricted bandwidth access for their customers, which has led to an enormous increase in the amount of bandwidth that is being used. In response to these extreme bandwidth loads and associated costs, ISPs have tried all types of schemes to limit and restrict total usage. Some of the solutions that have been tried include:

While in many cases effective, most of these efforts have been mired in controversy with respect to net neutrality. However, caching is the one exception.

Up to this point, caching has proven to be the magic bullet that can benefit both ISPs and consumers (faster access to videos, etc.) while respecting net neutrality. To illustrate this, we’ll run caching through the gauntlet of questions that have been raised about these other solutions in regard to a violation of net neutrality. In the end, it comes up clean.

1. Does caching involve deep introspection of user traffic without their knowledge (like layer-7 shaping and DPI)?

No.

2. Does Caching perform any form of preferential treatment based on content?

No.

3. Does caching perform any form of preferential treatment based on fees?

No.

Yet, despite avoiding these pitfalls, caching has still proven to be extremely effective, allowing Internet providers to manage increasing customer demands without infringing upon customers’ rights or quality of service. It was these factors that led APconnections to develop our most recent NetEqualizer feature, YouTube caching.

For more on this feature, or caching in general, check out our new NetEqualizer YouTube Caching FAQ post.

A Tiered Internet – Penny Wise or Pound Foolish


With the debate over net neutrality raging in the background, Internet suppliers are preparing their strategies to bridge the divide between bandwidth consumption and costs. This topic is coming to a head now largely because of the astonishing growth-rate of streaming video from the likes of YouTube, NetFlix, and others.

The issue recently took a new turn and emerged front and center during a webinar when Allot Communications and Openet presented its new product features, including its approach of integrating policy control and charging for wireless access to certain websites.

On the surface, this may seem like a potential solution to the bandwidth problem. Basic economic theory will tell you that if you increase the cost of a product or service, the demand will eventually decrease. In this case, charging for bandwidth will not only increase revenues, but the demand will ultimately drop until a point of equilibrium is reached. Problem solved, right? Wrong!

While the short-term benefits are obviously appealing for some, this is a slippery slope that will lead to further inequality in Internet access (You can easily find many articles and blogs regarding Net Neutrality including those referencing Vinton Cerf and Tim Berners-Lee — two of the founding fathers of the Internet — clearly supporting a free and equal Internet). Despite these arguments, we believe that Deep Packet Inspection (DPI) equipment makers such as Allot will continue to promote and support a charge system since it is in their best business interests to do so. After all, a pay-for-access approach requires DPI as the basis for determining what content to charge.

However, there are better and more cost-effective ways to control bandwidth consumption while protecting the interests of net neutrality. For example, fairness-based bandwidth control intrinsically provides equality and fairness to all users without targeting specific content or websites. With this approach, when the network is busy small bandwidth consumers are guaranteed access to the Internet while large bandwidth users are throttled back but not charged or blocked completely. Everyone lives within their means and gets an equal share. If large bandwidth consumers want access to more bandwidth, they can purchase a higher level of service from their provider. But let’s be clear, this is very different from charging for access to a particular website!

Although this content-neutral approach has repeatedly proved successful for NetEqualizer users, we’re now taking an additional step at mitigating bandwidth congestion while respecting network neutrality through video caching (the largest growth segment of bandwidth consumption). So, keep an eye out for the YouTube caching feature to be available in our new NetEqualizer release early next year.

Google Verizon Net Neutrality Policy, is it sincere?


With all the rumors circulating about the larger wireless providers trying to wall off competition or generate extra revenue through preferential treatment of traffic, they had to do something, hence  Google and Verizon crafted a joint statement on Net Neutrality. Making a statement in denial of a rumor on such a scale is somewhat akin to admitting the rumor was true. It reminds me of a politician claiming he has no plans to raise taxes.

Yes, I believe that most people who work for Google and Verizon, executives included, believe in an open Neutral Internet.  And yet, from experience, when push comes to shove, and profits are flat or dropping, the idea of leveraging your assets will be on the table.  And what better way to leverage your assets than restrict competition to your captive audience. Walling off a captive audience to selected content will always be enticing to any service provider looking for low hanging fruit.  Morals can easily be compromised or rationalized in the face of losing your house, and it only takes one over zealous leader to start a provider down the slope.

The checks and balances so far, in this case, are the consumers who have voiced outright disgust with anybody who dare toy with the idea of  preferential  treatment of Internet traffic for economic benefit.

For now this concept will have to wait, but it will be revisited again and hopefully consumers will rise up in disgust.  It would be naive to think that today’s statement by Verizon and Google would be  binding beyond the political moment.

Net Neutrality Enforcement and Debate: Will It Ever Be Settled?


By Art Reisman

Art Reisman CTO www.netequalizer.com

Editor’s note: Art Reisman is the CTO of APconnections. APconnections designs and manufactures the popular NetEqualizer bandwidth shaper. APconnections removed all Deep Packet Inspection technology from their NetEqualizer product over 2 years ago.

As the debate over net neutrality continues, we often forget what an ISP actually is and why they exist.
ISPs in this country are for-profit private companies made up of stockholders and investors who took on risk (without government backing) to build networks with the hopes of making a profit. To make a profit they must balance users expectations for performance against costs of implementing a network.

The reason bandwidth control is used in the first place is the standard switching problem capacity problem. Nobody can afford the investment of infrastructure to build a network to meet peak demands at all times. Would you build a house with 10 bedrooms if you were only expecting one or two kids sometime in the future? ISPs build networks to handle an average load, and when peak loads come along, they must do some mitigation. You can argue they should have built their networks; with more foresight until you are green, but the fact is demand for bandwidth will always outstrip supply.

So, where did the net neutrality debate get its start?
Unfortunately, in many Internet providers’ first attempt to remedy the overload issue on their networks, the layer-7 techniques they used opened a Pandora’s box of controversy that may never be settled.

When the subject of net neutrality started heating up around 2007 and 2008, the complaints from consumers revolved around ISP practices of looking inside customer’s transmittal of data and blocking or redirecting traffic based on content. There were all sorts of rationalizations for this practice and I’ll be the first to admit that it was not done with intended malice. However, the methodology was abhorrent.

I likened this practice to the phone company listening into your phone calls and deciding which calls to drop to keep their lines clear. Or, if you want to take it a step farther, the postal service making a decision to toss your junk mail based on their own private criteria. Legally I see no difference between looking inside mail or looking inside Internet traffic. It all seems to cross a line. When referring to net neutrality, the bloggers of this era were originally concerned with this sort of spying and playing God with what type of data can be transmitted.

To remedy this situation, Comcast and others adopted methods that relegated Internet usage based on patterns of usage and not content. At the time, we were happy to applaud them and claim that the problem of spying on data had been averted. I pretty much turned my attention away from the debate at that time, but I recently started looking back at the debate and, wow, what a difference a couple of years make.

So, where are we headed?
I am not sure what his sources are, but Rush Limbaugh claims that net neutrality is going to become a new fairness doctrine. To summarize, the FCC or some government body would start to use its authority to ensure equal access to content from search engine companies. For example, making sure that minority points of view on subjects got top billing in search results. This is a bit a scary, although perhaps a bit alarmist, but it would not surprise me since, once in government control, anything is possible. Yes, I realize conservative talk radio show hosts like to elicit emotional reactions, but usually there is some truth to back up their claims.

Other intelligent points of view:

The CRTC (Canadian FCC) seems to have a head on their shoulders, they have stated that ISPs must disclose their practices, but are not attempting to regulate how in some form of over reaching doctrine. Although I am not in favor of government institutions, if they must exist then the CRTC stance seems like a sane and appropriate request with regard to regulating ISPs.

Freedom to Tinker

What Is Deep Packet Inspection and Why All the Controversy?

Behind the Scenes on the latest Comcast Ruling on Net Neutrality


Yesterday the FCC ruled in favor of Comcast regarding their rights to manipulate consumer traffic . As usual, the news coverage was a bit oversimplified and generic. Below we present a breakdown of the players involved, and our educated opinion as to their motivations.

1) The Large Service Providers for Internet Service: Comcast, Time Warner, Quest

From the perspective of Large Service Providers, these companies all want to get a return on their investment, charging the most money the market will tolerate. They will also try to increase market share by consolidating provider choices in local markets. Since they are directly visible to the public, they will also be trying to serve the public’s interest at heart; for without popular support, they will get regulated into oblivion. Case in point, the original Comcast problems stemmed from angry consumers after learning their p2p downloads were being redirected and/or  blocked.

Any and all government regulation will be opposed at every turn, as it is generally not good for private business. In the face of a strong headwind, don’t be surprised if Large Service Providers might try to reach a compromise quickly to alleviate any uncertainty.  Uncertainty can be more costly than regulation.

To be fair, Large Service Providers are staffed top to bottom with honest, hard-working people but, their decision-making as an entity will ultimately be based on profit.  To be the most profitable they will want to prevent third-party Traditional Content Providers from flooding  their networks with videos.  That was the original reason why Comcast thwarted bittorrent traffic. All of the Large Service Providers are currently, or plotting  to be, content providers, and hence they have two motives to restrict unwanted traffic. Motive one, is to keep their capacities in line with their capabilities for all generic traffic. Motive two, would be to thwart other content providers, thus making their content more attractive. For example who’s movie service are you going to subscribe with?  A generic cloud provider such as Netflix whose movies run choppy or your local provider with better quality by design?

2) The Traditional Content Providers:  Google, YouTube, Netflix etc.

They have a vested interest in expanding their reach by providing expanded video content.  Google, with nowhere to go for new revenue in the search engine and advertising business, will be attempting  an end-run around Large Service Providers to take market share.   The only thing standing in their way is the shortcomings in the delivery mechanism. They have even gone so far as to build out an extensive, heavily subsidized, fiber test network of their own.  Much of the hubbub about Net Neutrality is  based on a market play to force Large Service Providers to shoulder the Traditional Content Providers’ delivery costs.  An analogy from the bird world would be the brown-headed cowbird, where the mother lays her eggs in another bird’s nest, and then lets her chicks be raised by an unknowing other species.  Without their own delivery mechanism direct-to-the-consumer, the Traditional Content Providers  must keep pounding at the FCC  for rulings in their favor.  Part of the strategy is to rile consumers against the Large Service Providers, with the Net Neutrality cry.

3) The FCC

The FCC is a government organization trying to take their existing powers, which were granted for airwaves, and extend them to the Internet. As with any regulatory body, things start out well-intentioned, protection of consumers etc., but then quickly they become self-absorbed with their mission.  The original reason for the FCC was that the public airways for television and radio have limited frequencies for broadcasts. You can’t make a bigger pipe than what frequencies will allow, and hence it made sense to have a regulatory body oversee this vital  resource. In  the early stages of commercial radio, there was a real issue of competing entities  broadcasting  over each other in an arms race for the most powerful signal.  Along those lines, the regulatory entity (FCC) has forever expanded their mission.  For example, the government deciding what words can be uttered on primetime is an extension of this power.

Now with Internet, the FCC’s goal will be to regulate whatever they can, slowly creating rules for the “good of the people”. Will these rules be for the better?  Most likely the net effect is no; left alone the Internet was fine, but agencies will be agencies.

4) The Administration and current Congress

The current Administration has touted their support of Net Neutrality, and perhaps have been so overburdened with the battle on health care and other pressing matters that there has not been any regulation passed.  In the face of the aftermath of the FCC getting slapped down in court to limit their current powers, I would not be surprised to see a round of legislation on this issue to regulate Large Service Providers in the near future.  The Administraton will be painted as consumer protection against big greedy companies that need to be reigned in, as we have seen with banks, insurance companies, etc…. I hope that we do not end up with an Internet Czar, but some regulation is inevitable, if nothing else for a revenue stream to tap into.

5) The Public

The Public will be the dupes in all of this, ignorant voting blocks lobbied by various scare tactics.   The big demographic difference on swaying this opinion will be much different from the health care lobby.  People concerned for and against Internet Regulation will be in income brackets that have a higher education and employment rate than the typical entitlement lobbies that support regulation.  It is certainly not going to be the AARP or a Union Lobbyist leading the charge to regulate the Internet; hence legislation may be a bit delayed.

6) Al Gore

Not sure if he has a dog in this fight; we just threw him in here for fun.

7) NetEqualizer

Honestly, bandwidth control will always be needed, as long as there is more demand for bandwidth than there is bandwidth available.  We will not be lobbying for or against Net Neutrality.

8) The Courts

This is an area where I am a bit weak in understanding how a Court will follow legal precedent.  However, it seems to me that almost any court can rule from the bench, by finding the precedent they want and ignoring others if they so choose?  Ultimately, Congress can pass new laws to regulate just about anything with impunity.  There is no constitutional protection regarding Internet access.  Most likely the FCC will be the agency carrying out enforcement once the laws are in place.

NetEqualizer provides Net Neutrality solution for bandwidth control.


By Eli Riles NetEqualizer VP of Sales

This morning I read an article on how some start up companies are being hurt awaiting the FCC’s decision on Net Neutrality.

Late in the day, a customer called and exclaimed, “Wow now with the FCC coming down  hard on technologies that jeopardize net neutrality, your business  must booming since you offer an excellent viable alternative” And yet  in face of this controversy, several of our competitors continue to sell deep packet inspection devices to customers.

Public operators and businesses that continue to purchase such technology are likely uninformed about the growing fire-storm of opposition against Deep Packet Inspection techniques.  The allure of being able to identify, and control Internet Traffic by type is very a natural solution, which customers often demand. Suppliers who sell DPI devices are just doing what their customer have asked. As with all technologies once the train leaves the station it is hard to turn around. What is different in the case of DPI is that suppliers and ISPs had their way with an ignorant public starting in the late 90’s. Nobody really gave much thought as to how DPI might be the villain in the controversy over Net Nuetrality. It was just assumed that nobody would notice their internet traffic being watched and redirected by routing devices. With behemoths such as Google having a vested interest in keeping traffic flowing without Interference on the Internet, commercial deep packet inspection solutions are slowly falling out of favor in the ISP sector. The bigger question for the players betting the house on DPI is , will it fall out favor in other  business verticals?

The NetEqualizer decision to do away with DPI two years ago is looking quite brilliant now, although at the time it was clearly a risk bucking market trends.  Today, even in the face of world wide recession our profit and unit sales are up for the first three quarters of 2009 this year.

As we have claimed in previous articles there is a time and place for deep packet inspection; however any provider using DPI to manipulate data is looking for a potential dog fight with the FCC.

NetEqualizer has been providing alternative bandwidth control options for ISPs , Businesses , and Schools of all sizes for 7 years without violating any of the Net Nuetrality sacred cows. If you have not heard about us, maybe now is a good time to pick up the phone. We have been on the record touting our solution as being fair equitable for quite some time now.

Net Neutrality Bill Won’t End Conflicts Between Users and Providers


This week, Representatives Edward Markey, a Massachusetts Democrat, and Anna Eshoo, a California Democrat, introduced the Internet Freedom Preservation Act aimed at protecting the rights of Internet users and ultimately net neutrality. Yet, before net neutrality advocates unequivocally praise the bill, it should be noted that it protects the rights of Internet service providers as well. For example, as long as ISPs are candid with their customers in regard to their network optimizaiton practices, the bill does allow for “reasonable network management,” stating:

“Nothing in this section shall be construed to prohibit an Internet access provider from engaging in reasonable network management consistent with the policies and duties of nondiscrimination and openness set forth in this Act. For purposes of subsections (b)(1) and (b)(5), a network management practice is a reasonable practice only if it furthers a critically important interest, is narrowly tailored to further that interest, and is the means of furthering that interest that is the least restrictive, least discriminatory, and least constricting of consumer choice available. In determining whether a network management practice is reasonable, the Commission shall consider, among other factors, the particular network architecture or technology limitations of the provider.”

While this stipulation is extremely important in the protection it provides Internet service providers, it is likely to come into conflict with some Internet users’ ideas of net neutrality.  For example, the bill also states that it is ISPs’ “duty to not block, interfere with, discriminate against, impair or degrade the ability of any person to use an Internet access service to access, use, send, post, receive or offer any lawful content, application or service through the Internet.” However, even users of the NetEqualizer, one of the more hands off approaches to network management, don’t have a choice but to target the behavior of certain heavy customers. One person’s penchant for downloading music — legally or not — can significantly impact the quality of service for everyone else. And, increasing bandwidth just to meet the needs of a few users isn’t reasonable either.

It would seem that this would be a perfect case of reasonable network management which would be allowed under the proposed bill. Yet many net neutrality advocates tend to quickly dismiss any management as an infringement upon the user’s rights. The protection of the users’ rights will likely get the attention in discussions about these types of bills, but there should also be just as much emphasis on the rights of the provider to reasonably manage their network and what this may mean for the idea of unadulterated net neutrality.

The fact that this bill includes the right to reasonably manage one’s network indicates that some form of management is typically nececsary for a network to run at its full potential. The key is finding some middle ground.

Related article September 22 2009

FCC rules in favor of Net Neutrality the commentary on this blog is great and worth the read.

Do We Need an Internet User Bill of Rights?


The Computers, Freedom and Privacy conference wraps up today in Washington, D.C., with conference participants having paid significant attention to the on-going debates concerning ISPs, Deep Packet Inspection and net neutrality.  Over the past several days, representatives from the various interested parties have made their cases for and against certain measures pertaining to user privacy. As was expected, demands for the protection of user privacy often came into conflict with ISPs’ advertising strategies and their defense of their overall network quality.

At the center of this debate is the issue of transparency and what ISPs are actually telling customers. In many cases, apparent intrusions into user privacy are qualified by what’s stated in the “fine print” of customer contracts. If these contracts notify customers that their Internet activity and personal information may be used for advertising or other purposes, then it can’t really be said that the customer’s privacy has been invaded. But, the question is, how many users actually read their contracts, and furhtermore, how many people actually understand the fine print? It would be interesting to see what percentage of Internet users could define deep packet inspection. Probably not very many.

This situation is reminiscent of many others involving service contracts, but one particular timely example comes to mind — credit cards. Last month, the Senate passed a credit card “bill of rights,” through which consumers would be both better protected and better informed. Of the latter, President Obama stated, “you should not have to worry that when you sign up for a credit card, you’re signing away all your rights. You shouldn’t need a magnifying glass or a law degree to read the fine print that sometimes doesn’t even appear to be written in English.”

Ultimately, the same should be true for any service contracts, but especially if private information is at stake, as is the case with the Internet privacy debate. Therefore, while it’s a step in the right direction to include potential user privacy issues in service contracts, it should not be done only with the intention of preventing potential legal backlash, but rather with the customer’s true understanding of the agreement in mind.

Editor’s Note: APconnections and NetEqualizer have long been a proponent of both transparency and the protection of user privacy, having devoted several years to developing technology that maintains network quality while respecting the privacy of Internet users.

Obama’s Revival of Net Neutrality Revisits An Issue Hardly Forgotten


Last Friday, President Obama reinvigorated (for many people, at least) the debate over net neutrality during a speech from the White House on cybersecurity. The president made it clear that users’ privacy and net neutrality would not be threatened under the guise of cybersecurity measures. President Obama stated:

“Let me also be clear about what we will not do. Our pursuit of cyber-security will not — I repeat, will not include — monitoring private sector networks or Internet traffic. We will preserve and protect the personal privacy and civil liberties that we cherish as Americans. Indeed, I remain firmly committed to net neutrality so we can keep the Internet as it should be — open and free.”

While this is certainly an important issue on the security front, for many ISPs and networks administrators, it didn’t take the president’s comments to put user privacy or net neutrality back in the spotlight.  In may cases, ISPs and network administrators constantly must walk the fine line between net neutrality, user privacy, and ultimately the well being of their own networks, something that can be compromised on a number of fronts (security, bandwidth, economics, etc.).

Therefore, despite the president’s on-going commitment to net neturality, the issue will continue to be debated and remain at the forefront of the minds of ISPs, administrators, and many users. Over the past few years, we at NetEqualizer have been working to provide a compromise for these interested parties, ensuring network quality and neutrality while protecting the privacy of users. It will be interesting to see how this debate plays out, and what it will mean for policy, as the philosophy of network neutrality continues to be challenged — both by individuals and network demands.

Further Reading

%d bloggers like this: