A Brief History of Peer to Peer File Sharing and the Attempts to Block It


By Art Reisman

The following history is based on my notes and observations as both a user of peer to peer, and as a network engineer tasked with cleaning  it up.

Round One, Napster, Centralized Server, Circa 2002

Napster was a centralized service, unlike the peer to peer behemoths of today there was never any question of where the copyrighted material was being stored and pirated from. Even though Napster did not condone pirated music and movies on their site, the courts decided by allowing copyrighted material to exist on their servers, they were in violation of copyright law. Napster’s days of free love were soon over.

From an historic perspective the importance of the decision to force the shut down of Napster was that it gave rise to a whole new breed of p2p applications. We detailed this phenomenon in our 2008 article.

Round Two, Mega-Upload  Shutdown, Centralized Server, 2012

We again saw a doubling down on p2p client sites (they expanded) when the Mega-Upload site, a centralized sharing site, was shutdown back in Jan 2012.

“On the legal side, the recent widely publicized MegaUpload takedown refocused attention on less centralized forms of file sharing (i.e. P2P). Similarly, improvements in P2P technology coupled with a growth in file sharing file size from content like Blue-Ray video also lead many users to revisit P2P.”

Read the full article from deepfield.net

The shut down of Mega-Upload had a personal effect on me as I had used it to distribute a 30 minute account from a 92-year-old WWII vet where he recalled, in oral detail, his experience of surviving a German prison camp.

Blocking by Signature, Alias Layer 7 Shaping, Alias Deep packet inspection. Late 1990’s till present

Initially, the shining star savior in the forefront against spotting illegal content on your network, this technology can be expensive and fail miserably in the face of newer encrypted p2p applications. It also can get quite expensive to keep up with the ever changing application signatures, and yet it is still often the first line of defense attempted by ISPs.

We covered this topic in detail, in our recent article,  Layer 7 Shaping Dying With SSL.

Blocking by Website

Blocking the source sites where users download their p2p clients is still possible. We see this method applied at mostly private secondary schools, where content blocking is an accepted practice. This method does not work for computers and devices that already have p2p clients. Once loaded, p2p files can come from anywhere and there is no centralized site to block.

Blocking Uninitiated Requests. Circa Mid-2000

The idea behind this method is to prevent your Network from serving up any content what so ever! Sounds a bit harsh, but the average Internet consumer rarely, if ever, hosts anything intended for public consumption. Yes at one time, during the early stages of the Internet, my geek friends would set up home pages similar to what everybody exposes on Facebook today. Now, with the advent hosting sites, there is just no reason for a user to host content locally, and thus, no need to allow access from the outside. Most firewalls have a setting to disallow uninitiated requests into your network (obviously with an exemption for your publicly facing servers).

We actually have an advanced version of this feature in our NetGladiator security device. We watch each IP address on your internal network and take note of outgoing requests, nobody comes in unless they were invited. For example, if we see a user on the Network make a request to a Yahoo Server , we expect a response to come back from a Yahoo server; however if we see a Yahoo server contact a user on your network without a pending request, we block that incoming request. In the world of p2p this should prevent an outside client from requesting a receiving a copyrighted file hosted on your network, after all no p2p client is going to randomly send out invites to outside servers or would they?

I spent a few hours researching this subject, and here is what I found (this may need further citations). It turns out that p2p distribution may be a bit more sophisticated and has ways to get around the block uninitiated query firewall technique.

P2P networks such as Pirate Bay use a directory service of super nodes to keep track of what content peers have and where to find them. When you load up your p2p client for the first time, it just needs to find one super node to get connected, from there it can start searching for available files.

Note: You would think that if these super nodes were aiding and abetting in illegal content that the RIAA could just shut them down like they did Napster. There are two issues with this assumption:

1) The super nodes do not necessarily host content, hence they are not violating any copyright laws. They simply coordinate the network in the same way DNS service keep track of URL names and were to find servers.
2) The super nodes are not hosted by Pirate Bay, they are basically commandeered from their network of users, who unwittingly or unknowingly agree to perform this directory service when clicking the license agreement that nobody ever reads.

From my research I have talked to network administrators that claim despite blocking uninitiated outside requests on their firewalls, they still get RIAA notices. How can this be?

There are only two ways this can happen.

1) The RIAA is taking liberty to simply accuse a network of illegal content based on the directory listings of a super node. In other words if they find a directory on a super node pointing to copyrighted files on your network, that might be information enough to accuse you.

2) More likely, and much more complex, is that the Super nodes are brokering the transaction as a condition of being connected. Basically this means that when a p2p client within your network, contacts a super node for information, the super node directs the client to send data to a third-party client on another network. Thus the send of information from the inside of your network looks to the firewall as if it was initiated from within. You may have to think about this, but it makes sense.

Behavior based thwarting of p2p. Circa 2004 – NetEqualizer

Behavior-based shaping relies on spotting the unique footprint of a client sending and receiving p2p applications. From our experience, these clients just do not know how to lay low and stay under the radar. It’s like the criminal smuggling drugs doing 100 MPH on the highway, they just can’t help themselves. Part of the p2p methodology is to find as many sources of files as possible, and then, download from all sources simultaneously. Combine this behavior with the fact that most p2p consumers are trying to build up a library of content, and thus initiating many file requests, and you get a behavior footprint that can easily be spotted. By spotting this behavior and making life miserable for these users, you can achieve self compliance on your network.

Read a smarter way to block p2p traffic.

Blocking the RIAA probing servers

If you know where the RIAA is probing from you can deny all traffic to their probes and thus prevent the probe of files on your network, and ensuing nasty letters to desist.

How Effective is P2P Blocking?


This past week, a discussion about peer-to-peer (P2P) blocking tools came up in a user group that I follow. In the course of the discussion, different IT administrators chimed in, citing their favorite tools for blocking P2P traffic.

At some point in the discussion, somebody posed the question, “How do you know your peer-to-peer tool is being effective?” For the next several hours the room went eerily silent.

The reason why this question was so intriguing to me is that for years I collaborated with various developers on creating an open-source P2P blocking tool using layer 7 technology (the Application Layer of the OSI Model). During this time period, we released several iterations of our technology as freeware. Our testing and trials showed some successes, but we also learned how fragile the technology was and we were reluctant to push it out commercially. I had always wondered if other privately-distributed layer 7 blocking tools had found some magic key to perfection?

Sometimes, written words can be taken as fact even though the same spoken words might be dismissed as gossip; and so it was with our published open source technology. We started getting indications that it was getting picked up and integrated in other solutions and touted as gospel.

Our experience with P2P blocking:

Our free P2P blocking tool worked most of the time – maybe eighty percent. Eighty percent accuracy is fine for an experimental open-source tool. Intuitively, a blocking tool is expected to be 99.9 percent effective. Even though most customers would likely not conclusively measure our accuracy, eighty percent was too low to ethically sell this technology without disclosures.

The on-line discussion ended fairly quickly when the question of accuracy was brought up, and I think it is safe to assume the silence is an indication that nobody else was achieving better than eighty percent.

How do you validate the effectiveness of a P2P tool?

1) Brute force testing:

I am not aware of too many IT administrators that have the time to load up six or seven different P2P clients on their laptops, and download bootlegged Madonna videos all day.

In testing P2P clients, we infected several computers with just about every virus in circulation. Over time, you can get a rough idea of how deep you must go to expose weaknesses in your tool set. To be thorough, you can’t stop at the first P2P client tool. In the real world, users on your network will likely search for multiple P2P clients, especially if the first one fails. Once they find a kink in the armor, they will yap to others, exposing your Achilles heel.

2) Reduction of RIAA requests:

Most small-to-medium ISP’s don’t really think about P2P unless they get RIAA requests or their network is saturated.

RIAA requests seem to be a big motivator in purchasing technology to block P2P. If you are getting RIAA requests (these are letters from lawyers threatening to sue you for copyright infringement), you can install your P2P blocking tool, and if in the next week your notifications of copyright violations are way down, you can assume that you have put a good dent in your P2P downloading issue.

3) Reduced congestion:

Plug your P2P tool in and see if your network utilization drops.

4) Lower connection rates through your router:

One of the signatures of P2P is that clients will open up hundreds of connections per minute to P2P servers in order to download content. There are ways to measure and quantify these connection rates empirically.

Other observations:

Many times we’ll hear from an ISP/operator claiming they have P2P users run amok on their network, however analysis often shows most of their traffic is video – Netflix, YouTube, Hulu, etc.

Total P2P traffic has really dropped off quite a bit in the last three or four years. We attribute this decline to:

1) Legal iTunes. 99 cent songs have eliminated the need for pirated music.

2) RIAA enforcement and education of copyright laws.

3) The invention of the iPad and iPhone. These devices control the applications which run on them (they are not going to distribute P2P clients as readily).

One method to handle P2P problems is to control all the computers in your environment, scan them before granting network access, and then block access to P2P sites (the sites where the client utilities are loaded from).

Note: once a P2P client is loaded on a computer you cannot block any single remote site, as the essence of P2P is that the content is not centralized.

Summary:

Results of different P2P blocking techniques are often temporary, especially when you have an aggressive user base with motivation to download free content.

NetEqualizer P2P Locator Technology


Editor’s NoteThe NetEqualizer has always been able to thwart P2P behavior on a network. However, our new utility can now pinpoint an individual P2P user or gamer without any controversial layer-7 packet inspectionThis is an extremely important step from a privacy point of view as we can actually spot P2P users without looking at any private data.

A couple of months ago, I was doing a basic health check on a customer’s heavily used residential network. In the process, I instructed the NetEqualizer to take a few live snapshots. I then used the network data to do some filtering with custom software scripts. Within just a few minutes, I was able to inform the administrator that eight users on his network were doing some heavy P2P, and one in particular looked to be hosting a gaming session. This was news to the customer, as his previous tools didn’t provide that kind of detail.

A few days later, I decided to formally write up my notes and techniques for monitoring a live system to share on the blog. But, as I got started, another lightbulb went on…in the end, many customers just want to know the basics — who is using P2P, hosting game servers, etc. They don’t always have the time to follow a manual diagnostic recipe.

So, with this in mind, instead of writing up the manual notes, I spent the next few weeks automating and testing an intelligent utility to provide this information. The utility is now available with NetEqualizer 5.0.

The utility provides: 

  • A list of users that are suspected of using P2P
  • A list of users that are likely hosting gaming servers
  • A confidence rating for each user (from high to low)
  • The option of tracking users by IP and MAC address

The key to determining a user’s behavior is the analysis of the fluctuations in their connection counts and total number of connections. We take snapshots over a few seconds, and like a good detective, we’ve learned how to differentiate P2P use from gaming, Web browsing and even video. We can do this without using any deep packet inspection. It’s all based on human-factor heuristics and years of practice.

Enclosed is a screen shot of the new P2P Locator, available under our Reports & Graphing menu.

Our new P2P Locator technology

Contact us to learn more about the NetEqualizer P2P Locator Technology or NetEqualizer 5.0. For more information about ongoing changes and challenges with BitTorrent and P2P, see Ars Technica’s “BitTorrent Has New Plan to Shape Up P2P Behavior.”

Comcast Suit: Was Blocking P2P Worth the Final Cost?


By Art Reisman
CTO of APconnections
Makers of the plug-and-play bandwidth control and traffic shaping appliance NetEqualizer

Art Reisman CTO www.netequalizer.com

Comcast recently settled a class action suit in the state of Pennsylvania regarding its practice of selectively blocking of P2P.  So far, the first case was settled for 16 million dollars with more cases on the docket yet to come. To recap, Comcast and other large ISPs invested in technology to thwart P2P, denied involvment when first accused, got spanked by the FCC,  and now Comcast is looking to settle various class action suits.

When Comcast’s practices were established, P2P usage was sky-rocketing with no end in sight and the need to block some of it was required in order to preserve reasonable speeds for all users. Given that there was no specific law or ruling on the book, it seemed like mucking with P2P to alleviate gridlock was a rational business decision. This decision made even more sense considering that DSL providers were stealing disgruntled customers. With this said, Comcast wasn’t alone in the practice — all of the larger providers were doing it, throttling P2P to some extent to ensure good response times for all of their customers.

Yet, with the lawsuits mounting, it appears on face value that things backfired a bit for Comcast. Or did they?

We can work out some very rough estimates as the final cost trade-off. Here goes:

I am going to guess that before this plays out completely, settlements will run close to $50 million or more. To put that in perspective, Comcast shows a 2008 profit of close to $3 billion. Therefore, $50 million is hardly a dent to their stock holders. But, in order to play this out, we must ask what the ramifications would have been to not blocking P2P back when all of this began and P2P was a more serious bandwidth threat (Today, while P2P has declined, YouTube and online video are now the primary bandwidth hogs).

We’ll start with the customer. The cost of getting a new customer is usually calculated at around 6 months of service or approximately $300. So, to make things simple, we’ll assume the net cost of a losing a customer is roughly $300. In addition, there are also the support costs related to congested networks that can easily run $300 per customer incident.

The other more subtle cost of P2P is that the methods used to deter P2P traffic were designed to keep traffic on the Comcast network. You see, ISPs pay for exchanging data when they hand off to other networks, and by limiting the amount of data exchanged, they can save money. I did some cursory research on the costs involved with exchanging data and did not come up with anything concrete, so I’ll assume a P2P customer can cost you $5 per month.

So, lets put the numbers together to get an idea of how much potential financial damage P2P was causing back in 2007 (again, I must qualify that these are based on estimates and not fact. Comments and corrections are welcome).

  • Comcast had approximately 15 million broadband customers in 2008.
  • If 1 in 100 were heavy P2P users, the exchange cost would be $7.5 million per month in exchange costs.
  • Net lost customers to a competitor might be 1 in 500 a month. That would run $9 million a month.
  • Support calls due to preventable congestion might run another 1 out of 500 customers or $9 million a month.

So, very conservatively for 2007 and 2008, incremental costs related to unmitigated P2P could have easily run a total of $600 million right off the bottom line.

Therefore, while these calculations are approximations, in retrospect it was likely financially well worth the risk for Comcast to mitigate the effects of unchecked P2P. Of course, the public relations costs are much harder to quantify.

Four Reasons Why Peer-to-Peer File Sharing Is Declining in 2009


By Art Reisman

CTO of APconnections, makers of the plug-and-play bandwidth control and traffic shaping appliance NetEqualizer

Art Reisman CTO www.netequalizer.com

I recently returned from a regional NetEqualizer tech seminar with attendees from Western Michigan University, Eastern Michigan University and a few regional ISPs.  While having a live look at Eastern Michigan’s p2p footprint, I remarked that it was way down from what we had been seeing in 2007 and 2008.  The consensus from everybody in the room was that p2p usage is waning. Obviously this is not a wide data base to draw a conclusion from, but we have seen the same trend at many of our customer installs (3 or 4 a week), so I don’t think it is a fluke. It is kind of ironic, with all the controversy around Net Neutrality and Bit-torrent blocking,  that the problem seems to be taking care of itself.

So, what are the reasons behind the decline? In our opinion, there are several reasons:

1) Legal Itunes and other Mp3 downloads are the norm now. They are reasonably priced and well marketed. These downloads still take up bandwidth on the network, but do not clog access points with connections like torrents do.

2) Most music aficionados are well stocked with the classics (bootleg or not) by now and are only grabbing new tracks legally as they come out. The days of downloading an entire collection of music at once seem to be over. Fans have their foundation of digital music and are simply adding to it rather than building it up from nothing as they were several years ago.

3) The RIAA enforcement got its message out there. This, coupled with reason #1 above, pushed users to go legal.

4) Legal, free and unlimited. YouTube videos are more fun than slow music downloads and they’re free and legal. Plus, with the popularity of YouTube, more and more television networks have caught on and are putting their programs online.

Despite the decrease in p2p file sharing, ISPs are still experiencing more pressure on their networks than ever from Internet congestion. YouTube and NetFlix  are more than capable of filling in the void left by waning Bit-torrents.  So, don’t expect the controversy over traffic shaping and the use of bandwidth controllers to go away just yet.

%d bloggers like this: