The NetEqualizer P2P Locator Is Now Available
This past week, we announced the release of our new peer-to-peer (P2P) locator technology. The release is the most recent addition to the growing capabilities of NetEqualizer 5.0 and marks another significant step in our commitment to ensuring network quality while maintaining user privacy.
Although the NetEqualizer has long been able to thwart P2P behavior without any controversial layer 7 packet inspection, this new utility can now automatically pinpoint and identify an individual P2P user or gamer without looking at any private data.The key to determining a user’s behavior is the analysis of fluctuations in their connection counts and the total number of connections. By taking snapshots of network activity over a few seconds, the utility differentiates P2P use from gaming, Web browsing and even video.
Using this information, the utility provides:
- A list of users that are suspected of using P2P
- A list of users that are likely hosting gaming servers
- A confidence rating for each user (from high to low)
- The option of tracking users by IP and MAC address
In the past, the ability to do so required the time-consuming study of network behavior. However, the new utility provides administrators the results without the additional legwork.
NetEqualizer 5.0 and the P2P locator technology are available at no charge for customers with current NetEqualizer Software Subscriptions (NSS). Additional information about the NetEqualizer and user privacy can be found in the NetEqualizer News Blog’s “NetEqualizer Offers Net Neutrality, User Privacy Compromise.” |
Another Successful Tech Seminar Is In the Books…Here’s A Rundown

This past March, we held our most recent complimentary NetEqualizer Technical Seminar in Southern California with host Biola University. As always, the Seminar was great, and we had a wonderful time meeting with several current and future NetEqualizer users.
In addition to Biola, the Seminar was attended by NetEqualizer users such as Chapman University, The Master’s College, Southern California Coastal Water Research Project, and JD Enterprises, who came all the way from Haiti.
After opening remarks from Biola University Director of IT Operations Scott Himes, APconnections co-founder and CTO Art Reisman took center stage to discuss several recent technical advances in the NetEqualizer such as the release of the new NetEqualizer Caching Option and enhancements to provide a softer license violation enforcement and an improved handling of pools.
Keeping with our Seminar’s traditional hands-on approach, Biola’s network was also analyzed live on a large projection screen with various network reports displayed and possible P2P/BitTorrent sessions identified and discussed.The hands-on demonstration was followed by a Q&A session with topics ranging from fine-tuning to network policy enforcement to IPv6 adoption and how the internals of NetEqualizer handle bandwidth accounting and connection persistence. (This included Art’s story about his shameless offer to give his ISP a NetEqualizer to improve his Internet connection so he could listen to a stream of the Broncos game uninterrupted from his rural Kansas farm.)
However, the Q&A was not just a one-way street, as we always enjoy hearing suggestions from participants and learning more about the individual issues facing network administrators. There was a relative consensus that the biggest bandwidth management challenge right now is video (YouTube, Netflix, Apple TV, Hulu, etc.). This was substantiated by a recent Morgan Stanley Research report that was cited during the meeting, as well as through users’ own experiences and observations. A number of participants shared their own video policies as well as how these policies may change going forward. Even those customers who had surplus bandwidth available either have considered or are currently contemplating bandwidth limits on users/IPs as a response.
Overall, the Biola University Seminar was another enjoyable and successful meet-up for both current and future NetEqualizer users as well as the APconnections staff. We hope to see you at our next Seminar (see our next article)! |
|
Our Next NetEqualizer Tech Seminar Is Coming Soon!
Plans are now in the works for our next complimentary NetEqualizer Technical Seminar. We’re currently taking suggestions for potential hosts, so if you’re interested, be sure to let us know.
The upcoming Seminar will cover:
- The various tradeoffs regarding how to stem P2P and bandwidth abuse
- Recommendations for curbing RIAA requests
- Demo of the NetEqualizer network access control module
- Lots of customer Q&A and information sharing on how clients are using the NetEqualizer, including some hands-on probing of a live system
If that wasn’t enough, we’ll be giving away great door prizes to attendees. So, be sure not to miss this Seminar! We’ll keep you posted as the details develop and the final location is determined. For more information, or to express interest in hosting, contact us via email to admin. |
See You At edACCESS!

edACCESS is quickly approaching! On June 22nd, APconnections will be attending the edACCESS conference in Hightstown, New Jersey. We hope to see many of you there!
The edACCESS conference is geared toward small schools and colleges, and was recommended to us by long-time customer Tom Phelan of The Peddie School. The conference is unique in that it is limited to 100 attending schools, and the attendees define the topics to be discussed during the first sessions! If you’re a small school or college and would like to attend, please register at edaccess.
Also, we’re always open to new conferences. So, if you know of a conference that you think would be a good fit for us, please email us at sales with your recommendation. See you in New Jersey! |
Best of The Blog
The True Price Of Bandwidth Monitoring
For most IT administrators, bandwidth monitoring of some sort is an essential part of keeping track of, as well as justifying, network expenses. Without visibility into a network load, an administrator’s job would degrade into a quagmire of random guesswork. Or would it?
The traditional way of looking at monitoring your Internet has two parts: the fixed cost of the monitoring tool used to identify traffic, and the labor associated with devising a remedy. In an ironic inverse correlation, we assert that costs increase with the complexity of the monitoring tool. Obviously, the more detailed the reporting tool, the more expensive its initial price tag. The kicker comes with part two. The more expensive the tool, the more detail it will provide, and the more time an administrator is likely to spend adjusting and mucking, looking for optimal performance.
But is it a fair to assume higher labor costs with more advanced monitoring and information?
Well, obviously it would not make sense to pay more for an advanced tool if there was no intention of doing anything with the detailed information it provides. Why have the reporting tool in the first place if the only output was to stare at reports and do nothing? Typically, the more information an admin has about a network, the more inclined he might be to spend time making adjustments.
On a similar note, an oversight often made with labor costs is the belief that when the work needed to adjust the network comes to fruition, the associated adjustments can remain statically in place. However, in reality, network traffic changes constantly, and thus the tuning so meticulously performed on Monday may be obsolete by Friday.
Does this mean that the overall productivity of using a bandwidth tool is a loss? Not at all. Bandwidth monitoring and network mucking can certainly result in a cost-effective solution. But where is the tipping point? When does a monitoring solution create more costs than it saves?
A review of recent history reveals that technologies with a path similar to bandwidth monitoring have become commodities and shunned the overhead of most human intervention.
To keep reading, click here. |
|
Leave a Reply