The Illusion of Separation: My Malaysia Trip Report


By Zack Sanders

VP of Security – APconnections

Traveling is an illuminating experience. Whether you are going halfway across the country or halfway around the world, the adventures that you have and the lessons that you learn are priceless and help shape your outlook on life, humanity, and the planet we live on. Even with the ubiquitousness of the Internet, we are still so often constrained by our limited and biased information sources that we develop a world view that is inaccurate and disconnected. This disconnection is the root of many of our problems – be they political, environmental, or social. There is control in fear and the powerful maintain their seats by reinforcing this separation to the masses. Having the realization that we are all together on this planet and that we all largely want the same things is something that can only be discovered by going out and seeing the world for yourself with as open of a mind as possible.

One of the great things about NetEqualizer, and working for APconnections, is that, while we are a relatively small organization, we are truly international in our business. From the United States to the United Kingdom, and Argentina to Finland, NetEqualizers are helping nearly every vertical around the world optimize the bandwidth they have available. Because of this global reach, we sometimes get to travel to unique customer sites to conduct training or help install units. We recently acquired a new customer in Malaysia – a large university system called International Islamic University Malaysia, or IIUM. In addition to NetEqualizers for all of their campuses, two days of training was allotted in their order – one day each at two of their main locations (Kuala Lumpur and Kuantan). I jumped at the chance to travel to Asia (my first time to the continent) and promptly scheduled some dates with our primary contact at the University.

I spent the weeks prior to my departure in Spain – a nicely-timed, but unrelated, warmup trip to shake the rust off that had accrued since my last international travel experience five years ago. The part about the Malaysia trip that I was dreading the most was the hours I would log sitting in seat 46E of the Boeing 777 metal I was to take to Kuala Lumpur with Singapore Airlines. Having the Spain trip occur before this helped ease me in to the longer flights.

F.C. Barcelona hosting Real Madrid at the Camp Nou.

My Malaysia itinerary looked like this:

Denver -> San Francisco (2.5 hours), Layover (overnight)

San Francisco -> Seoul (12 hours), Layover (1 hour)

Seoul -> Singapore (7 hours), Layover (6 hours)

Singapore -> Kuala Lumpur (1 hour)

I was only back in the United States from Spain for one week. It was a fast, but much needed, seven days of rest. The break went by quickly and I was back in the air again, this time heading west.

After 22 hours on the plane and 7 hours in various airports, I was ready to crash at my hotel in the City Centre when I touched down in KL. I don’t sleep too well on planes so I was pretty exhausted. The trouble was that it was 8am local time when I arrived and check-in wouldn’t be until 2:00pm. Fortunately, the fine folks at Mandarin Oriental accommodated me with a room and I slept the day away.

KL City Centre.

I padded my trip with the intention of having a few days before the training to get adjusted, but it didn’t take me as long as I thought and I was able to do some site seeing in and outside the city before the training.

My first stop was Batu Caves – a Hindu shrine located near the last stop of the LRT’s KTM-KOMUTER line in the Gombak District – which I later learned was near the location of my first training seminar. The shrine is set atop 272 stairs in a 400 million year old limestone cave. After the trek up you are greeted by lightly dripping water and a horde of ambitious monkeys in addition to the shrines within the cave walls.

Batu Caves entrance.

Batu Caves.

Petronas Towers.

This was the furthest I ventured from the city for site seeing. The rest of the time, I spent near the City Centre – combing through the markets of Chinatown and Little India, taking a tour of the Petronas Towers, and checking out the street food on Jalan Alor. Kuala Lumpur is a very Western city. The influence is everywhere despite the traditional Islamic culture. TGI-Fridays, Chili’s, and Starbucks were the hotspots – at least in this touristy part of town. On my last night I found a unique spot at the top of the Trader’s Hotel called Skybar. It is a prime location because it looks directly at the Petronas Towers – which, at night especially, are gorgeous. The designers of the bar did a great job implementing sweeping windows and sunken sofas to enjoy the view. I stayed there for a couple hours and had a Singapore Sling – a drink I’ve heard of but had never gotten to try.

Singapore Sling at the Skybar.

The city and sites were great, however, the primary purpose of the trip was not leisure – it was to share my knowledge of NetEqualizer with those that would be working with it at the University. To be honest, I wasn’t sure what to expect. This was definitely different from most locations I have been to in the past. A lot of thoughts went through my head about how I’d be received, if the training would be valuable or not, etc. It’s not that I was worried about anything in particular, I just didn’t know. My first stop was the main location in KL. It’s a beautifully manicured campus where the buildings all have aqua blue roofs. My cab driver did a great job helping me find the Information Technology Department building and I quickly met up with my contact and got set up in the Learning Lab.

This session had nine participants – ranging from IT head honchos to network engineers. The specific experience with the NetEqualizer also ranged from well-versed to none at all. I catered the training such that it would be useful to all participants – we went over the basics but also spent time on more advanced topics and configurations. All in all, the training lasted six hours or so, including an hour break for lunch that I took with some of the attendees. It was great talking with each of them – regardless of whether the subject was bandwidth congestion or the series finale episode of Breaking Bad. They were great hosts and I look forward to keeping in touch with them.

Training at IIUM.

I was pretty tired from the day by the time I arrived back at the hotel. I ate and got to bed early because I had to leave at 6:00am for my morning flight across the peninsula to Kuantan – a short, 35 minute jaunt eastward – to do it all over again at that campus. Kuantan is much smaller than KL, but it is still a large city. I didn’t get to see much of it, however, because I took a cab directly from the airport to the campus and got started. There were only four participants this time – but the training went just as well. I had similar experiences talking with this group of guys, and they, too, were great hosts. I returned back to the airport in the evening and took a flight back to KL. The flight is so short that it’s comical. It goes like this:

Taxi to the runway -> “Flight attendants prepare for takeoff” -> “You may now use your electronic devices” -> 5 minutes goes by -> “Flight attendants prepare for landing – please turn off your electronic devices” -> Land -> Taxi to terminal

The airport in Kuantan at sunset.

I had one more day to check out Kuala Lumpur and then it was back to the airport for another 22 hours of flying. At this point though, I felt like a flying professional. The time didn’t bother me and the frequent meals, Sons of Anarchy episodes, and extra leg room helped break it up nicely. I took a few days in San Francisco to recover and visit friends before ultimately heading back to Boulder.

It was a whirlwind of a month. I flew almost 33,000 miles in 33 days and touched down in eight countries on three continents. Looking back, it was a great experience – both personally and professionally. I think the time I spent in these places, and the things I did, will pay invaluable dividends going forward.

If your organization is interested in NetEqualizer training – regardless of whether you are a new or existing customer – let us know by sending an email to sales@apconnections.net!

View of KL Tower from the top of the Petronas Towers.

How much on-line advertising revenue is fraudulent ?


Today the Wall Street Journal broke an article describing how professionals are scamming on-line advertising revenue.  The scam is pretty simple.

  • First create a web site of some kind.
  • Second hijack personal computers all over the world, or contract with a third party that does this for you.
  • Third have those computers visit your site en mass to drive up the popularity
  • Fourth sell advertisement space on your Website based on the fake heavy traffic

The big loser in this game is the advertising sponsor.

Our Experience

I have been scratching my head for years about the patterns and hit ratios of our own pay-per-click advertisements that we have placed through third parties such as Google  . The Google advertising network for Content Ad placement is a black hole of blind faith.  No matter how hard you examine your results, you cannot figure out who is clicking our advertisements and why.  I do know that Google on one hand takes fraud seriously, but I also know in the past we have been scammed.

Early on in our business, before we had any Web presence, we were putting a large portion of our very limited advertising budget into on-line advertising. Initially we did see a very strong correlation of clicks to inquiries. It was on the order of 100 to 1. One hundred paid clicks per one follow through inquiry. And then one day, we jumped to 1500 clicks. A whopping 15 fold increase in clicks, but there was no increase in corresponding inquiries, not even a tiny blip.  What are the chances of that ?  As you can imagine we had very little re-course other than to pay our bill for the phony clicks. We then removed our content placement advertisements and switched over to search engine only . Search engine clicks  are not likely scammed as Google does not split this revenue with third parties.

I honestly have no idea how big the scamming business on content advertisement is, but I do suspect it is enormous.  In the wall street journal article , the companies that have investigated and prosecuted scammers are large companies with resources to detect and do something about the fraud, the average small business placing content advertisements is running blind.

How Much Will That Replacement 3D Printer Cartridge Cost


By Art Reisman

CTO http://www.netequalizer.com

In case you haven’t heard, Jeff Bezos, of Amazon Fame is tossing around idea of using 3D printers to provide unlimited inventory in all of their local warehouses. I suspect this is more of a proclamation for publicity rather than a near term reality?   A 3D printer is just  a catchy name for a new way to do injection molding.  It involves a process  that lays down material in slices, one on top of each other under the guidance of a computer, until the  slices stack up into the finished part. I don’t doubt this  process will obsolete  100+ year old injection molding techniques, but beyond economizing the parts business, I just don’t see a mesh of generic warehouses able to deliver products on demand through 3D printing. For example, if I dropped all the parts needed to build a Harley Davidson at the local hardware store for final assembly, would you want to buy the finished product ?   Modern manufacturing takes advantage of extreme agile and automated assembly line techniques to build anything.  Sourcing parts on demand with a local, one-off assembly would greatly inflate consumer costs for anything more complex than a paperweight.

As for my question about the replacement cost for the printer cartridge, I am more worried about the next-generation paper jam,  hot liquid plastic spewing out all over my desk.

Who is Your Customer?


By Art Reisman

CTO – netequalizer.com

My morning ritual involves stopping in at my Local Grocery Store for  a cup of coffee at their branded coffee stand. Sometimes I also pick up a few grocery items before heading into the office. At this particular King Soopers, before 7:00 am, they don’t have any checkout lanes open. My only option is the automated line. The automated lanes are great when you have  one or two standard coded items, but every once in a while I forget the rules, and make the mistake. Never buy an un-coded bakery item, or some produce that the scanner does not know how to handle, doing so can make you the laughing stock of the store. The employees will huddle in the back room giggling at you on the security camera as you paw through endless menu options for a muffin that does not exist in the system.

This morning my first clue that something was amiss was that there were two check lanes open with attendants and baggers. All this at 6:45 in the morning. I also noticed somebody under the fresh flowers scrubbing the crud off the wood floor where the moisture seeps, two people organizing carts, and some strange men in suits huddling around the demo food vendors. Wait a second, demo food vendors at 6:45 in the morning?

“So is the CEO coming into the store today”, I asked the attendant who was dressed in some newly printed shirt, scrawled with a Dilbert slogan about customer service.” No not the CEO – he was here last week – today, we have the Vice President of sales visiting,” she replied.

I guess the VP of sales must bring them a ton of business because they were rolling out the red carpet like he was a Vegas high roller.

This reminded me of my early days as an engineer in Eagan Minnesota, at Sperry Corporation, when I had to break down all my experimental circuit boards in my Lab, and sit there, politely acting intelligent for two full days, because the VP of engineering was coming for a visit and might tour our lab at some point.

Sperry went under shortly after that incident. King Sooper’s parent company is healthy right now, and perhaps my experience is isolated and unfair.

I still ask the question, who is your customer?

Is a Balloon Based Internet Service a Threat to Traditional Cable and DSL?


Update:

 

Looks like this might be the real deal. A mystery barge in San Francisco Bay owned by Google

 

I recently read an article regarding Google’s foray into balloon based Internet services.

This intriguing idea sparked a discussion with some of the engineers at a major satellite internet provider on the same subject. They, as well as myself, were somewhat skeptical at the feasibility of this balloon idea. Could we be wrong? Obviously, there are some unconventional obstacles with bouncing Internet signals off balloons, but what if those obstacles could be economically overcome?

First lets look at the practicalities of using balloons to beam Internet signals from ground based stations to consumers.

Advantages over satellite service

Latency

Satellite Internet, the kind used by Wild Blue, usually comes with a minimum of a 1 second delay, sometimes more. The bulk of this signal delay is due to the distance required for a stationary satellite, 22,000 miles.

A balloon would be located much closer to the earth, in  the atmosphere at around 2 to 12 miles up. The delay at this distance latency is just a few milliseconds.

Cost

Getting a basic stationary satellite into space runs at a minimum 50 million dollars, and perhaps a bit less for a low orbiting non stationary satellite.

Balloons are relatively inexpensive compared to a satellite. Although I don’t have exact numbers on a balloon, the launch cost is practically zero, a balloon carries its payload without any additional energy or infrastructure, the only real cost is the balloon, the payload, and ground based stations. For comparison purposes let’s go with 50,000 per balloon.

Power

Both options can use solar, orienting a balloon position with solar collectors might require 360 degree coverage; however as we will see a balloon can be tethered and periodically raised and lowered, in which case power can be ground based rechargeable.

Logistics

This is the elephant in the room. The position of a satellite in time is extremely predictable. Even for satellites that are not stationery, they can be relied on to be where they are supposed to be at any given time. This makes coverage planning deterministic. Balloons on the other hand, unless tethered will wonder with very little future predictability.

Coverage Range

A balloon at 10,000 feet can cover a Radius on the ground of about 70 miles.  A stationary satellite can cover an entire continent.  So you would need a series of balloons to cover an area reliably.

Untethered

I have to throw out the idea of untethered high altitude balloons. They would wander all over the world , and crash back to earth in random places. Even if  it was cost-effective to saturate the upper atmosphere with them, and pick them out when in range for communications, I just don’t think NASA would be too excited to have 1000’s of these large balloons in unpredictable drift patterns .

Tethered

As crazy as it sounds, there is a precedent for tethering a communication balloon to a 10,000 foot cable. Evidently the US did something like this to broadcast TV signals into Cuba. I suppose for an isolated area where you can hang out offshore well out-of-the-way of any air traffic, this is possible

High Density Area Competition

So far I have been running under the assumption that the balloon based Internet service was an alternative to satellite coverage which finds its niche exclusively in rural areas of the world.  When I think of the monopoly and cost advantage existing carriers have in urban areas, a wireless service with beamed high speeds from overhead might have some staying power. Certainly there could be some overlap with rural users and thus the economics of deployment become more cost-effective. The more subscribers the better. But I do not see urban coverage as a driving business factor.

Would the consumer need a directional Antenna?

I have been assuming all along that these balloons would supply direct service to the consumer. I would suspect that some sort of directional antenna pointing at your local offshore balloon would need to be attached to the side of your house.  This is another reason why the balloons would need to be in a stationary position

My conclusion is that somebody, like Google, could conceivably create a balloon zone off of any coastline with a series of Balloons tethered to barges of some kind. The main problem assuming cost was not an issue, would be the political ramifications of  a plane hitting one of the tethers. With Internet demand on the rise, 4g’s limited range, and the high cost of laying wires to the rural home, I would not be surprised to see a test network someplace in the near future.

Tethered Balloon ( Courtesy of Arstechnica article)

Revenge of the Spammed


By
Art Reisman

CTO http://www.apconnections.net

This morning, after spending the customary five to 10 minutes filtering good e-mail from bad e-mail,  I decided I was no longer going to be  a victim of spam. I started thinking of  ways to  counter it.  Something beyond the traditional filtering mechanisms that are just a preventitive defense.  Although not in line with my core beliefs, I wanted to take revenge on spam, make it suffer and feel pain.

The word “virus” in the computer world is always associated with negative connotations. Why is that?  In the natural world, we now know there are good germs and bad germs, invasive species, and beneficial species, etc.

So why can’t we create a good germ for the cyberworld, a germ that would work for us to combat spam?

I think we can.

My first idea is to create a little automated bot that will respond and engage my incoming unsolicited spamming friends into a frivolous black hole of wasted time.

When I install my new tool into my e-mail reader, it will replace the  “spam” button with a new button called “attack” or  “revenge” or maybe something a little less macho and aggressive. I’ll leave the final button name choice to the focus group. Once the button is  pressed the wheels will go in motion,  as my bot comes alive.

For starters,  it will respond with a reasonable non-specific inquiry back to the spammer. (Yes,  I’ll have to be smart and change-up my responses from a vast database so they never know they are talking to a bot.)

For example, Spammer would start with a subject line something like:

” Your Credit Score May Have Been Updated.”

My Bot would respond with:

“Thank you for sending me this information about your services.

Can you kindly send me some more information on your pricing and terms of service.  Also, I will gladly consider buying if you can send me three references from satisfied customers.”

So you see where this is going…

The ultimate goal would be to replicate my tool to the entire world of law-abiding citizens.  I alone could not deter Spammer with my lone bot, but if this idea caught on, spammers would get inundated with time-wasting responses, leading them down a path of slow painful spammer starvation.  They would relentlessly squander all of their energy searching in vain for an actual human response.

I suppose there are  all sorts of flaws in my plan, so don’t ask me for the final commercial version just yet, at least until I have more time to vet the idea and the prototype.

It should be known that there is a precedent in this area.  Back in a time before the Internet, unsolicited phone calls were reaching a peak in the early 1990’s. I was actually involved as an arms supplier in this  previous war.  I was the system architect for AT&T’s  business class inbound call answering servers. In the wrong hands, these servers could also be used for robotic outbound dialing.

We had labs set up where we could direct our calling servers to make 1000’s of calls an hour, using 128 phone lines.  Normally a test set up involved having two servers call each other to create a load. One machine would call the other, and once the call was established, talking bots on each machine would engage each other ad nauseam,  in a dance of prompts and automated touch tone responses.

In one case we (I)  fat-fingered an internal number in the outbound calling program. The end result was that I  caused  a  test machine to call out to a real phone, a colleague of mine, rendering his phone completely useless for a half day. I became  sort of folk hero within the group when I boldly admitted my mistake and took credit for it.

The urban legend of the time, and people swore this story was true, was that some tech working at one of the AT&T resellers took one of our boxes and turned it on any unsolicited business caller with a little message script that would go start by saying,

“hello, I am an automated customer ” and then would ask them a bunch of questions about their business and hang up.

He would direct 64 calls at time into the business call center tying up all their agents.

Five Things to Consider When Building a Commercial Wireless Network


By Art Reisman, CTO, APconnections,  www.netequalizer.com

with help from Sam Beskur, CTO Global Gossip North America, http://hsia.globalgossip.com/

Over the past several years we have provided our Bandwidth Controllers as a key component in many wireless networks.  Along the way we have seen many successes, and some not so successful deployments.  What follows are some key learnings  from our experiences with wireless deployment,

1) Commercial Grade Access Points versus Consumer Grade

Commercial grade access points use intelligent collision avoidance in densely packed areas. Basically, what this means is that they make sure that a user with access to multiple access points is only being serviced by one AP at a time. Without this intelligence, you get signal interference and confusion. An analogy would be if  you asked a sales rep for help in a store, and two sales reps start talking back to you at the same time; it would be confusing as to which one to listen to. Commercial grade access points follow a courtesy protocol, so you do not get two responses, or possibly even 3, in a densely packed network.

Consumer grade access points are meant to service a single household.  If there are two in close proximity to each other, they do not communicate. The end result is interference during busy times, as they will both respond at the same time to the same user without any awareness.  Due to this, users will have trouble staying connected. Sometimes the performance problems show up long after the installation. When pricing out a solution for a building or hotel be sure and ask the contractor if they are bidding in commercial grade (intelligent) access points.

2) Antenna Quality

There are a limited number of frequencies (channels) open to public WiFi.  If you can make sure the transmission is broadcast in a limited direction, this allows for more simultaneous conversations, and thus better quality.  Higher quality access points can actually figure out the direction of the users connected to them, such that, when they broadcast they cancel out the signal going out in directions not intended for the end-user.  In tight spaces with multiple access points, signal canceling antennas will greatly improve service for all users.

3) Installation Sophistication and Site Surveys

When installing a wireless network, there are many things a good installer must account for. For example,  the attenuation between access points.  In a perfect world  you want your access points to be far enough apart so they are not getting blasted by their neighbor’s signal. It is okay to hear your neighbor in the background a little bit, you must have some overlap otherwise you would have gaps in coverage,  but you do not want them competing with high energy signals close together.   If you were installing your network in a giant farm field with no objects in between access points, you could just set them up in a grid with the prescribed distance between nodes. In the real world you have walls, trees, windows, and all sorts of objects in and around buildings. A good installer will actually go out and measure the signal loss from these objects in order to place the correct number of access points. This is not a trivial task, but without an extensive site survey the resultant network will have quality problems.

4) Know What is Possible

Despite all the advances in wireless networks, they still have density limitations. I am not quite sure how to quantify this statement other than to say that wireless does not do well in an extremely crowded space (stadium, concert venue, etc.) with many devices all trying to get access at the same time. It is a big jump from designing coverage for a hotel with 1,000 guests spread out over the hotel grounds, to a packed stadium of people sitting shoulder to shoulder. The other compounding issue with density is that it is almost impossible to simulate before building out the network and going live.  I did find a reference to a company that claims to have done a successful build out in Gillette Stadium, home of the New England Patriots.  It might be worth looking into this further for other large venues.

5) Old Devices

Old 802.11b devices on your network will actually cause your access points to back off to slower speeds. Most exclusively-b devices were discontinued in the mid 2000’s, but they are still around. The best practice here is to just block these devices, as they are rare and not worth bringing the speed of your overall network down.

We hope these five (5) practical tips help you to build out a solid commercial wireless network. If you have questions, feel free to contact APconnections or Global Gossip to discuss.

Related Article:  Wireless Site Survey With Free tools

The Rebirth of Wired Bandwidth


By Art Reisman. CTO http://www.netequalizer.com

As usual marketing expectations for internet speed have out-run reality; only this time reality is having a hard time catching up.

I am starting to get spotty yet reliable reports, from sources at some of the larger wireless carriers, that the guys in the trenches charged with supporting wireless technology are about ready to throw in the towel.  The reports are coming in from technicians who work with the large service providers.

No, I am not predicting the demise of wireless bandwidth and devices, but I am claiming we are at their critical saturation point. In the near future we will likely see only small incremental improvements in wireless data speeds.

The common myth with technology, especially in the first few decades, is that improvements are endless and infinite.  Yes, the theory is validated with technologies that are relatively new and moving fast, but the physical world eventually puts the brakes on.

For example, air travel saw huge jumps in comfort and speed for a 20 year span from the 1930’s to the 1950’s, culminating in jet travel across oceans.  While trans-ocean travel became a reality about 50 years ago, since that time there have been no improvements in speed. The Concorde was just not practical; as a result we have seen no net improvements in jet travel speed in 50 years.

Well, the same goes for wireless technology in 2013. The airwaves are saturated, the frequencies can only carry so much bandwidth.  Perhaps there will be one last gasp of innovation, similar to WDM on wired networks, but the future of high-speed computing will require point-to-point wires.  For this reason, I am still holding onto my prediction that we will see plugins for your devices start to pop up again as an alternative and convenience to wireless in the near future.

Related posts:

The truth about the wireless bandwidth crises This article assumes that there is a payment problem with the cost of paying for the technology.

ISP speed claim dilemma.

Internet Regulation, what is the world coming to ?


A friend of mine just forwarded an article titled “How Net Neutrality Rules Could Undermine the Open Internet”

Basically Net Neutrality advocates are now worried that bringing the FCC in to help enforce Neutrality will set a legal precedent allowing wide-reaching control over other aspects of the Internet. For example, some form of content control extending into gray areas.

Let’s look at the history of the FCC for precedents.

The FCC came into existence to manage and enforce the wireless spectrum,  essentially so you did not get 1000 radio/tv stations blasting signals over each other in every city.  A very necessary and valid government service. Without it, there would be utter anarchy in the airwaves. Imagine roads without traffic signals, or airports without control towers.

At some point in time, their control over frequencies got into content and accessibility mandates.  How did this come about? Simply put, it is the normal progression of government asserting control over a resource. It is what it is, neither good nor bad, just a reflection of a society that looks to government to make things “right”. And like an escaped non-native species in the Hawaiian Islands, it tends to take as much real estate as the ecosystem will allow.

What I do know as a certainty, the FCC, once in the door at regulating anything on the Internet, will continue to grow in order to make things “right” and “fair” during our browsing experience.

At best we can hope the inevitable progression of control by the FCC gets thwarted at every turn allowing us a few more good years of the good old Internet as we know it. I’ll take the current Internet flaws for a few more years while I can.

For more information on non-native species invading Hawaii’s ecosystem, check out this blog, from the Kohala Watershed Partnership.

For an overview of Net Neutrality – check out this Net Neutrality for Dummies Article explaining the act’s possible effects on the everyday internet user.

For a discussion on the possible lawlessness of the FCC’s control over the internet, read this blog entitled “Is the FCC Lawless?”.

Does your ISP restrict you from the public Internet?


By Art Reisman

The term, walled off Garden, is the practice of a  service provider  locking  you into their  local content.   A classic  example of the walled off garden  was exemplified by the early years of AOL. Originally when using their dial-up service,  AOL provided all the content you could want.  Access to the actual internet was  granted  by AOL only after other dial-up Internet providers started to compete with their closed offerings.  Today, using much more subtle techniques, Internet providers try to keep you on their networks.  The reason is simple, it costs them money to transfer you across a boundary to another network, and thus,  it is in their economic interest to keep you within their network.

So how do Internet service providers keep you on their network?

1) Sometimes with monetary incentives , for example, with large commercial accounts they just tell you it is going to cost more. My experience with this practice are first hand. I have heard testimonial from many of our customers running   ISPs, mostly outside the US , where they are  sold a chunk of bulk  bandwidth with conditions. The Terms are often something on the order of:

  • – you have a 1  gigabit connection
  • – if you access data outside  the country you can only use 300 megabits.
  • – If you go over 300 megabits outside the country there will hefty additional fees.

obviously there is going to be a trickle down effect where the regional ISP is going to try to discourage usage outside of the local country under such terms.

2) Then there are more passive techniques such as blatantly looking at your private traffic and just not letting off their network. This technique was used in the US,  implemented by large service providers back in the mid 2000’s.  Basically they targeted peer-to-peer requests and made sure you did not leave their network. Essentially you would only find content from other users within your providers network, even though it would appear as though you were searching the entire Internet.  Special equipment was used to intercept your requests and only allow to you probe other users within your providers network thus saving them money by avoiding Internet Exchange fees.

3) Another way your provider will try  to keep you on their network is offer local mirrored content. Basically they keep a copy of common files at a central location . In most cases this  actually causes the user no harm as they still get the same content. But it can cause problems if not done correctly, they risk sending out old data or obsolete news stories that have been updates.

4) Lastly some governments just outright block content, but this is for mostly political reasons.

Editors Note: There are also political reasons to control where you go on the Internet Practiced in China and Iran

Related Article Aol folds original content operations

Related Article: Why Caching alone won’t speed up your Internet

Imagine Unlimited Bandwidth


By Art Reisman – CTO – www.netequalizer.com

Art Reisman CTO www.netequalizer.com

I was feeling a bit idealistic today about the future of bandwidth, so I jotted these words down. I hope it brightens your day

Imagine there’s no congestion
 It’s easy if you try
No hidden fees surprise us
Above us high speed guy
Imagine all providers, giving bandwidth away

Imagine there’s no Quota’s
It isn’t hard to use
 No killer apps that die for
A lack of bandwidth too
Imagine all the gamers living layer 7 free

You may say, I’m a streamer
But I’m just gonna download one
I hope some day you’ll join us
And your speed concerns will be done

What is a transparent bridge, and why can’t we use them in a wireless network to reduce congestion?


Back in the early days of the telephone, customers had what was called a party line. In this setup, the phone company strings  one common phone line into a neighborhood, and when
a phone call was intended for your house, the operator would ring the line with your designated number of rings. You were on the honor system to pick up
and listen only when the ringing was intended for your house. It takes little imagination to understand that only one person could be on the phone at the same time with this shared configuration.

antique phone and generator oak box 1920's 1930'?

Flash forward to 2013, and a modern computer network . Believe it or not the local  (ethernet) network works much the same as a party line.  All computers on the network listen and are only supposed to answer when being talked to. The idea of ethernet bridge came along when somebody figured out you could have a device on the wire that would prevent unwanted Ethernet packets ( analogous to rings) from traversing a segment of the wire they are not intended for. The benefit of the bridging device is to segment of the transmissions on a wire and reduce a good bit of the overhead from data not intended for your  network segment.

Wireless networks, based on 802.11 technology also could benefit from  a transparent bridge. They share the property that all shared devices must listen for their address and only answer when spoken to. Unfortunately there is no good place to insert a bridge device on a wireless network.  There is no wire containment of transmissions.  For the most part, once broadcast, transmissions spread out in all directions ,and thus nothing can stop a wireless transmission from reaching unintended devices. The only thing a network operator can do to relieve congestion is to divide the network up in geographic segments and limit the power at each tower from encroaching on neighboring segments.

Related Article: More ideas on how to improve wireless network quality.

Is the Reseller Channel for Network Equipment Declining?


Back in 2008, TMCnet posed an interesting question about traditional PBX vendors. Has VOIP outgrown traditional business service channels? And that got me wondering, what is going on in the traditional network equipment channel? Is it starting to erode in favor of direct sales?

We are seeing a split in buying patterns.

1) Companies that do not have an in house staff generally make their equipment purchases based on the advice of their Network Consultants, VARs or local reseller.

The line between Network Consultants and VARs has always been a bit muddy.  Most network consultants tend to dabble in reselling.  Hence this relationship behaves like the traditional channel where consultants and VARs represent specific manufactures, and  mark up equipment to make margins. Customers benefit because the true cost of the consulting, to design and deploy their  networks, is subsidized by the margins the VARs make on their equipment sales.

2) On the other hand, companies and institutions with  in house IT staffs are starting to get away from the traditional equipment reseller.  They are more likely to do their research on line, and are more than willing to buy outside of a traditional channel.  This creates a strange double edged sword for OEMs,  as they are heavily dependent on the relationships of their channel partners to move equipment. For the same reason that those factory outlet stores are located outside of town, OEMs do not want to shoot themselves in the foot by selling direct and competing with their resellers.

Even though there is some degradation in the traditional channel, I don’t think we will see its demise any time soon for a couple of reasons.

1) Network solutions remain labor intensive, and expertise will always be at a minimum. Even with cloud based computing there is still a good bit of infrastructure required at the enterprise and this bodes well for the VARs and reseller who offer their expertise while acting as the conduit to move equipment with mark-up from the OEMs

2) Network equipment itself resists becoming a commodity. Yes home routers and such have gone that route, but with advanced features such as bandwidth optimization and security driving the market , network equipment remains complex enough to justify the value added channel.

What are you seeing?

Related Article:  Us channel sales flat for third straight year.

A Brief History of Peer to Peer File Sharing and the Attempts to Block It


By Art Reisman

The following history is based on my notes and observations as both a user of peer to peer, and as a network engineer tasked with cleaning  it up.

Round One, Napster, Centralized Server, Circa 2002

Napster was a centralized service, unlike the peer to peer behemoths of today there was never any question of where the copyrighted material was being stored and pirated from. Even though Napster did not condone pirated music and movies on their site, the courts decided by allowing copyrighted material to exist on their servers, they were in violation of copyright law. Napster’s days of free love were soon over.

From an historic perspective the importance of the decision to force the shut down of Napster was that it gave rise to a whole new breed of p2p applications. We detailed this phenomenon in our 2008 article.

Round Two, Mega-Upload  Shutdown, Centralized Server, 2012

We again saw a doubling down on p2p client sites (they expanded) when the Mega-Upload site, a centralized sharing site, was shutdown back in Jan 2012.

“On the legal side, the recent widely publicized MegaUpload takedown refocused attention on less centralized forms of file sharing (i.e. P2P). Similarly, improvements in P2P technology coupled with a growth in file sharing file size from content like Blue-Ray video also lead many users to revisit P2P.”

Read the full article from deepfield.net

The shut down of Mega-Upload had a personal effect on me as I had used it to distribute a 30 minute account from a 92-year-old WWII vet where he recalled, in oral detail, his experience of surviving a German prison camp.

Blocking by Signature, Alias Layer 7 Shaping, Alias Deep packet inspection. Late 1990’s till present

Initially, the shining star savior in the forefront against spotting illegal content on your network, this technology can be expensive and fail miserably in the face of newer encrypted p2p applications. It also can get quite expensive to keep up with the ever changing application signatures, and yet it is still often the first line of defense attempted by ISPs.

We covered this topic in detail, in our recent article,  Layer 7 Shaping Dying With SSL.

Blocking by Website

Blocking the source sites where users download their p2p clients is still possible. We see this method applied at mostly private secondary schools, where content blocking is an accepted practice. This method does not work for computers and devices that already have p2p clients. Once loaded, p2p files can come from anywhere and there is no centralized site to block.

Blocking Uninitiated Requests. Circa Mid-2000

The idea behind this method is to prevent your Network from serving up any content what so ever! Sounds a bit harsh, but the average Internet consumer rarely, if ever, hosts anything intended for public consumption. Yes at one time, during the early stages of the Internet, my geek friends would set up home pages similar to what everybody exposes on Facebook today. Now, with the advent hosting sites, there is just no reason for a user to host content locally, and thus, no need to allow access from the outside. Most firewalls have a setting to disallow uninitiated requests into your network (obviously with an exemption for your publicly facing servers).

We actually have an advanced version of this feature in our NetGladiator security device. We watch each IP address on your internal network and take note of outgoing requests, nobody comes in unless they were invited. For example, if we see a user on the Network make a request to a Yahoo Server , we expect a response to come back from a Yahoo server; however if we see a Yahoo server contact a user on your network without a pending request, we block that incoming request. In the world of p2p this should prevent an outside client from requesting a receiving a copyrighted file hosted on your network, after all no p2p client is going to randomly send out invites to outside servers or would they?

I spent a few hours researching this subject, and here is what I found (this may need further citations). It turns out that p2p distribution may be a bit more sophisticated and has ways to get around the block uninitiated query firewall technique.

P2P networks such as Pirate Bay use a directory service of super nodes to keep track of what content peers have and where to find them. When you load up your p2p client for the first time, it just needs to find one super node to get connected, from there it can start searching for available files.

Note: You would think that if these super nodes were aiding and abetting in illegal content that the RIAA could just shut them down like they did Napster. There are two issues with this assumption:

1) The super nodes do not necessarily host content, hence they are not violating any copyright laws. They simply coordinate the network in the same way DNS service keep track of URL names and were to find servers.
2) The super nodes are not hosted by Pirate Bay, they are basically commandeered from their network of users, who unwittingly or unknowingly agree to perform this directory service when clicking the license agreement that nobody ever reads.

From my research I have talked to network administrators that claim despite blocking uninitiated outside requests on their firewalls, they still get RIAA notices. How can this be?

There are only two ways this can happen.

1) The RIAA is taking liberty to simply accuse a network of illegal content based on the directory listings of a super node. In other words if they find a directory on a super node pointing to copyrighted files on your network, that might be information enough to accuse you.

2) More likely, and much more complex, is that the Super nodes are brokering the transaction as a condition of being connected. Basically this means that when a p2p client within your network, contacts a super node for information, the super node directs the client to send data to a third-party client on another network. Thus the send of information from the inside of your network looks to the firewall as if it was initiated from within. You may have to think about this, but it makes sense.

Behavior based thwarting of p2p. Circa 2004 – NetEqualizer

Behavior-based shaping relies on spotting the unique footprint of a client sending and receiving p2p applications. From our experience, these clients just do not know how to lay low and stay under the radar. It’s like the criminal smuggling drugs doing 100 MPH on the highway, they just can’t help themselves. Part of the p2p methodology is to find as many sources of files as possible, and then, download from all sources simultaneously. Combine this behavior with the fact that most p2p consumers are trying to build up a library of content, and thus initiating many file requests, and you get a behavior footprint that can easily be spotted. By spotting this behavior and making life miserable for these users, you can achieve self compliance on your network.

Read a smarter way to block p2p traffic.

Blocking the RIAA probing servers

If you know where the RIAA is probing from you can deny all traffic to their probes and thus prevent the probe of files on your network, and ensuing nasty letters to desist.

Can Rural Internet Services be Subsidized with Advertising?


By Art Reisman

I just read a Wall Street Journal article this morning regarding the lack of home Internet service in poor rural areas. In this story, the children of Cirtronelle, Alabama are forced to do their homework at the local McDonald’s because the local Library closes at 6, and they must use the Internet to complete their school assignments. Internet at home is either not available or it is too expensive.

This got me thinking of an idea that had been bandied around for quite some time with some of our rural WISP NetEqualizer customers. It has been a while, but we actually helped a few operators set up systems with some form of on-line advertising (prior to the great recession). For example, the base minimum  subscription price required for a rural WISP to turn a profit starts at around $40 to $50 a month. So what if a WISP sold a lower grade service, $10 a month, and then required that each time a home user logged on  to the service, that they were presented with a 20 second promo trailer from a local merchant? The Merchant would then subsidize the WISP per showing. Would this be a viable alternative to stimulate rural Internet services?

I am sure many a WISP has tried this, and I suspect the barriers are:

1) The mechanics of redirection and authentication, in other words this requires a much more complex authentication infrastructure than what a small WISP would typical start with.

2) Selling advertisement space, this would be a full time hustle to keep slots filled and paying.

3) Justifying the return on investment to the advertiser.

Comments and/or ideas are welcome!

admin@netequalizer.com