Five Things to Know About Wireless Networks


By Art Reisman
CTO, APconnections

overwhelmed

Over the last year or so, when the work day is done, I often find myself talking shop with several peers of mine who run wireless networking companies.  These are the guys in he trenches. They spend their days installing wireless infrastructure in apartment buildings , hotels, professional sports arenas to name just a few.  Below I share a few tidbits intended to provide a high level picture for anybody thinking about building their own wireless network.

There are no experts.

Why? Competition between wireless manufacturers is intense. Yes the competition is great for innovation, and  certainly wireless technology has come a long way in the last 10 years; however these fast paced  improvements come with a cost.  New learning curves for IT partners, numerous patches, combined with  differing approaches,   make it hard for any one person to become an expert.    Anybody that works in this industry usually settles in with one manufacturer perhaps 2, it is moving too fast .

The higher (faster) the frequency  the higher the cost of the network.

 Why ? As the industry moves to standards that transmit data at higher data rates, they must use higher frequencies to achieve the faster speeds.  It just so happens that these higher frequencies tend to be less effective at penetrating   through buildings , walls, and windows.   The increase in cost comes with the need to place more and more access points in a building to achieve coverage.

Putting more access points in your building does not always mean  better service. 

Why?  Computers have a bad habit of connecting to one access point and then not letting go, even when the signal gets weak.    For example when you connect up to a wireless network with your lap top in the lobby of a hotel, and then move across the room, you can end up in a bad spot with respect to original access point connection. In theory, the right thing to do would be to release your current connection and connect to a different access point. Problem is most of the installed base of wireless networks , do not have any intelligence built in  to get you routed to the best access point,hence even a building with plenty of coverage can have maddening service.

Electro Magnetic Radiation Cannot Be Seen

So What?  The issue here is that there are all kinds of scenarios where the wireless signals bouncing around the environment can destroy service. Think of a highway full of invisible cars traveling in any direction they wanted.  When a wireless network is installed the contractor in charge does what is called a site survey. This is involves special equipment that can measure the electro magnetic waves in an area, and helps them plan how many and where to install wireless access points ;  but once installed, anything can happen. Private personal hotspots , devices with electric motors, a change in metal furniture configuration are all things that  can destabilize  an area, and thus service can degrade for reasons that nobody can detect.

The more people Connected the Slower their Speed

Why?  Wireless  access points use  a technique called TDM ( Time Division Multiplexing) Basically available bandwidth is carved up into little time slots. When there is only one user connected to access point, that user gets all the bandwidth, when there are two users connected they each get half the time slots. So that access point that advertised 100 megabit speeds , can only deliver at best 10 megabits when 10 people are connected to it.

Related Article

Wireless is nice but wired networks are here to stay

Seven Tips To Improve Performance of your Wireless Lan

Will Fixed Wireless Ever Stand up To Cable Internet?


;

Screen Shot 2016-04-05 at 10.07.59 AM

By Art Reisman
CTO http://www.netequalizer.com

Screen Shot 2016-04-21 at 1.46.41 PM

Last night I had a dream. A dream where  I was free from relying on my Cable operator for my Internet Service.  After all, the latest wireless technology can be used to beam an Internet signal into your house  at  speeds approaching 600 Megabits right?

My sources tell me some wireless operators  are planning to compete head  to head with entrenched cable operators. This new  tactic is a  bold experiment  considering  most legacy WISP operators normally offer service on the outskirts of town; areas  where traditional Cable and DSL  service is spotty or non-existent.  Going at the throat of the entrenched  cable operators in the urban corridor , beaming Internet into homes with service that compete on price and speed  is a bold undertaking.  Is it possible? Let’s look at some of the obstacles and some of the advantages.

In the wireless model, a provider lights up a fixed tower with Internet service and beams a signal from the tower into each home it services.

  • Unlike cable where there is a fixed  physical wire to each home , the wireless operator relies on a line of sight signal from tower to home. The tower can have as many as four transmitters each capable of 600 megabits The kicker is, to turn a profit,  you have to share the  600 megabits  from each transmitter among as many users as possible.  Each user only gets a fraction of the bandwidth.  For example,       to make the business case work you will need perhaps  100 users (homes ) on one transmitter, that breaks down to 6  megabits per customer.
  • Each tower will need a physical connection back to a tier one provider such as Level 3. This will be a cost duplicated at each tower. A cable operator has a more concentrated NOC and requires far fewer links connections to their Tier one connection.
  • Radio Interference is a problem so the tower may not be able to perform consistently at 600 megabits, when there is interference speeds are backed down
  • Cable operators can put 100 megabits or more down each wire direct to the customer home so if you get into a bandwidth speed war on the last mile connection, the wireless is still not competitive.
  • Towers in this speed range must be line of sight to the home, so the towers must be high enough to clear all trees and buildings , this creates logistical problems on putting in one tower for every 200 homes.

On the flip side I  would gladly welcome a solid 6 megabit feed from a local  wireless  provider.

Speed is not everything , as long as it is adequate for basic services, facebook, e-mail etc. Where a wireless operator can excel and win over customers are in the following areas.

  • good clean honest service
  • no back door price hikes
  • local support, and not that impersonal off shore call center service
  • customers tend to appreciate locally owned companies

 

Five Bars Does not Always Mean Good Data Why ?


I have a remote get-away cabin in the middle of the Kansas Prairie where I sometimes escape to work for a couple of days.   I use my Verizon 4G data service as my Internet connection as this is my best option. Even though I usually have 3 or 4 bars of solid signal, my data service comes and goes. Sometimes it is unbelievably fast, and other times I can’t raise a simple web page before timing out. What gives?
The reason for this variability is the fact that the wireless providers actually have two different networks. One for their traditional phone service, and one for the Internet.  Basically what this means is that the tower sites that you are getting your cell signal from actually have two circuits coming in. One is for the traditional cell service, which is almost always available as long as you have a strong signal (5 bars) on your phone.  And the other carries the legacy phone connection. Each one taking a different path out from the cell tower.

Limited Data Line to towers. The data service to each tower is subject to local or regional congestion depending on where and how your provider connects you to the Internet.  In rural Kansas during the broadband initiative the cellular companies had no Internet presence in the area, so they contracted with the local Internet companies to back haul Internet links to their cell towers. Some of these back haul links to the Internet have very limited data capacity, and hence they can get congested when there are multiple data users competing for this limited resource.

A second reason for slow data service is the limited amount of wireless frequency between your phone and the tower. Even though you may have 4 bars and a good phone connection, it is likely that your wireless provider limits data usage during peak times so they are not forced to drop calls. Think of it like two lanes on a highway, one is the priority lane for phone service , and then there is the data lane which can get jammed with data.

So the next time you can’t find directions to your favorite restaurant, or Siri is having a fit, just remember not all is fair on the data circuit to your tower and beyond.

Is a Balloon Based Internet Service a Threat to Traditional Cable and DSL?


Update:

 

Looks like this might be the real deal. A mystery barge in San Francisco Bay owned by Google

 

I recently read an article regarding Google’s foray into balloon based Internet services.

This intriguing idea sparked a discussion with some of the engineers at a major satellite internet provider on the same subject. They, as well as myself, were somewhat skeptical at the feasibility of this balloon idea. Could we be wrong? Obviously, there are some unconventional obstacles with bouncing Internet signals off balloons, but what if those obstacles could be economically overcome?

First lets look at the practicalities of using balloons to beam Internet signals from ground based stations to consumers.

Advantages over satellite service

Latency

Satellite Internet, the kind used by Wild Blue, usually comes with a minimum of a 1 second delay, sometimes more. The bulk of this signal delay is due to the distance required for a stationary satellite, 22,000 miles.

A balloon would be located much closer to the earth, in  the atmosphere at around 2 to 12 miles up. The delay at this distance latency is just a few milliseconds.

Cost

Getting a basic stationary satellite into space runs at a minimum 50 million dollars, and perhaps a bit less for a low orbiting non stationary satellite.

Balloons are relatively inexpensive compared to a satellite. Although I don’t have exact numbers on a balloon, the launch cost is practically zero, a balloon carries its payload without any additional energy or infrastructure, the only real cost is the balloon, the payload, and ground based stations. For comparison purposes let’s go with 50,000 per balloon.

Power

Both options can use solar, orienting a balloon position with solar collectors might require 360 degree coverage; however as we will see a balloon can be tethered and periodically raised and lowered, in which case power can be ground based rechargeable.

Logistics

This is the elephant in the room. The position of a satellite in time is extremely predictable. Even for satellites that are not stationery, they can be relied on to be where they are supposed to be at any given time. This makes coverage planning deterministic. Balloons on the other hand, unless tethered will wonder with very little future predictability.

Coverage Range

A balloon at 10,000 feet can cover a Radius on the ground of about 70 miles.  A stationary satellite can cover an entire continent.  So you would need a series of balloons to cover an area reliably.

Untethered

I have to throw out the idea of untethered high altitude balloons. They would wander all over the world , and crash back to earth in random places. Even if  it was cost-effective to saturate the upper atmosphere with them, and pick them out when in range for communications, I just don’t think NASA would be too excited to have 1000’s of these large balloons in unpredictable drift patterns .

Tethered

As crazy as it sounds, there is a precedent for tethering a communication balloon to a 10,000 foot cable. Evidently the US did something like this to broadcast TV signals into Cuba. I suppose for an isolated area where you can hang out offshore well out-of-the-way of any air traffic, this is possible

High Density Area Competition

So far I have been running under the assumption that the balloon based Internet service was an alternative to satellite coverage which finds its niche exclusively in rural areas of the world.  When I think of the monopoly and cost advantage existing carriers have in urban areas, a wireless service with beamed high speeds from overhead might have some staying power. Certainly there could be some overlap with rural users and thus the economics of deployment become more cost-effective. The more subscribers the better. But I do not see urban coverage as a driving business factor.

Would the consumer need a directional Antenna?

I have been assuming all along that these balloons would supply direct service to the consumer. I would suspect that some sort of directional antenna pointing at your local offshore balloon would need to be attached to the side of your house.  This is another reason why the balloons would need to be in a stationary position

My conclusion is that somebody, like Google, could conceivably create a balloon zone off of any coastline with a series of Balloons tethered to barges of some kind. The main problem assuming cost was not an issue, would be the political ramifications of  a plane hitting one of the tethers. With Internet demand on the rise, 4g’s limited range, and the high cost of laying wires to the rural home, I would not be surprised to see a test network someplace in the near future.

Tethered Balloon ( Courtesy of Arstechnica article)

Five Things to Consider When Building a Commercial Wireless Network


By Art Reisman, CTO, APconnections,  www.netequalizer.com

with help from Sam Beskur, CTO Global Gossip North America, http://hsia.globalgossip.com/

Over the past several years we have provided our Bandwidth Controllers as a key component in many wireless networks.  Along the way we have seen many successes, and some not so successful deployments.  What follows are some key learnings  from our experiences with wireless deployment,

1) Commercial Grade Access Points versus Consumer Grade

Commercial grade access points use intelligent collision avoidance in densely packed areas. Basically, what this means is that they make sure that a user with access to multiple access points is only being serviced by one AP at a time. Without this intelligence, you get signal interference and confusion. An analogy would be if  you asked a sales rep for help in a store, and two sales reps start talking back to you at the same time; it would be confusing as to which one to listen to. Commercial grade access points follow a courtesy protocol, so you do not get two responses, or possibly even 3, in a densely packed network.

Consumer grade access points are meant to service a single household.  If there are two in close proximity to each other, they do not communicate. The end result is interference during busy times, as they will both respond at the same time to the same user without any awareness.  Due to this, users will have trouble staying connected. Sometimes the performance problems show up long after the installation. When pricing out a solution for a building or hotel be sure and ask the contractor if they are bidding in commercial grade (intelligent) access points.

2) Antenna Quality

There are a limited number of frequencies (channels) open to public WiFi.  If you can make sure the transmission is broadcast in a limited direction, this allows for more simultaneous conversations, and thus better quality.  Higher quality access points can actually figure out the direction of the users connected to them, such that, when they broadcast they cancel out the signal going out in directions not intended for the end-user.  In tight spaces with multiple access points, signal canceling antennas will greatly improve service for all users.

3) Installation Sophistication and Site Surveys

When installing a wireless network, there are many things a good installer must account for. For example,  the attenuation between access points.  In a perfect world  you want your access points to be far enough apart so they are not getting blasted by their neighbor’s signal. It is okay to hear your neighbor in the background a little bit, you must have some overlap otherwise you would have gaps in coverage,  but you do not want them competing with high energy signals close together.   If you were installing your network in a giant farm field with no objects in between access points, you could just set them up in a grid with the prescribed distance between nodes. In the real world you have walls, trees, windows, and all sorts of objects in and around buildings. A good installer will actually go out and measure the signal loss from these objects in order to place the correct number of access points. This is not a trivial task, but without an extensive site survey the resultant network will have quality problems.

4) Know What is Possible

Despite all the advances in wireless networks, they still have density limitations. I am not quite sure how to quantify this statement other than to say that wireless does not do well in an extremely crowded space (stadium, concert venue, etc.) with many devices all trying to get access at the same time. It is a big jump from designing coverage for a hotel with 1,000 guests spread out over the hotel grounds, to a packed stadium of people sitting shoulder to shoulder. The other compounding issue with density is that it is almost impossible to simulate before building out the network and going live.  I did find a reference to a company that claims to have done a successful build out in Gillette Stadium, home of the New England Patriots.  It might be worth looking into this further for other large venues.

5) Old Devices

Old 802.11b devices on your network will actually cause your access points to back off to slower speeds. Most exclusively-b devices were discontinued in the mid 2000’s, but they are still around. The best practice here is to just block these devices, as they are rare and not worth bringing the speed of your overall network down.

We hope these five (5) practical tips help you to build out a solid commercial wireless network. If you have questions, feel free to contact APconnections or Global Gossip to discuss.

Related Article:  Wireless Site Survey With Free tools

How Many Users Can Your High Density Wireless Network Support? Find Out Before you Deploy.


By

Art Reisman

CTO http://www.netequalizer.com

Recently I wrote an article on how tough it has become to deploy wireless technology in high density areas.  It is difficult to predict final densities until fully deployed, and often this leads to missed performance expectations.

In a strange coincidence, while checking  in with my friends over at Candela Technologies last Friday , I was not  surprised to learn that their latest offering ,the Wiser-50 Mobile Wireless Network Emulator,  is taking the industry by storm.  

So how does their wireless emulator work and why would you need one ?

The Wiser-50  allows you to take your chosen access points, load them up with realistic  signals from a densely packed area of users, and play out different load scenarios without actually building out the network . The ability to this type of emulation  allows you to make adjustments to your design on paper without the costly trial and error of field trials.  You will be able to  see how your access points will behave under load  before you deploy them.  You can then make some reasonable assumptions on how densely to place your access points,  and more importantly get an idea on the upper bounds of your final network.

With IT deployments  scaling up into new territories of  densities, an investment in a wireless emulation tool will pay for itself many times over.  Especially when bidding on a project. The ability to justify how you have sized a quality solution over an ad-hock random solution, will allow your customer to make informed decisions on the trade -offs in wireless investment.

The technical capabilities of Wiser-50 are listed below.   If you are not familiar with all the terms involved with wireless testing I would suggest a call to Candelatech network engineers, they have years of experience helping all levels of customers and are extremely patient and easy to work with.

Scenario Definition Tool/Visualization

  • Complete Scenario Definition to add nodes, create mobility vectors and traffic profiles for run-time executable emulation.
  • Runtime GUI visualization with mobility and different link and traffic conditions.
  • Automatic Traffic generation & execution through the GUI.
  • Drag-and-drop capability for re-positioning of nodes.
  • Scenario consistency checks (against node capabilities and physical limitations such as speed of vehicle).
  • Mock-up run of the defined scenario (i.e., run that does not involve the emulator core to look at the scenario)
  • Manipulation of groups of nodes (positioning, movement as a group)
  • Capture and replay log files via GUI.
  • Support for 5/6 pre-defined scenarios.

RF Module

  • Support for TIREM, exponent-based, shadowing, fading, rain models (not included in base package.)
  • Support for adaptive modulation/coding for BER targets for ground-ground links.
  • Support for ground-to-ground & satellite waveforms
  • Support for MA TDMA (variants for ground-ground, ground-air & satellite links).
  • Support for minimal CSMA/CA functionality.
  • Support to add effects of selective ARQ & re-transmissions for the TDMA MAC.

Image

Related Articles

The Wireless Density Problem

Wireless Network Capacity Never Ending Quest Cisco Blog

Wireless is Nice, but Wired Networks are Here to Stay


By Art Reisman, CTO, www.netequalizer.com

Art Reisman CTO www.netequalizer.com

The trend to go all wireless in high density housing was seemingly a slam dunk just a few years ago. The driving forces behind the exclusive deployment of wireless over wired access was two fold.

  • Wireless cost savings. It is much less expensive to strafe a building with a mesh network  rather than to pay a contractor to insert RJ45 cable throughout the building.
  • People expect wireless. Nobody plugs a computer into the wall anymore – or do they?

Something happened on the way to wireless Shangri-La. The physical limitations of wireless, combined with the appetite for ever increasing video, have caused some high density housing operators to rethink their positions.

In a recent discussion with several IT administrators representing large residential housing units, the topic turned to whether or not the wave of the future would continue to include wired Internet connections. I was surprised to learn that the consensus was that wired connections were not going away anytime soon.

To quote one attendee…

“Our parent company tried cutting costs by going all wireless in one of our new builds. The wireless access in buildings just can’t come close to achieving the speeds we can get in the wired buildings. When push comes to shove, our tenants still need to plug into the RJ45 connector in the wall socket. We have plenty of bandwidth at the core , but the wireless just does can’t compete with the expectations we have attained with our wired connections.”

I found this statement on a Resnet Mailing list from Brown University.

“Greetings,

     I just wanted to weigh-in on this idea. I know that a lot of folks seem to be of the impression that ‘wireless is all we need’, but I regularly have to connect physically to get reasonable latency and throughput. From a bandwidth perspective, switching to wireless-only is basically the same as replacing switches with half-duplex hubs.
     Sure, wireless is convenient, and it’s great for casual email/browsing/remote access users (including, unfortunately, the managers who tend to make these decisions). Those of us who need to move chunks of data around or who rely on low-latency responsiveness find themselves marginalized in wireless-only settings. For instance: RDP, SSH, and X11 over even moderately busy wireless connections are often barely usable, and waiting an hour for a 600MB Debian ISO seems very… 1997.”

Despite the tremendous economic pressure to build ever faster wireless networks, the physics of transmitting signals through the air will ultimately limit the speed of wireless connections far below of what can be attained by wired connections. I always knew this, but was not sure how long it would take reality to catch up with hype.

Why is wireless inferior to wired connections when it comes to throughput?

In the real world of wireless, the factors that limit speed include

  1. The maximum amount of data that can be transmitted on a wireless channel is less than wired. A rule of thumb for transmitting digital data over the airwaves is that you can only send bits of  data at 1/2 the frequency. For example, 800 megahertz ( a common wireless carrier frequency) has  800 million cycles per second and 1/2 of that is 400 million cycles per second. This translates to a theoretical maximum data rate of 400 megabits. Realistically though, with imperfect signals (noise) and other environmental factors, 1/10 of the original frequency is more likely the upper limit. This gives us a maximum carrying capacity per channel of 80 megabits on our 800 megahertz channel. For contrast, the upper limit of a single fiber cable is around 10 gigabits, and higher speeds are attained by laying cables in parallel, bonding multiple wires together in one cable, and on major back bones, providers can transmit multiple frequencies of light down the same fiber achieving speeds of 100 gigabits on a single fiber! In fairness, wireless signals can also use multiple frequencies for multiple carrier signals, but the difference is you cannot have them in close proximity to each other.
  2. The number of users sharing the channel is another limiting factor. Unlike a single wired connection, wireless users in densely populated areas must share a frequency, you cannot pick out a user in the crowd and dedicate the channel for a single person.  This means, unlike the dedicated wire going straight from your Internet provider to your home or office, you must wait your turn to talk on the frequency when there are other users in your vicinity. So if we take our 80 megabits of effective channel bandwidth on our 800 megahertz frequency, and add in 20 users, we are no down to 4 megabits per user.
  3. The efficiency of the channel. When multiple people are sharing a channel, the efficiency of how they use the channel drops. Think of traffic at a 4-way stop. There is quite a bit of wasted time while drivers try to figure out whose turn it is to go, not to mention they take a while to clear the intersection. Same goes for wireless users sharing techniques there is always overhead in context switching between users. Thus we can take our 20 user scenario down to an effective data rate of 2 megabits
  4. Noise.  There is noise and then there is NOISE. Although we accounted for average noise in our original assumptions, in reality there will always be segments of the network that experience higher noise levels than average. When NOISE spikes there is further degradation of the network, and sometimes a user cannot communicate at all with an AP. NOISE is a maddening and unquantifiable variable. Our assumptions above were based on the degradation from “average noise levels”, it is not unheard of for an AP to drop its effective transmit rate by 4 or 5 times to account for noise, and thus an effective data rate for all users on that segment from our original example drops down to 500kbs, just barely enough bandwidth to watch a bad video.

Long live wired connections!

%d bloggers like this: