Five Things to Know About Wireless Networks


By Art Reisman
CTO, APconnections

overwhelmed

Over the last year or so, when the work day is done, I often find myself talking shop with several peers of mine who run wireless networking companies.  These are the guys in he trenches. They spend their days installing wireless infrastructure in apartment buildings , hotels, professional sports arenas to name just a few.  Below I share a few tidbits intended to provide a high level picture for anybody thinking about building their own wireless network.

There are no experts.

Why? The companies that make wireless equipment are sending out patches almost hourly.  Because they have no idea what works in the real world, every new release is an experiment.  Anybody that works in this industry is chasing this technology, it is not stable enough for a person to become an expert. Anybody that claims to be an expert is living an illusion at best, perhaps wireless historian would be a better term for this fast-moving technology. What you know today will likely be obsolete in 6 months.

The higher (faster) the frequency  the higher the cost of the network.

 Why ? As the industry moves to standards that transmit data at higher data rates, they must use higher frequencies to achieve the faster speeds.  It just so happens that these higher frequencies tend to be less effective at penetrating   through buildings , walls, and windows.   The increase in cost comes with the need to place more and more access points in a building to achieve coverage.

Putting more access points in your building does not always mean  better service. 

Why?  Computers have a bad habit of connecting to one access point and then not letting go, even when the signal gets weak.    For example when you connect up to a wireless network with your lap top in the lobby of a hotel, and then move across the room, you can end up in a bad spot with respect to original access point connection. In theory, the right thing to do would be to release your current connection and connect to a different access point. Problem is most of the installed base of wireless networks , do not have any intelligence built in  to get you routed to the best access point,hence even a building with plenty of coverage can have maddening service.

Electro Magnetic Radiation Cannot Be Seen

So What?  The issue here is that there are all kinds of scenarios where the wireless signals bouncing around the environment can destroy service. Think of a highway full of invisible cars traveling in any direction they wanted.  When a wireless network is installed the contractor in charge does what is called a site survey. This is involves special equipment that can measure the electro magnetic waves in an area, and helps them plan how many and where to install wireless access points ;  but once installed, anything can happen. Private personal hotspots , devices with electric motors, a change in metal furniture configuration are all things that  can destabilize  an area, and thus service can degrade for reasons that nobody can detect.

The more people Connected the Slower their Speed

Why?  Wireless  access points use  a technique called TDM ( Time Division Multiplexing) Basically available bandwidth is carved up into little time slots. When there is only one user connected to access point, that user gets all the bandwidth, when there are two users connected they each get half the time slots. So that access point that advertised 100 megabit speeds , can only deliver at best 10 megabits when 10 people are connected to it.

Related Article

Wireless is nice but wired networks are here to stay

Seven Tips To Improve Performance of your Wireless Lan

Is a Balloon Based Internet Service a Threat to Traditional Cable and DSL?


Update:

 

Looks like this might be the real deal. A mystery barge in San Francisco Bay owned by Google

 

I recently read an article regarding Google’s foray into balloon based Internet services.

This intriguing idea sparked a discussion with some of the engineers at a major satellite internet provider on the same subject. They, as well as myself, were somewhat skeptical at the feasibility of this balloon idea. Could we be wrong? Obviously, there are some unconventional obstacles with bouncing Internet signals off balloons, but what if those obstacles could be economically overcome?

First lets look at the practicalities of using balloons to beam Internet signals from ground based stations to consumers.

Advantages over satellite service

Latency

Satellite Internet, the kind used by Wild Blue, usually comes with a minimum of a 1 second delay, sometimes more. The bulk of this signal delay is due to the distance required for a stationary satellite, 22,000 miles.

A balloon would be located much closer to the earth, in  the atmosphere at around 2 to 12 miles up. The delay at this distance latency is just a few milliseconds.

Cost

Getting a basic stationary satellite into space runs at a minimum 50 million dollars, and perhaps a bit less for a low orbiting non stationary satellite.

Balloons are relatively inexpensive compared to a satellite. Although I don’t have exact numbers on a balloon, the launch cost is practically zero, a balloon carries its payload without any additional energy or infrastructure, the only real cost is the balloon, the payload, and ground based stations. For comparison purposes let’s go with 50,000 per balloon.

Power

Both options can use solar, orienting a balloon position with solar collectors might require 360 degree coverage; however as we will see a balloon can be tethered and periodically raised and lowered, in which case power can be ground based rechargeable.

Logistics

This is the elephant in the room. The position of a satellite in time is extremely predictable. Even for satellites that are not stationery, they can be relied on to be where they are supposed to be at any given time. This makes coverage planning deterministic. Balloons on the other hand, unless tethered will wonder with very little future predictability.

Coverage Range

A balloon at 10,000 feet can cover a Radius on the ground of about 70 miles.  A stationary satellite can cover an entire continent.  So you would need a series of balloons to cover an area reliably.

Untethered

I have to throw out the idea of untethered high altitude balloons. They would wander all over the world , and crash back to earth in random places. Even if  it was cost-effective to saturate the upper atmosphere with them, and pick them out when in range for communications, I just don’t think NASA would be too excited to have 1000’s of these large balloons in unpredictable drift patterns .

Tethered

As crazy as it sounds, there is a precedent for tethering a communication balloon to a 10,000 foot cable. Evidently the US did something like this to broadcast TV signals into Cuba. I suppose for an isolated area where you can hang out offshore well out-of-the-way of any air traffic, this is possible

High Density Area Competition

So far I have been running under the assumption that the balloon based Internet service was an alternative to satellite coverage which finds its niche exclusively in rural areas of the world.  When I think of the monopoly and cost advantage existing carriers have in urban areas, a wireless service with beamed high speeds from overhead might have some staying power. Certainly there could be some overlap with rural users and thus the economics of deployment become more cost-effective. The more subscribers the better. But I do not see urban coverage as a driving business factor.

Would the consumer need a directional Antenna?

I have been assuming all along that these balloons would supply direct service to the consumer. I would suspect that some sort of directional antenna pointing at your local offshore balloon would need to be attached to the side of your house.  This is another reason why the balloons would need to be in a stationary position

My conclusion is that somebody, like Google, could conceivably create a balloon zone off of any coastline with a series of Balloons tethered to barges of some kind. The main problem assuming cost was not an issue, would be the political ramifications of  a plane hitting one of the tethers. With Internet demand on the rise, 4g’s limited range, and the high cost of laying wires to the rural home, I would not be surprised to see a test network someplace in the near future.

Tethered Balloon ( Courtesy of Arstechnica article)

Five Things to Consider When Building a Commercial Wireless Network


By Art Reisman, CTO, APconnections,  www.netequalizer.com

with help from Sam Beskur, CTO Global Gossip North America, http://hsia.globalgossip.com/

Over the past several years we have provided our Bandwidth Controllers as a key component in many wireless networks.  Along the way we have seen many successes, and some not so successful deployments.  What follows are some key learnings  from our experiences with wireless deployment,

1) Commercial Grade Access Points versus Consumer Grade

Commercial grade access points use intelligent collision avoidance in densely packed areas. Basically, what this means is that they make sure that a user with access to multiple access points is only being serviced by one AP at a time. Without this intelligence, you get signal interference and confusion. An analogy would be if  you asked a sales rep for help in a store, and two sales reps start talking back to you at the same time; it would be confusing as to which one to listen to. Commercial grade access points follow a courtesy protocol, so you do not get two responses, or possibly even 3, in a densely packed network.

Consumer grade access points are meant to service a single household.  If there are two in close proximity to each other, they do not communicate. The end result is interference during busy times, as they will both respond at the same time to the same user without any awareness.  Due to this, users will have trouble staying connected. Sometimes the performance problems show up long after the installation. When pricing out a solution for a building or hotel be sure and ask the contractor if they are bidding in commercial grade (intelligent) access points.

2) Antenna Quality

There are a limited number of frequencies (channels) open to public WiFi.  If you can make sure the transmission is broadcast in a limited direction, this allows for more simultaneous conversations, and thus better quality.  Higher quality access points can actually figure out the direction of the users connected to them, such that, when they broadcast they cancel out the signal going out in directions not intended for the end-user.  In tight spaces with multiple access points, signal canceling antennas will greatly improve service for all users.

3) Installation Sophistication and Site Surveys

When installing a wireless network, there are many things a good installer must account for. For example,  the attenuation between access points.  In a perfect world  you want your access points to be far enough apart so they are not getting blasted by their neighbor’s signal. It is okay to hear your neighbor in the background a little bit, you must have some overlap otherwise you would have gaps in coverage,  but you do not want them competing with high energy signals close together.   If you were installing your network in a giant farm field with no objects in between access points, you could just set them up in a grid with the prescribed distance between nodes. In the real world you have walls, trees, windows, and all sorts of objects in and around buildings. A good installer will actually go out and measure the signal loss from these objects in order to place the correct number of access points. This is not a trivial task, but without an extensive site survey the resultant network will have quality problems.

4) Know What is Possible

Despite all the advances in wireless networks, they still have density limitations. I am not quite sure how to quantify this statement other than to say that wireless does not do well in an extremely crowded space (stadium, concert venue, etc.) with many devices all trying to get access at the same time. It is a big jump from designing coverage for a hotel with 1,000 guests spread out over the hotel grounds, to a packed stadium of people sitting shoulder to shoulder. The other compounding issue with density is that it is almost impossible to simulate before building out the network and going live.  I did find a reference to a company that claims to have done a successful build out in Gillette Stadium, home of the New England Patriots.  It might be worth looking into this further for other large venues.

5) Old Devices

Old 802.11b devices on your network will actually cause your access points to back off to slower speeds. Most exclusively-b devices were discontinued in the mid 2000’s, but they are still around. The best practice here is to just block these devices, as they are rare and not worth bringing the speed of your overall network down.

We hope these five (5) practical tips help you to build out a solid commercial wireless network. If you have questions, feel free to contact APconnections or Global Gossip to discuss.

Related Article:  Wireless Site Survey With Free tools

How Many Users Can Your High Density Wireless Network Support? Find Out Before you Deploy.


By

Art Reisman

CTO http://www.netequalizer.com

Recently I wrote an article on how tough it has become to deploy wireless technology in high density areas.  It is difficult to predict final densities until fully deployed, and often this leads to missed performance expectations.

In a strange coincidence, while checking  in with my friends over at Candela Technologies last Friday , I was not  surprised to learn that their latest offering ,the Wiser-50 Mobile Wireless Network Emulator,  is taking the industry by storm.  

So how does their wireless emulator work and why would you need one ?

The Wiser-50  allows you to take your chosen access points, load them up with realistic  signals from a densely packed area of users, and play out different load scenarios without actually building out the network . The ability to this type of emulation  allows you to make adjustments to your design on paper without the costly trial and error of field trials.  You will be able to  see how your access points will behave under load  before you deploy them.  You can then make some reasonable assumptions on how densely to place your access points,  and more importantly get an idea on the upper bounds of your final network.

With IT deployments  scaling up into new territories of  densities, an investment in a wireless emulation tool will pay for itself many times over.  Especially when bidding on a project. The ability to justify how you have sized a quality solution over an ad-hock random solution, will allow your customer to make informed decisions on the trade -offs in wireless investment.

The technical capabilities of Wiser-50 are listed below.   If you are not familiar with all the terms involved with wireless testing I would suggest a call to Candelatech network engineers, they have years of experience helping all levels of customers and are extremely patient and easy to work with.

Scenario Definition Tool/Visualization

  • Complete Scenario Definition to add nodes, create mobility vectors and traffic profiles for run-time executable emulation.
  • Runtime GUI visualization with mobility and different link and traffic conditions.
  • Automatic Traffic generation & execution through the GUI.
  • Drag-and-drop capability for re-positioning of nodes.
  • Scenario consistency checks (against node capabilities and physical limitations such as speed of vehicle).
  • Mock-up run of the defined scenario (i.e., run that does not involve the emulator core to look at the scenario)
  • Manipulation of groups of nodes (positioning, movement as a group)
  • Capture and replay log files via GUI.
  • Support for 5/6 pre-defined scenarios.

RF Module

  • Support for TIREM, exponent-based, shadowing, fading, rain models (not included in base package.)
  • Support for adaptive modulation/coding for BER targets for ground-ground links.
  • Support for ground-to-ground & satellite waveforms
  • Support for MA TDMA (variants for ground-ground, ground-air & satellite links).
  • Support for minimal CSMA/CA functionality.
  • Support to add effects of selective ARQ & re-transmissions for the TDMA MAC.

Image

Related Articles

The Wireless Density Problem

Wireless Network Capacity Never Ending Quest Cisco Blog

Wireless is Nice, but Wired Networks are Here to Stay


By Art Reisman, CTO, www.netequalizer.com

Art Reisman CTO www.netequalizer.com

The trend to go all wireless in high density housing was seemingly a slam dunk just a few years ago. The driving forces behind the exclusive deployment of wireless over wired access was two fold.

  • Wireless cost savings. It is much less expensive to strafe a building with a mesh network  rather than to pay a contractor to insert RJ45 cable throughout the building.
  • People expect wireless. Nobody plugs a computer into the wall anymore – or do they?

Something happened on the way to wireless Shangri-La. The physical limitations of wireless, combined with the appetite for ever increasing video, have caused some high density housing operators to rethink their positions.

In a recent discussion with several IT administrators representing large residential housing units, the topic turned to whether or not the wave of the future would continue to include wired Internet connections. I was surprised to learn that the consensus was that wired connections were not going away anytime soon.

To quote one attendee…

“Our parent company tried cutting costs by going all wireless in one of our new builds. The wireless access in buildings just can’t come close to achieving the speeds we can get in the wired buildings. When push comes to shove, our tenants still need to plug into the RJ45 connector in the wall socket. We have plenty of bandwidth at the core , but the wireless just does can’t compete with the expectations we have attained with our wired connections.”

I found this statement on a Resnet Mailing list from Brown University.

“Greetings,

     I just wanted to weigh-in on this idea. I know that a lot of folks seem to be of the impression that ‘wireless is all we need’, but I regularly have to connect physically to get reasonable latency and throughput. From a bandwidth perspective, switching to wireless-only is basically the same as replacing switches with half-duplex hubs.
     Sure, wireless is convenient, and it’s great for casual email/browsing/remote access users (including, unfortunately, the managers who tend to make these decisions). Those of us who need to move chunks of data around or who rely on low-latency responsiveness find themselves marginalized in wireless-only settings. For instance: RDP, SSH, and X11 over even moderately busy wireless connections are often barely usable, and waiting an hour for a 600MB Debian ISO seems very… 1997.”

Despite the tremendous economic pressure to build ever faster wireless networks, the physics of transmitting signals through the air will ultimately limit the speed of wireless connections far below of what can be attained by wired connections. I always knew this, but was not sure how long it would take reality to catch up with hype.

Why is wireless inferior to wired connections when it comes to throughput?

In the real world of wireless, the factors that limit speed include

  1. The maximum amount of data that can be transmitted on a wireless channel is less than wired. A rule of thumb for transmitting digital data over the airwaves is that you can only send bits of  data at 1/2 the frequency. For example, 800 megahertz ( a common wireless carrier frequency) has  800 million cycles per second and 1/2 of that is 400 million cycles per second. This translates to a theoretical maximum data rate of 400 megabits. Realistically though, with imperfect signals (noise) and other environmental factors, 1/10 of the original frequency is more likely the upper limit. This gives us a maximum carrying capacity per channel of 80 megabits on our 800 megahertz channel. For contrast, the upper limit of a single fiber cable is around 10 gigabits, and higher speeds are attained by laying cables in parallel, bonding multiple wires together in one cable, and on major back bones, providers can transmit multiple frequencies of light down the same fiber achieving speeds of 100 gigabits on a single fiber! In fairness, wireless signals can also use multiple frequencies for multiple carrier signals, but the difference is you cannot have them in close proximity to each other.
  2. The number of users sharing the channel is another limiting factor. Unlike a single wired connection, wireless users in densely populated areas must share a frequency, you cannot pick out a user in the crowd and dedicate the channel for a single person.  This means, unlike the dedicated wire going straight from your Internet provider to your home or office, you must wait your turn to talk on the frequency when there are other users in your vicinity. So if we take our 80 megabits of effective channel bandwidth on our 800 megahertz frequency, and add in 20 users, we are no down to 4 megabits per user.
  3. The efficiency of the channel. When multiple people are sharing a channel, the efficiency of how they use the channel drops. Think of traffic at a 4-way stop. There is quite a bit of wasted time while drivers try to figure out whose turn it is to go, not to mention they take a while to clear the intersection. Same goes for wireless users sharing techniques there is always overhead in context switching between users. Thus we can take our 20 user scenario down to an effective data rate of 2 megabits
  4. Noise.  There is noise and then there is NOISE. Although we accounted for average noise in our original assumptions, in reality there will always be segments of the network that experience higher noise levels than average. When NOISE spikes there is further degradation of the network, and sometimes a user cannot communicate at all with an AP. NOISE is a maddening and unquantifiable variable. Our assumptions above were based on the degradation from “average noise levels”, it is not unheard of for an AP to drop its effective transmit rate by 4 or 5 times to account for noise, and thus an effective data rate for all users on that segment from our original example drops down to 500kbs, just barely enough bandwidth to watch a bad video.

Long live wired connections!

Will Bandwidth Shaping Ever Be Obsolete?


By Art Reisman

CTO – www.netequalizer.com

I find public forums where universities openly share information about their bandwidth shaping policies an excellent source of information. Unlike commercial providers, these user groups have found technical collaboration is in their best interest, and they often openly discuss current trends in bandwidth control.

A recent university IT user group discussion thread kicked off with the following comment:

“We are in the process of trying to decide whether or not to upgrade or all together remove our packet shaper from our residence hall network.  My network engineers are confident we can accomplish rate limiting/shaping through use of our core equipment, but I am not convinced removing the appliance will turn out well.”

Notice that he is not talking about removing rate limits completely, just backing off from an expensive extra piece of packet shaping equipment and using the simpler rate limits available on his router.  The point of my reference to this discussion is not so much to discourse over the different approaches of rate limiting, but to emphasize, at this point in time, running wide-open without some sort of restriction is not even being considered.

Despite an 80 to 90 percent reduction in bulk bandwidth prices in the past few years, bandwidth is not quite yet cheap enough for an ISP to run wide-open. Will it ever be possible for an ISP to run wide-open without deliberately restricting their users?

The answer is not likely.

First of all, there seems to be no limit to the ways consumer devices and content providers will conspire to gobble bandwidth. The common assumption is that no matter what an ISP does to deliver higher speeds, consumer appetite will outstrip it.

Yes, an ISP can temporarily leap ahead of demand.

We do have a precedent from several years ago. In 2006, the University of Brighton in the UK was able to unplug our bandwidth shaper without issue. When I followed up with their IT director, he mentioned that their students’ total consumption was capped by the far end services of the Internet, and thus they did not hit their heads on the ceiling of the local pipes. Running without restriction, 10,000 students were not able to eat up their 1 gigabit pipe! I must caveat this experiment by saying that in the UK their university system had invested heavily in subsidized bandwidth and were far ahead of the average ISP curve for the times. Content services on the Internet for video were just not that widely used by students at the time. Such an experiment today would bring a pipe under a similar contention ratio to its knees in a few seconds. I suspect today one would need more or on the order of 15 to 25 gigabits to run wide open without contention-related problems.

It also seems that we are coming to the end of the line for bandwidth in the wireless world much more quickly than wired bandwidth.

It is unlikely consumers are going to carry cables around with their iPad’s and iPhones to plug into wall jacks any time soon. With the diminishing returns in investment for higher speeds on the wireless networks of the world, bandwidth control is the only way to keep order of some kind.

Lastly I do not expect bulk bandwidth prices to continue to fall at their present rate.

The last few years of falling prices are the result of a perfect storm of factors not likely to be repeated.

For these reasons, it is not likely that bandwidth control will be obsolete for at least another decade. I am sure we will be revisiting this issue in the next few years for an update.

More Ideas on How to Improve Wireless Network Quality


By Art Reisman

CTO – http://www.netequalizer.com

I just came back from one of our user group seminars held at a very prestigious University. Their core networks are all running smoothly, but they still have some hard to find, sporadic dead spots on their wireless network. It seems no matter how many site surveys they do, and how many times they try to optimize their placement of their access points, they always end up with sporadic transient dark spots.

Why does this happen?

The issue with 802.11 class wireless service is that most access points lack intelligence.

With low traffic volumes, wireless networks can work flawlessly, but add a few extra users, and you can get a perfect storm. Combine some noise, and a loud talker close to the access point (hidden node), and the weaker signaled users will just get crowded out until the loud talker with a stronger signal is done. These outages are generally regional, localized to a single AP, and may have nothing to do with the overall usage on the network. Often, troubleshooting is almost impossible. By the time the investigation starts, the crowd has dispersed and all an admin has to go on is complaints that cannot be reproduced.

Access points also have a mind of their own. They will often back down from the best case throughput speed to a slower speed in a noisy environment. I don’t mean audible noise, but just crowded airwaves, lots of talkers and possible interference from other electronic devices.

For a quick stop gap solution, you can take a bandwidth controller and…

Put tight rate caps on all wireless users, we suggest 500kbs or slower. Although this might seem counter-intuitive and wasteful, it will eliminate the loud talkers with strong signals from dominating an entire access point. Many operators cringe at this sort of idea, and we admit it might seem a bit crude. However, in the face of random users getting locked out completely, and the high cost of retrofitting your network with a smarter mesh, it can be very effective.

Along the same lines as using fixed rate caps, a bit more elegant solution is to measure the peak draw on your mesh and implement equalizing on the largest streams at peak times. Even with a smart mesh network of integrated AP’s, (described in our next bullet point) you can get a great deal of relief by implementing dynamic throttling of the largest streams on your network during peak times. This method will allow users to pull bigger streams during off peak hours.

Another solution would be to deploy smarter mesh access points…

I have to back track a bit on my stupid AP comments above. The modern mesh offerings from companies such as:

Aruba Networks (www.arubanetworks.com)

Meru ( www.merunetworks.com)

Meraki ( www.meraki.com)

All have intelligence designed to reduce the hidden node, and other congestion problems using techniques such as:

  • Switch off users with weaker signals so they are forced to a nearby access point. They do this basically by ignoring the weaker users’ signals altogether, so they are forced to seek a connection with another AP in the mesh, and thus better service.
  • Prevent low quality users from connecting at slow speeds, thus the access point does not need to back off for all users.
  • Smarter logging, so an admin can go in after the fact and at least get a history of what the AP was doing at the time.

Related article explaining optimizing wireless transmission.

%d bloggers like this: