Do you want failover on your NetEqualizer or wondered why it’s not available? Let me share a story with you that has developed our philosophy on failover.
A long time ago, back in 1993 or so, I was the Unix and operating system point person for the popular AT&T (i.e. Lucent and Avaya) voice messaging product called Audix. It was my job to make sure that the Unix operating system was bug free and to trouble shoot any issues.
At the time, Audix sales accounted for about $300 million in business and included many Fortune 500 companies around the world. One of the features which I investigated, tested, and certified was our RAID technology. The data on our systems consisted of the archives of all those saved messages that were so important, even more so before e-mail became the standard.
I had a lab setup with all sorts of disk arrays and would routinely yank one from the rack while an Audix system was running. The RAID software we’d integrated worked flawlessly in every test. We were one of the largest companies in the world and we spared no expense to ensure quality in our equipment, and we also charged a premium for everything we sold. If the RAID line item feature was included with an Audix system, it could run as high as $100,000.
Flash forward to the future. We get a call that a customer has lost all their data. A RAID system had failed. It was a well-known insurance company in the Northeast. Needless to say, they were not pleased that their 100 K insurance policy against disk failure did not pan out.
I had certified this mechanism and stood behind it. So, I called together the RAID manufacturer and several Unix kernel experts to do a postmortem. After several days locked in a room, we found was that the real world failure did not follow the lab testing where we had pulled live disk drives in our lab. In fact, it failed in such a way as to slowly corrupt the customer data on all disk drives rendering it useless.
I did some follow up research on failover strategies over the years and discovered that many people implement them for political reasons to cover their asses. I do not mean to demean people covering their asses, it is an important part of business, but the problem is the real cost of testing and validating failover is not practical for most manufacturers.
Many customers ask, “If a NetEqualizer fails, will the LAN cards still pass data?” The answer is, we could certainly engineer our product this way, but there is no guarantee for fail safe systems.
Here are the pros and cons of such a technology:
1) Just like my disk drive failure experience, a system can fail many different ways and the failover mechanism is likely not foolproof. So, I don’t want to recreate history for something we cannot (nor can anybody) reliably real-world test.
2) NetEqualizer’s failure rate is about two percent over two years, which is mostly attributed to harsh operating conditions. That means you have a 1 in 50 chance of having a failure over a two-year period. Put simply, the odds are against this happening.
3) If a NetEqualizer fails, it is usually a matter of moving a cable, which can be easily fixed. So, if you, or anyone with access to the NetEqualizer, are within an hour of your facility, that means you have a 1 in 50 chance of your network being down for one hour every two years because of a NetEqualizer.
4) Customers that really need a fully redundant failover for their operation duplicate their entire infrastructure and purchase two NetEqualizers. These customers are typically brokerage houses where large revenue could be lost. Since they already have a fully tested strategy at the macro level, a failover card on the NetEqualizer is not needed.
5) For customer that is just starting to dabble, they have gone to Cisco spanning tree protocol. Cisco has many years and billions of dollars invested in their switching technology and is rock solid.
6) Putting LAN failover cards in our product would likely raise our base price by about $1000. That would be a significant price increase for most customers, and one that would most likely not be worth paying for.
7) Most equipment failures are software or system related. We take pride in the fact that our boxes run forever and don’t lock up or need rebooting. A failover LAN card does not typically protect against system-type failures.
So, yes, we could sell our system as failsafe with a failover LAN card, but we would rather educate than exploit fears and misunderstandings. Hopefully we’ve accomplished that here.
How Much YouTube Can the Internet Handle?
November 21, 2008 — netequalizerBy Art Reisman, CTO, http://www.netequalizer.com
Art Reisman
As the Internet continues to grow and true speeds become higher, video sites like YouTube are taking advantage of these fatter pipes. However, unlike the peer-to-peer traffic of several years ago (which seems to be abating), YouTube videos don’t face the veil of copyright scrutiny cast upon p2p which caused most users to back off.
In our experience, there are trade offs associated with the advancements in technology that have come with YouTube. From measurements done in our NetEqualizer laboratories, the typical normal quality YouTube video needs about 240kbs sustained over the 10 minute run time for the video. The newer higher definition videos run at a rate at least twice that.
Many of the rural ISPs that we at NetEqualizer support with our bandwidth shaping and control equipment have contention ratios of about 300 users per 10-megabit link. This seems to be the ratio point where these small businesses can turn a profit. Given this contention ratio, if 40 customers simultaneously run YouTube, the link will be exhausted and all 300 customers will be wishing they had their dial-up back. At last check, YouTube traffic accounted for 10 percent of all Internet Traffic. If left completely unregulated, a typical rural ISP could find itself on the brink of saturation from normal YouTube usage already. With tier-1 providers in major metro areas there is usually more bandwidth, but with that comes higher expectations of service and hence some saturation is inevitable.
If you believe there is a conspiracy, or that ISPs are not supposed to profit as they take risk and operate in a market economy, you are entitled to your opinion, but we are dealing with reality. And there will always be tension between users and their providers, much the same as there is with government funds and highway congestion.
The fact is all ISPs have a fixed amount of bandwidth they can deliver and when data flows exceed their current capacity, they are forced to implement some form of passive constraint. Without them many networks would lock up completely. This is no different than a city restricting water usage when reservoirs are low. Water restrictions are well understood by the populace and yet somehow bandwidth allocations and restrictions are perceived as evil. I believe this misconception is simply due to the fact that bandwidth is so dynamic, if there was a giant reservoir of bandwidth pooled up in the mountains where you could see this resource slowly become depleted , the problem could be more easily visualized.
The best compromise offered, and the only comprise that is not intrusive is bandwidth rationing at peak hours when needed. Without rationing, a network will fall into gridlock, in which case not only do the YouTube videos come to halt , but so does e-mail , chat , VOIP and other less intensive applications.
There is some good news, alternative ways to watch YouTube videos.
We noticed during out testing that YouTube videos attempt to play back video as a real-time feed , like watching live TV. When you go directly to YouTube to watch a video, the site and your PC immediately start the video and the quality becomes dependent on having that 240kbs. If your providers speed dips below this level your video will begin to stall, very annoying; however if you are willing to wait a few seconds there are tools out there that will play back YouTube videos for you in non real-time.
Buffering Tool
They accomplish this by pre-buffering before the video starts playing. We have not reviewed any of these tools so do your research. We suggest you google “YouTube buffering tools” to see what is out there. Not only do these tools smooth out the YouTube playback during peak times or on slower connections , but they also help balance the load on the network during peak times.
Bio Art Reisman is a partner and co-founder of APconnections, a company that provides bandwidth control solutions (NetEqualizer) to ISPs, Universities, Libraries, Mining Camps and any organization where groups of users must share their Internet resources equitably. What follows is an objective educational journey on how consumers and ISPs can live in harmony with the explosion of YouTube video.
Share this: