Adopting Best Practices for Wi-Fi: A Service Provider's Onus
Improve customer attraction rates and reduce churn by delivering a satisfying customer experience over Wi-Fi
By: Joe Zeto
Jan. 17, 2012 05:45 AM
As smartphone users become more sophisticated, they are actively seeking out the service provider they believe offers the best overall network for their smartphone. Providers are learning that users are quick to switch if they are unhappy with their existing service. Customers today expect their smartphone to deliver high-bandwidth applications along with high quality voice services. Service providers must look to alternative for "offloading" these bandwidth-intensive applications if they are to keep up with this high bandwidth demands. After years of serving as a nice-to-have hospitality solution, IEEE 802.11 is being thrust into the forefront as a solution.
The risk that providers face when using Wi-Fi for cellular offload is that unsatisfactory user experiences with Wi-Fi now result in a loss of high margin smartphone users. The reality is that consumers will, in most instances, not realize that new smartphones will move off a 3G or 4G service to a Wi-Fi network. At that point, end users will associate a poor Wi-Fi connection with a poor cellular network connection.
A single smartphone subscriber accounts for thousands of dollars in revenue and hundreds in profit for a single smartphone contract. Service providers must make every reasonable effort to acquire and retain these premium customers in order to sustain and grow their business.
Previous Wi-Fi Practices and Problems
The corresponding deployment methodologies assumed that the vast majority of the problems that would occur were caused by 802.11 RF issues. In other words, the assumption was that if a device could detect the radio signal from the wireless access point (AP) and if there was relatively little interference, then the network would work fine.
The dual goals of minimizing cost and maximizing coverage resulted in service providers and enterprise IT managers deploying the minimum number of APs required to cover the target service area with each AP's transmit power turned up as high as possible. For example, instead of using 10 APs with a low transmit power, the traditional design might use five APs configured for maximum transmit power.
The deployment strategy focused on maximizing RF power and minimizing installation costs, and resulted in a well-known experience: the laptop or smartphone is turned on, a suitable public-access Wi-Fi network is identified with three of four bars which indicate that a strong network signal is available, then the user tries to connect the device to the network but is unable to. Thus, the network appears to be available, but it clearly cannot be used.
Another common situation is when users are able to get connected and then find the performance of the network to be unacceptable. Even more frustrating is that a user with 3G data access may be sitting next to you getting better performance. Wi-Fi should be the fastest wireless technology available to a laptop or smartphone.
These scenarios are not surprising when one considers that the signal strength is usually the only criteria used to approve a network. There are numerous reasons why the signal strength is not a good predictor of performance. One of the most common issues is that the signals that produce the signal strength indication are sent out at the most robust, but lowest encoding rate that is supported by the AP. However, higher rate encoding must be used by the users' application traffic in order to attain the performance necessary to provide a positive user experience.
In an access point with a marginal radio, the management frames will commonly work fine while the higher rate data encodings will be marginal or fail completely. Thus the user is able to see the AP, but is unable to use it for any practical purpose. APs will continue to advertise their presence, even if they cannot accept more users or effectively route user traffic to the Internet due to limited backhaul bandwidth, a broken backhaul connection, or misconfigured equipment.
Clearly, the ability to detect a signal from the AP is necessary in order for users to be able to access the Wi-Fi network. However, the mere fact that an AP is present and advertising itself is not sufficient to ensure that a user will achieve reasonable performance and have a satisfactory experience. A much better criterion is user satisfaction with application performance, even when the network is fully populated.
Cellular Offload - Optimizing for Coverage, Not Capacity
In large public venues such as stadiums, concerts, conferences, and fairs, the number of smartphones packed into a given area creates some of the most demanding scenarios for 802.11 deployments. With nearly 50 percent of the population carrying and using smartphones in modern cultures, and growing, these networks need a lot of capacity. The challenge with a network designed primarily for coverage is that it uses relatively few APs to service large areas and will therefore have a large number of clients attempting to share any given AP. In these environments, these networks quickly become congested and result in frequent user complaints that they see the network fine, but that it just doesn't work.
Another significant concern in large public venue deployments is the capabilities of the wired infrastructure elements. With thousands of local users, multiple controllers, switches and servers must all be configured and working properly in order to deliver a quality experience to all customers. Downloading a web page occasionally from a single client will often work when the network is lightly loaded. When the network is fully loaded and people are simultaneously trying to upload pictures, download web pages, stream video or Skype with friends, the aggregate customer experience under scale is often very different from the user experience of downloading a web page in an unloaded network.
Why Is Deployment Testing Critical?
Most testing today assumes that there is a direct relationship between the maximum RF signal strength detected by a test laptop and the performance of a high value client such as a smartphone. This correlation is wishful thinking at best. The reason being is that smartphones are optimized for power, space and performance. Their radio designs and software will be different from the designs found in a test laptop and their performance will differ accordingly.
One of the most common occurrences is that the smartphone will not connect to the AP that one would typically expect. Smartphones, and actually all 802.11 devices, make their own decisions about which specific access point they will use when connecting to a network. They also make their own decision about when to roam, which is to say when they decide to stop talking to one AP in the network and start talking to another AP, presumably because better performance will result. Roaming algorithms are completely unspecified so every client design roams at different times and in different patterns.
Some devices try to minimize the number of roams because there is generally a slight service interruption during a roam that they want to avoid as much as possible. These devices often remain connected to an AP over a severely degraded link even when a substantially better option is available. The device remains connected and avoids the momentary impacts of frequent roaming, but the performance degrades severely as the client device gets further from its associated AP. Even worse, the effect of this one underperforming client device is to reduce the overall capacity of the network because all other client devices on the same channel must wait for the now lengthier conversation to complete before they can transfer their data. To a user, of course, this simply looks like the network is doing a terrible job.
At the other end of the spectrum are devices that instantly roam every time they think there is a slightly better option. These devices attempt to achieve improved connection quality by accepting more frequent interruptions due to roaming. They tend to work well for data services, but the interruptions do have an impact on the quality of real-time services such as voice or video. Once again, since device operation is essentially invisible to users, users naturally assume that any performance degradation that they perceive must be the fault of the network.
Understanding how flagship products will behave in a Wi-Fi deployment is critical to tuning the network so it can maximize the high value users' quality of experience. Reading the RF signal strength on a laptop or just looking at a phone's signal level periodically is simply insufficient to facilitate this tuning.
Core Technical Principles of Next Generation Practices
In essence, it involves moving beyond site survey and into site assessment. Site assessments can be run in a similar amount of time as a site survey, but provide a much more comprehensive view of a network's ability to deliver customer satisfaction, and a more powerful set of metrics as the deployment sign-off criteria.
The major principle of site assessment is to use application traffic and measure the customer experience directly rather than infer it from signal strength as is done with site survey. This approach detects a much broader range of deployment issues including misconfigurations of network elements, improperly installed APs with marginal performance, network problems caused by client behaviors, and noise. In short, any issue that degrades customer experience can be detected by measuring the actual customer experience.
Equally important to the measurement technique is the ability to isolate the source of identified issues to the client or to the network. Service providers are able to easily address network-related issues once they are identified using existing practices. Somewhat surprisingly, it is also true that some significant client issues can also be addressed through alternative network configurations. This ability to take action is the reason why it is important to know which client devices matter most to businesses and to understand which specific client behaviors are leading to a degraded user experience. Once known, alternative configurations can be tested to deliver optimized performance to the most valuable users.
New Cellular Offload Practices = Increased Revenues
Reader Feedback: Page 1 of 1
Latest Cloud Developer Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week