In our March 2006 feature, "Broadband to Go," we performed a series of tests on Verizon's BroadbandAccess EvDO service in order to gauge the advantages of this cellular data network over other forms of wired and wireless broadband Internet service. This article documents the types of tests we performed and explains the rationale for each.
In most tests that involve radio transmissions, we work from a location that is as free as possible from interference. But a pristine testing environment would not have established the conditions we wanted for our examination of the EvDO service; so in this case we asked our testers to evaluate the service rigorously but not to find locations that were entirely free of radio frequency interference (RFI). Because we tested the service in an uncontrolled environment--out in the real world--the results cannot be considered scientifically repeatable.
As this story went to press, Cingular contacted us to let us know that its HSDPA service was just coming online. We obtained a single Cingular HSDPA card, made by Sierra Wireless; but because of time constraints, we could test it only in San Francisco. For this reason, we did not report the details of our Cingular HSDPA tests in the story. Our overall impression of Cingular's competing service was generally favorable, and results for it were roughly consistent with those for the Verizon EvDO service in San Francisco.
Where We Tested
Our testers included the story's author, Christopher Null; three PC World editors (two in San Francisco and one in the Boston area); and four freelance testers (two living in rural areas and two residing in large urban centers). They repeatedly performed a series of seven tests devised in consultation with analysts from the PC World Test Center. We chose testers in part based on their geographic location because, in our experience, the quality of wireless service can vary greatly from city to city. We wanted to represent a broad cross-section of the continental United States in the tests. Testers were located in San Francisco; Washington, D.C.; Decatur, Georgia; Wentzville, Missouri (outside St. Louis); Portland and Eugene, Oregon; and West Roxbury, Massachusetts (a suburb of Boston).
The tests were designed to evalauate various aspects of a wireless Internet connection, and the testers performed the series of tests in several physical locations, including:
- At home (indoors)
- Inside a nearby hotel's conference center, preferably in a location without windows
- At a cafe that offers its customers Wi-Fi access
- Outside in a park or similar open space
- On moving transit, such as in a moving car, bus, train, or ferry
We asked testers to perform the tests in these locations to obtain a fuller picture of the quality of service under various normal conditions in which people might use the EvDO service. When possible, testers repeated tests in the same locations at different times of the day, to determine whether peak usage affected the quality of their connection. Our final scorecard reflects the average results for each test, based on all the results submitted by the testers.
The following sections describe in detail each test covered on the final scorecard. All tests were performed in each location listed above, in each tester's geographic area.
Ping test: A ping test is a standard network diagnostic that determines the time, in milliseconds, that elapses between when a single packet of data leaves the originating PC and when the originating PC receives an echo (response) back. We used a Windows batch file that was designed to ping several large Web sites and a DSL-connected test PC located at PC World's offices in San Francisco, recording the results to a text file that the testers submitted with their other results.
File uploads/downloads test: Testers visited the Speakeasy Speed Test Web page and ran the speed test on two different servers--one located in the city closest to their geographical location, and one located in Seattle.
Shoutcast audio test: Testers used Winamp media player software to connect to at least two Internet radio stations that use Shoutcast streaming technology and noted any audio quality problems, such as static, skips, lost packets, or repeated or extended buffering.
Streaming video test: After listening to Internet radio, testers launched Windows Media Player and watched a live stream from NASA TV for up to a minute. They were instructed to record details about any audio or video quality problems, such as static, skips, lost packets, lost video frames, a jagged image, or repeated or extended buffering.
Web browsing test: In what turned out to be the test that produced the most widely varying results, testers visited four major Web sites and, using a stopwatch, timed the number of seconds it took for the home page of each site to load. Testers also noted whether graphics took a long time to load, pages rendered weirdly, or pages took an abnormally long time to load. Testers visited the following sites:
Instant messaging test: Testers used their preferred instant messaging client application to connect to their service and then send and receive a few instant messages.
VoIP test: Testers made a Skype PC-to-PC (Voice-over-IP) phone call from each location to one of our editors and had a brief voice chat. Besides making basic voice-quality evaluations, testers reported on how long it took to connect and whether the call disconnected during the conversation.
In addition to these tests, we evaluated dialing (not included in our final scorecard), as follows: The EvDO service requires users to install "dialing" software onto their computers, which users must run in order to connect to the service. EVDO thus works much like a wired DSL connection that requires a user to run PPPoE dialer software. Testers noted how quickly the service "dialed up," and took detailed notes rlating to (1) occasions when the service disconnected itself during the tests, and (2) whether dialing took an inordinately long time in certain locations or under certain conditions.