AT&T Roars Back in PCWorld’s Second 3G Wireless Performance Test

How We Do the Testing

For our tests, we chose cities that broadly represent the population density, socioeconomic statuses, physical terrain, foliage, and building construction found in medium to large U.S. cities. Our testing cities include Baltimore, Boston, Chicago, Denver, New Orleans, New York City, Orlando, Phoenix, Portland, San Diego, San Francisco, San Jose, and Seattle.

We measured the 3G networks in each of our 13 cities from 20 testing locations situated in and around the center of the city.
In each city we tested from 20 locations situated in a grid over the center of the city. These locations are roughly 2 miles apart, allowing us to measure service levels among and between numerous cell towers. Overall, we performed more than 51,000 tests in December 2009 and January 2010.

At each testing location, we subjected the networks to industry-standard network stress testing using laptops and to Internet-based testing using smartphones.

Our laptop-based tests use a direct TCP connection to the network to test the network's capacity--that is, the speed and performance that the network is capable of delivering to subscribers. Using the Ixia Chariot 4.2 testing tool running on a laptop PC, we tested both the speed and the reliability of the network.

To measure download speed, Chariot requests a number of large, uncompressible files from a dedicated server in the network; it then measures the speed of each transfer during a 1-minute period. To measure upload speed, Chariot sends a number of files from the Chariot client on the laptop to the server on the network, again timing each transfer during a 1-minute period. We report the average of all of these transfers at each location as the location average. Then we average all tested locations to obtain an average city performance.

We also assign a reliability score to each test. If during a test our client device cannot connect to the network, or if the network drops the connection, or if the throughput speed is unacceptably slow (less than 75 kbps), we label that testing location ?low quality.? We then report the percentage of testing locations in a given city that are of good quality. Thus, if we successfully establish an uninterrupted connection of reasonable speed at 19 of our 20 testing locations for a given network, we award that network a reliability score of 95 percent for that city.

Our smartphone-based tests approximate the real-world connection between specific smartphones and specific networks. We perform the smartphone tests from the same locations that we use for the laptop tests, applying an Internet-based performance test designed by Xtreme Labs. The test sends a large test file back and forth between the smartphone and a network server, and then measures the speeds at which the data is transferred. We perform three upload tests and three download tests at each testing location.

We tested all 13 cities during December 2009 and January 2010, using the same locations, methodology, and personnel we used to test those cities in our April 2009 tests. Maintaining a consistent methodology allowed us to compare the performance of the networks across an interval of eight months and look for possible evolutionary changes.

We did not exhaustively survey every city. We tested from stationary locations only (no drive tests); we did not survey indoor performance; and we did not measure voice service.

Consider the Network

U.S. consumers pay a lot for the convenience of mobile communications and computing. In 2009, Americans spent about $4.8 billion on wireless devices and service, and analysts project that they will spend an even bigger chunk of their paychecks on wireless during 2010.

Regardless of the type of connected device you use, you?ll eventually pay more for the wireless service that connects the device than you will for the device itself. So deciding on a wireless provider is a big decision--and an unwise choice can be a costly mistake.

We hope that our latest study arms you with some real-world information to help you pick the wireless carrier that?s best for you.

CORRECTION: The original version of this story contained two mathematical errors. AT&T's 13-city average download speed increased from 812 kbps in our tests last spring to 1410 kbps in our recent tests; that's a 74 percent increase, not an 84 percent increase, as we wrote. Also, AT&T's 13-city average upload speed rose from 549 kbps in our tests last spring to 773 kbps in our recent tests; that's an increase of 41 percent, not 58 percent, as we wrote.

For comprehensive coverage of the Android ecosystem, visit Greenbot.com.

Subscribe to the Business Brief Newsletter

Comments