How We Test
In each city, we tested from 20 locations situated in a grid over the center of the city. These locations are roughly 2 miles apart, allowing us to measure service levels among and between numerous cell towers. At each testing location, we subjected the networks to industry-standard stress-testing using laptops, and we put the networks though Internet-based testing using smartphones.
Our laptop-modem tests use a direct TCP connection to the network to test the network's capacity--that is, the speed and performance that the network is capable of delivering to subscribers. To connect the laptop to the various networks, we used the fastest USB modem available, as suggested by the carriers themselves. We used the LG VL600 4G USB modem to test Verizon, the ZTE WebConnect Rocket 2.0 USB modem to test T-Mobile, the Sierra Wireless 250U AirCard to test Sprint, and the Sierra Wireless USBConnect Shockwave to test AT&T. Using the Ixia Chariot 4.2 testing tool on our laptop PC, we tested both the speed and the latency of the network.
To measure download speed, Chariot requests a number of large, uncompressible files from a server in the San Francisco Bay Area, then from another server in Northern Virginia. For each server, the software measures the speed of each transfer during a 1-minute period, and then creates an average of the results.
To measure upload speed, Chariot sends a number of files from the Chariot client on the laptop to the local and distant network servers, again timing each transfer during a 1-minute period. We report the average of all of these transfers, both from the local and distant server, at each location as the average for that location.
During the speed tests, the Ixia testing software also measures latency, or the time it takes for a packet to move from the client laptop to the network servers and back again. This metric, expressed in milliseconds, can reveal delays or bottlenecks in the flow of data through the network, and can foretell how well real-time applications such as voice calling and video chatting--which require nearly instantaneous packet transfer to work smoothly--will work on the service being tested.
Our smartphone tests, which we run from the same locations as our laptop-modem tests, approximate the real-world connection between specific smartphones and specific networks. For the tests, we used AT&T's Apple iPhone 4, Sprint's HTC EVO 4G, T-Mobile's HTC G2, and Verizon's Motorola Droid 2.
On each phone we run the FCC-approved mobile-broadband performance test from Ookla. The test sends a large file back and forth between the smartphone and a network server, and then measures the speeds at which the data transfers. We perform three upload tests and three download tests at each testing location.
We tested all 13 cities during January 2011 and February 2011, using the same locations, methodology, and personnel we used to test those cities in our January 2010 tests. Maintaining a consistent methodology allows us to compare the performance of the networks over time and to look for evolutionary changes.
Our research was not comprehensive. We did not exhaustively survey every city. We tested from stationary locations only, we didn't test indoor performance, and we did not measure voice service.
For quick tips on how you can try evaluating your phone's network, see "Test Your Smartphone Data Speed."