Take a drive on Highway 101 between Silicon Valley and San Francisco these days and you might see one of Google’s driverless cars in the lane next to you. The vehicles are one of the most visible signs of the increasing amount of research going on in the area related to automated driving technology.
To people like Sergey Brin, co-founder of Google and head of the Google X division that is researching the cars, the technology holds the potential to transform urban centers, reduce the amount of land given over to parking lots and cut down on accidents.
Cars will be able to drop people off at work in pleasant green spaces and drive themselves away to distant parking lots, where they’ll park efficiently in compact spaces, Brin said last September. He predicted widespread use of autonomous vehicles could come as soon as 2018.
Cars from Nissan, Ford and BMW already have parking assist functions that require just a little help from drivers to perfectly slide into a space. And last year, Nissan showed a prototype car that can find itself a space and park—without a driver even being inside.
The eventual goal is fully automated freeways. Cars would travel in long lines, much closer together than they are now because they’ll be controlled by computers. That would reduce congestion and burn less fuel, because there would be less of the constant speeding up and slowing down. Drivers would have time to read a newspaper or catch up on email as their cars shuttle them to work.
“On a normally operating highway, cars take up a tiny fraction of the space,” said Brin. “Mostly, it’s all air between you and the car in front of you, to the sides of you. Self-driving cars can chain together and use the highways far more efficiently.”
While the dream is appealing, it may also be farther down the road than Brin suggests. Autonomous cars today are limited to roads that have been mapped in advance for each vehicle, and the required sensor technology costs several times the price of the vehicle.
The drive to autonomy
The roots of today’s autonomous driving research in the U.S. can be traced to 2004 and the DARPA Grand Challenge. Set up by the Defense Advanced Research Projects Agency (DARPA), a research arm of the U.S. Department of Defense, the event challenged teams to design and race driverless cars over a 240-kilometer (149-mile) course in the Mojave Desert.
As a race, it was a failure. The farthest any car got was just under 12 kilometers. But as a kick-starter for development and innovation, it was a huge success.
Among the teams that built on their race experience was David and Bruce Hall. In 2004, the stereoscopic camera system they used for navigation allowed their converted Toyota pickup truck to travel 10 kilometers and take third place, though they scrapped the system for a prototype laser imaging system.
Using a bank of lasers on a rotating drum on the roof of the car, the system was able to bounce light off most objects in the vicinity. By measuring the strength and delay of the reflected beams, just as aeronautical radar does, a computer could build up an accurate 3D map of the surroundings.
The LIDAR (light detection and ranging) sensor took the car 40 kilometers before a steering control board failure ended its race. The team came in 11th place out of 23 finalists, and the sensor drew a lot of attention.
“By the third challenge, everyone wanted it,” said David Hall in an interview at the headquarters of his company, Velodyne, in Morgan Hill, south of Silicon Valley.
A year later, when Velodyne offered a more compact version of the 64-laser LIDAR unit, it quickly started receiving orders from other DARPA Grand Challenge teams. In 2007, the next year the event was held, five of the six finishing teams were using Velodyne LIDAR, including the first- and second-place cars.
One of those early LIDAR prototypes is today in the Smithsonian’s National Museum of American History, and Velodyne has gone on to produce hundreds of LIDAR units for commercial use.
Perhaps the most visible use is atop Google’s driverless cars. At any time there are about a dozen of the vehicles on the roads of Northern California. They are mostly modified Lexus RX450H cars, with a few Toyota Prius vehicles, each with one of Velodyne’s $80,000 LIDAR sensors.
Google says its main goal is to make driving safer, more enjoyable and more efficient.
“Over 1.2 million people are killed in traffic accidents worldwide every year, and we think self-driving technology can help significantly reduce that number,” the company said via email.
But while driverless cars are slowly becoming more common on California roads, they’re still at an early stage of development. Nothing demonstrates this better than the amount of preparation required before a self-driving car can hit the streets.
The LIDAR sensor on the roof pulls in thousands of points of data every second to produce an accurate 3D model of the car’s surroundings, but that isn’t enough for the car to reliably drive itself. Before that can happen, a Google car with a human being at the wheel must first drive the streets, mapping the surroundings.
“By mapping things like lane markers and traffic signs, the software in the car becomes familiar with the environment and its characteristics in advance,” Google said. “When we later drive a route without driver assistance, these same cameras, laser sensors and radars help determine where other cars are and how fast they are moving. The software controls acceleration and deceleration, and mounted cameras read and interpret traffic lights and other signs.”
To facilitate development, California recently became one of a handful of U.S. states to legally recognize driverless cars. The others are Nevada, Texas and Florida.
“Today, we are looking at science fiction becoming tomorrow’s reality,” California Governor Jerry Brown said when he signed a new law recognizing driverless vehicles. Before, they weren’t mentioned in the law, leaving them in a legal gray area, neither banned nor officially regulated.
The law was signed during a ceremony at Google’s headquarters in Mountain View. The company lobbied hard for the law, so its passage was something of a victory for Google.
The state’s Department of Motor Vehicles has now been charged with developing regulations for the licensing and testing of driverless cars on state roads. It’s expected to also address the question of liability: Who’s responsible if a driverless car is involved in an accident? The car maker, the software developer or the human behind the wheel (but who was not perhaps controlling the vehicle)?
The road ahead
Perhaps surprisingly, Velodyne’s Hall is cool on the prospects for fully driverless cars.
“Getting [the technology] finished is not trivial,” he said. “You have to program for every scenario.”
He has a point. While an engineer can program a car to avoid pedestrians, decelerate safely if a tire blows or stop from skidding on black ice, what about the numerous, unpredictable incidents drivers can face all the time? In an emergency, he argues, there’s probably not enough time for a human to take control of a car, assess the situation and react safely.
“The idea that a human can get back into the loop is unrealistic,” he said.
Instead, Hall sees technologies developed from LIDAR test cars being applied to production vehicles in stages.
That’s already happening. Auto makers have been focusing on high-tech safety systems for the past few years. Japan’s Nissan, for example, introduced a lane-keeping system a few years ago and some cars will now automatically brake if they sense an obstacle in the road ahead—but both still require a human to be in control of the car.
Nissan recently opened a research center in Silicon Valley in the hopes of spurring collaboration with local high-tech companies. It’s one of several car makers that has decided to set up shop among the startups and high-tech leaders of the region, rather than try to attract engineers and talent to Detroit, the traditional home of the U.S. auto industry. The others include Audi, General Motors and Ford.
Not far from all these R&D centers, on the leafy and spacious grounds of Stanford University, there is some even more advanced research going on.
Stanford’s work in the field, like Velodyne’s, dates back to early success in the DARPA Grand Challenge. Today, at the Volkswagen-sponsored Automotive Innovation Lab, students are working on cars that remove the need for anyone to be behind the wheel in case something goes wrong.
The most famous of these is “Shelley,” an Audi TTS that’s been retrofitted for automatic control.
Its front grille hides a LIDAR sensor, and the roof bristles with antennas. The car uses GPS to determine within a few centimeters exactly where it is on its test track, the Thunderhill Raceway near Sacramento, and it tears around the raceway with nothing more than a computer in control.
“When the car is coming to a track, the algorithms that I create optimize the racing line to go as fast as possible around the track,” said Paul Theodosis, one of the team that works on Shelley. The software has similarities with its Google cousins, calculating paths around obstacles and fine-tuning the trajectory for best performance in the conditions.
“For our research, we mainly consider safety as our primary goal, and we research that on the racetrack,” he said. “If we can teach a car to race at the limits continually, that technology could someday be used in production systems on the road for when a car runs into an accident situation.”
The car can already make it around the track in about two-and-a-half minutes—a respectable time, but no record.
“We want to push the car until one day we beat a professional driver on the track,” he said.
When you purchase through links in our articles, we may earn a small commission. This doesn't affect our editorial independence.
Martyn Williams produces technology news and product reviews in text and video for PC World, Macworld, and TechHive from his home outside Washington D.C.. He previously worked for IDG News Service as a correspondent in San Francisco and Tokyo and has reported on technology news from across Asia and Europe.