Prepare for the Self-Driving Car
Here in the US, the National Highway Traffic Safety Administration (NHTSA) has created five categories of autonomous cars. The most basic of these are level zero, which might include your vehicle if it doesn’t have a system like electronic stability control. Fully autonomous cars, which can complete their journeys with no human control beyond choosing the destination, are categorized as level four. While level fours are still some way off, level three autonomous cars, which will be able to self-drive under certain conditions (say, an HOV lane during rush hour), are much closer than one might think.
A couple of weeks ago, Tesla wooed its fan base with the news that soon, its cars will be able to drive themselves. But the autonomous car may be one of the company's least innovative moves yet. Those who’ve been watching the industry closely will know that Mercedes, Volvo, Audi, and others have similar products waiting in the wings, ready to hit the streets as soon as the rules and regulations fall into place.
First steps
It all used to be so simple. A car was just a car; a mechanical contraption with an engine and wheels, controlled by a human being with a combination of pedals, levers, and wheel. Vehicle-to-vehicle (V2V) communication meant using turn signals or perhaps gesticulating rudely out the window to indicate displeasure at being cut off in traffic. However as semiconductors became cheaper, faster, and more rugged, they attracted the attention of the auto industry. Electronics began to infiltrate our cars, with fuel injection replacing carburetors in the name of performance and efficiency, for example, and anti-lock brakes (ABS) being added for safety.
By 1995, electronic stability control (ESC) systems started to appear, Mercedes-Benz leading the way with its flagship S-Class. Cars equipped with ESC are constantly monitoring their driver’s steering inputs and comparing them to the direction the vehicle is headed. If those two variables start to diverge beyond certain limits (because the car is either under- or oversteering), ESC will apply the brakes to individual wheels to bring things back under control. Stability control systems proved so effective at reducing both crashes and injuries that they became mandatory for any car sold in the US or EU by the end of 2011.
The mandate in effect made ABS and traction control standard features, too. So any car one might buy today will not only constantly be monitoring both its direction and where it’s heading, but also whether an individual wheel is spinning too much (because of a loss of grip) or even not at all (locked by a brake). These various safety aids aren’t sufficient for self-driving cars. They only take control during emergencies to slow a vehicle, but with the advent of drive-by-wire throttles and steering—something we explored recently—all that remains is for the vehicle to be able to ‘see’ the environment around it and have a ‘brain’ fast enough to make sense of that data to control where it goes. No biggie.
Eyes and ears
As it turns out, most of the technology needed for a car to sense the world around it already exists. Adaptive cruise control—as fitted to the Audi A8, for example—uses a mix of optical, radar, and ultrasonic sensors that keep a car from veering out of its lane and, by constantly checking the range to other vehicles, from hitting any of them. Image recognition software will even detect speed limits on road signs and alert the driver. All of this would seem like science fiction even a decade ago, but it really is just the beginning. Quite soon, those sensors will do more than just tell your car what’s around it, thanks to what’s known as V2V.
As Ars' Sean Gallagher found out early this year, V2V-enabled cars can communicate to each other, warning of upcoming road hazards. V2V is being built atop 802.11p, a Wi-Fi standard that uses 75 MHz of the spectrum centered on 5.9 GHz. 802.11p allows almost instant network connections and can broadcast messages without establishing a network connection first, both of which are extremely desirable when thinking about the safety aspects of V2V. After all, it’s no good telling another car about a road hazard if you need to spend precious seconds handshaking. V2V-enabled cars will be able to quite literally see around corners, since the technology doesn’t require line of sight.
The cloud
But wait, there’s more, and it’s coming from the cloud. More and more cars are coming equipped with LTE data connections, mainly in response to consumer demand for streaming media services. Passenger entertainment may seem trivial to some, but persistent data connections also enable in-car navigation systems to get a lot smarter. I’m probably not alone, for example, in ditching either a standalone or built-in GPS unit in favor of a smartphone app like Google Maps or Waze. And if you’re like me, you probably did it for the same reason: the smartphone apps are able to provide layers of real-time data (like traffic) on top of the cartography. Data-enabled cars mean we can ditch the smartphone holders and go back to using that onboard navigation system. That navigation data will also allow the car to know where it is in the world and, to a certain extent, what it’s likely to encounter.
That kind of map data is sufficiently informative for human drivers to use while they navigate, but even combined with GPS it’s not going to be accurate enough for a self-driving car (civilian GPS accuracy only has a 95-percent confidence interval of 7.8 meters). No, that’s going to require an extremely high-resolution map, and that map will need to be accurate, which means constantly updating. Writing for Slate, Lee Gomes identified this as a problem for Google, but other companies, particularly Nokia, think they might have this one licked.
Nokia’s HERE platform begins by mapping streets in the conventional 21st century way—with a small fleet of sensor- and GPS-equipped mapping vehicles, which it uses to create an HD map that’s machine (but not human) readable. But in addition to providing location data to HERE-enabled cars, Nokia will leverage them to continually update that map in near-real time. Those same cars will send sensor data about the road—things like the position of road lane markers accurate to a few centimeters—resulting in an always up-to-date map.
Nokia also has other plans for using crowdsourced data to improve the self-driving car. We recently spoke with HERE's head of Automotive Cloud Services, Vladimir Boroditsky, who told Ars the company plans to use crowdsourced data from connected cars to create data sets of driving behavior that the company can use to train car software how to drive without terrifying or aggravating humans along for the ride. Compared to the alternative, it certainly sounds like an efficient solution.
How far off are we talking?
As one might expect, car makers have been working with tech industry stalwarts like Qualcomm and Nvidia to build the kinds of integrated systems that allow a car to make sense of its environment and then act on it. Kanwalinder Singh, a senior vice president with Qualcomm, told Ars that’s an area where his company, and its Snapdragon processor, excels. “As more sensors get added, you need massive sensor fusion. It’s a highly intensive problem as the data needs to be crunched very rapidly.” Meanwhile, Nvidia’s Tegra K1 is the brains behind both Audi’s and Tesla’s self-driving vehicles.
If all of this is starting to sound like vaporware, think again. Mercedes-Benz has been testing Bertha, a self-driving S-Class, on the roads of California for some months now. Meanwhile over in Sweden, Volvo has been demoing a self-driving S60 sedan. Then there’s Tesla, which showed off its self-driving autopilot feature earlier this month, along with the information that every Tesla Model S on the road already has the necessary hardware on board.
Perhaps predictably, our favorite self-driving car demonstration thus far involved a race track. Less than two weeks ago, an Audi RS7 entertained the crowds at the final round of the DTM (think German NASCAR) with hot laps of the Hockenheim track. The car lapped the track in just over two minutes, hitting a top speed of 149 mph without a human in control.
Finally, Google has also shown the world its idea of a self-driving car, although it’s one that was radically different, lacking any driver controls like a steering wheel or pedals. It’s interesting to note that Google’s car still requires a roof-mounted camera pod. By comparison, those Teslas, Audis, Mercedes, and Volvos look almost indistinguishable from their less-intelligent siblings.
All of the cars described above are capable of driving to NHTSA’s level three. The agency defines level three autonomous cars as vehicles that “enable the driver to cede full control of all safety-critical functions under certain traffic or environmental conditions and in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control.” In contrast to a car fitted with adaptive cruise control (level 2), the driver won’t need to constantly monitor road conditions. However, when Forbes went for a ride in a self-driving Audi, it reported that the car monitored the driver’s eyes, sounding alerts and then coming to a halt if they were closed for too long. Don’t expect to be able to sleep in your commute just yet.
According to Anders Eugensson, Volvo’s director of government affairs, the technology is fairly mature, and the Swedish company plans to have 100 test cars on the road in 2017. “It is more a matter of how to apply the technologies and properly link them up with the infrastructure. What is important is also to understand how this is working together with non self-driving vehicles and the acceptances of other road users. They also have to match the expectations of the end customers.” Level three self-driving Volvos should be on sale early in the next decade, he said.
Audi is even more optimistic, telling Ars it expects to have level 3 autonomous cars on sale in the US by 2017. Brad Stertz, an Audi spokesman, said the company’s confidence was down to computing power. “We announced our centralized driver assistance processor or zFAS would employ the NVIDIA K1 supercomputer on a chip announced at CES 2014. Our piloted driving pre-development work is being done in parallel with the development of the 192-core K1 chip to bring this technology out sooner.”
Is that legal?
Both of those predictions came with a big regulatory caveat. Cars won’t be driving themselves anywhere until it’s legal for them to do so. This, rather than the technology, will really determine precisely when you can go out and buy a car that drives itself. Eugensson told Ars that the liability issues have to be acceptable to their customers, and lawmakers will have to cooperate to avoid a regulatory patchwork. Sertz also pointed out the need for regulatory consistency, but he raised another issue. “One problem with regulations is that they are often considered and drafted from the perspective of fully autonomous driving capabilities, and every innovator is still a long way from reaching that level of capability. The concern then is that laws are written in a highly restrictive way that addresses a far into the future state, while slowing progress on driver assistance technologies that are the foundation for fully automated driving.”
In the absence of either US-wide federal, or Europe-wide EU regulations, individual states (in the US) and member nations (in the EU) have started the ball rolling. California started issuing licenses for driverless test cars earlier this year, and the UK intends to follow suit in January 2015. Interestingly, neither the EU nor California seem set to allow Google’s steering-wheel free car onto the road any time soon. But even once the legal issues are worked out, it will still be quite some years before completely autonomous door-to-door journeys become possible. Qualcomm’s Singh told Ars that we should expect dedicated highway lanes first. “[Self driving] is complex enough on a highway, but it increases in difficulty as the setting becomes more urban and congested,” he said. No one we talked with thought that self-driving cars would be ready to tackle a dense urban environment (say, an intersection in downtown Mumbai) for at least a decade. Rest assured, it’s a topic that automakers (and Cars Technica) will be revisiting frequently between now and then. But we're well on our way traveling down the road toward robot cars