These days, the talk about self-driving cars seems constant. Unlike many of the “talkers”, simply motivated by product hype, Duke engineering professor, Missy Cummings, has an inside look at the technology. She spent over a decade as a fighter pilot and is now leading a study on pedestrian interactions with autonomous vehicles.
In the current state of this technology, and during vehicle testing, a human is generally still behind the wheel or at least in the vehicle “just in case”. Cummings’ opinion is that for the cars to be truly ready for full autonomy, it can’t be assumed that a human will always be there to take over if needed.
Cummings believes that much improvement is needed to the vehicle technology, but more so, training to the humans “not” driving the vehicles. If an impossible situation could occur, once the technology is ready, in which every single car on the road was autonomous at the same time, the odds may be different.
Recently, at the Senate hearing for autonomous vehicles, Cummings provided her skepticism concerning autonomous vehicles. She said:
“While I enthusiastically support the research, development, and testing of self-driving cars, as human limitations and the propensity for distraction are real threats on the road, I am decidedly less optimistic about what I perceive to be a rush to field systems that are absolutely not ready for widespread deployment, and certainly not ready for humans to be completely taken out of the driver’s seat.”
“The development of self-driving car technologies has led to important advances in automotive safety including lane departure prevention and crash avoidance systems. While such advances are necessary stepping stones towards fully capable self-driving cars, going from automated lane changing or automated parking to a car that can autonomously execute safe control under all possible driving conditions is a huge leap that companies are not ready to make.”
In an interview with Automotive News, Cummings explained further:
Context is important. If a traffic policeman is gesturing and a car can’t interpret the gesture, it could slow down and vibrate the seat and ask a human to take over. It’s not critical that a human take over in that case. If they don’t, the car can stop.
But if a car is going 65 mph and a car is having trouble deciding whether to get off the interstate, it can’t just say: “Three, two, one, now take over.” A car would need to say: “Click this button if you’re ready to take over,” and if [the driver doesn’t], the car will need to be able to come to a safe stop in some way.
Cummings added that there needs to be an understanding between both car and human—as cringeworthy as it is to refer to technology as “understanding”—of what each is capable of doing. As for taking steps as drastic as removing the steering wheels from cars, Cummings things it’s just “not going to happen as quickly as Google might want.”
With all of the recent hype on autonomous vehicles, it’s interesting to hear from a credible source that has done the research, has the answers, and can shed some light on the reality of the technology.