Self-driving cars are being developed more rapidly than most experts thought possible. But even if they’re technically possible, can we be assured of their safety? Trust is a major hurdle to adoption of what could be a major step forward in transportation, as experts discuss.
- Dr. Nidhi Kalra, Senior Information Scientists, RAND Corp.
- Brandon Schoettle, Project Manager for Sustainable Worldwide Transportation, University of Michigan Transportation Research Institute
Links for more information:
16-46 Self-Driving Cars
Reed Pence: Americans drive about three trillion miles per year, and like anything we spend a lot of time with, the relationship we have with our car can be complicated. Some people love to hit the road… but for many of us, driving is a daily chore—a source of stress and frustration that we’d rather do without. We may love the convenience of being able to go where we want, when we want… but cars that drive themselves sure sound good.
Kalra: I think people should anticipate that they’re coming. A few years ago this was the stuff of science fiction, and I think very very soon it’s going to become reality.
Pence: That’s Dr. Nidhi Kalra, a senior information scientist for the Rand Corporation. She says self-driving cars are coming faster than many experts anticipated. Their cameras and radar are rapidly getting better at sensing and recognizing everything around them—obviously a good thing if someone’s putting their life in the car’s hands, so to speak. Public perception is a major hurdle. Most people think they’re pretty good drivers themselves… and doubt that a computer can do it more safely than they can.
Kalra: The question of confidence is a complex one. What you’re really asking is how safe are autonomous vehicles? Is that safe enough? And the trouble is we have almost no way right now of really knowing how safe they are in a plausible time frame. The number of miles that you have to drive to establish with some statistical confidence how safe they are is astronomical. We would have to drive them the equivalent of driving to Neptune and back. So it’s billions upon billions of miles. So what that means is that it involves some trust and evidence that developers of this technology have done due diligence, but we may not know for sure. So I think the technology needs to viewed with some healthy skepticism.
Shuttle: The technology works but it’s not a complete version of you sitting there reading a book and working on your lap top or whatever while your care drives you where you need to go. Not yet anyway
Pence: Tesla’s autopilot system comes close out on the open road…but Brandon Schoettle says the company doesn’t really want drivers to take their hands off the wheel and quit paying attention. Schoettle is project manager for sustainable worldwide transportation at the University of Michigan Transportation Research Institute. He says inattention is what led to the one known crash death in an autonomous Tesla. And eventually people will see that computers can be better drivers than us.
Schoettle: I think for most people seeing these become commonplace, getting experience with them and hopefully a positive experiences where people see that they’re saving lives and reducing crashes. And this is what’s discussed within the industry, too. If you can take 30-40,000 deaths a year that occur on U.S. roads and reduce them down to three or four-hundred are people still going to be as happy about that massive reduction or are they going to be overly concerned about the 300-400 deaths occur because computers killed people driving these cars around. Hopefully people will understand the value in getting down to something like 300 or 400 fatalities a year and aren’t focused on the fact that a computer was necessarily responsible for it. Because after all the computers that are responsible for it are doing what they are told to do by their human programmers.
Pence: But sometimes, computers get confused. Right now, there are still some things that human drivers do better.
Schoettle: Pattern recognition involves making sense of sometimes visually complicated scenes. An example most people can relate to are the thing known as CAPCHAs you see on the internet where they want to avoid having computers or other things automatically sign up for e-mail lists or register so they ask you to type a word there’s a squiggly line through it, or it’s a fuzzy picture of an address plate on a house and you as a human still generally, of course they sometimes they give you bad ones, but have a relatively easy time of figuring out what that says. Whereas a computer vision system, which would be the type of thing analyzing that image still has a lot of difficulty with it. When you encounter those types of things on the road we show a few in our report, which are illustrated well with photographs, is.one’s a downed power line.
Pence: Humans can discern the difference between a downed power line and an expansion joint in the road. Sometimes computers can’t. Self-driving cars also have problems with rain and reflections from water on the road. They may be unable to tell the difference between a puddle and a flood. In fact, bad weather in general is one of the biggest problems for autonomous cars.
Schoettle: Yeah and this has to do with some basic technical issues, cameras and other sensing hardware has a hard time of course as do people seeing through snow. Though this somewhat comes back to the visual pattern recognition issue. People are very good at guessing often have a pretty good idea where the road should be and so you drive there. And sure enough it’s still there like it was this morning, even though it snowed. But this might not be quite as simple for a vehicle to do. It doesn’t just figure things out; it’s following some rules and instructions, and if those rules and instructions don’t have enough information because it’s not able to see or sense the road, you may get stuck, whereas a human you are able to figure it out.
Pence: Despite those shortcomings, Google likes to tout that its self-driving cars have a nearly spotless track record. They simply don’t cause crashes. But Schoettle’s analysis has found that autonomous cars are about five times as likely as a normal car to be involved in a crash… even if it’s not the car’s fault.
Schoettle: It was very much the case that this was not often or at all the fault of the self-driving vehicle. These were almost all the time a situation where a human driver of a conventional vehicle was rear-ending these cars running into the back of them under relatively normal circumstances at traffic lights and things like that. So it’s a bit of a mystery as to why that was happening. But crash involvement is this number that doesn’t really pay any attention to whose fault it was.
Pence: One suggestion why self-driving cars get rear-ended? Maybe because they’re going so slowly. Schoettle says Google’s cars don’t go any faster than 25 miles per hour. That may keep them legally faultless, but a top speed of 25 isn’t going to convince people that self-driving cars will be worth it.
Schoettle: That’s part of the situation that we face these days is that they’ve been used in very limited circumstances and are treated as a low-speed vehicle. So they’re limited to roads where the speed limit is no greater than 35 mph. This lack of experience and more opening testing was part of what we think still needs to occur before these are really going to be deemed ready for mass consumption by the public. High speed driving including limited access highways where you’re driving 60-70 miles an hour, also things like nighttime, unusual environments like mountainous areas where lots of people in this country live, there are all areas where these vehicles have not been fully tested yet. So this is part of what we think needs to be done to demonstrate that they are really capable of driving in all the different places they’ll need to go.
Pence: Another criticism of many self-driving cars is that right now, they can be extremely timid.
Schoettle: This is certainly something to be concerned about if the vehicles have a combination of potentially following the rules of the road too closely and being too cautious. I guess you might call them tolerance levels of what the vehicles is looking for that they’re still trying to work out. For example I know early generations of these types of vehicles, and I don’t know of this is still the case, at four-way stops, were looking for the vehicle that were at the other parts of the intersection to come to a complete stop as legally required. Now most people don’t do that. The might go extremely slow, but they don’t actually stop. These vehicles would proceed ahead until they sense that the vehicle has stopped because their calculation showed it was still on its current path going continue into the intersection and cause a crash. And we all know from these scenarios that that person’s slowing down very slow to allow the person with the right away to continue through but they don’t want to actually bring their vehicle to a stop.
Pence: Schoettle says such an overly strict adherence to the rules brings up the possibility that human drivers may be able to bully autonomous cars by driving aggressively. So he says automakers are trying to instill some confidence and even assertiveness in self-driving cars… so they’ll proceed even if everything’s not perfect. And Kalra says sometimes they may even have to break the speed limit.
Kalra: Sometimes the safest way of driving is not what is required by law. Driving 55 miles an hour when everyone is doing 85 is extremely dangerous. So the balance has to be struck. They have to strike a balance between following the letter of the law and following the spirit of the law, which is drive safely, which means being safe in the flow of traffic.
Schoettle: All of this will be a big issue that needs to be addressed in this large period of time where these types of vehicles will be sharing the road and interacting with conventional human drivers in normal vehicles. Things can work perfectly as everyone envisions when everybody has one of these vehicles, but trying to merge onto the highway at or below the speed limit is often a very dangerous thing to try to do. And certainly your going to cause some traffic backup if you enter the highway going 50 miles an hour and the speed limit’s 70 and the average traffic is going 75. But then the question becomes what rules do you allow the vehicles to break, if you allow them to break anyhow much? These are some simple questions that are really difficult to answer.
Pence: There are ethical, legal, and insurance issues to settle as well. If a self-driving car has the choice of hitting a pedestrian or crashing head-on into another car, what should it do? If a self-driving car causes a crash, who’s responsible?
Schoettle: There’s already been some people who say the driver should be responsible even if they are not driving their car. I believe Volvo has come out with some of their vehicles and said we as a company will take responsibility because we are in fact the driver. The computer is doing the driving so we should be the ones responsible. So we already before anything’s happened have these conflicting approaches to liability and responsibility for what happens. So there are a lot of those questions that need to get answered before this becomes something that you see in most people’s driveways.
Pence: Liability could be why some manufacturers are shying away from cars that call for the driver to help out under some conditions. Those carmakers would prefer to keep control themselves and let the car do everything. But right now, we’re all trusting manufacturers that these cars work the way they say they do. There are no governmental safety standards yet. But Schoettle says there should be. Self-driving cars should have to pass a road test before they get their license. Maybe, like a teenager, they’d have to get a graduated license first.
Schoettle: To have a fully self-driving vehicle that can do all of the driving for you it may not have any controls for you to take, can even drive around empty, these need to be able to do everything all of the time. So there would be a test that’s devised and they would need to pass this test completely to get non-restricted license. Versions of self-driving vehicles short of that which do most of the driving or all of the driving some of the time and occasionally pass control back to you or even give you the option to take control when you want, they may be able to operate under a restricted license where they pass certain tests, but then there are other scenarios either they voluntarily restrict or it happens as a result of failing the test where, for example, they say this vehicle cannot drive in snow and it cannot drive at light. Just to pick two scenarios. And the vehicle will know when these scenarios are and will force the human do to the driving under those conditions.
Pence: However, when self-driving cars start passing all those tests with flying colors is when you’ll start to see huge changes and the concept will begin to pay off in vastly increased efficiency. It’s not just that cars will be different… the infrastructure will be, too.
Schoettle: Roads and intersections and other things can be designed quite differently than they are now because most of them were set up to control and communicate to humans drivers who are driving around, and that just can work so much differently and more efficiently when it’s done by computers talking to each other and vehicles talking to each other, and to the infrastructure, to the traffic signals to other things to know exactly what’s going on around them. Those things I think are still definitely decades down the road.
Pence: However, Schoettle admits it could all come sooner than he imagines, because the technology is already progressing at a breakneck pace. And while the big payoff will come when just about all the cars on the roads drive themselves… he says we should still see an increase in efficiency and lives saved with every car added in the meantime.
You can find out more about all our guests on our website… radiohealthjournal.net. You can find archives of our programs there as well as on iTunes and Stitcher.
I’m Reed Pence.