A fatal accident involving a self-driving car, though tragic in its outcome, will help regulators and manufacturers answer important questions as autonomous vehicles are more fully developed, says Toronto civil and commercial litigator Jonathan Miller.
“What I took away from this incident is that it will provide a test case and give some advance warning of issues that developers and users of this technology need to consider and question before these vehicles are put on the road,” says Miller, an associate with the Toronto office of Shibley Righton LLP.
The accident, which occurred when a pedestrian was struck while walking her bike across a poorly lit road in Arizona, is believed to be the first time an autonomous vehicle has been linked to a pedestrian death, The New York Times reports.
“I’m not aware of a lawsuit being launched yet, but I think that whatever comes out of this case, it will get a significant amount of attention because there are a number of issues that could come up,” Miller tells AdvocateDaily.com.
“I’ve read comments about some of those issues — like including who’s responsible: Is it the ride-sharing company for using an autonomous vehicle that isn’t necessarily as safe as we want it to be? Does the vehicle manufacturer bear responsibility for using or selling that technology when it’s not as safe as it could be? And does the safety driver who was behind the wheel bear any responsibility?”
The popular ride-sharing company was testing autonomous vehicles on roadways at the time of the accident, but it has since stopped the program in several cities, including Toronto, Phoenix, Pittsburgh and San Francisco, the CBC reports, noting that in Ontario, they have been used in a pilot project since January 2016.
Video of the safety driver in Arizona suggests that he wasn’t paying attention, Miller says, adding that Ontario’s pilot project “requires that there still be a human ‘driver’ behind the wheel in case the vehicle isn’t reacting and the human needs to respond. Had the driver in this particular accident been paying attention, maybe he could have done something to avoid the accident.”
But, Miller notes, it’s difficult to know if the driver was at fault because the pedestrian was reportedly crossing on a dark street.
“It may not have been an avoidable accident, so we’ll see how the authorities deal with it,” he says.
“One of the difficulties they face in programming these vehicles is the potential irrational or unexpected responses of people around them. There’s all this talk about how autonomous vehicles will be able to communicate with each other, so if the car ahead is braking, the car behind will brake automatically,” Miller says.
“But there hasn’t been as much discussion about the interaction between the computer-driven vehicle and the humans around it — whether that be someone who’s not crossing at a crosswalk, or a cyclist or a pedestrian on their phone who suddenly walks into the street. It would be hard to program for these things, but that is a reality that developers of this technology will face.”
That leads to difficult moral issues, he says.
“I think a question that these developers have to struggle with — and where regulators and lawmakers can assist — is what happens when an autonomous vehicle has to assess whether it’s better to hit a pedestrian or make a potentially very dangerous move that could endanger its passengers or an oncoming vehicle.
“How do they go about calculating that risk and programming it into the system? How does the computer decide between two equally terrible results?”
Miller points to a report prepared by KPMG International called “Autonomous Vehicles Readiness Index,” which assesses the “openness and preparedness” for the vehicles in 20 countries.
The study ranked countries in four categories — policy and legislation, technology and innovation, infrastructure, and consumer acceptance — and Canada placed seventh overall. The Netherlands, Singapore and the United States were in the top three spots.
“I think that means we have some work to do in terms of understanding the technology and being prepared for the implications of computers having to decide between hitting one pedestrian or potentially rolling the vehicle with four passengers inside,” he says.