General
July 4, 2016
By 
Reilly Brennan

A compendium of Tesla "Autopilot" perspectives

There were hundreds of reactions to the news of Joshua Brown’s death, as it appears that it occurred while his Tesla was operating in its ‘Autopilot’ mode. I wasn’t able to find many with a great balance of perspectives, but I picked up various quotes and points of view below. In total it might be the most useful set of reactions to the news I have found:

As we try to sort out what exactly happened in the latest Tesla crash, we realize that there isn’t a whole lot of information publicly available. Everything Google, Tesla and others do today remain private, we have no standards to compare them against. The autonomous car industry is still living in the dark age of siloed commuity.
- Junko Yoshida, in a comment on her piece in EETimes ‘Tesla Crashes BMW-Mobileye-Intel Event

It may be tempting to describe this as a driverless car crash, but don’t give in. There’s a big difference between assisted driving technologies and full automation, and what we have here is the former.
- Brian Fung in The Washington Post ‘The technology behind the Tesla crash, explained

Here there is a clear situation where LIDAR would have detected the truck. A white truck against the sky would be no issue at all for a self-driving capable LIDAR, it would see it very well. In fact, a big white target like that would be detected beyond the normal range of a typical LIDAR. That range is an issue here — most LIDARs would only detect other cars about 100m out, but a big white truck would be detected a fair bit further, perhaps even 200m. 100m is not quite far enough to stop in time for an obstacle like this at highway speeds, however, such a car would brake to make the impact vastly less, and a clever car might even have had time to swerve or aim for the wheels of the truck rather than slide underneath the body.
- Brad Templeton on his blog ‘Man dies while driven by Tesla Autopilot

Beta-testing on public roads, then, looks like human-subjects research, and this is usually governed by an ethics board in research labs and universities. Industry is slowly realizing this — such as Facebook, after its emotional manipulation experiments — as their products become more impactful on people, psychologically and physically. As far as the public knows, Tesla doesn’t have an ethics board to ensure it was doing right by its customers and public as, say, Google-DeepMind has for its artificial intelligence research.
- Patrick Lin in Forbes, ‘Is Tesla Responsible for the Deadly Crash On Auto-Pilot? Maybe.'

The sad reality of the situation, aside from the loss of life, is that this latest development is likely to cause some safety advocates in the industry to hit their own personal emergency brake. Rather than seeking a deeper understanding of what went wrong — sensor failure, software failure, sunglare, driver distraction — they will insist that Tesla was in the wrong and that the time has come to shut down all of this self-driving nonsense.
- Roger C. Lanctot on LinkedIn ‘No Turning Back on Autonomous Driving

Say it can do 99% but not 100% — then people are not ready for the 1%. We see problems of under-stimulation.
- Bryant Walker Smith, quoted in a piece by Sam Levin, Julia Carrie Wong and Nicky Woolf in the Guardian ‘Elon Musk’s self-driving evangelism masks risk of Tesla autopilot, experts say

It would be unfortunate if public sentiment swung so far against driverless cars that people would never benefit from their lifesaving potential. On the day the Tesla driver died, he said, approximately 100 other people died on U.S. roads.
- Bryant Walker Smith, quoted in a piece by Dee-Ann Durbin in AP / Sacbee ‘Tesla crash could hurt sentiment on driverless cars

The crash reminded me of “Normal Accidents,” a 1984 book by Yale sociologist Charles Perrow. The book grew out of Perrow’s work on the President’s Commission on the Accident at Three Mile Island. In that case, disaster happened both despite and because of a complex chain of safety systems…Normal accidents are a part of our relationship with technology. They are going to be a part of our relationship with driverless cars. That doesn’t mean driverless cars are bad. Again, so far statistics show they’re safer than humans. But complex systems will never be safe. You can’t engineer away the risk. And that fact needs to be part of the conversation.
- Maggie Koerth-Baker in FiveThirtyEight ‘No Technology — Not Even Tesla’s Autopilot — Can Be Completely Safe

One person dies & tech is not safe? Versus 30,000 dead in human-driven cars.
- Stewart Alsop in a tweet

There’s a longstanding dynamic where new transport technologies tend to produce spikes in fatalities before engineers learn the lessons of real-world accidents and make systems safer than the status quo. Driverless cars are unlikely to be an exception.
- David Fickling in Bloomberg ‘All Driving Is Dangerous, Tesla

This, the headlines roared, is the first known example of a fatal road accident involving a self-driving car. Except it is not. The Tesla’s ‘Autopilot’ feature was turned on. But the model was not designed to be and should not have been considered to be fully self-driving.
- Washington Post Editorial Board in ‘The Tesla didn’t really crash itself

Share this post: