Authorities have identified the driver of the Model S as Joshua Brown. Tesla has confirmed that Mr. Brown was using its Autopilot system at the time of the crash.
In a blog post, Tesla explains what went wrong:
[T]he vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.
A sad inevitability
The story of Mr. Brown’s death is making headlines — and rightly so. Tesla is the first automaker to offer something akin to self-driving software, and this is the first fatality associated with it. Not surprisingly, Tesla opens its post with a plea for calm:
This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.
And as grim as it might seem, we all knew this was coming. Even as Tesla has touted the gee-whiz factor of its Autopilot software, it’s always been obligated to include an important footnote — one that the company reiterates in its blog post: “Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled” (emphasis ours).
This tragic story is the result of flawed software, but there will be future tragedies, too — ones that might be blamed on software that works all-too-well.
As we discussed last fall (and as many news orgs have recently reported), autonomous driving systems will eventually have to include some kind of moral code. That embedded sense of ethics will force autonomous cars to make tough decisions: Kill the driver, or kill the pedestrian? Kill the driver, or kill the group of cyclists?
In short, autonomous driving software will only make the world better, not perfect. Mr. Brown’s death is a testament to that.
Story and photo: The Cheat Sheet


