Forty-year-old Joshua Brown was killed when his Tesla Model S failed to detect a white 18-wheel tractor trailer crossing in front of him in May. He is the first person known to be killed by a self-driving vehicle.
The collision occurred on a non-controlled access highway in Williston, Florida when a truck driver, Frank Baressi, reportedly made a left turn in front of Brown’s Tesla at an intersection. The Autopilot program failed to detect the white truck against the “brightly lit sky.”
What Went Wrong with the Self Driving Vehicle?
Tesla has offered the car camera likely tuned out the image of the truck, thinking it was a sign. Because the truck was much higher off the ground than the Tesla and perpendicular to the path of travel, the car registered the truck ahead as a harmless sign. The program has built in commands to ignore road signs to avoid false braking while traveling at high speeds.
Instead of initiating the brakes, the car drove forward as if no obstacle were in its way, driving under the trailer and removing most of the top of the Tesla before veering off the road and hitting a telephone pole.
It is worth noting, the driver did not manually apply the brakes before the impact. There has been no confirmation yet on what he was doing at the time of the collision, but Baressi alleged he heard the Harry Potter movie playing when he approached the vehicle after the crash and the vehicle was moving so quickly he didn’t see it pass under his trailer.
The National Highway Traffic Safety Administration Office of Defects Investigation has opened a preliminary investigation into the first known death caused by an autonomous car. These are typically done to determine whether a recall of a defective product is required.
Who is Responsible for the Fatal Crash?
While many self-driving vehicle manufacturers have assumed broad liability for any collisions caused while their vehicles are in autopilot mode, Tesla has yet to do so. The company published a blog post about the incident on June 30, quoting from their warnings which state the following:
- Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
The main point of Tesla’s post was to clarify that despite the vehicle’s malfunction, the driver remained responsible for the collision. Many commentators disagree. A Volvo engineer criticized the Tesla Autopilot system, saying it “gives you the impression that it’s doing more than it is.” This idea that the Autopilot feature gives people a false sense of security in the computers isn’t that farfetched. Videos have surfaced of people reading books and sleeping through rush hour traffic while their Tesla controls the drive.
Tesla reminds drivers the Autopilot program is still in its Beta phase and may require human supervision and intervention at any time when the feature is turned on. But, should drivers be held responsible for a collision they only had a second or two to avoid but the Autopilot cameras had seen and detected tens of seconds sooner?
The company can publish all the warnings about being ready to jump in at any moment and staying alert, but knowing people ignore the warnings, shouldn’t they do more to prevent potentially fatal errors?
Additionally, was the truck driver partially responsible for an improper turn?
There are many questions left unanswered regarding the collision which took Mr. Brown’s life. This is the kind of unfortunate circumstance that may have been needed to pave the way for future autonomous vehicle laws.
Because the accident happened in a pure comparative fault state, Tesla, Brown and Baressi can all be found proportionately responsible for the collision and have to pay their percentage of fault.
The Future of Autonomous Vehicles
There has been a surprising lack of regulation of these vehicles. Only 8 states have rules on the books as of this posting! We can expect Congress to tackle this issue in the coming year as more and more autonomous vehicles are on the roads. A uniform federal law governing the issue would be ideal, allowing the many self-driving vehicles to cross state lines without having to reference individual state laws.
As of April 2016, Florida law permits anyone with a valid driver’s license to operate an autonomous vehicle on public roads. This used to be limited for testing purposes only and required a human operator in the vehicle at all times, but those restrictions have been lifted. Eleven auto manufacturers have plans to unveil self-driving vehicles by 2020, putting at least 10 million on American roads. This means more and more people will be letting their vehicles take the wheel in Florida. But, does it mean the roads will be safer?
It is widely thought self-driving and fully autonomous vehicles would reduce accident rates dramatically. Multiple statisticians and engineers project a 95%+ decrease in accidents if all vehicles employ these technologies. However, I recently read an article which questioned whether people would be willing to accept 100% autonomous vehicles that were not yet 100% safe. It is a curious thought. Are you willing to give total control of your life to a computer knowing it is nearly, but not quite, 100% safe? In the wake of the autonomous Google vehicle which sideswiped a bus in February and the Tesla Model S that caused this accident, BMW announced it will be releasing its own version of an autonomous vehicle in 2021. Despite the possibility for errors, it seems we have already accepted the idea of a nearly perfect system.
For more considerations on the implications of autonomous vehicles, visit our blog.