Questions for Tesla Motors about Autopilot
A case study looking at the real-life case of Joshua Brown who was killed in a car accident when his Tesla S ran into a tractor-trailer while using its "autopilot" feature.
The Death of Joshua Brown
In October 2014, Tesla began selling sedans with a $4,250 technology package containing a dozen ultrasonic sensors, a camera, a front radar, and digitally controlled brakes. The package allowed the car to stop before crashing. A year later, Tesla released a software update named Tesla Version 7.0 to the 60,000 cars it had sold with the technology package. The new software enabled the car to control its speed and steer. Tesla gave the software update the nickname Autopilot [1].
Here is what Tesla wrote on its Web page: “While truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car” [2].
That made Tesla Motors the first automaker to release a product exhibiting 3 automation, as defined by SAE International [3]:
- SAE Level 0 – No Automation: “the full-time performance by the human driver of all aspects of the dynamic driving task, even when enhanced by warning or intervention systems”
- SAE Level 1 – Driver Assistance: “the driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration using information about the driving environment and with the expectation that the human driver perform all remaining aspects of the dynamic driving task”
- SAE Level 2 – Partial Automation: “the driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/ deceleration using information about the driving environment and with the expectation that the human driver perform all remaining aspects of the dynamic driving task”
- SAE Level 3 – Conditional Automation: “the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task with the expectation that the human driver will respond appropriately to a request to intervene”
- SAE Level 4 – High Automation: “the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task, even if a human driver does not respond appropriately to a request to intervene”
- SAE Level 5 – Full Automation: “the full-time performance by an automated driving system of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver”
Joshua Brown was a Tesla fanatic. He nicknamed his Model S sedan Tessy, and he averaged more than 5,000 miles per month on the road [4]. Mr. Brown posted YouTube videos showing himself “driving” hands-free and testing the limits of the system [5, 6, 7].
On May 7, 2016, Mr. Brown was killed when the Tesla Model S he was “driving” crashed into a tractor trailer on a Florida highway [8]. Tesla’s first public response to the accident came nearly two months later, on June 30 [9]. I encourage you to read it, here.
Question 1: How much moral responsibility does Tesla Motors carry for the death of Joshua Brown?
Details of the Accident
The accident occurred as Joshua Brown’s Model S was traveling east on US-27A, a divided highway in northern Florida. A tractor trailer, traveling in the opposite direction on the highway, turned left in front of the Tesla. The Tesla was in Autopilot mode. According to Tesla Motors, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied” [9]. The trailer was high enough off the ground that the car continued under the trailer, shearing off its roof. The car drove off the road and struck two fences and a power pole before coming to a stop [8].
The accident killing Joshua Brown occurred at an at-grade intersection of a divided highway. Divided highways are more dangerous than freeways. On a per-mile basis, the probability of getting into an accident is higher on a divided highway than on a freeway. In fact, the Interstate System of freeways is the safest system of roads in the country [10].
According to the National Transportation Safety Board, Joshua Brown’s Tesla Model S was traveling 74 miles per hour with Autopilot engaged at the time of the crash with the tractor trailer, 9 miles per hour above the posted speed limit of 65 miles per hour [11]. According to the web site Quartz, Autopilot remained engaged at speeds up to 89 miles per hour [12].
Question 2: Should Tesla Motors have added restrictions to the beta version of Autopilot so that it could only be activated while driving on freeways?
Question 3: Should Autopilot allow the driver to set a cruising speed above the speed limit, and if so, by how much?
The Hand-off Problem
In 2015, before the Tesla accident, Ford Motor Company announced its plans to introduce a self-driving car by 2021. It also said it was skipping level 3 because of its inherent difficulties. How can the computer ensure the driver is paying enough attention that it can pass over control in case of an emergency? Ford said that its tests indicated it took an average of 3 to 7 seconds, but sometimes as many as 10 seconds, for a driver to take control of the vehicle. This is called the hand-off problem [13].
Passing over control is even more difficult if the driver is distracted. The Model S sedan Joshua Brown was driving did not have a mechanism to ensure the driver kept attention on the road while Autopilot was engaged. People have observed Tesla sedans traveling while the "driver" sleeps [14]. The Florida Highway Patrol found a portable DVD player in Joshua Brown’s Tesla Model S. Some witnesses said they heard a Harry Potter movie playing when they approached the car after the accident, although other witnesses to the scene of the accident said there was no movie playing [15].
Ford has publicly announced that it will not sell an automobile with level 3 automation. It does plan to start selling an automobile with level 5 automation in 2021 – a full self-driving car – but it will not have a steering wheel, gas pedal, or brake pedal. Control will never be handed off from the computer to the driver [13].
Question 4: Should Tesla Motors have released Autopilot to the public when the hand-off problem has not been solved?
References
- Ryan Bradley. “Tesla Autopilot: The electric-vehicle maker sent its cars a software update that suddenly made autonomous driving a reality.” MIT Technology Review.
- Tesla Motors. “Your Autopilot has arrived.” October 14, 2015.
- SAE International. “Automated Driving: Levels of Driving Automation Are Defined in New SAE International Standard J3016.” 2014.
- Rachel Abrams and Annalyn Kurtz. “Joshua Brown, Who Died in Self-Driving Accident, Tested Limits of His Tesla.” The New York Times. July 1, 2016.
- Joshua Brown. “Tesla Autopilot v7.0 Intro Video.” YouTube. October 15, 2015.
- Joshua Brown. “Tesla v7.0 Autopilot: Showing When It Cant Handle It.” YouTube. October 18, 2015.
- Joshua Brown. “Tesla v7.0 Autopilot: Very Difficult RR Track Turn.” YouTube. October 18, 2015.
- Anjali Singhvi and Karl Russell. “Inside the Self-Driving Tesla Fatal Accident.” The New York Times. July 12, 2016.
- The Tesla Team. “A Tragic Loss.” Tesla Motors. June 30, 2016.
- Federal Highway Administration. “Interstate Frequently Asked Questions. US Department of Transportation.
- National Transportation Safety Board. “Preliminary Report, Highway HWY16FH018, Executive Summary.” July 26, 2016.
- Alice Truong. “Elon Musk Is Going to Pull Back on Autopilot Mode to Keep Tesla Drivers from “Doing Crazy Things.” Quartz. November 5, 2015.
- Alex Davies. “Ford’s Skipping the Trickiest Thing about Self-Driving Cars.” Wired. November 10, 2015.
- Electrek.co. "Tesla Model S Driver Caught Sleeping at the Wheel While on Autopilot - Electrek." YouTube. May 23, 2016.
- Barbara Liston and Bernie Woodall. “DVD Player Found in Tesla Car in Fatal May Crash.” Reuters. July 1, 2016.