Three Scenarios with Self-Driving Vehicles

Description

A set of three scenarios that highlight ethical and technical issues related to self-driving vehicles.

Body

Suggested journal paper for faculty to read while preparing to teach using these three scenarios:
Borenstein, Jason, Joseph R. Herkert, and Keith W. Miller. "Self-driving cars and engineering ethics: The need for a system level analysis." Science and Engineering Ethics (2017): 1-16.

Suggested website that lists SAE's 6 levels of automation and ways that vehicles can communicate with the outside world: autocaat.org/Technologies/Automated_and_Connected_Vehicles/

Suggested pedagogical approach:
Students can be engaged by presenting them with complex problems and inviting them to suggest solutions. They can be challenged to consider the sociotechnical systems intertwined with self-driving vehicles, and to focus on the consequences for humans in technical decisions about these systems. Students can learn from the progression from concrete technical questions to broader, more abstract issues, including those with ethical significance.

The two questions that follow each scenario invite the reader to examine moral responsibility for an accident involving one or more autonomous vehicles and invite the reader to imagine ways in which the accident could have been avoided. The first question is meant to encourage analysis of different decision-makers in the scenario. The second question is open-ended, inviting creative solutions to improve the situation for all stakeholders.

Scenario 1: Two Models of Autonomous Vehicles[1]

A car with level 3 driving automation ("Car A") is on a highway and is seeking to move onto an off-ramp; being level 3 entails that an automated system can control the car's operation but the human driver is supposed to supervise the car at all times.

A second car (“Car B”) is traveling on the on-ramp and is seeking to merge onto the highway in the same location as where Car A is.  Car B contains level 4 automation which entails that its automated system is designed to operate, at least at times, without direct human supervision.

Assume that the two cars are almost exactly parallel.

Version A: The cars cannot communicate with one another and they collide.

Version B: The cars are capable of communicating with one another through vehicle-to-vehicle communication (V2V) but they still collide.

Questions for consideration:

  1. To what extent does each of the following bear moral responsibility for the accident?
    • The designer of Car A
    • The driver of Car A
    • The designer of Car B
    • The designer of the V2V system
    • Regulatory agencies or officials
    • Other
  2. What could have been different to reduce the chances of this kind of accident happening?

Scenario 2: An Autonomous Vehicle and a Right-Hand Turn

A level 4 self-driving car is trying to turn right at an intersection. It has a green arrow from a traffic light.  However, a pedestrian is trying to cross through the intersection where the car is trying to turn even though the pedestrian does not have a walk signal.  Due to the car's detection sensors, it hesitates to turn right and therefore the pedestrian decides to cross. While halted, the self-driving car gets rear ended by a standard (non-autonomous) car.

Questions for consideration:

  1. To what extent does each of the following bear moral responsibility for the accident?
    • The pedestrian
    • The self-driving car
    • The designer of the self-driving car
    • The driver of the standard car
    • Regulatory agencies or officials
    • Other
  2. What could have been different to reduce the chances of this kind of accident happening?

Scenario 3: An Autonomous Vehicle and Centralized Intersection Management

Several different types of vehicles are approaching an automated intersection. Vehicles are dispatched through the intersection by a centralized management system. The vehicles approaching the intersection include:

  • A level 3 self-driving car
  • A motorcycle

Being level 3 entails that an automated system can control the car's operation, but the human driver is supposed to supervise the car at all times.

The weather is inclement, which inhibits visibility and the automated intersection controls are starting to fail. The driver of the level 3 car is alerted to take over control of the car’s operation, but the driver is distracted by texting. Features in the motorcycle allow it to communicate with intersection controls, but the motorcyclist chooses to ignore the instructions from the intersection’s automated system to slow down. Instead, the motorcyclist decides to speed up while approaching the intersection. The level 3 car collides with the motorcycle, and the motorcyclist is seriously injured.

Questions for consideration:

  1. To what extent does each of the following bear moral responsibility for the accident?
    • The driver of the level 3 self-driving car
    • The designer of the level 3 self-driving car
    • The motorcyclist
    • The designer of the intersection’s centralized management system
    • Regulatory agencies or officials
    • Other
  2. What could have been different to reduce the chances of this kind of accident happening?

[1] Based on a scenario described in Jason Borenstein, Joseph R. Herkert, and Keith W. Miller. "Self-driving cars and engineering ethics: The need for a system level analysis." Science and Engineering Ethics (2017): 1-16.