Learning and Moving forward from the Boeing Max 737 Crisis


This scenario entrusts students to play roles within ATIC to collaboratively discuss how the Boeing 737 Max disaster happened. They are also asked to discuss how the crisis could have been presented and future safety and transparency can be encouraged or enforced.



The objective of this case study discussion is to provide students with an understanding of the complexity of technology development and different considerations made by stakeholders in the design and implementation of a technology. The case provides students the opportunity to deliberate and understand how concerns of safety, risks, and ethics can be overlooked or not attended to when technology is developed under time pressure and when developers lack resources – human and material. The case discussion is set up as a role-play where students are assigned different roles as members of a committee that is investigating two crashes. Students discuss the issue from the perspective of the stakeholder they are playing. When implemented within a course, this case study will be most useful towards the start of the semester as it aims to seed the idea of “socio-technical” among students.  

Role-Play Scenario Narrative and Description of Roles 

Following two crashes involving a Boeing 737 Max aircraft – a Lion Air Flight 610 on October 29, 2018, and an Ethiopian Airlines Flight 302 on March 10, 2019 – that killed a total of 346 people, the U.S. Congress formed an Aviation Transportation Investigative Committee (ATIC). The charge given to the committee was to better understand the lessons that can be learned from the Boeing Max crisis, ensure future safety, and prevent future air transportation disasters. The Committee was instructed to examine the incidents from all perspectives and report back to Congress with recommendations. 

Brad Jorgensen and Kathy Schmidt, who had both worked in the aerospace industry at some point in their careers and later worked for members of Congress were designated co-chairs of the committee. For assistance, they assigned different members of their committee, including external experts, different issues to research and present their findings. 

When Brad and Kathy took on this responsibility, they thought it would be an easy task, as initial investigation had shown that a newly installed software was at fault, but they soon recognized that the problem might be more complex. They were surprised to learn that competitive pressure from Airbus in terms of losing market share was one of the major reasons for the introduction of a new model of aircraft and this had necessitated the use of the problematic software. They were amazed at how quickly decisions were made to be able to compete, including changing the design of the aircraft to accommodate a new engine. As they dug more into existing news coverage, reports, and testimonies, they began to feel the weight of the problem even more. 

They were flying to Seattle to speak with Boeing representatives a few months after they started their work, and when boarding the flight in Dulles, they noticed that some passengers on their flight asked the desk personnel if the aircraft they were going on was a Boeing 737 Max, the model that had crashed. The desk personnel assured them that all Boeing 737 Max aircraft were grounded. Trust was a key issue. 

The initial time period to report back to Congress was set to be six months after the second crash, but Brad and Kathy quickly realized that they would need more time to better understand the complex structure that makes up the overall ecosystem for the Boeing 737 Max aircraft. They realized that they would need more input in their report from additional external expert members, so they reached out to folks from different areas of interests and added them to the committee. They were hoping to look toward the future and come up with guidelines that would not only be enforced through the regulatory agencies, but also be made public so that there is more trust in the aircraft and the industry.

Today is the final meeting of the committee, and five of the external members are scheduled to present their findings in brief and then discuss what they found to come up with a recommendation for the co-chairs to take to Congress. 

The questions posed to the committee are: 
1)    Why did the crash happen, and how could it have been prevented? 
2)    How can we ensure future safety and transparency and rebuild trust? 

Committee Members – External Experts (roles for the scenario)

Norman Devlin, aviation consultant, is an aviation expert who trained as an engineer and then worked as a pilot for over two decades. He has been an expert witness on similar committees before. Given his prior experience and his sympathies, he is always keen on expressing the viewpoint of pilots. In particular, he is concerned that authority for decisions during flights has shifted from pilots to technology and that decisions about pilot training have been determined by business interests rather than pilots’ needs. He often comments that, if Boeing had remained an engineering company and hired fewer MBAs, who let its engineering capabilities slip, it wouldn’t have had this crisis. 

Meera Patil, professor of aerospace engineering, is an expert on aeroelasticity and specifically studies nonlinear aeroelasticity flight dynamics of highly flexible wings. She has been invited as an expert to discuss the decision by Boeing to change the wing placement to incorporate the new, bigger, engine on the Max. She has looked into the changes made by both Boeing and Airbus, and she still is not sure what to make of it, as she believes she doesn’t have all the information she needs to understand Boeing’s decision.   

Andrew Gelman, software engineer, is an expert on human-automation interaction. He has written software and designed software for a large number of organizations. He even founded a small automation company that was acquired by a large defense contractor in aviation. He has seen the complexity of interdependent systems firsthand, as he has designed software to manage it and to simulate outcomes. Although he was shocked and distressed by the disasters of Boeing Max planes and wished the tragedy could have been averted, he wasn’t too surprised. He always thought aircraft design had become too complex for its own good. 

John O’Leary, FAA officer (retired), is on the committee for his expertise on regulations and the role of the Federal Aviation Agency (FAA) in the disaster. He is concerned with salvaging the reputation of the FAA, as it has come to light after the disasters that the FAA was too close to Boeing, even colluding with them by letting the company regulate itself in order to make quick decisions and compete with Airbus. The funding for the FAA had gradually been reduced, and it didn’t have the staff necessary to provide regulatory oversight. 

Mary Bradley, Boeing representative, is someone who has worked at Boeing for over 30 years and wants to convey how Boeing works, what they have tried to do for safety, and what she believes the reason for the disaster is. She is also concerned with thousands of parked aircraft and the cost of this for the company. Although Boeing almost never pays for any trouble, as it is always bailed out, she knows this is the low point for the company. She joined the company as a trainee straight out of high school and took part in the excellent training incentives provided by Boeing to complete her undergraduate degree in communications, and, much later, an MBA. 

Sarah Bennett, family lawyer, represents the passengers and their families. She wants to make sure that their voices don’t get lost among all these “experts” and that companies actually work toward the safety of the people rather than simply setting money aside with the goal to pay people off if accidents occur. 


Cases in this collection are also available on the Mason Tech Ethics website.

Aditya Johri. . Learning and Moving forward from the Boeing Max 737 Crisis. Online Ethics Center. DOI:. https://onlineethics.org/cases/george-mason-tech-ethics/learning-and-moving-forward-boeing-max-737-crisis.

Role-play Instructions

1. Each student is assigned a role a week before the discussion. 

2. Students assigned to the role of Brad Jorgensen and/or Kathy Schmidt serve as the moderator and lead the conversation based on the script below.

3. The script provided below is there to guide the discussion, but you should leave room for the conversation to flow naturally and allow everyone to contribute.

Role-play Script (for Brad/Kathy)

1.    What role are you playing in the role-play group discussion? Please state the name, title, and describe the role in your own words (couple of sentences). 
[to be answered by each group members individually and in a sequence]

2.    From the perspective of your role, how would you respond to Brad and Kathy’s question about why the disaster happened and how it could have been prevented?
[to be answered by each group members individually and in a sequence]

3.    From the perspective of your role, what is your response to Brad and Kathy’s question about how can we ensure future safety and transparency and rebuild trust? Why do you think the approach you suggest is the best approach? What do you think are the main barriers to this approach?
[to be answered by each group members individually and in a sequence]

4.    What is your overall group recommendation to Brad/Kathy?
[open discussion, anyone can chime in]

One way to ensure students are prepared for the discussion is to assign a few questions from the script as a pre-discussion assignment (short answers). Similarly, to ensure students reflect on the discussion, they can be assigned the last question from the script as a post-discussion exercise. They can also be asked specifically about ethical concepts or concerns related to safety and transparency. 

Ethical Codes and Guidelines

Several different ethical codes or guidelines can be provided to students to prepare for the discussion or to reflect upon during their discussion depending on the students’ disciplinary composition. For instance, for implementation in a computing or technology related course ACM and IEEE guidelines can be more informative and the discussion can be centered largely on the MACS software (how did the algorithm work, why was it implemented, who designed it, why were the pilots not informed about it, etc.). 

American Institute of Aeronautics and Astronautics code of ethics:

Airline pilots’ association code of ethics: 

FAA Ethics of Maintenance:

ACM Code of Ethics:

IEEE Code of Ethics:

National Society of Professional Engineers code of ethics:

Background Readings and Resources

One of the goals of this exercise is to motivate students to undertake their own research on the topic to prepare for the role they are playing. But it is important to provide them with preliminary material to start their own research. 


Wall Street Journal report “How Boeing Rocked the Aviation Industry”:

Vox’s “The real reason Boeing’s new plane crashed twice”: 

Bloomberg’s “How Boeing Lost Its Way”:


Johnston, P., & Harris, R. (2019). The Boeing 737 MAX saga: lessons for software organizations. Software Quality Professional, 21(3), 4-12.

Herkert, J., Borenstein, J., & Miller, K. (2020). The Boeing 737 MAX: Lessons for engineering ethics. Science and engineering ethics, 26, 2957-2974.

Travis, G. (2019). How the Boeing 737 Max disaster looks to a software developer. IEEE Spectrum, 18.
    A Rebuttal to Travis’ article from ACM Risks Digest: https://catless.ncl.ac.uk/Risks/31/21#subj20

Official information provided by Boeing:

Seattle Times Coverage: 

The New Yorker (in collaboration with ProPublica):
MacGillis, A. (2019). The Case Against Boeing. 

Authorship and Project Information and Acknowledgements

The scenarios and roles were conceptualized and written by Aditya Johri. Feedback was provided by Ashish Hingle, Huzefa Rangwala, and Alex Monea, who also collaborated on initial implementation and empirical research. This work is partly supported by U.S. National Science Foundation Awards# 1937950, 2335636, 1954556; USDA/NIFA Award# 2021-67021-35329. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the funding agencies. The research study associated with the project was approved by the Institutional Review Board at George Mason University.