Hurricane Katrina, Large Socio-Technical Systems, and Social Responsibility
Engineers are often charged with focusing on individual pieces of a system, but what if the whole system fails? The infrastructure to protect residents of New Orleans from hurricanes did during Hurricane Katrina. A part of the Research, Ethics, and Society project, this case can be used to address understanding technological systems as an important tool for acting on social responsibilities.
This case can be used to address understanding technological systems as an important tool for acting on social responsibilities.
- How might engineering organizations have acted differently before Hurricane Katrina to improve outcomes for the people in New Orleans?
When Hurricane Katrina hit New Orleans in 2005, it became one of the most devastating and complex disasters in U.S. history. The Hurricane Protection System (HPS) proved ineffective in New Orleans, but not because of one specific design flaw. Rather, a series of technical and other problems surfaced together, resulting in catastrophic flooding of parts of New Orleans. The problems might have been reduced through improved coordination between and among engineering professionals.
Flood Control as a System
The HPS is a large socio-technical system. The physical infrastructure of the HPS consists of flood-resistant buildings, flood control levees, reservoirs, and pump stations overseen by a myriad of organizations. For example, the New Orleans Levee District maintains the levees and floodwalls within the city, while the Army Corps of Engineers monitors the status of levees built with federal money and develops flood models. A lack of combined oversight led to breached levees, collapsed floodwalls, and overwhelmed pump stations when Hurricane Katrina hit shore. The levees along the Mississippi River held, but several breaches in floodwalls quickly submerged large sections of New Orleans. Suddenly, residents who needed to get out of New Orleans could not get out, and first responders who needed to enter were unable to do so.
Reducing System Failures
Katrina is a complex case, and identifying all the problems that led to over 1500 human deaths and widespread destruction is difficult. Byron Newberry, an engineering professor at Baylor University, argues that organizational communication problems led to safety issues in several key areas
Poor communication between government agencies
Roads, rail lines, and pipelines pass through levees in hundreds of places. At the time of Katrina, these intersections had gates that workers should have moved into place under flood conditions. However, some of the closure systems were either missing or inoperable, and offered little resistance to floodwaters. In one instance, five agencies, including the Army Corps of Engineers, shared responsibility for the upkeep and safety of a levee-railroad line interface. In the event of a hurricane, someone ought to have positioned a steel floodgate to “seal” the levee shut. In retrospect, investigators could not determine which organization had this task. Better communication between engineers in these five agencies might have prevented this failure.
Lack of communication between engineering firms
If two components of a complex system do not interface well, both physically and organizationally, the overall system may fail. In some areas of New Orleans, different engineering firms constructed adjacent flood control structures. When the firms responsible for these “side-by-side” structures did not communicate with each other, the result was a patchwork of flood barriers with different top elevations and building materials. Many breaches occurred at these poorly matched interfaces between individual segments of the levee system. Coordination between engineers across firms during design and construction could have addressed these kinds of issues.
Lack of institutional memory within an organization
As long-term projects move forward, individual engineers and firms may leave projects, including through retirement. Departing engineers take their knowledge with them, leading to a loss of organizational memory. Newer engineers lack awareness about the reasoning behind procedures. With explanations lacking, new engineers often design and plan based on the premise that “we have always done it this way.”
For example, engineers originally designed levees to stop flooding from the Mississippi River. Later generations of engineers viewed levees as a first line of defense against flooding from a hurricane, although earlier engineers did not design levees to protect from ocean flooding. Engineers (and local governments) who did not realize the design limits of the levees failed to revisit levee design and monitor the levees. Better communication among engineers within an organization across generations could decrease the odds of this kind of system failure.
To begin fulfilling social obligations engineers need to recognize the pitfalls of complex systems and atypical kinds of system failures. Addressing this through inter- and intra-organizational communication may begin to reduce failures in large socio-technical systems.
Shared from CITI Program.
This case is based upon work supported by the National Science Foundation under Grant No. 1033111. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Our project team and advisory board read many drafts and provided important insights. Project team: Heather Canary, Joseph Herkert, Jameson Wetmore, Ira Bennett, and Jason Borenstein. Advisory board: Joan Brett, Jim Svara, Richard Fish, Juergen Gadau, Shelli McAlpine, Timothy Newman, Byron Newberry, Patrick Phelan, and Petra Schroeder.