Kenneth L. Carper's Commentary on "Owning up to a Failure"

Commentary On

The fundamental moral concept of honesty is at stake in this case study. Norm Nash, representing the position of management, has made the decision to deny the possibility of a defective product. This decision has been made on the basis of public image and ignores the technical opinion given by Walt Winters, one of the firm's engineers.

Winter's silence is probably appropriate in the first meeting with the client. His position is one of technical support, not public relations. Also, his suspicions are not yet confirmed, and a preliminary contradiction of Nash's statement is unwarranted. Winters is correct in raising his objections directly with Nash following the meeting with the client.

Norm Nash's reaction is unfortunate. Walt Winters should be distressed by this reaction. His first move should be to disassemble the equipment to confirm his diagnosis, if possible. If the evidence supports his hypothesis, he should then press Nash vigorously to deal honestly with the client.

While this one experience with one executive may not be indicative of the attitudes of all management executives in the corporation, Winter should observe corporate management decisions carefully for other moral deficiencies. The expression that this is merely a "management problem" of little concern to technical staff can lead to serious consequences. If management decisions routinely overrule factual technical information, placing public relations over honesty, the stage has been set for potential moral disaster. There are many examples from all engineering disciplines. One well-documented case is the Morton-Thiokol treatment of the events leading up to the Challenger Space Shuttle accident (Boisjoly 1987).

One puzzling question comes to mind: What is the cost of honesty here? The relationship between R&M and XYZ is firmly established, based on years of reliable service. An honest admission of equipment failure will not damage such a relationship. Confidence is built, not destroyed, by honesty and integrity. This client is left with unanswered questions: Is this an equipment deficiency? Is it an installation problem? Has the breakdown occurred due to operator error or improper maintenance? These unanswered questions may lead to suspicions. Unanswered questions are far more likely to undermine client confidence than an honest admission of potential manufacturing defects. And Nash has already agreed to replace the equipment at no cost to the customer. What possible economic cost could honesty demand beyond this?

It is precisely the lack of economic cost that makes this case so disturbing. The lessons for Winters, potentially a future manager, are clear: If honesty can be compromised in such a trivial instance, why should one insist on integrity when the costs are high? Honesty is not always this inexpensive. Sometimes it costs a great deal. When the stakes are high, surely it will be easier to dismiss moral commitments.

The image of infallibility cultivated by managers like Nash, and their unwillingness to admit fault leads to unrealistic expectations by clients. When failures do occur, society is unprepared for the consequences.

The concept of risk is not at all well understood by the public (Martin and Schinzinger 1989). Instead of providing assistance in understanding this concept, many engineers and managers like Nash have encouraged unrealistic expectations by their attitudes. The public has become more intolerant of failure and more suspicious of the technical experts who are unable to deliver the promised risk-free society.

In fact, the very foundation of engineering design is based in trial-and-error experience. The state-of-the-art cannot be advanced without failure (Petroski 1985). The implication of a condition where failure does not occur is that technology is not advancing. When products do not fail once in awhile, one must conclude that they are inefficient and over-designed.

Technical professionals and product manufacturers have a clear ethical responsibility to communicate honestly about failures, thus contributing to the safety and reliability of products and the advancement of engineering design practice (Carper 1989, 1986, Gnaedinger 1987). Admittedly, this communication has been greatly hindered by the expanding litigiousness of contemporary American society.

Finally, some additional questions ought to be considered. It has been noted that the cost of honesty is very small in this case. What if the anticipated cost were higher? What if XYZ were a new prestigious client, with no established business relationship? An honest admission of fallibility might destroy the relationship in its infancy, with implications for many employees of R&M. What if the equipment failure had resulted in great economic losses to XYZ, as products and other equipment may have been damaged by the failure? What if serious injuries, or even deaths, were caused by failure of this equipment? Should the actions of Nash and Winters be any different?

Do these more serious consequences and potential costs create an intrinsically different moral situation, or is the situation merely made more complex by the legal implications? Does the fear of litigation dictate the appropriate moral response?

Unfortunately, the example provided by Norm Nash gives Walt Winters very little to encourage principled moral reasoning.

Suggested Readings:

  1. Boisjoly, R. M. 1987. "Ethical Decisions: Morton Thiokol and the Space Shuttle Challenger Disaster," presented at the Winter Annual Meeting, American Society of Mechanical Engineers, Boston, MA, December 13-18.
  2. Carper, Kenneth L., ed. 1989. Forensic Engineering, Elsevier Science Publishers, New York, NY, pp. 1-31, 347-348.
  3. Carper, Kenneth L., ed. 1986. Forensic Engineering: Learning from Failures, American Society of Civil Engineers, New York, NY.
  4. Gnaedinger, John P. 1987. "Case Histories: Learning from Our Mistakes," Journal of Performance of Constructed Facilities, American Society of Civil Engineers, New York, NY, Vol.1, No. 1, pp. 35-47.
  5. Martin, Mike W. and R. Schinzinger 1989. Ethics in Engineering (2nd edition), McGraw-Hill, Inc., New York, NY, pp. 106-142.
  6. Petroski, Henry 1985. To Engineer is Human, St. Martin's Press, New York, NY.