Chapter 2: Laboratory Practices
Chapter 2 of "An Instructor's Guide to Ethical issues in Physics
Chapter 2: Laboratory Practices
Section 2.1: Introduction
This chapter addresses ethical issues related to how experiments are performed. A number of well documented cases can be used to provide students with insight into why carefully designed procedures are important in the responsible conduct of research. Such procedures can both reduce the opportunity for research misconduct and increase the level of confidence others have in the research. For several decades, the computer has been an essential part of physics research, so issues in computational physics will be addressed as well. While the dividing line between laboratory procedures and data management is fuzzy, the latter will be addressed in the next chapter.
It is noteworthy that a clear connection exists between the elements of many scientific codes of ethics and a variety of descriptions of the nature of science as a discipline. In Elements of Ethics for Physical Scientists[1], Sandra Greer lays the foundation for subsequent discussions by asking the question, “What is Science?”. Instructors may want to consider assigning that chapter early in a course on ethics. Alternatively, the American Association for the Advancement of Science has an online book called Science for All Americans. Its first chapter, The Nature of Science[2], concisely describes what defines science as a discipline. While to the seasoned scientist, this information will seem straightforward, it can be helpful to use a reading like this to present all of these ideas to students in an organized way. Both of these sources can help students address questions like, “What are some of the considerations in designing a good experiment?”
Section 2.2: Fabrication and falsification, and how they harm the scientific community
Most definitions of research misconduct in science refer explicitly to fabrication and falsification. The Office of Research Integrity, whose charge is to direct U. S. Public Health Services research integrity activities, defines fabrication as “making up data or results and recording or reporting them” and falsification as “manipulating research materials, equipment, or processes, or changing or omitting data or results such that the research is not accurately represented in the research record.”[3] The use of fabrication or falsification is considered research misconduct for several reasons, which include:
- It misleads other members of the scientific community.
- It undermines trust in the research record.
- It can cause others to waste resources attempting to replicate the falsified or fabricated experiments.
- In some cases, it may have a direct, negative impact on society, as in the case of the alleged link between vaccinations and autism.[4]
In the early 2000s, there were two highly publicized cases in the physics community resulting in findings of research misconduct, the cases of Victor Ninov and Hendrik Schön.
Ninov
Very concisely, Victor Ninov was part of a collaboration at Lawrence Berkeley National Laboratory that in 1999 reported having discovered two new elements, 116 and 118. When other labs were unable to reproduce these results, LBNL re-examined their original data and tried rerunning their experiment. These efforts triggered an internal investigation that found that Ninov had created data for events that did not in fact exist.
Many accounts of the Ninov case are available. David Goodstein has a brief discussion in chapter 6 of his book On Fact and Fraud: Cautionary tales from the front lines of science.[5] While Goodstein’s discussion of the incident is short, it is worth pointing out that he did have access to the investigation report, most of which has remained confidential. Physics Today has a concise but fairly comprehensive treatment.[6] Students reading the New York Times article[7] on this incident will be able to see how technical issues are covered in the popular press and gain insight into how research misconduct may impact public trust in science.
Discussion Suggestions
The ethical breach in this case is obvious: falsification of the research record. One aspect of the case that could be usefully explored in a classroom discussion is, what is it about the research environment that might have led Ninov to do what he did? It is likely that Ninov fully expected elements 116 and 118 would eventually be discovered. He presumably also recognized that in the first experiment, he was the only one who had direct contact with the data and with the data analysis program. In the follow up analysis and experiments, the fact that others had direct contact with the data was integral in their arriving at the conclusion that the research record had been falsified. Thus the Ninov case can be used as a springboard to discuss group dynamics in research collaborations. Collaborations are put together usually because no one person has both the time and expertise to complete all aspects of a project on their own. When trying to maintain the integrity of the project, what are the tradeoffs between relying primarily on trust among collaborators that each individual will act both responsibly and competently, as opposed to building up a robust system of independent checks within the group? Some students may be tempted to propose extensive independent checking; they could be reminded that science almost always operates with limited resources where efficiency is nearly always a concern. Students might also be encouraged to reflect on their personal experiences either in research or working with a partner in an advanced lab course.
A second aspect of this case that can be probed is the response of the research group to concerns that there might be a problem with the experiment. Often, ethics case studies focus on situations involving ethical ambiguity and/or unethical behavior. However, we should not overlook discussions of situations that demonstrate proper actions in challenging settings. For instance, asking the students to discuss how the evidence of falsification was obtained will give them insight into how scientists can police themselves. In addition, the actions of Ninov’s collaborators to retract that publication promptly upon realizing the data were problematic shows respect for the publication process.
Schön
Hendrik Schön was a researcher at Bell Labs, focusing primarily on the electrical properties of materials. He appeared to have an extraordinarily productive period there, publishing papers at a very high rate, including several in Nature and Science. There were, however, reproducibility problems with his work—not only were others outside of Bell Labs unable to reproduce much of what he reported, but even colleagues inside Bell Labs were having difficulty getting sufficient cooperation from him to attempt to reproduce his results or to perform independent tests to further study his samples. Concerns were also raised about his results being improbable and about the fact that it appeared he represented a single data set as arising from two different experiments. In 2002, Bell Labs commissioned an external investigation of his work, leading to findings of research misconduct and to his termination.
Three different approaches can be taken to studying this case. The quickest way to get a handle on it is to read two news articles from Physics Today.[8],[9] Both include links to earlier articles in Physics Today that reported on Schön’s research in a favorable light. A second approach is to read the report issued by the investigation committee.[10] While the entire report is 129 pages long, the main body is only nineteen pages, with the rest being appendices, most of which would probably be optional reading. The report describes not only findings but also methods, so it would give students some insight into how research misconduct investigations are conducted.
An in depth discussion of the Schön case is found in Eugenie Samuel Reich’s book Plastic Fantastic: How the biggest fraud in physics shook the scientific world.[11] Several chapters in this book that can be understood independently and hence can be used as moderate-length reading assignments. Of particular relevance to the topic of research misconduct are:
- Chapter 1: The author describes Schön’s strategy of trying to identify the results people are expecting and then producing “data” to meet those expectations. He explores the assumption that someone determined to fabricate results in science cannot be stopped but will always be caught, and he discusses the role of trust in science.
- Chapter 3: Here, the origins of the standards of reproducibility and peer-reviewed publications are discussed. The chapter concludes with an overview of some of Schön’s earlier publications.
- Chapter 4: This chapter gets at the heart of Schön’s research misconduct, describing his interactions with various colleagues and how Schön produced “data” to meet expectations. This discussion helps address a question that many students have raised about this case: how could the fabrication and falsification of data have gone undetected for so long? The key research misconduct issues in this chapter can be understood without having read any of the preceding chapters, although students may find the large cast of characters a little hard to keep track of.
- Chapter 5: This chapter should probably not be read by itself but would fit well with Chapter 4. It gives insight into job-related pressures on Schön to be productive. It also provides a good example of how internal peer review (in this case, prepublication review by Bell Labs colleagues) can help keep bad science from being published.
- Chapters 7 and 8: Both of these chapters highlight failed attempts by other scientists to reproduce Schön’s work. They could probably each be read individually, but the students are more likely to fully appreciate them after having read either Chapter 1 or 3 to lay the foundation. The chapters illustrate how trying to reproduce the work of others is often a natural part of progress in science and how these attempts play a role in detecting bad science.
- Chapter 10: The author provides historical background on fabrication and falsification, including how the David Baltimore case (from the life sciences) has had a lasting impact on the handling of research misconduct in science. There is a brief reference to Irving Langmuir’s Pathological Science (discussed in Section 2.3 below). The chapter concludes with a discussion of some of the first suspicions that Schön might be fabricating or falsifying his data. Understanding this chapter probably requires reading either Chapter 1 or 4 to place it in context.
- Chapter 11: This chapter is a useful follow up to Chapter 10 but probably won’t work well on its own. It does a good job of chronicling how Schön’s research misconduct was uncovered by other scientists, prompting Bell Labs to initiate the investigation. The coverage of the case is rather brief after that. Instructors interested in learning about the investigation itself are better off reading the report of the investigation committee, cited above.
Discussion Suggestions
The Schön case is particularly well documented. It is common for students to ask how someone could expect to get away with the type of fabrication and falsification evident in Schön’s work. In addressing this, students can be directed to Reich’s book, which describes how Schön carefully assessed expectations in the field before he produced his results. In this regard, he was not that different from Ninov (who fully expected elements 116 and 118 to be discovered eventually).
Another issue that commonly comes up in discussing this case is the length of time Schön was working at Bell Labs before the problems with his data were brought to light. When we rely on reproducibility checks to weed out bad science (whether based on research misconduct or not), then there is a limit on how fast the community can respond. It is rare that a lab can check the work of someone else with just a few days of efforts. More commonly, equipment must be reconfigured (or even purchased), samples acquired, and time in a busy lab schedule blocked out in order to make the experiment happen. Moreover, there is not much incentive to only reproduce someone else’s work because such reproduction is not, in most cases, considered publishable. For an instructor planning on covering the Ninov, Schön, and cold fusion cases, it could be interesting to compare response times of the community in all three cases.
While the responsibilities of coauthors is a publication issue, it is relevant to discuss it at this point as well. It is noteworthy that Schön had a large number of coauthors but that all of them were apparently unaware of his fabrication and falsification of data. There are some challenging issues here of how to balance working with colleagues in an atmosphere of trust with the need to make sure your trust is not misplaced. As students explore this challenge, you can ask them to place themselves in the role of working in a collaboration. One way of helping to strike the balance is to be very open with your collaborators on your methods, your data, your analysis, and your level of confidence in your results. This can help set the standards for sharing in the collaboration.
Section 2.3: Carelessness and how it harms the scientific community
Carefulness is not usually identified as a principle in ethics codes associated with the physical sciences. However, in professions such as medicine and engineering, most would agree that being careful is an ethical requirement. A doctor or an engineer may have lives in their hands, so the requirement for carefulness is obvious. Put another way, the harm carelessness can do to others is readily apparent in those professions.
How about in physics? Is there harm done to others by a careless physicist? As the cases discussed in this section illustrate, the answer can easily be yes. A physicist who performs research carelessly and then disseminates the results wastes not only the resources that were provided to perform the experiments but also the time and resources of other physicists who try to reproduce or extend flawed lines of research.
The scientific community has several mechanisms designed to detect careless research. One line of defense is peer review, both the formal peer review associated with the publication process as well as the informal peer review that takes place in seminars, conference talks, and casual conversations. While peer review may catch some carelessness, it will not catch all carelessness. For instance, a reviewer of a manuscript might easily detect a careless experimental design but is less likely to detect a carelessly implemented experimental procedure. A second line of defense is the principle of reproducibility: if no one else can reproduce the results of an experiment, then the original results are called into question. While this can be very effective at identifying out careless research, it can also require a lot of resources. Inefficient use of resources has an impact both on the physics community and on society at large.
To be clear, everyone makes mistakes, sometimes due to lack of expertise and sometimes just because humans are fallible. However, when the mistakes could have been easily prevented by more careful attention to experimental design and execution, that is an indication of carelessness. Bezuidenhout wrote an article on the importance of routine procedures in developing useful experimental data.[12] The context of the article is a discussion of the Open Data movement—the effort to get scientists to freely share their raw data. The point she makes is that sharing raw data is of little value unless we fully understand the procedures used to generate that data. The article is written from a life sciences perspective and it could have used one more round of editing for style, but it can still make a valuable addition to an ethics discussion, especially for students just beginning to do research.
Pathological Science
One consequence of carelessness can be self-deception. Irving Langmuir gave a talk on what he termed “pathological science” in 1953. A transcription of his talk appeared in Physics Today.[13] Langmuir discussed the Davis and Barnes experiment, N rays, mitogenetic rays, the Allison Effect, Joseph Rhine’s ESP experiments and UFO sightings. These examples illustrate the importance of designing an experimental procedure that minimizes the chance of producing self-deception. While the focus is more on the experimental procedures, to some extent, the same principles apply to the procedures one might choose to employ in data analysis. The history of N rays can be found in several sources, but perhaps the most relevant one is the paper R. W. Wood submitted to Nature documenting the way in which poorly designed experiments left the researchers open to self deception.[14]
To illustrate debate in the physics community, one could ask students to read a set of letters to the editor in response to the “Pathological Science” article.[15] Clusters of letters in response to articles often appear in Physics Today. Usually they are followed by a response from the article’s author, but that was not possible in this case since Langmuir was no longer alive when “Pathological Science” was published.
Since the experiments described by Langmuir took place well before his talk in 1953, the equipment and related procedures are dated. Some of the terminology may therefore be a bit challenging for students. For the most part, this issue will not prevent students from realizing that most of these experiments are flawed due to their reliance on humans to detect weak signals directly. When students make this connection, they may tend to dismiss these cases as completely irrelevant since in the modern lab such data would be acquired by instrumentation, not by eye. There are two ways in which these cases have continuing relevance, however. First, they serve as readily understood reminders of the need for vigilance in avoiding self deception. Second, they remind us that in less formal settings (i.e., outside of the laboratory) it is common for scientists and non-scientists alike to fall victim to self deception.
Discussion Prompts
- Discuss the role experimental design had in allowing self deception to develop in the experiments described by Langmuir.
- How would you redesign procedures in these experiments to reduce the likelihood of self deception, using where possible the same equipment as in the original experiments?
- Are there examples of laboratory experiments or activities that you have been involved with and that were vulnerable to self deception?
- Discuss how your opinion of the “Pathological Science” changed, if at all, after you read the letters to the editor responding to it.
Cold Fusion
On March 23, 1989, University of Utah chemist Stanley Pons announced that he and Martin Fleischmann had observed room temperature fusion involving deuterium nuclei in a tabletop experiment. They further claimed to have seen significant excess heat generated by their experiment. Soon after that, Steven Jones of Brigham Young University announced that he, too, had observed evidence for fusion, but in a somewhat different tabletop experiment. These were the first of what came to be known as the cold fusion experiments. A concise overview of some essentials of the cold fusion saga can be found in two Physics Today news articles.[16],[17] The Levi article16 covers the initial announcement, the inconsistency between reported excess heat flows and reported rate of neutrons observed, attempts by groups at other universities to replicate the experiment, and theoretical considerations for the experiment. The subsequent article by Goodwin17 focuses on the difficulties that arose due to most information coming through press conferences and newspaper reports rather than through refereed journals. This article then moves on to recount requests by Pons, Fleischmann, and the University of Utah for federal funding to pursue this research. Glenn Seaborg suggested to Energy Secretary James Watkins that a committee be formed to study the validity of the cold fusion claims. The resulting DOE report noted that tens of millions of dollars had been spent on cold fusion research in the short time since Pons and Fleischmann’s announcement. The report concluded that there was no convincing evidence that excess heat generation would be sufficient for commercial applications or that excess heat generation was linked to a nuclear process. Additional discussion can be found in letters to the editor.[18],[19]
Chapter 5 of Goodstein’s book On Fact and Fraud provides a different perspective on the story.5 In addition to covering much of what is covered in the Physics Today articles, Goodstein goes into some detail describing the activities of a Cal Tech research group, members of which he knew personally, as they prepared for and presented their conclusions at a May 1, 1989 session of an American Physical Society. They reported that the original experiments could not be replicated and the claims of Pons and Fleischmann were inconsistent with well established nuclear theory. Interestingly, Goodstein then discusses in length the work of an Italian group led by Franco Scaramuzzi that seemed to find evidence for some form of cold fusion, although not necessarily by the mechanism that Pons and Fleischmann originally reported. In a classroom discussion of this chapter, it can be useful to compare the way in which Pons and Fleischmann approached cold fusion research with the way Scaramuzzi’s group approached it.
A review of Goodstein’s book appeared in Physics Today.[20] This review was followed a few issues later by two letters to the editor and a response by the author of review, with the focus being on reproducibility.[21]
John Huizenga, co-chair of a Department of Energy committee tasked with investigating reports of cold fusion, wrote a book on the cold fusion saga entitled, Cold Fusion: The Scientific Fiasco of the Century.[22] As a physicist, he is able to provide relevant technical background in a way that an undergraduate physics student can follow. While the entire book is worth reading, it is possible to lead discussions based on selected chapters. Most of the chapters can be understood and put into context based on Chapter I.
- Chapter I provides a summary of the relevant nuclear physics and of the key cold fusion claims that were made in 1989. As such, it is very helpful to read in conjunction with almost any of the remaining chapters.
- Chapter II provides historical background on fusion studies as well as the evolution of the relationship between the Pons and Fleischmann group at the University of Utah and the Jones group at Brigham Young University. It is not essential reading in the context of ethics.
- Chapter III discusses, among other topics, the casual peer review process for the Pons and Fleischmann paper on their experiment. It also goes into some detail about the challenges of doing calorimetry experiments properly. This chapter can be useful in a discussion of the importance of carefully designed experiments.
- Chapters IV and V have interesting discussions of the role of government in funding research, but they are not directly relevant to experimental procedures.
- Chapter VI covers results obtained by other groups during the two months that followed the Pons and Fleischmann announcement. This chapter provides examples of scientific debate at its best, such as research groups trying to verify results reported by others. It also provides examples of scientific debate gone astray by describing groups that reported then retracted results prior to submitting them for peer review. Chapter VI also highlights the general lack of reproducibility in cold fusion experiments, in part due to the reluctance of Pons and Fleischmann to provide details of their procedure.
- Chapter VII discusses the conclusions of the committee co-chaired by Huizenga and Norman Ramsey. It covers the same issues as Chapter VI, but from a somewhat different perspective. Chapter I in combination with either Chapter VI or Chapter VII would form a good foundation for a classroom discussion of cold fusion.
- Chapter VIII focuses on the experiments done to search from the products of the hypothesized fusion reactions. It addresses both issues of reproducibility as well as issues that arise when some research groups keep key procedural information concealed from others. Chapters VI-VIII serve to illustrate that scientific research is fundamentally a community activity. Research is often done in groups, and our knowledge base is expanded by multiple groups investigating the same phenomenon. As a result, the scientific community has established norms for careful and peer reviewed reporting of experimental procedures and results. Classroom discussion could focus on connecting what was learned about ethical codes (see Chapter 1 of this Guide) to the cold fusion story in order to provide further insight into why the physics community has these standards.
- Chapter IX discusses a conference on cold fusion co-sponsored by the National Science Foundation. While it makes interesting reading, it does not cover much ethical territory not already addressed in the other chapters.
- Chapter X looks at the interplay between funding, regional influence, and science. The author documents a regional bias to positive cold fusion results being reported, with such results most likely arising from Utah or Italy. The regional enthusiasm for cold fusion appears to have led some researchers to come up with ad hoc reasons to explain the discrepancy between their findings and those from outside these regions. The chapter also addresses the importance of peer review by describing the external evaluation of the National Cold Fusion Institute.
- Chapter XI makes an interesting comparison between cold fusion and another discredited phenomenon, polywater. However, the material in this chapter probably cannot be fully appreciated without reading much of the rest of the book.
- Chapter XII likewise does a nice job of putting cold fusion in the context of Langmuir’s discussion of pathological science but probably requires reading most of the rest of the book to really appreciate.
- Chapter XIII describes a number of lessons that can be learned from the cold fusion story. While many of these lessons do not directly relate to experimental procedures, this chapter can be a valuable way to conclude a reading assignment that consists of selected chapters from Huizenga’s book.
There are two other noteworthy books on cold fusion that might be relevant to a discussion of ethics, Frank Close’s Too Hot to Handle: The Race for Cold Fusion[23], and Gary Taubes’s Bad Science: The Short Life and Weird Times of Cold Fusion[24]. The book by Close is a good choice for an instructional setting where you can assign the entire book. Its technical detail is at a level similar to Huizenga’s book. The book by Taubes is written from a journalist’s perspective, so the students who read it may not be as effectively connected to the science behind the experiments; on the other hand, it does provide a valuable perspective. In an ideal situation, a classroom discussion could be based on each student having read one of these three books so that all three perspectives are present.
Discussion Prompts
- What are some of the concerns outside groups had regarding the design of the Pons and Fleischmann experiments?
- How did the actions of Pons and Fleischmann inhibit peer review of their experimental procedures?
- In what sense is research a community activity? How can this community aspect be used to improve the quality of experiments?
Section 2.4: Computational physics
Computational physics often takes the form of developing a model of a physical system, reducing that model to a set of mathematical equations, and employing a computer to find numerical solutions to those equations. In the field of computational physics, one might consider the computer(s) to be analogous to the lab and the computer coding analogous to an experimental procedure. The endpoint of carrying out the experimental procedure or running the computer code is a body of data, which then must be interpreted. Just as it is possible to falsify results in experiments, it is also possible to falsify results in computations. Just as it is possible to intentionally design an experimental procedure to generate misleading results, it is possible to intentionally construct a poor model or mathematical rendering of the model to get misleading computational results. Careless computer coding can cause the same inaccuracies as careless experimental execution.
Since computational physics is a much newer field than experimental physics, a lot less is written about the field and ethical codes tend to overlook it. There is, however, an interesting article about computational modeling in mechanical engineering that covers much the same territory that one on computational physics might.[25] The authors develop ethical standards for developing and using computational models and simulations, and they base their standards in part on the results of consulting experts in computation in various fields of engineering and science. The authors illustrate the importance of some of their proposed standards by briefly summarizing real cases in which developers or users failed to follow the standards with disastrous consequences. The paper at times assumes that the code developer and the end user are two different people, which is often not true in computational physics. Even so, some of the considerations raised in the paper are helpful in cases where a computational physicist later passes their own code on to someone else, such as a new member of their research group. Issues addressed by the authors include care in model development, choosing appropriate algorithms to solve the model equations, documenting the code properly, validating the code thoroughly, and representing the results fairly.
Discussion Prompts
- What is meant by verification of code, and how does it differ from validation?
- The authors lay out cases for requiring developers to disseminate their codes and for allowing developers to choose not to disseminate their codes. Summarize these two positions and then discuss which is more persuasive and whether your answer depends on what type of problem is being address computationally.
- If you are developing a code to analyze a problem computationally, and you are the only one planning on using the code, what sort of documentation should you have for the code, both within the program itself and external to the program?
- How would your answer to the previous question change, if at all, if you knew that you were going to share the code with other users?
- What issues the authors raise regarding proper presentation of computational results are similar to issues commonly raised regarding proper presentation of experimental results?
Section 2.5: Laboratory safety
When one looks at the broad spectrum of physics experiments, many hazards may be present: high voltage, lasers or other intense light sources, radioactive sources, strong magnets, etc. Adherence to appropriate safety standards, setting a good example, teaching others about safety standards as appropriate, and preventing injury to those who are not a part of the research group are among the ethical principles associated with experimentation.
The consequences for giving insufficient attention to safety can be severe. A letter to the editor of Physics Today describes an accident involving shop equipment leading to the death of a high school physics student.[26] In July of 2014, an intern at Los Alamos National Laboratory was exposed a brief, intense flash of light. The exposure resulted in a small hole being burned into her retina and a consequent degradation of her eyesight. A lengthy investigative report brought up numerous problems with safety procedures in the lab.[27] The report may be a bit much for some undergraduate students to digest, but its critique of the laboratory in question will provide a sense for what a robust safety program should look like. Two employees lost their job as a result of this incident.[28]
Richard Feynman relates a story about enriched uranium during the Manhattan Project.[29] Oak Ridge was designing a plant for the enrichment, and some people involved in the project did not fully appreciate the importance of not allowing the accumulation of too much enriched uranium in a small space. Despite the fact that the Army wanted to keep this highly classified information compartmentalized, Feynman convinced the officer in charge that safety procedures would be followed more closely if the people working on the plant were given enough background about the nature of nuclear reactions to understand why this aspect of safety was so important. Once the plant workers understood why these procedures were needed, compliance improved significantly.
As far as resources are concerned, research-oriented universities generally have formal safety policies and procedures and one or more safety offices. There are numerous safety manuals available online. For instance, as of this writing, Princeton has a publically available, comprehensive manual.[30] Due to its length, it would not be feasible to have students read the entire manual. Having the students read one or two sections (and perhaps having different students read different sections) will be enough to promote a discussion on general safety considerations.
Discussion Prompts
- Discuss your institution’s safety policies and procedures.
- Discuss the role of mentoring in promoting a culture of safety. In what contexts, if any, can students serve as mentors?
- Has there ever been a safety procedure you were told to follow whose purpose was not clear to you? If so, did that affect your motivation to follow the procedure?
- If you thought your supervisor was encouraging you to bypass some safety procedures, what options would you have for handling the situation?
Section 2.6: How common is research misconduct in physics?
There are two well known cases from the field of physics in which findings of research misconduct have been documented: the Schön and Ninov cases. The Ninov case was discussed earlier in this chapter and the Schön case will be discussed in the next chapter. When addressing well documented instances of misconduct, it is helpful to keep the perspective that these cases are outliers both in terms of the impact they have had and the extent to which they are documented. Even though they are more dramatic than most cases of misconduct, they are worth studying because we know so much about them.
To get a feel for how common research misconduct in the field of physics is, it is helpful to examine survey data. A survey of physicists in the United States was taken in 2004 in order to learn about perceptions related to ethics in the community.[31] The response rate by early career physicists (PhD students and those having recently received a PhD) was larger than other categories, and 39% of them reported having direct knowledge of ethical transgressions related to research. They also reported serious concerns over mistreatment of subordinates, and how this mistreatment could, at times, foster research misconduct. The article about the survey includes both quantitative and qualitative results from the research. A more recent survey took a different approach, using a much smaller number of subjects but involving in depth interviews. The survey of physicists in the United States and United Kingdom focused on whether there is a gender difference in perception of ethical issues.[32] Among the findings were that some physicists view women in physics as being more ethical and men as being more competitive. The interviews indicated a tendency to perceive competitive behavior as having an inherently unethical component. These researchers also found that physicists were encountering a wide range of ethically ambiguous situations. The detailed results appear in Science and Engineering Ethics,[33] while a brief version is found in a Physics Today commentary.[34]
Discussion Prompts
- After reviewing the APS Guidelines on Ethics, discuss whether competition among physicists has an inherently unethical component. If your conclusion is no, then discuss whether competition has a tendency to promote unethical behavior.
- Have you ever encountered an ethically grey area in a research or instructional lab? If so, how did you handle it?
- Some early career physicists in the 2004 survey reported having been pressured into acts of research misconduct by their supervisors. What approaches are there for an early career physicist to take in order to deal with such pressure? What, if anything, can the APS do to address this issue?
Continue to Chapter 3: Data -Recording, Managing, and Reporting
Acknowledgment
The author is grateful for the time and effort of the anonymous reviewers of this work, and for their numerous helpful suggestions.
[1] Greer, Sandra C. Greer, Elements of Ethics for Physical Scientists (MIT Press, Cambridge, MA, 2017).
[2] American Association for the Advancement of Science, Science for All Americans. Chapter 1: The Nature of Science. (American Association for the Advancement of Science, Washington, DC, 1989). http://www.project2061.org/publications/sfaa/online/chap1.htm (accessed September 20, 2019).
[3]The Definition of Research Misconduct, The Office of Research Integrity, https://ori.hhs.gov/definition-misconduct (accessed September 20, 2019).
[4] Jonathan D. Quick and Heidi Larson, “The Vaccine-Autism Myth Started 20 Years Ago. Here’s Why it Still Endures Today,” Time February 18, 2018. https://time.com/5175704/andrew-wakefield-vaccine-autism/ (accessed September 18, 2019).
[5] David Goodstein, On Fact and Fraud: Cautionary tales from the front lines of science, (Princeton University Press. Princeton, NJ, 2010).
[6]Bertram Schwarzschild, “Lawrence Berkeley Lab Concludes that Evidence of Element 118 Was a Fabrication,” Physics Today 55 (9) 15 (2002). https://physicstoday.scitation.org/doi/full/10.1063/1.1522199
[7] George Johnson, “At Lawrence Berkeley, Physicists Say a Colleague Took Them for a Ride,” New York Times October 10, 2002. https://www.nytimes.com/2002/10/15/science/at-lawrence-berkeley-physicists-say-a-colleague-took-them-for-a-ride.html
[8] Barbara Gross Levi, “Bell Labs Convenes Committee to Investigate Questions of Scientific Misconduct,” Physics Today 55 (7) 15-16 (2002). https://doi.org/10.1063/1.1506737
[9] Barbara Gross Levi, “Investigation Finds that One Lucent Physicist Engaged in Scientific Misconduct,” Physics Today 55 (11) 15-17 (2002). https://doi.org/10.1063/1.1534995
[10] Lucent Technologies, “Report of the Investigation Committee on the Possibility of Scientific Misconduct in the Work of Hendrik Schön and Coauthors, September 2002,” https://media-bell-labs-com.s3.amazonaws.com/pages/20170403_1709/misconduct-revew-report-lucent.pdf (accessed September 28, 2019).
[11] Eugenie Samuel Reich, Plastic Fantastic: How the biggest fraud in physics shook the scientific world, (Palgrave Macmillan New York, NY, 2009).
[12] Louise Bezuidenhout, “Variations in Scientific Production: What Can We Learn from #Overlyhonestmethods?,” Science and Engineering Ethics 21 (6) 1509-1523 (2015). https://doi.org/10.1007/s11948-014-9618-9
[13] Irving Langmuir and Robert N. Hall, Physics Today 42 (10) 36 (1989) ; https://doi-org.ezproxy.emich.edu/10.1063/1.881205
[14] R. W. Wood, Nature 70 530-531 (1904). https://www.nature.com/articles/070530a0.pdf.
[15] Christopher Cooper et al., “Second Opinions on ‘Pathological Science’”. Physics Today 43 (3) 13-14 and 105-112 (1990). https://doi.org/10.1063/1.2810480
[16] Barbara G. Levi, “Doubts Grow as Many Attempts at Cold Fusion Fail,” Physics Today 42 (6) 17-19 (1989). https://doi.org/10.1063/1.2811042
[17] Irwin Goodwin, “Fusion in a Flask: Expert DOE Panel Throws Cold Water on Utah ‘Discovery’,” Physics Today 42 (12) 43-45 (1989). https://doi.org/10.1063/1.2811241
[18] W. Peter Trower, “Cold Fusion as Seen with X-Ray Vision,” Physics Today 43 (7) 13 (1989). https://doi.org/10.1063/1.2811074
[19] Leaf Turner et al., “Thoughts Unbottled by Cold Fusion,” Physics Today 42 (9) 140-144 (1989). https://doi.org/10.1063/1.2811168
[20] Bernard J. Feldman, “On Fact and Fraud: Cautionary Tales from the Front Lines of Science” (book review). Physics Today 63 (7) 50 (2010). https://doi.org/10.1063/1.3463629
[21] Fred McGalliard, Scott R. Chubb, and Bernard J. Feldman, “Cold Fusion and Reproducibility, Physics Today 63 (11) 11 (2010). https://doi.org/10.1063/1.3518197
[22] John R. Huizenga, Cold Fusion: The Scientific Fiasco of the Century, (University of Rochester Press, Rochester, NY, 1992).
[23] Frank Close, “Too Hot to Handle: The Race for Cold Fusion,” (Princeton University Press, Princeton, NJ, 1991).
[24] Gary Taubes, “Bad Science: The Short Life and Weird Times of Cold Fusion,” (Random House, New York, NY, 1993).
[25] David J. Kijowski, Harry Dankowicz, and Michael C. Loui, “Observations on the Responsible Development and Use of Computational Models and Simulations,” Science and Engineering Ethics 19 (1) 63-81 (2013). DOI:10.1007/s11948-011-9291-1
[26] Irving E. Dayton, “Student lab safety emphasized,” Physics Today 64 (8) 9 (2011). https://doi.org/10.1063/PT.3.1196
[27] Connon R. Odom, “Los Alamos Laser Eye Injury Investigation,” LA-UR-05-0282 (2004). https://permalink.lanl.gov/object/tr?what=info:lanl-repo/lareport/LA-UR-05-0282
[28] Rebecca Trounson, “Safety and Security Breaches at Los Alamos Nuclear Lab Cost 5 their Jobs,” Los Angeles Times, September 16, 2004. https://www.latimes.com/archives/la-xpm-2004-sep-16-na-alamos16-story.html
[29] Richard P. Feynman, “Surely You’re Joking, Mr. Feynman!”: Adventures of a curious character (W. W. Norton & Company, New York, NY, 1985), pp. 120-125.
[30] Princeton University Office of Environmental Health and Safety. Laboratory Safety Manual. https://ehs.princeton.edu/laboratory-research/laboratory-safety/laboratory-safety-manual (accessed September 2, 2019).
[31] Kate Kirby and Frances A. Houle, “Ethics and the Welfare of the Physics Profession,” Physics Today 57 (11) 42-46 (2004). https://doi.org/10.1063/1.1839376
[32] Elaine Howard Ecklund, “A Gendered Approach to Science Ethics for US and UK Physicists,” Science and Engineering Ethics 23 (1) 183-201 (2017). https://doi.org/10.1007/s11948-016-9751-8
[33] David R. Johnson and Elaine Howard Ecklund, “Ethical Ambiguity in Science,” Science and Engineering Ethics 22 989-1005 (2016). https://doi.org/10.1007/s11948-015-9682-9
[34] Elaine Howard Ecklund, David R. Johnson, and Kristin R. W. Matthews, “Commentary: Study highlights ethical ambiguity in physics,” Physics Today 68 (6) 8 (2015). https://doi.org/10.1063/PT.3.2796