Chapter 3: Data: Recording, Managing, and Reporting

Description

Chapter 3 of "An Instructor's Guide for Ethical Issues in Physics

Body

Chapter 3: Data: Recording, Managing, and Reporting

Section 3.1: Introduction

This chapter looks at several different aspects of recording, managing and reporting data. In many cases, concrete, specific standards are lacking even though there are widely accepted general principles. To provide more insight into the general principles, the chapter concludes with a discussion of three well documented case studies. Instructors planning to address any of the issues outlined in Sections 3.1 – 3.5 may wish to look ahead to the case studies to see if any of these cases will support their intended focus.

The National Science Foundation has a policy on Data Management Plans that provides a useful starting point for discussion of data-related issues.[1] The policy indicates that prompt dissemination of results is an expectation of all grant awardees. Investigators are also expected to share their primary data with others, although there are a few exceptions to this part of the policy. There must also be a plan for retaining data for an appropriate length of time. One of the goals of this policy is to ensure that investigators have the information needed to answer questions about their disseminated research. The APS Guidelines on Ethics[2] covers similar issues in Section I: The Research Record and Publication Results, Subsection: Research Results.

Discussion Prompt

What are the areas of overlap between the NSF Data Management Policy and the APS Guidelines on Ethics?

Section 3.2: The lab notebook

There was a time when details about experimental procedures and results were all recorded in a single place, the lab notebook. A variety of guidelines, such as numbering your pages and writing only in ink, were designed to provide a lasting record of the experimental details so that they could be studied later both by the investigators and by others outside the research group. If someone were to question the legitimacy of a published result, the lab notebook could be referred to as the best available documentation for what happened during the experiment. The traditional lab notebook was an essential element in maintaining trust within the scientific community. As electronic instruments began taking over the duties of recording data, it became possible to accumulate larger quantities of data. This development in turn led to the accumulation of bodies of data so large that they would not easily fit into a lab notebook. Now it is common for most data to reside in computer memory while other information about the underlying experiments may be found either in a traditional handwritten lab notebook or in a file, either on a lab-based computer or on a remote server in the cloud. Currently, no single approach is used throughout all of physics for collecting and archiving data.

Similarly, those in the physics community who continue to use lab notebooks do not appear to have a formal consensus on how they should be structured. An article in the American Journal of Physics identifies a fair amount of diversity in the structure of lab notebooks.[3] Some standard procedures for hardcopy lab notebooks include using a bound book with numbered pages, writing in ink, deleting information only by crossing out (not by erasing), dating entries, and writing out complete descriptions of experiments. These procedures are designed to maintain the integrity of the research record and to make it easy for others to review that record. Recognizing that no system is foolproof, a reasonable goal for electronic forms of lab notebooks is that they be as difficult to tamper with as their hard copy predecessors.

While students are used to owning the notebook they keep for lab-based course work, it is important for them to recognize that the research lab notebook is often considered to be the property of the lab, not of the individual who makes entries into it. As such, it should be understandable to others working in that lab, and it should remain in the lab, even after that individual leaves the institution.

Discussion Prompts

  1. If you were to design an electronic lab notebook using commonly available software (such as Microsoft Word), how would you design it to maintain the integrity of the research record to the extent that the traditional hard copy version can? Or is this not possible?
  2. Research commonly-available platforms for maintaining electronic lab notebooks and explore what measures they have in place to maintain the integrity of the research record.
  3. Is it necessary for theorists and computational physicists to maintain an equivalent to the lab notebook?

Section 3.3: Data management and archiving

As noted in Chapter 2, expanding the frontiers of scientific knowledge is a community activity. This is an important principle to keep in mind when acquiring and archiving data. Data should be recorded in a way that is not only understandable to the individual, but also to that person’s collaborators and to others outside the immediate circle. Some inexperienced researchers, in their haste to push through experiments, may make some of the following mistakes:

  • They may not record all of the relevant experimental parameters in a way that clearly connects them to a data set, with the result being that the data set is useless when detailed analysis is later performed.
  • They may organize their data sets in a way that is difficult for others to understand, making the data effectively inaccessible to anyone but the person who acquired the data.
  • They may fail to provide a robust system for backing up their data, risking loss of the original raw data.
  • They may store their data in a format only readily accessibly by a single type of proprietary software, risking effective loss of that data if the software is no longer supported.
  • They may not have a plan for retaining the data past the point at which it is used in a publication, making it difficult to follow up on inquiries related to the paper.

Not much has been written in peer reviewed journals on good practices for data management and retention, particularly in the area of physical sciences. Large research collaborations typically have carefully designed data management plans, and the size of their group affords them the opportunity to have people who devote a significant amount of effort to maintaining the integrity of the data. On the other hand, limited guidance is available for smaller collaborations who may not have the same resources to maintain the integrity of the data. There are two open access articles that provide some helpful advice about issues to consider when deciding how to manage data. While they are written from the perspective of the life sciences, much of the content is relevant to all data-intensive fields of science. The first article provides tips for constructing the type of data management plans required by many funding agencies.[4] The second article provides considerations in deciding how to archive data.[5] Both articles are relatively short and can easily be put together in a single reading assignment. They also provide a starting point for discussing data management and retention issues that arise in the Schön and Ninov cases discussed below.

Discussion Prompts

  1. List all the forms you are aware of (past and present) for storing data that can be accessed by a computer. Which of those forms are no longer commonly available? What are the implications for data sets stored that way?
  2. Describe a simple physics experiment you have done as a student. Now imagine that this is actually groundbreaking research and the data must be recorded for future analysis by a computer code. Make a list of all of the metadata (information about the nature of the experiment) that should be included with this data set.
  3. Pick a software package that you commonly use for producing plots (e.g., Excel, Origin, MatLab). Suppose data generated by one of your experiments resides only in a file associated with that particular software package. Would you expect that data to be accessible by someone who does not have access to that same software package? Would you expect that data to be accessible by someone ten years from now, who has access to a wide range of software packages?
  4. Suppose you have developed a complex computer model and are now preparing to run it with different parameter sets to see how it behaves. Each run takes several hours of computer time and generates a large body of data. What aspects of the discussion of data management and archiving apply to data generated by these computational modeling runs?
  5. Discuss what one can do to prevent and/or detect tampering with raw data sets.

Section 3.4: Digital images

The ease with which digital images can be inappropriately manipulated has become a concern among publishers of scientific journals. An article by Parrish and Noonan looks at case histories of research misconduct involving image manipulation.[6] While the focus is on the life sciences, many of the issues raised are relevant to fields of physics where digital images are used.

The Physical Review Letters Information for Authors states, “Figures should accurately present the scientific results. If adjustments to images, such as changing its brightness, are made, state the adjustment in the figure caption.”[7]

Science magazine advises, “Science does not allow certain electronic enhancements or manipulations of micrographs, gels, or other digital images. Figures assembled from multiple photographs or images, or non-concurrent portions of the same image, must indicate the separate parts with lines between them. Linear adjustment of contrast, brightness, or color must be applied to an entire image or plate equally. Nonlinear adjustments must be specified in the figure legend. Selective enhancement or alteration of one part of an image is not acceptable. In addition, Science may ask authors of papers returned for revision to provide additional documentation of their primary data.”[8]

The Office of Research Integrity (ORI), whose charge is to direct U. S. Public Health Services research integrity activities, has a comprehensive set of recommendations for treatment of digital images in their Guidelines for Best Practices in Image Processing.[9] The main page for these recommendations has individual boxes for each of the twelve guidelines. Note that clicking on any box sends you to a new page with more details about that guideline, making this a more extensive reading assignment than it initially appears

An instructor wishing to address the topic of digital images might want to have their students read the ORI guidelines as well as the brief statements from Physical Review Letters and Science. The instructor could then draw from the Parrish and Noonan paper some examples of inappropriate digital image manipulation.

Discussion Prompts

  1. Artists who work with photography might make many alterations considered routine processing to enhance, not distort, an image. If a photograph is instead a piece of scientific evidence, which of these alterations might compromise the scientific record?
  2. Compare the guidelines on digital photographs provided by Physical Review Letters to those provided by Science. Are both equally clear? Is one more restrictive?

Section 3.5: Reporting results

One of the most challenging areas to explore from an ethical perspective is that of reporting scientific results. Large grey areas exist when it comes to deciding what can be reported versus what must be reported, as well as how to present results to highlight patterns without suggesting trends for which insufficient evidence exists. In a commencement address, Richard Feynman set a high standard in this regard:[10]

It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things that you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell if they have been eliminated….

In summary, the idea is to try to give all of the information to help others to judge the value of your contribution, not just the information that leads to judgment in one particular direction or another.

It may be valuable for the instructor to acknowledge that Feynman’s suggestion is easier to follow if one’s scientific reputation is already firmly established. Someone trying to make a mark in the field of physics may be hesitant to express any uncertainty in the ideas they put forth. Nevertheless, this standard of scientific honesty benefits scientific community and, in the long run, the individual scientist.

It is also important to remind students that misleading presentation of research results can have a negative impact on how the physics community relates to society at large, for instance leading society to make bad choices. Pickett and Roche report results from a survey of the general public that indicates a desire for harsh consequences not only for those who commit fraud in research but also for those who mislead the public through selective reporting of results.[11] While the Pickett and Roche paper does bring to light valuable information, there are ways in which it is written that might not be helpful for students (such as over generalizations in the introduction) so it is probably best used as background reading for the instructor.

Finally, one of the foundational principles of science is that results of experiments should be reproducible. It is generally expected that results of laboratory-based experiments are not reported until they have been checked for reproducibility by that group. Moreover, one of the primary reasons that research groups are expected to report fully on their experimental methods and to maintain good records of their data is to allow other research groups to try to reproduce their results.

At this time, few articles have been written about ethical representation of data in physics. Instructors wishing to address the issue may find the Millikan oil drop experiment, discussed below, a good way to launch discussion.

Discussion Prompts

  1. Are students encouraged to apply Feynman’s “utter honesty” standard in reporting results in lab courses?
  2. Are there any circumstances under which it is alright to selectively exclude data from being reported? [See also the discussion of the Millikan case below.]
  3. Discuss situations outside of the scientific community in which data is selectively reported. [Students may hit on advertisers and politicians as practitioners of selective reporting.]

Section 3.6: Case studies

Ninov

As discussed in Chapter 2, many accounts of the Victor Ninov case are available. David Goodstein’s, On Fact and Fraud: Cautionary tales from the front lines of science[12] covers the case briefly, based in part to his access to the investigation report, most of which has remained confidential. For a more comprehensive treatment, see Physics Today.[13] One issue that could be discussed in this case is whether enough safeguards were in place to protect the integrity of the research data. Students may not have enough information to answer that question, however, so posing a hypothetical situation may help: Having read about the Ninov case, imagine you are involved in a collaboration of 5-10 people. The experiment generates large quantities of data that are stored electronically. What steps could your group take to reduce the chances of one group member altering the raw data in an undetectable way?

Schön

The essentials on the case of Hendrik Schön are covered in two news articles from Physics Today.[14],[15] Schön was employed by Bell Labs, doing research on the electrical properties of materials. Of particular relevance to this chapes trnal of Physics thas to investigate Sch research.the language is not particularly refined.ives trnal of Physics thathis chapter is the admission by Schön that he substituted an analytically calculated curve for experimental data. Another good reference is the report issued by the committee set up by Bell Labs to investigate Schön’s research.[16] While the entire report is 129 pages long, the main body is only nineteen pages, with the rest being appendices, most of which would probably be optional reading. Among the issues worth bringing out in discussion of the importance of proper recordkeeping are:

  • Schön by himself made almost all of the samples on which he performed his experiments. It was often the case that his coauthors never even saw the samples (see page 9 of the report). Note that some people include the samples themselves as part of their definition of “data.”
  • Schön did not maintain systematic records, such as lab notebooks (see page 10 of the report).
  • Schön indicated he had deleted raw data files that the committee requested, and thus, in the eyes of the committee, he could not supply adequate evidence for several of his published papers. His stated reason for the deletions was lack of computer storage space (see page 10 of the report).
  • The committee found numerous cases in which data had been presented in misleading ways. Pages 11-13 of the report as well as pages 3-4 of its Appendix E summarize these findings. The remainder of Appendix E looks at each problematic publication individually.

Millikan

There are a number of sources one can consult on ethical issues related to the Millikan oil drop experiment. Two key concerns have arisen as a result of historians studying his lab notebooks. First, in his lab notebooks he had recorded measurements on a larger number of drops than were reported in his 1913 paper.[17] Did he select data to report in an appropriate way, or was his data selection based on displaying only what would support his hypothesis? Even if one concludes that his data selection was appropriate, a second issue arises when Millikan’s paper states, “It is to be remarked, too, that this is not a selected group of drops but represents all of the drops experimented upon during 60 consecutive days….”. In light of the evidence in Millikan’s lab notebooks, is there any way in which that statement can be viewed as truthful?

A paper by Richard C. Jennings examines some of these issues.[18] While some of the conclusions drawn in the paper do not necessarily seem consistent with the factual information introduced, the paper nevertheless serves as a good springboard for  discussion. That said, many physicists may be bothered by the way Jennings treats fractional charge. A few people (mostly non-physicists) have suggested that some of Millikan’s omitted data might have contained evidence for fractional charge and that thus Millikan perhaps hindered the development of quark theory, which involves charges of value e/3 and 2e/3, by not publishing this data. However, current quark theory indicates it would not be possible to observe fractional charge in the oil drop experiment. Moreover, if it were, there would almost certainly have been copious experimental confirmation of this observation during the century that has elapsed since Millikan’s paper was published.

Allan Franklin has written extensively about the Millikan oil drop experiment. In one of Franklin’s works, he introduces the experiment with some basic physics, accessible to first or second year physics students, to help the reader understand how the experiment worked.[19]  He then discusses issues related to how Millikan analyzed his data. Franklin concludes that Millikan’s exclusion of five drops as well as his selective use of analytical techniques could not be justified. He points out, though, that the effect of these actions was not to change significantly the proposed value of the charge of the electron but rather to reduce the apparent uncertainty in that proposed value.

One of the questions a discussion based on this case can address is when it is alright to exclude some measurements from reported results. It may be that some students take the position that all measured values should be reported. To help the discussion along, one might ask, what is meant by data. To help clarify what must be reported, data can be usefully defined as measured values or other information acquired by following a well-defined experimental protocol. As an extreme case, if a student timing the period of a pendulum dozes off during one trial, causing the measured time to be much too large, that measured value would not satisfy the definition of a data point. Likewise, if a student records a series of measurements before realizing that a critical instrument has not yet been calibrated, those measured values would not be considered data. This definition of data allows one to exclude measured values that can easily be dismissed as irrelevant.

There remains however, the case of measured values that seem far outside the trend apparent in other data but for which there is no obvious breach of experimental protocol. One approach to handling such apparent outliers is to report them and then state explicitly that they will be excluded from subsequent analysis. This clarifies exactly how the data set has been narrowed and provides the reader with the tools to examine the impact of those data points being neglected.

Three other considerations that may also be helpful to bring out in a discussion of the oil drop experiment:

  • Even with modern equipment, this experiment can be challenging. Students often perform the oil drop experiment in intermediate physics lab courses, so it can be helpful to have them share their experiences when possible. Making sure students are aware of the challenges in this experiment can help them take a more realistic view of how data obtained from it should be analyzed.
  • One needs to exercise caution in judging actions of people in the past. In particular, some might argue it is only fair to judge past scientists by the ethical standards commonly accepted at the time, rather than by present ethical standards. This is not to say that the relevant ethical standards for this experiment have changed much in the past century, but rather that it is safer to ask questions along the lines of, “Do you think that Millikan’s approach to data analysis is ethical by our present standards?” rather than asking “Was Millikan ethical in his analysis?” Put another way, in a discussion of ethics involving cases as old as this, it is better to focus on actions rather than individuals.
  • Very few physicists have had their lab notebooks scrutinized as closely as Millikan’s. It is likely that a close examination of almost any lab notebook or similar set of records will at least raise some questions, if only due to lack of clarity in the records.

Continue to Chapter 4: Publication Practices

Acknowledgment

The author is grateful for the time and effort of the anonymous reviewers of this work, and for their numerous helpful suggestions.


[1] National Science Foundation, Directorate of Mathematical and Physical Sciences, Division of Physics, “Advice to PIs on Data Management Plans”, January 2, 2018. https://www.nsf.gov/bfa/dias/policy/dmpdocs/phy.pdf (accessed September 18, 2019).

[2] American Physical Society Guidelines on Ethics (19.1) (2019). https://www.aps.org/policy/statements/guidlinesethics.cfm (accessed September 18, 2019)

[3] Jacob T. Stanley and H. J. Lewandowski, “Recommendations for the use of notebooks in upper-division physics lab courses,” American Journal of Physics 86 (1) 45-53 (2018). https://doi.org/10.1119/1.5001933

[4] William K. Michener, “Ten Simple Rules for Creating a Good Data Management Plan”, PLOS Computational Biology 11 (10): e1004525 (2015). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4619636/ doi: 10.1371/journal.pcbi.1004525

[5] Edmund M. Hart, et al., “Ten Simple Rules for Digital Data Storage, PLOS Computational Biology 12 (10): e1005097 (2016). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5072699/  doi: 10.1371/journal.pcbi.1005097

[6] Debra Parrish and Bridget Noonan, “Image Manipulation as Research Misconduct,” Science and Engineering Ethics 15 (2) 161-167 (2009). https://doi.org/10.1007/s11948-008-9108-z

[7] Physical Review Letters, “Information for Authors,” https://journals.aps.org/prl/authors (accessed September 23, 2019)

[8] Science, “Instructions for preparing an initial manuscript,” https://www.sciencemag.org/authors/instructions-preparing-initial-manuscript (accessed September 23, 2019)

[9] Office of Research Integrity, “Guidelines for Best Practices in Image Processing,” https://ori.hhs.gov/education/products/RIandImages/guidelines/list.html (accessed September 23, 2019).

[10] Richard P. Feynman, “Surely You’re Joking, Mr. Feynman!”, (W. W. Norton & Company, New York, NY, 1985), p. 341.

[11] Justin T. Pickett and Sean Patrick Roche, “Questionable, Objectionable or Criminal? Public Opinion on Data Fraud and Selective Reporting in Science,” Science and Engineering Ethics 24 (1) 151-171 (2018). https://doi.org/10.1007/s11948-014-9618-9

[12] David Goodstein,  On Fact and Fraud: Cautionary tales from the front lines of science, (Princeton University Press. Princeton, NJ, 2010).

[13]Bertram Schwarzschild, “Lawrence Berkeley Lab Concludes that Evidence of Element 118 Was a Fabrication,” Physics Today 55 (9) 15 (2002).   https://physicstoday.scitation.org/doi/full/10.1063/1.1522199

[14] Barbara Gross Levi, “Bell Labs Convenes Committee to Investigate Questions of Scientific Misconduct,” Physics Today 55 (7) 15-16 (2002). https://doi.org/10.1063/1.1506737

[15] Barbara Gross Levi, “Investigation Finds that One Lucent Physicist Engaged in Scientific Misconduct,” Physics Today 55 (11) 15-17 (2002). https://doi.org/10.1063/1.1534995


[16] Lucent Technologies, “Report of the Investigation Committee on the Possibility of Scientific Misconduct in the Work of Hendrik Schön and Coauthors, September 2002,”  https://media-bell-labs-com.s3.amazonaws.com/pages/20170403_1709/misconduct-revew-report-lucent.pdf (accessed September 28, 2019).

[17] R. A. Millikan, “On the Elementary Electrical Charge and the Avogadro Constant,” Physical Review 2 (2) 109-143 (1913), see especially p. 138. https://doi.org/10.1103/PhysRev.2.109

[18]Richard C. Jennings, “Data Selection and Responsible Conduct: Was Millikan a Fraud?” Science and Engineering Ethics 10 (4) 639-653 (2004). https://link.springer.com/article/10.1007/s11948-004-0044-2

[19] Allan Franklin, “Selectivity and the Production of Experimental Results,” Archives for History of Exact Sciences 53 (5) 399-485, see especially pp. 422-431 (1998). https://link.springer.com/article/10.1007/s004070050031

Citation
Marshall Thomsen. . Chapter 3: Data: Recording, Managing, and Reporting. Online Ethics Center. DOI:https://doi.org/10.18130/0gmg-ef13. https://onlineethics.org/cases/chapter-3-data-recording-managing-and-reporting.