I’m delighted to be here today to talk to you about ethical issues in research in the social sciences. I have to warn you in advance that my expertise in this area is broad rather than deep – that is to say, I have broad expertise in research ethics, but only shallow expertise in the ethical issues specific to the social sciences. By training, I am a folklorist, an odd field composed of almost equal parts humanities and social sciences, and I have done research with a strict focus on novels and other texts as well as research-based on ethnographic interviews. By profession, I am an ethicist – have been for almost ten years – and for the last five or so years my main concerns have been with the ethics of research in the life sciences. I also have a great deal of experience putting together workshops, which is the main reason I was invited to help plan today’s event. I hope you will find my comments, and the whole workshop, useful, and I am sure I will learn a great deal today.
Anyone doing serious research today, especially research involving human subjects, quickly learns about the myriad rules governing research. Rules exist on every imaginable level. There are international conventions, and both federal and state laws and regulations. Universities and colleges, divisions or schools, departments, even individual professors might have rules. Academic and professional societies often have codes of conduct. We are hemmed in on all sides by rules and regulations.
It was not always so. For example, when the great social anthropologist E. E. Evans Pritchard was getting ready to undertake research in Central Africa early this century, he had trouble finding serious guidance on how to do fieldwork.
I first sought advice from Westermarck. All I got from him was “don’t converse with an informant for more than twenty minutes because if you aren’t bored by that time he will be.” . . . [Haddon advised] that it was really all quite simple; one should always behave as a gentleman. . . . My teacher, Seligman, told me to take ten grains of quinine every night and to keep off women. . . . Finally I asked Malinowski and was told not to be a bloody fool. [Evans-Pritchard 1976:240]
Doing research was much more simple, if not more easy, in those days.
There are two fundamental ways of looking at rules. You can take the cynical point of view, which I’m sure most researchers do when they are filling out endless forms and endless revisions of endless forms. The cynical view sees rules as constraints devised by the powerful and inept to keep the rest of us from getting anything done. We have to fill out human subjects forms to keep the bureaucracy happy. After all, administrators have to be able to show something to justify their salaries, and paperwork does the job nicely. Since we fill out forms only for the sake of show, we don’t have to concern ourselves with what the forms say; we just have to write whatever it takes to get approved, and then we can get on with our research.
As I say, this is the cynical view, and although I have sometimes looked at life this way – both as a researcher filling out forms and as a committee member reviewing them – I have to urge you, as I urge myself, to resist the temptation to slide into cynicism.
The other point of view gives the rules, rule-makers, and rule-enforcers the benefit of the doubt. Certainly when I was doing my fourth or fifth revision on a consent form for a very straightforward project, I felt like a lab rat being sent through a maze again and again by the human subjects committee. But even at such moments of frustration I know that there are good reasons for the rules.
One reason is historical. Let me mention a few names and events quite briefly, and then describe one case in some detail, that led to our current wealth of rules and regulations.
World War II changed research as it changed much of the world. The trials at Nuremberg in response to Nazi atrocities carried out by physicians and scientists during the war resulted in the Nuremberg Code (1949), The complete text of the Nuremberg Code, as quoted from “Trials of War Criminals Before the Nuremberg Military Tribunals Under Control Council Law No. 10,” Vo l. 2, Nuremberg, October 1946-April 1949. (Washington, DC: US Government Printing Office, 1949), pp. 181-182, can be found on the World Wide Web at . the first article of which reads, “The voluntary consent of the human subject is absolutely essential.” Also of note are the twelve principles adopted by the World Medical Association in the 1964 Declaration of Helsinki to guide physicians in biomedical research (updated, 1989). I suspect that everyone has heard of the Tuskegee Syphilis Study, which, among other terrible effects, made it extremely difficult for researchers to gain the trust of African-Americans. In the social sciences, the research of Stanley Milgram and other social psychologists who routinely deceived their subjects raised significant concern. Milgram’s experiments are infamous and, I hope, need no further comment here.
The case of misconduct in science that I want to describe in some depth This description of the Breuning case is adapted from Pimple 1996. took several years ago, when Robert Sprague of the University of Illinois was working on a project for NIMH intended to assess the effects of neuroleptic drugs on the retarded.The following paragraphs paraphrase and quote liberally from Holden 1987. He took on Stephen E. Breuning as an investigator. Breuning did brilliant work, “gaining considerable prominence in his thinly peopled research field with studies indicating that antipsychotic drugs are overused and that stimulant drugs are more effective in the treatment of hyperactive retarded children.” Breuning’s work had “a ‘significant impact’ The internal quotations are from a draft report of an NIMH investigation into allegations of misconduct by Breuning. on his field, not only on the knowledge base but on social policies concerning the care of the mentally retarded, particularly since his contributions came ‘at a time when most clinical practice and administrative policy bearing on drug treatment were based primarily on anecdote and clinical impression.’” Breuning’s work was influential enough to effect public policy, at least in Connecticut, concerning the treatment of retarded children.
In 1983, Sprague began to suspect that there was something wrong with Breuning’s work. The University of Illinois investigated, but not very thoroughly. Sprague continued to press the case in spite of the skepticism, even hostility, of various officials. At least three years after Sprague raised the red flag, NIMH released a report outlining a “chronic career of doctored research results and reports of research that was not conducted at all, dating from the mid-1970s in Chicago to April 1984 when Breuning resigned from the University of Pittsburgh.”
Breuning was employed at the Oakdale (Illinois) Regional Center for Developmental Disabilities for a year after he got his doctorate from the Illinois Institute of Technology. He transferred to the Coldwater (Michigan) Regional Center in 1978, and moved on to the University of Pittsburgh in 1981.
The NIMH review panel was able to authenticate little of the research he claimed to have conducted at any of these institutions. Although he claimed many of the subjects in his publications were studied while he was at Oakdale, investigators could find no evidence, either in Oakdale’s records or from questioning colleagues there, that he had done any research with human subjects while at Oakdale.
Similarly, no raw data could be found for some studies Breuning allegedly conducted at Coldwater, and “no evidence could be found that deliberate drug manipulation according to a protocol, or administration of a placebo as described, was ever carried out there. . . . [N]one of the described studies of psychopharmacologic treatment had been carried out.”
When at Pittsburgh, Breuning got his own NIMH grant to study the effects of stimulant drugs on mentally retarded children. Breuning submitted two progress reports to NIMH; in the second, he reported 6 completed studies and 11 publications published or in preparation. However, the investigation revealed that appropriate subjects were not available at the psychiatric unit in Pittsburgh where Breuning worked at the time. . . . The panel . . . concluded that his “preparation of two grossly distorted, but polished and detailed, progress reports could only have been a deliberate and intentional effort to mislead and deceive the Federal funding agency.” [Holden 1987]
Breuning also wrote a review chapter based on studies of 3,496 subjects who never existed and listed on some of his papers as co-authors persons who had not worked with him. “In 1988, Breuning became the first independent researcher in the United States to be indicted on research fraud, and ultimately pled guilty to two charges of filing false reports”(Goodman 1996:4).
This is just one example of misconduct in science, and it is egregious. Breuning wrote reports based on experiments he never did. He defrauded the Federal government of thousands of dollars. His fictitious publications led to changes in how mentally retarded children were treated – in other words, his fraud hurt just about the most vulnerable population in America. This is serious stuff.
This and a number of other highly publicized cases of misconduct in science in led the United States Public Health Service and National Science Foundation (two of the largest federal supporters of research in the United States) in the late 1980s to adopt similar official definitions of “scientific misconduct” – perhaps the most important rule regarding the conduct of research – and similar policies regarding responsibilities and procedures for investigating allegations of misconduct in science. Both agencies essentially defined misconduct in science as “fabrication, falsification, plagiarism, or other serious deviation from accepted practices.” Thanks in part to federal pressure, most research universities have since adopted explicit definitions of misconduct in science and procedures for handling allegations of misconduct. The early history of investigations is one of very shoddy work, investigations that looked like cover-ups, and tremendous denial on the part of the scientific community. We have come a long way since 1989 in that regard. This is not to say that all of the procedures now in place are perfect, but they are certainly much better than they were initially. We have learned a great deal from our mistakes.
This historical framework, then, suggests one answer to the question, “Why all these rules?” The rules are designed to make researchers accountable to the public. There may have been a time when rules were not needed to ensure that researchers acted responsibly, but the historical record of the past several decades shows clearly that that time is past.
The public has two broad interests in research. First, in large part, academic research, including research in the social sciences and education, is paid for by the public, whether through federal grants or through the salaries of professors and graduate researchers at state institutions like the University of Minnesota. If the public pays for research, the public has a right to demand a certain level of responsibility on the part of researchers. Second, research has an impact on the public, for better, or, as in Breuning’s case, for worse.
There are other reasons to have rules and to respect them, of course. In the first instance, in any society, the default stance is that disobeying rules and laws is unethical and immoral. This is particularly true in a democracy, where it is possible to improve or remove flawed rules and laws. In other words, generally speaking, it is simply wrong to disobey rules and laws. The qualifier “generally speaking” is important here, because we all know that there are bad rules and unjust laws, and that not all rules and laws are created carefully enough to cover all relevant situations. However, those instances are exceptions. Another way of making the point is to say that if someone decides to break a rule or a law, the burden is on her or him to justify that violation. No justification or excuse is needed to obey a rule or law, but justification is needed for breaking one. Rules governing research also help to protect everyone involved in the research project – universities, departments, researchers, students, human subjects, eventual consumers of research, and both public and private funding agencies. I will not belabor the obvious ways in which rules provide protection.
My final major point is this: Rules regarding research have two aspects, regulation and education. We tend to think primarily in terms of regulation, of forcing people to act a certain way. But the educational aspect is probably more important. In many instances – ideally in all instances, but this is not an ideal world – researchers learn good research practices from rules, “good” both in the sense of “effective” and in the sense of “ethical.” Rules help save researchers from re-inventing the effective and ethical wheel by becoming depositories of accumulated experience and wisdom. Filling out forms helps researchers think clearly and carefully about what they are going to do, and demands for revisions by granting agencies and other committees often result in improvements in the research that the researcher did not think of and might not have thought of without such input. In other words, gaining approval for research can be seen as an unnecessary irritant or as a chance to get real help from people with valuable expertise.
I opened by quoting Evans-Pritchard on how difficult it was in his day to get advice on doing fieldwork. To close, I want to quote another passage from the same essay. After observing that there are many countries in which there is “a hostile attitude to anthropological inquiries” because “they suggest that the people of the country where [such inquires] are made are uncivilized savages” and “anthropology smells to them as cultural colonialism” (Evans-Pritchard 1976:250), Evans-Pritchard shares some advice on fieldwork he gave to his students. “I have for many years advised students about to embark on fieldwork to claim that they are historians or linguists, subjects which no one can take offence at; or they can talk vaguely about sociology” (Evans-Pritchard 1976:251). In other words, he advises his students to lie. I
do not want to suggest anything about Evans-Pritchard’s moral character, and certainly arguments could be made that this is a harmless, even justifiable, lie. But at least today anyone embarking on fieldwork receiving advice like this would also, almost certainly, hear a great deal about the ethics of fieldwork. I hope that makes it harder to decide to lie to the people we study.
Thank you.
Works Cited and Other Resources
American Anthropological Association (http://www.ameranthassn.org/) ethics site. http://www.ameranthassn.org/ethics.htm.
American Council of Learned Societies (http://www.acls.org/jshome.htm). Many academic societies and associations have codes of ethics, most of which can be found on the Web. A few societies and associations are listed in this bibliography; if you can’t find yours here, try the ACLS Web site.
American Historical Association (http://www.theaha.org/) Statement on Standards of Professional Conduct. http://www.theaha.org/pubs/standard.htm.
American Political Science Association (http://www.apsanet.org/) ethics site. http://www.apsanet.org/PS/ethics.html.
American Psychological Association (http://www.apa.org/) ethics site. http://www.apa.org/ethics/.
American Sociological Association (http://www.asanet.org/) code of ethics. http://www.asanet.org/ecoderev.htm. Association for Practical and Professional Ethics. Includes links to many useful ethics related sites. http://php.ucs.indiana.edu/~appe/home.html
Ethics Codes Collection. The Center for the Study of Ethics in the Professions at the Illinois Institute of Technology received a grant from the National Science Foundation in June 1996 to put its collection of over 850 codes of ethics on the World Wide Web. Included are codes of ethics of professional societies, corporations, government, and academic institutions. Earlier versions of codes of ethics of some organizations represented will be available so people can study the development of codes. A literature review, an introduction to the codes, and a User Guide will also be available.
Declaration of Helsinki. http://www.faseb.org/arvo/helsinki.htm.
Evans-Pritchard, E. E. 1976. “Some Reminiscences and Reflections on Fieldwork.” Appendix IV of Witchcraft, Oracles, and Magic among the Azande, by E. E. Evans-Pritchard (abridged by Eva Gillies), pp. 240-254. Oxford: Clarendon Press.
Goodman, Billy. 1996. “Scientific Whistleblowers Stress that the Media are a Last Resort.” The Scientist, March 18, pp. 1 & 4.
Holden, Constance. 1987. “NIMH Finds a Case of ‘Serious Misconduct.’” Science 235 (27 March):1566-1567.
Nuremberg War Crimes Trials. http://www.yale.edu/lawweb/avalon/imt/imt.htm.
Pimple, Kenneth D. 1996. “A Few Key Issues in Research Ethics.” Presented at the Whitaker Foundation Biomedical Engineering Research Conference, August 10, 1996. http://php.ucs.indiana.edu/~pimple/whitaker.html.
Pimple, Kenneth D. 1997. “Defining Misconduct in Science: Some Reflections on the American Experience.” Presented for the National Committee for Research Ethics in the Social Sciences and the Humanities, Oslo, Norway, March 1997. http://www.indiana.edu/~poynter/tre4-2a.html.
Teaching Research Ethics. Includes back issues of the newsletter Trends and other useful information. http://www.indiana.edu/~poynter/tre.html.