Anonymous
Questions 1-3
The objective of this case study is to invoke discourse on two issues: 1) a possible conflict of interest between mentor and student and 2) the dissemination of information within academia.
Common reasons for attending a research conference are to learn what the competition is focusing on and to assess the progress of their research. This valuable information is used to avoid being "scooped" and to gain technical knowledge that can provide a "leg-up" on the competitors. What Dr. Smith has done is not too unfamiliar to a graduate student attending his or her first conference and something seemingly taken for granted by more senior students and, unfortunately, researchers.
But by what authority does Dr. Smith act? Often, such actions are taken under the pretense that they are in the student's best interest (i.e., so Lisa won't be "scooped" and can publish). It is very easy for a mentor to become increasingly occupied with the success of the lab, as judged by size and amount of funding. This focus can easily lead to the belief that the success of the lab justifies actions that are believed to better chances of success. In other words, the end (i.e., promotion for Dr. Smith and publications for Lisa) justifies the means (i.e., instructing a student to compromise her relationship with a friend and to withhold knowledge from the scientific community).
But doesn't the position of a mentor entail fostering personal growth and the teaching of students? Does this responsibility include teaching students that publishing supersedes friendships and the sharing of information with peers? The situation is further complicated by the fact that this information involves Lisa's thesis project, and Dr. Smith has a significant influence over when and if she will complete her degree. In most cases, a successful graduate career is based on the number of published articles. The bibliography, along with a letter of recommendation from the mentor, will greatly influence a graduate student's career. Should a person's thesis be based on published data? If the student is "scooped," should that damage the student's ability to graduate? Is Dr. Smith acting in Lisa's best interest, or is he thinking about his ability to get future funding? How much influence should a mentor have on when and if a student should graduate?
Steve is Lisa's peer, and collaborations are often based on previous relationships. Is it possible that by withholding information, Lisa is jeopardizing her future potential to set up fruitful collaborations?
Question 4
The quest for research funding has created a highly competitive environment where advantages are sought and adamantly held. In addition, because of the duration of most grants, many scientists plan no further than three to five years ahead, thus masking the long-term consequences for the scientific community of withholding information. These actions serve to impede the overall advancement of science. In addition to generating data, the ability to conceal possible advantages is now a determinants of success, and this situation jeopardizes the advancement of knowledge as a whole. Taken together, these issues are possible conflicts of interest.
Isn't Dr. Smith's responsibility to his student and the scientific community greater than that to his lab? Dr. Smith assumes that his actions offer no ill effects but simply are part of today's cutthroat research environment. In truth, his actions serve to propagate unhealthy practices that only hinder scientific progress.
The main goal of this case is to stimulate discussion of activities that fall into the category of questionable research practices. The National Academy of Sciences states:
Questionable research practices are actions that violate traditional values of the research enterprise and that may be detrimental to the research process. . . . Questionable research practices include activities such as the following: Failing to retain significant research data for a reasonable period; Maintaining inadequate research records, especially for results that are published or relied on by others; . . . Inadequately supervising research subordinates or exploiting them. (National Academy of Science 1992, 1-16)
While questionable research practices do not endanger the research process as critically as outright scientific misconduct, they do erode the integrity of the scientific institution as a whole.
Part 1 of this case attempts to present a scenario that is difficult to interpret definitively as a questionable research practice. Rather than a blatant statement of scientific misconduct, the reader is presented with a suspicion of inappropriate behavior. The intent is to mimic potential real-life situations where, quite frequently, there is no initial concrete evidence to support the decisions that must be made by the parties involved. Additionally, this case attempts to establish an environment where there is intense pressure on graduate students to produce publishable results quickly. This type of pressure is often encountered in labs conducting biomedical research, and it arises not only from the principal investigator, but from the graduate students themselves.
In this case, several factors contribute to the stressful environment in Dr. Larson's lab. For example, it is stated that neither Peter nor Sally had managed to publish a paper, which caused them both to worry about obtaining postdoctoral positions. It is also stated that other labs were attempting to develop the same knockout mouse. Dr. Larson's assertion that Peter and Sally have the chance to publish in Nature only adds to the pressure. In such an environment, even normally careful researchers can be tempted to cut corners, and thus engage in inappropriate scientific conduct.
Questions 1-3
These questions focus on the decisions Peter must make. Although he has no solid evidence that Sally has done anything inappropriate, his suspicions are aroused by their phone conversation in which Sally states that her data deviate from the previous trend observed, the fact that she disposed of remaining cells so that their identity could not be determined in an unbiased manner, and the absence of sufficient documentation in her lab notebook. There is no evidence that Sally has falsified data, which would constitute scientific misconduct. However, Sally's work behavior does fall into the category of questionable research practice.
Peter should first attempt to initiate better communication with his potential co-author by asking her to review her data with him, not accusing her of wrongdoing. If this conversation does not alleviate his suspicions, he should approach Dr. Larson with his concerns. As first author of the manuscript, Peter is ultimately responsible for its entire content. It is imperative that he feel confident in the data. The scientific process relies upon the publication of unbiased data generated via sound experimental designs.
Question 4
As a contributing author, Sally has a responsibility to maintain her lab notebook in such a way that her experimental procedures and raw data are easily located and identified. She is responsible for retaining any raw data or samples until the lab is reasonably confident that they are no longer needed. She also has a responsibility to honestly communicate any procedural problems to her co-authors. In situations where multiple researchers contribute to a final manuscript, each must be able to assume the honesty of the others and the unbiased nature of their results. There are many situations in which it is almost impossible to identify data that have been obtained in error or altered on purpose. In Sally's defense, she could well have the raw data in another notebook, and she may have thrown out the remaining cells by mistake, but her behavior raises suspicions about her scientific conduct.
Question 5
Dr. Larson's actions contributed to the problem in several ways. First, although it may be a reality that other labs are competing to produce the same results, he should attempt to set an example as a mentor in which strict adherence to careful lab practices is of utmost importance. His statement that Peter and Sally may be able to publish in Nature if they beat the competition went a long way toward establishing a stressful working environment, where inappropriate conduct is more likely to occur. Secondly, he tells Sally that she will be included on Peter's paper only if her results are informative. That may well encourage Sally to falsify data to produce the desired results. A better approach would have been to tell Sally that her efforts would be rewarded with a second author status, even if the results were not what was predicted. As Sally's research adviser, Dr. Larson should be stressing that honesty in science is required and expected of his students. After all, "Graduate school is the place to learn that one does not publish research results and conclusions until one is certain of their accuracy." (Sigma Xi, 1994, 6)
Thus, even Dr. Larson's behavior could be classified as a questionable research practice in that his supervision of his students was inadequate. He appeared more interested in the results than in the methods used to obtain them.
Part 2 introduces more evidence that Sally had actually engaged in scientific misconduct. Although Peter does not find concrete evidence that her graphed data was falsified, circumstantial evidence is gained from the results of the competing labs. At this point, after his paper has already been published, the best course of action would be first to discuss his concerns with Dr. Larson (whose reputation is also at stake). After that, it may be possible for Peter to repeat key experiments done by Sally. If that indicates that the initial published results were in error, the best course of action would be for Dr. Larson and Peter to notify Sally and then submit a retraction or correction to the journal.
References
- National Academy of Sciences. Responsible Science: Ensuring the Integrity of the Research Process. Washington, D. C.: National Academy Press, 1992, pp. 1-16.
- Sigma Xi, The Scientific Research Society. Honor in Science. Sigma Xi, Research Triangle Park, N.C.: Sigma XI, 1994.
This case addresses the issues of mentor responsibility to the student as well as a scientist's right to maintain scientific freedom. Julie is faced with a dilemma that could have been avoided had her mentor played by the books and presented her with the contract before she began her research project. But even then, she would have had to decide whether to abide by ABC's unwritten agreement. However, now that she is placed in this difficult situation, all the parties involved are at risk of losing.
If she refuses to sign the contract, Julie stands to lose all the research work she has put in so far toward obtaining her degree, unless she can find another funding source. As she has nearly completed her data analysis, this loss could be substantial. She also stands to lose the support of her mentor, Dr. Angstrom. By creating friction with ABC, Julie could fall from favor with Dr. Angstrom, which could jeopardize the amount of knowledge she could gain from working with him, as well as the contacts that he could make for her when she begins looking for a job.
Dr. Angstrom and possibly other researchers at the university stand to lose a close relationship with a funder, ABC. If Julie and ABC are not able to work out their differences, this incident could create distrust or negative feelings toward the university from ABC's perspective. That could in turn result in ABC granting fewer contracts to the university.
Dr. Angstrom may find it necessary to take sides since he was the primary contact person between ABC and Julie. By siding with Julie, he may lose a significant source of funding. By siding with ABC, he may lose the respect of a graduate student. If Julie decides to make this incident public, he may lose respect within the university as well.
ABC stands to lose significant profits and reputation if its drug is proven less cost effective than competitors and Julie publishes these results. If ABC agrees to allow Julie to publish regardless of the results, the company runs the risk of funding a project that may severely damage them financially.
ABC may hold Joni responsible for this damage, and her job and reputation are also at stake.
Based on federal guidelines, Julie's academic freedom is legally protected from clauses such as the one presented by ABC. (Kodish 1996) However, the situation becomes difficult when she realizes that the way in which she decides to deal with ABC at this point could affect not only her professional career, but that of her mentor as well.
It appears that Julie's dilemma is not uncommon, given the increasingly closer relationships between academia and industry. (Blumenthal 1996) A potential conflict of interest exists in this situation because the way that Julie writes up her results is likely to be influenced by a secondary interest -- that of ABC. While Joni was not asking Julie to falsify the data or distort the results of the data analysis, she was implying that ABC would like Julie to provide more discussion of the positive results for ABC's drug. Julie could easily refuse to do so and still be protected by law -- she would not have to worry about losing the funding for her particular project. However, as a student of Dr. Angstrom's, Julie represents him as well, especially since she obtained this contract through his relationship with ABC. If Julie breaks the informal agreement made with ABC, it would appear to ABC that Dr. Angstrom has broken the informal agreement as well, since he oversees Julie's research project. His relationship with ABC could be forever tarnished by Julie's actions.
Julie could go along with the informal agreement, but that response would raise the issues of academic freedom and conflict of interest. Can Julie truly abide by the agreement without a loss of freedom?
A conflict of interest clearly exists, yet there is a fine line as to the extent of conflict and its ramifications. What could happen if her study were taken out of context due to a "skewed" manuscript? One possibility is that within a hospital drug formulary, ABC's drug could be chosen over a cheaper, equally effective AIDS treatment, and significantly higher drug costs would result in fewer people having access to the drug. In the most extreme case, death might occur earlier due to inadequate treatment because a patient could not afford the medication.
Another option for Julie is to explain her reservations to Dr. Angstrom and ask him for advice. This course of action could solve all her problems or make a decision even more difficult, depending on how Dr. Angstrom handles her request for advice. If Dr. Angstrom truly finds nothing ethically wrong with writing manuscripts in conjunction with ABC, it is likely that he would not understand Julie's concern, and he would suggest she sign the contract and agree with the informal agreement. Since he himself has had a similar relationship with ABC, that is the most likely case. However, if she is able to convince him that she has a conflict of interest, a possible course of action for Dr. Angstrom would be to help her to find an alternate source of funding for the project that is nearly completed.
If Julie were to perform a quick analysis of the data before making any further decisions, she may be solving her own immediate problem, but she would not really be addressing the ethical issue that she is facing. Throughout her whole career it is likely that she will be confronted with similar conflicts of interest, and it may be more appropriate to set a precedent in how she will carry herself in these future situations. Also, she must consider whether it is fair to future graduate students of Dr. Angstrom to be placed in the same situation, when she could have addressed the issue and perhaps come up with a solution.
Julie could always refuse to sign the contract with full knowledge that by doing so, she alienates herself from ABC and possibly from Dr. Angstrom as well. A more immediate concern would be how she would obtain funding for her project. If no funding is available, she faces the possibility of developing a completely new project for which she could obtain funding. If her relationship with Dr. Angstrom is tarnished because of this incident, finding a new funding source may prove to be difficult.
It is important in this case that the ramifications of all possible actions are explored and weighed individually. Consequences of Julie's actions affect not only herself, but the careers of others as well, and this consideration should weigh on her decision.
References
- Blumenthal, D. "Ethics Issues in Academic-Industry Relationships in the Life Sciences: The Continuing Debate." Academic Medicine 71 (12, December 1996): 1291-1296.
- Kodish E., T. Murray and P. Whitehouse. "Conflict of Industry in University-Industry Research Relationships: Realties, Politics and Values [Comment]." Academic Medicine 71 (12, December 1996): 1287-1290.
This case revolves around an interdepartmental and cross-disciplinary research discussion group dynamic found at many medical schools and medical research centers. The situation allows for the discussion of several issues depending on the audience and the time available for discussion. The most obvious ethical concern is that Dr. Kent presented data that originated from another lab without the consent of that lab chief (Dr. Barry). However, additional, more subtle secondary issues can also be addressed. The overall message from this case study is the need to establish defined roles involving dissemination and control of data in a research discussion group or joint lab meeting environment. The perspectives of each person involved in the case (i.e., Dr. Barry, John, Dr. Kent and Jim, who represents the greater scientific community) are discussed below.
Dr Barry has a responsibility to John to ensure that he receives proper credit for his work, particularly since, as a graduate student, John will be significantly affected if he and Dr. Barry are scooped by competitors. More importantly, however, she has a responsibility to be sure John understands what defines appropriate scientific practice. If she does not address Dr. Kent's unethical behavior, John might get the message that Dr. Kent's actions are acceptable in the scientific community.
Dr. Barry must also consider that, as a junior faculty member, she is under a great deal of pressure to publish multiple articles in first-rate journals and to actively pursue extramural funding. If John's findings are reproducible, then she must weigh her responsibilities to her own development as a scientist and tenure-seeking faculty member with her responsibilities as John's mentor. A consulting and/or collaborative connection with a major pharmaceutical company would no doubt be a lucrative relationship. However, she must determine the impact of such a decision on John and the other students and post-doctoral fellows for whose training she is currently responsible. An additional aspect of establishing an association with the pharmaceutical company who approached Dr. Kent is that she would be strengthening her ties with Dr. Kent. Considering his previous unethical behavior, aligning herself with Dr. Kent is probably not a prudent choice. Moreover, she does not know how John's replicate experiments will turn out. If they do not reflect Dr. Kent's presentation results, then the pharmaceutical company will probably rescind its offer to collaborate and Dr. Barry will be left with a tainted reputation. These are important discussion points for students, but can also be elaborated on by faculty.
While it is justified to condemn Dr. Kent for his actions, it is also possible to use him as an example of the enormous pressure under which medical school faculty function. This is another opportunity to bring faculty into the discussion for comments on how to deal with such temptations. As the Director of the Breast Cancer Center, Dr. Kent is under more pressure than most to be a productive physician-scientist. He probably has substantial clinical duties in addition to his research activities. Since his lab has not been particularly productive in the past few months, it is possible that he simply made a bad decision in presenting Johns findings. However, Dr. Kent's culpability is compounded by his apparent fabrication of data. At the discussion group meeting, John clearly presented his findings as preliminary with one set of replicates (three mice per treatment group); however, Dr. Kent presented the results of multiple experiments in a bar graph format. Either Dr. Kent miraculously replicated the experiments in a matter of weeks, or he fabricated the replicate data. Unfortunately, the former is most likely as in vivo experiments often require months to complete. This point is not explicitly stated in the case study, and it offers an opportunity to play out scenarios for discussion (i.e., have participants consider what would changes if Kent did nor did not fabricate the data). A more subtle point is that Kent is trained as a physician, not a scientist; that might have a dramatic effect on Kent's perspective. Physicians often have different notions regarding the communal use of data within a research group. This point might also generate discussion on the scope research ethics training at the institutional level (i.e., all persons engaged in research activities would benefit from such training, not just graduate students). Physicians are not likely to be as sensitive to the competitive nature of science as a basic science faculty members might be. Moreover, MDs and MD/PhDs are more likely to receive funding for clinically relevant research grants; Kent may not be aware of the intense competition for new or younger PhDs in the basic sciences to obtain and sustain funding. Second, Kent probably has never trained a graduate student and is not familiar with the role of a mentor in graduate student research training. Thus, he might be ignorant of the value of John's work to his future as an independent scientist. Finally, physicians are often more concerned with expediting the flow of information, particularly novel, efficacious therapeutics from the bench to the bedside. Dr. Kent's comments to Dr. Barry are an attempt to stimulate this line of discussion. Participants can debate the pros and cons of such motivations.
Another perspective to consider is that of John. This case places John in a precarious position. He must trust Dr. Barry to represent his interests with Dr. Kent and to assert her (and his) right to control the dissemination of the data. Dr. Kent's premature presentation has left John in the position of having to publish these data as soon as possible, ideally before any competing labs can perform similar experiments. A point of discussion revolved around the consequences of John's project turning into a collaboration with the pharmaceutical company. This possibility leads to a host of issues including publication rights and sources of research dollars among others. Each of these topics can be integrated into the case study depending on the time allotted for discussion and the audience.
Jim represents the greater scientific community and researchers in the breast cancer field in particular. Clearly, other breast cancer investigators have a vested interest in obtaining data and information like John's research. The practice and advancement of science depend upon the publication and dissemination of new results. However, if Kent fabricated a portion of the results he presented, then Jim and the rest of the scientific community cannot depend on the research to guide their own. For the greater community of researchers, it is more useful to have complete sets of data with valid results and conclusions that might not be very interesting than to have incomplete or invalid sets of data with erroneous conclusions that appear more exciting. In the latter case, investigators will waste time, energy and resources following up an artifact.
Taking all these perspectives and issues into consideration, Dr. Barry has a few options for a plan of action. As a junior faculty member she is in rather dangerous territory. However, since Dr. Kent does not hold an appointment in her department, he has no tangible control over her professional future at the university (i.e., tenure decision, etc.). Barry will first need to solicit the opinion and support of other members of the Breast Cancer Group. Her next move should be to contact her department chairperson and discuss the incident. This way she may be able to gain support from other faculty who are on more even ground with Kent (tenured full professors). Next, or alternatively, depending on the relationship with her chair, she should report the matter to the Office of Research or the Office of Research Integrity. This is an excellent opportunity to discuss the appropriate institutional policies and procedures regarding issues like scientific misconduct and whistle blowing. As a last resort, Barry can also contact the International Breast Cancer Meeting organizing committee or society directly. However, before taking such action with an organization outside the university it is best to go through the proper institutional channels. The organizing committee and/or society could then be contacted in an official statement from the university. This way Dr. Barry would not need to be mentioned specifically. This anonymity might be important as she could be discriminated against in future dealings with the International Breast Cancer organization.
The onus rests with Dr Barry; she is confronted with a number of dilemmas and has a variety of responsibilities. Dr. Kent's actions are clearly unacceptable and highlight what can happen when ground rules for control of data are not established in a group meeting or joint lab meeting setting. It is important to include options that will help to avoid such situations. One choice is to refuse to participate in discussion groups. That is not a very realistic option since much can be learned in such meetings. A better option is for Kent to confer with Barry regarding his upcoming presentation and ask for her permission to mention the findings. Perhaps the two of them could develop a more traditional collaboration on the project. Finally, this case highlights the need for the development of clear guidelines for the discussion group's operation (i.e, Kent's role in handling dissemination of findings) before the first research presentations. This way each investigator is aware of the ground rules for the group and situations like the one described can be avoided.
Part 1
The objective of this case was to create a situation where two aspects of being a successful and ethical scientist come into conflict. In this case, maintaining reviewer confidentiality challenges the scientist's ability to honor her responsibilities as a mentor. In an ideal situation, this case will engender a discussion of both the importance of reviewer confidentiality and the specifics of being a responsible mentor. It may also help the discussants to think about situations where they will be forced to juggle the various aspects of a scientific career. These situations may push the ethical scientist to look for solutions that may not be obvious at first.
Question 1. This question brings up the inherent conflict of the skilled reviewer. A scientist who is knowledgeable and prominent within a particular field will often be working on questions related to the work being reviewed. The individuals most qualified to review a paper will also be those who stand to gain or lose the most from the information it contains. A discussion focusing on this question should address the fact that there are no universal criteria detailing the situations in which it is acceptable to review a paper. It is almost entirely up to the individual reviewers to decide whether they will be able to maintain objectivity or whether reviewing the paper will present a conflict of interest. How do those reading this case think they would make a decision like this? Can any absolute criteria or considerations be identified?
Question 2. This question addresses the heart of the conflict. If Dr. Ethicos suggests that Sarah add GFX to the cells, she will be acting on information received in confidence. Furthermore, her suggestion will open a can of worms if Sarah ever wants to publish her results, especially if the original reviewed paper remains unpublished. Essentially, Dr. Ethicos and Sarah would be claiming credit for an idea that belonged on some level to someone else.
That brings up another very tricky question that often causes problems in scientific research: Who (if anyone) owns an idea? The discussion of this question could focus on who would be affected by Dr. Ethicos' decision and how would they be affected. Clearly in the short term Sarah (and Dr. Ethicos by extension) would probably benefit from the suggestion to add GFX, or at the very least they certainly would not be harmed. However, Dr. Spacely could certainly be harmed by Dr. Ethicos' decision. The scientific community itself could be harmed if there were a general perception that reviewer confidentiality was not being honored.
This scenario has the added complication that Dr. Ethicos wouldn't be proposing that Sarah analyze the interaction between GFX and survivin (which was the essence of the reviewed paper) but simply that she use GFX as a tool to help with her cultures. Even though Dr. Ethicos would only be mentioning the idea to Sarah in the hopes of getting her cultures up and running, without intending for Sarah to focus on the GFX-survivin interaction itself, this is still starting down a slippery slope. The questions of where ideas come from and whether or not the origin of ideas can be regulated are additional powerful, subtle and tricky questions that the discussants may or may not wish to take up.
Question 3. The discussion of this question should bring up the mentoring side of the conflict: that it would be better to tell Sarah ASAP, but it should also bring up the reviewer side of the conflict: If Dr. Ethicos is going to remain an ideally ethical reviewer, she should wait until the paper is published or the information is made public at a conference or a talk or something of that nature. In some ways, this is a lose-lose situation. No matter how long Dr. Ethicos decides to wait, someone ends up losing out.
Question 4 simply serves to add even more weight to the scale on the side of mentor responsibilities without decreasing Dr. Ethicos' reviewer responsibilities. This question is designed to show how frustrating and tricky a situation like this could be. If the discussion is confined to the conflict itself, the discussants may find they have hit a wall. At first glance there seems to be no ideal ethical solution.
Question 5 throws another red herring of sorts into the discussion. One could see how easy it would be for Dr. Ethicos to convince herself that no real harm would be done by mentioning GFX to Sarah and that the information contained in the paper she reviewed will get out into the public domain somehow or another. However, essentially this rationalization does nothing to address the inherent conflict. Even if some other reviewer or individual had indeed broken reviewer confidentiality, that should have no bearing on Dr. Ethicos' ethical decision. To quote a useful cliché: Two wrongs don't make a right.
It is possible that the discussion will break down at some point. Some of the group may feel that reviewer confidentiality should really take precedence in this case, while others might feel mentor responsibilities should predominate. This case is designed in some ways to lead to this type of standoff.
However, at some point during the discussion a successful resolution to this dilemma might be brought forward with the realization that it is in fact possible to break reviewer confidentiality in special cases and reveal oneself to the author of the reviewed publication (ideally with the blessing of the journal editor). If Dr. Ethicos revealed herself to Dr. Spacely and explained her desire to assist Sarah in a way that should not harm Dr. Spacely's research, a satisfactory solution might be achieved.
This type of solution would preserve the inherent tenet of reviewer confidentiality, which is essential to a functional peer review system in any field, scientific or otherwise. It would also be an example of exemplary mentor responsibility, where a scientist is willing to expend extra time and energy to help further the prospects of her student while still remaining an admirable role model for that student. Lastly, this situation emphasizes the value of what might be considered lateral thinking. Ethical situations are complex and often require creative solutions that may not be immediately obvious to those facing the ethical dilemma. In some ways practice, in the form of the discussion of hypothetical situations, is the only way to try to prepare oneself to handle the inevitable ethical conflicts that will arise.
It is possible that the discussants may arrive at this solution early in the discussion before they have seen the zero sum game nature of the original conflict. The discussion leader might then want to introduce the question of what to do if Dr. Spacely refuses to allow Dr. Ethicos to mention the GFX to Sarah, which brings the discussion back to the initial problem.
This case study is intended to highlight the differences between "advisers" and "mentors" and to show the positive effects a good mentor can have on a graduate student. Because mentoring can be construed differently across disciplines, clarification is needed. In academic settings, the term "mentor" is often simultaneously associated with the term "faculty adviser." In this case, however, the research adviser and mentor are not only two different people, but also come from different disciplines.
The Committee on Science, Engineering, and Public Policy stated that "A fundamental difference between mentoring and advising is [that mentoring is] more than advising; mentoring is a personal, as well as, professional relationship." (Committee on Science, Engineering, and Public Policy, 1997, 1) Positive mentoring requires effort from both parties involved. A motivated graduate student helps the process of mentoring along, while the professor feels that she in not wasting anyone's time. Unfortunately, there is no optimal formula for positive mentoring. Each situation is complex, with many different factors entering the formula. Mentoring can differ on the basis of discipline, personality type, gender, ethnicity, knowledge of subject matter, and status of graduate student and professor.
The original concept of mentoring is an ancient one. Homer describes the first mentor as the "wise and trusted counselor" who is left in charge of Odysseus' household during his travels. Athena acted as the mentor and became the guardian and teacher of Telemachus, the son of Odysseus. In the context of today's higher education, mentoring has many different facets. A mentor's primary responsibility is to help a graduate student and to take an interest the student's professional development. This responsibility requires patience, trust, effective communication, good role modeling and understanding from both parties involved. It also requires that both the professor and graduate student fully understand the ethics of research and abide by federal and institutional regulations and guidelines.
Swazey and Anderson suggest that a good mentor be skilled in interpersonal relationships and genuinely interested in the mentee's professional development. In addition, they suggest that the mentor be involved in teaching effective communication skills to the mentee. It is not surprising that research has shown that both faculty and graduate students consider mentoring relationships rare. (Friedman 1987)
An adviser, by contrast, performs more narrow or technical functions such as "informal advising about degree requirements, periodic monitoring of an advisee's research work and progress toward his/her degree" (Swazey and Anderson 1996, 6). In addition, the adviser usually serves as the principal investigator and/or laboratory director for the graduate student's project. In this capacity, the adviser instructs the graduate student on design, methodology, literature review, proposal and other aspects of the dissertation research.
This case study demonstrates the differences between adviser and mentor by suggesting that the two need not be the same person, or even come from the same discipline. Simpson's egregious ethical mistake undermines his position as adviser. Simpson's behavior effectively demonstrates the term "toxic mentoring" coined by Swazey and Anderson (1996). They cite four types of undesirable or "toxic" mentors: "avoiders" - mentors who are neither available nor accessible; "dumpers" - mentors who force novices into new roles and let them "sink or swim"; "blockers"- mentors who continually refuse requests, withhold information, take over projects, or supervise too closely; and "destroyers or criticizers" - mentors who focus on inadequacies. (From Darling 1986, quoted in Mateo et al. 1991, 76)
Although this case study raises several issues, such as whistle blowing and the vulnerable position of being both an advisee and employee, it is important to underscore the differences between the mentor/mentee and adviser/advisee relationship as it may affect the ethical environment for both faculty member and student. Effective communication is paramount in both relationships. Interestingly, a recent survey of graduate students at one university reported that just over half of all graduate students surveyed (52%, with 40% agreeing and 12% strongly agreeing) believe that communication between faculty and graduate students is satisfactory. While that result is gratifying, the survey raises questions about why 48% found communication between graduate students and faculty unsatisfactory.
A positive mentoring relationship can be an important asset to the graduate school process. If properly mentored, graduate students can expect to grow academically, professionally and personally and develop the skills necessary to become mentors themselves in the future. The mentor/mentee relationship cannot be ignored in higher education and should not be confused with the adviser/advisee relationship.
References
- Adams, Howard G. Mentoring: An Essential Factor in the Doctoral Process for Minority Students. Notre Dame: National Consortium for Graduate Degrees for Minorities in Engineering and Sciences, Inc. (GEM), 1992.
- Burke, Ronald J., Carol A. McKeen and Catherine S. McKenna. "Sex Differences and Cross-Sex Effects on Mentoring: Some Preliminary Data." Psychological Reports 67 (1990): 1011- 1023.
- Burke, Ronald J., Catherine S. McKenna and Carol A. McKeen. "How Do Mentorships Differ From Typical Supervisory Relationships?" Psychological Reports 68 (1991): 459-466.
- Bushardt, Stephen C., Cherie Fretwell and B. J. Holdnak. "The Mentor/Protégé Relationship: A Biological Perspective." Human Relations 44 (No. 6, 1991): 619-639.
- Carden, Ann D. "Mentoring and Adult Career Development: The Evolution of a Theory." The Counseling Psychologist 18 (No. 2, April 1990): 275-299.
- Committee on Science, Engineering, and Public Policy. Adviser, Teacher, Role Model, Friend: On Being a Mentor to Students in Science and Engineering. Washington, D. C.: National Academy Press, 1997.
- Darling, L. A. "What to do About Toxic Mentors." Nurse Educator 11 (No. 2, 1986): 29-30.
- Edlind, Elaine P., and Patricia A. Haensly. "Gifts of Mentorship." Gifted Child Quarterly 29 (No. 2, Spring 1985): 55-60.
- Friedman, N. Mentors and Supervisors. IIE Research Report No. 14. New York: Institute of International Education, 1987.
- Gilbert, Lucia Albino, and Karen M. Rossman. "Gender and the Mentoring Process for Women: Implications for Professional Development." Professional Psychology: Research and Practice 23 (No. 3, 1992): 233-238.
- LaPidus, Jules B., and Barbara Mishkin. "Values and Ethics in the Graduate Education of Scientists." In William W. May, ed. Ethics and Higher Education. American Council on Education. New York: Macmillan Publishing Company, 1990
- Mateo, M. A.; K. T. Kirchoff and M. G. Schira. "Research Skill Development." In M. A. Mateo and K. T. Kirchoff. Conducting and Using Nursing Research in the Clinical Setting. Baltimore: Williams and Wilkins, 1991.
- Newby, Timothy J., and Ashlyn Heide. "The Value of Mentoring." Performance Improvement Quarterly 5 (No. 4. 1992): 2-15.
- Noe, Raymond A. "An Investigation of the Determinants of Successful Assigned Mentoring Relationships." Personnel Psychology Research Student and Supervisor: An Approach to Good Supervisory Practice. Washington, D.C.: Council of Graduate Schools, 1990.
- Pimple, Kenneth D. "General Issues in Teaching Research Ethics." In Robin Levin Penslar, ed. Research Ethics: Cases and Materials. Bloomington: Indiana University Press, 1995.
- Roberst, Priscilla, and Peter M. Newton. "Levinsonian Studies of Women's Adult Development." Psychology and Aging 2 (No. 2, 1987): 54-163.
- Schockett, Melanie R., and Marilyn Haring-Hidore. "Factor Analytic Support for Psychosocial and Vocational Mentoring Functions." Psychological Reports 57 (1985): 627-630.
- Swazey, Judith P., and Melissa S. Anderson. "Mentors, Advisers, and Role Models in Graduate and Professional Education." Washington, D. C.: Association of Academic Health Centers, 1996.
- Wigand, Rolf T., and Franklin S. Boster. "Mentoring, Social Interaction and Commitment: An Empirical Analysis of a Mentoring Program." Communications 16 (1991): 14-31.
At first glance, this case may lead the reader to focus on common practices between students and mentors. Traditionally, an adviser reviews graduate students' proposals before the proposal is presented publicly. Dr. Edgar did read and revise Janet's proposal, but he then failed to properly advise her and prepare her for the proposal defense. However, the ethical dilemma in this case is not only the one that Janet faces, but also the dilemma that Dr. Edgar faces.
This scenario was written with two goals in mind: 1) to promote discussion between graduate students and faculty on student-mentor relationships and 2) to encourage discussion between academicians about professional responsibility. Since this case involves two different issues, it would be useful to address each of these issues separately.
Student-Mentor Relationship
One common problem that arises between graduate students and their advisers is that the two parties fail to discuss their responsibilities and roles at the outset. Since both the graduate student and adviser have responsibilities to each other and to the other faculty, establishing rules at the beginning of the student-mentor relationship might avoid negative consequences for both concerned parties later.
Question 1 was written to encourage discussion of the responsibilities involved in the student-teacher relationship by focusing on Dr. Edgar's behavior toward Janet. Clearly, Dr. Edgar misinformed Janet about the seriousness of the proposal meeting. He also failed to inform the committee how he advised Janet. Dr. Edgar omits the fact that he gave the revisions to Janet just a few hours before the meeting and that he was even aware of the design flaw in an earlier draft.
Should Dr. Edgar own up to his responsibilities? Or is it Janet's responsibility to inform her committee about Dr. Edgar's late revisions?
How would the answer to Question 1 change if Janet were in her last year of graduate school? Or if her funding were cut off as a result of this meeting? Conversely, how would the reader's opinion change if we find that Janet is a student who does sloppy work? Is it still Dr. Edgar's responsibility to help Janet in her progress through the program?
According to a recent report produced by the Committee on Science, Engineering, and Public Policy, the mentor's primary goal or obligation is to further the student's education (National Academy of Sciences, National Academy of Engineering and Institute of Medicine, 1997). Thus, Dr. Edgar not only has a responsibility to help Janet through the program, but he has an obligation (regardless of other commitments) to help her through. Graduate students and faculty advisers assume different roles within the department and the university, and these roles might conflict. In this case, Dr. Edgar is Janet's adviser, but he also is now an administrator at the university. His conflicting roles create time pressure in his life and take time away from each other.
The reader could be asked how this situation would change if Dr. Edgar did not have an administrative position. Certainly it would be easier for Janet and Dr. Edgar to work together because Dr. Edgar would have one less role at the university. But the reality is that professors have many roles. Oftentimes they have to spend less time on research and advising duties to complete committee or administrative work. Question 2 was written to encourage discussion about how the role of adviser should be maintained. Professors often accept new students that they do not really have time for and spend very little time with. Perhaps Dr. Edgar should not have accepted Janet and the new position in administration at the same time. Effective advisers are good listeners, good observers and good problem solvers (NAS et al., 1997). In addition, effective advisers keep in touch with each graduate and respects the goals and interests of good students. Thus, regardless of Dr. Edgar's new position, one of his responsibilities is advising graduate students, and he should assume this role with commitment and dedication.
Professional Responsibility
After learning that Tom experienced a similar situation, Janet decides that it is her professional responsibility to inform the department head of Dr. Edgar's behavior. Questions 4, 5, 6 and 7 all focus on this issue of professional responsibility. These questions were written to explore whether the consequences of Dr. Edgar's behavior (both for Janet and Tom) should influence Janet's decision to speak to the head.
The reader should consider whether Dr. Edgar is a tenured faculty member. If he is untenured, then Janet's decision can affect his future at that department. Should his mentoring abilities (or lack of abilities) affect his ability to obtain tenure?
In addition, Janet's decision will affect her own future. Most likely she will no longer be advised by Dr. Edgar, and she might eventually feel forced to leave the department.
Consider Dr. Edgar's position. What is it like to face your fellow faculty members after realizing your faults as an adviser? Should Janet give Dr. Edgar some consideration before approaching the department head? The answer to this question is, of course, yes. I am not arguing that Janet should behave altruistically or even that she should do unto others as she would have others do unto her. However, her position at the university is dependent on Dr. Edgar. So most likely she will ruin or, at the very least, damage her career by reporting Dr. Edgar.
A final position to consider in this case, in terms of professional responsibility, is Dr. Smith's position. As head of the department, is it his responsibility to look after Janet's concerns? If Dr. Edgar is untenured, is it fair to Dr. Edgar to bring this issue before the tenure committee? Dr. Smith's position is tenuous. Suppose he is elected by the faculty. Does he have the same responsibilities to the graduate students as he does to the faculty?
In summary, this case concerns two general issues -- the student-mentor relationship and professional responsibilities. With both issues, all positions and relationships should be considered before Janet can make a final decision. However, it is fairly clear that Janet must do something. She must find a way to protect her career interests and to address these issues without purposefully damaging Dr. Edgar's career.
References
- National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. Adviser, Teacher, Role Model, Friend: On Being a Mentor to Students in Science and Engineering. Washington, D. C.: National Academy Press, 1997.
Expectations and Responsibilities of Interested Parties
Options for Actions and Arguments for Acting
Expectations and Responsibilities of Interested Parties
New graduate students will have certain expectations from their advisers, such as guidance and training. When problems arise, most students assume the faculty member can offer support. Anna initially assumes that her relationship with Professor Creasin is fiduciary, and that each of them is receiving mutual benefits based on an implied trust. After their discussion in Professor Creasin's office, it is clear that their relationship is more paternalistic with Professor Creasin assuming he will make decisions on this issue. Anna is concerned about the safety of others in the lab, including Dan and Professor Creasin. She must also think of physical harm to the other lab students, particularly the students using the range top oven, as well as herself and her fetus. Future graduate students could be exposed to the lead compound, as well as other people who have reason to walk through the laboratory space. Additionally, Anna has reason to be concerned with the economic status of low-income graduate students who are financially dependent on Professor Creasin. Anna also has a responsibility to Dr. Moore to inform him of situations she is now trained to recognize as unsafe. Anna's final concern is directed toward Professor Creasin's university appointment, since she has agreed to work under his guidance and do research for him that supports his overall goals. Anna also has expectations about her own career goals and financial security while enrolled in graduate school. She has reason to be concerned that these could be compromised if she is forced to report the situation to the Materials Safety and Policy Department against Professor Creasin's wishes.
Professor Creasin has many responsibilities toward his students, including ensuring their safety and providing the funding that he has promised. While his primary concern should be with the health and safety of his students, he seems to be most concerned with the status of his tenure and research papers. His progress also affects his students, since their work would most likely be discontinued if he were to leave State U.
Professor Creasin knowingly violated safe lead levels and failed to comply with biohazard research regulations. Professor Creasin is also aware that students eat and cook food in a laboratory setting. The drilling of the solid compound is another unacceptable practice due to creation of airborne particles. Professor Creasin describes these unsafe practices as "a small problem" and says he will consider looking into the situation only after his tenure is assured. Correcting the situation would inconvenience him financially and professionally.
Professor Creasin expects that his students will work for him, since he is supporting them financially, and that they will contribute to his research. He expects to have the final decision on matters in conflict, and he assumes that the students will not go over his head when there is a disagreement.
Dan expects to benefit from the publication of ground-breaking research and assumes that the project will continue. He will lose time and effort if the project were shut down. He expects Professor Creasin to act as a mentor to him, and he assumes that he will think of Dan's safety and well being. Dan's main concern is unknown. He could be very upset with Professor Creasin for allowing unsafe lead levels, or he might agree that the deviations are irrelevant and, since he will only be working on the project for a limited time, the professional gain will outweigh these risks.
Options for Actions and Arguments for Acting
Anna has several choices, the simplest of which is doing nothing. Frequently the best choice is not the easiest. Anna appears to have taken the safety lectures seriously. She has made an initial attempt to correct the situation by informing Professor Creasin of the safety hazard. She also notices that students are using the oven in an unsafe manner when she returns to the lab. Although she is a new graduate student, she has probably witnessed this practice before, but she was unaware of the hazards of airborne lead particles. The other students have not benefitted from the safety seminar and probably assume that cooking in the oven is safe. There is now a differential in knowledge between Anna and the other students. The only other person who is informed about the hazard is Professor Creasin, and he will not be pursuing the problem for a while, if at all.
Keeping quiet does not seem to be the option that Anna would be the most comfortable with, in light of the problems it can mean for her and the other students. If Anna believes that she can still maintain a fiduciary relationship with Professor Creasin, than she could try approaching him again with notes she has taken from the seminar, explaining that these are the guidelines set up by Dr. Moore and not her own arbitrary standards. Anna is now forced to decide whether she will break the relationship by speaking to someone in authority about her concerns or succumb to the pressure Professor Creasin is placing on her.
An intermediate option is to tell all of the graduate students in the lab about the safety lectures and not mention her discussion with their faculty adviser. This course of action allows Anna to remove the knowledge differential and makes all the students responsible for their own decisions. If Anna does not mention that she has spoken with Professor Creasin, she can later say that she spoke to the students before he told her not to mention it to anyone else. Although she would be intentionally lying, it can be argued that the moral rule of not hurting others imposes a higher burden than not lying. Although lying about when the students were told the truth and intentionally failing to inform the students about the safety risks can both be classified as deception, utilitarian ethics would classify the lie as less deceptive. Although Anna would be lying to Professor Creasin, the greatest good might arise from informing the students.
Professor Creasin has deceived the students in his lab. He designed the experiment even though he realized that it would be potentially harmful to the student working on it and other students in the vicinity. His anger implies he might retaliate against Anna if she were to blow the whistle; retaliation would fall under the category of misconduct. The consequences of Professor Creasin's unsafe practices can harm individuals inside and outside of the lab setting.
Arguments for Doing Nothing
The case study is written from the perspective of a single person, Anna. It is Anna who considers Professor Creasin to be petty and easily upset. Only Anna attended the seminar and had the initial perception that a problem existed. Anna has not yet discussed the problem with Dr. Moore, so her conclusions are based only on information from the seminar. Professor Creasin has explained to her that the violation is only a small problem and that he will look into it later. Perhaps Anna has been wrong about her assessments of Professor Creasin and the problem with the lead. Perhaps he will correct the situation at the end of the semester. Professor Creasin has more knowledge about the material science field in general and specifically with the experiment that he designed. Anna stands to lose her relationship with Professor Creasin and disrupt the lab during an investigation if her analysis of the situation is incorrect.
This case is intended for a graduate level audience in any field of investigative research. Researchers can become very protective of their own work. Their ultimate goals do no necessarily depend on the success of their peers. Credit is often assigned to a few, maybe only one person - the fewer the persons, the more prestige. A successful dissertation also requires sufficient results. At the same time, it is vital that the research group work well together. A lot depends on a successful working relationship, including the group's reputation, advancement of research and, ultimately, getting publications/grants.
What this case really gets at is the issue of ownership of ideas. Ideas are (in theory) cannot be copyrighted or patented, and university research should be open and available to all (a debatable point). Most professors would probably read this case and state at worst, "There is no problem here," or at best, "The problem is just a lack of communication." The professor often believes that he/she owns the work and that problems of this sort should be solved by the group members. Graduate students implicitly have very little power when it comes to owning ideas; however, they do have a lot of work at stake in these ideas. In many cases, interpersonal relations within the group are not a sufficient mechanism to solve these subtle ownership issues.
The case is designed to start discussion about the hierarchy of control within a group - who should control research, what should be understood by every group member about the mechanism of control? Research groups have many implied social and professional arrangements. It is important to know who is in charge and to what degree they control the work before one begins a research project.
Swen's action, while it seemed innocent enough to him, has harmed members of the group. Discussion should involve the consequences of Swen's decision, which should help to illustrate the potential problems. One obvious problem is that Jeremy will now receive credit for part of Beverly's original idea. This acclamation is valuable and raises the issues as to what type of value can be used to qualify work at the university. Has Beverly been deprived of something valuable without her knowledge or consent? Furthermore, Swen has used his position to steal Beverly's ideas, in some sense. To what degree this is theft is debatable. Swen's obvious motive or whether he had prior knowledge of Beverly's attachment to her work does not really change this essential question.
A poster on the door of a biologist's office reads, "You wouldn't be here to protest animal experimentation, if it weren't for animal experimentation." Whether or not that statement is true, we all enjoy the benefits of animal experimentation. The drugs we take were tested first on animals. Many medical advances that guide physicians and prolong our lives owe their discovery to the animal "models" chosen by the researchers. But is that right?
A philosopher friend of mine described a thought experiment he conducted to answer the question. Suppose you were in a room with fifty puppies. In the next room were all of the members of your family being held hostage by terrorists. You could foresee the future (important since terrorists are not to be trusted), and you knew that the terrorists would turn themselves over to the authorities and release your family unharmed, if you submitted to their demand to break the necks of all fifty puppies. He said that he would do it. He argued that this situation was analogous to animal experimentation and clarified the value of human life.
But what if the puppies were human babies? There is a strong tendency to want to preserve human life over animal life and to preserve familiar lives preferentially. But should feelings and beliefs be our ethical guides? I would like cold, hard reason to guide these sorts of decisions. In the case of the puppies, it is clear that losing one's entire family would be a greater personal loss for most people than losing fifty cute but unfamiliar dogs. I'm sure one could get over the trauma of killing the dogs with the help of family. From a personal perspective, the best decision is to kill the dogs.
That seems to be the level at which almost all moral reasoning occurs: the personal level. One usually chooses the answer first and then seeks to justify it using argument. If one steps back a little from the personal and looks at moral scenarios as an outsider, then the apparent clarity of moral problems begins to disappear. If one were not human, what would be the correct answer in the puppy problem? Is it right to kill fifty members of Species A to serve ten members of Species B?
There is a genuine lack of objective criteria for the ethical treatment of living beings. What exactly is it that makes it acceptable to perform experiments on certain animals, but not on people? Biologists have found many similarities among animals. We all have cells. Many of our main tissue types are nearly identical across family lines. There are also many differences, of course. But which of these differences are important in determining which types of experiments (if any) are acceptable if performed on a given animal? Unfortunately, animal experimentation may be necessary to provide the information that will enable us to answer these sorts of ethical questions.
The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research, issued in 1979, elucidates three comprehensive principles that are relevant to the ethical practice of human subjects research: 1) respect for persons, 2) beneficence and 3) justice. The first principle, respect for persons, is particularly relevant to the question of deception in research. The report claims that "respect for persons demands that subjects enter into the research voluntarily and with adequate information" (p. 4). It goes on to apply this principle to formulate the requirement that subjects must give their informed consent to participate in research. This requirement of full and complete disclosure is waived, however, when
informing subjects of some pertinent aspect of the research is likely to impair the validity of the research. In many cases, it is sufficient to indicate to subjects that they are being invited to participate in research of which some features will not be revealed until the research is concluded. Such research is justified only if it is clear that 1) incomplete disclosure is truly necessary to accomplish the goals of the research, 2) undisclosed risks to subjects are no more than minimal, and 3) there is an adequate plan for debriefing subjects, when appropriate, and for dissemination of research results to them. (p. 6)
The report uses the phrase "incomplete disclosure" to indicate that its criteria apply not only to instances of outright deception in research but also to cases in which the researcher has misled subjects or has given them only partial information. I use the term "deception" here to describe all such situations in which subjects consent to participate in research on the basis of less-than-complete information. My analysis does not include an admittedly relevant question, whether the degree of disclosure makes a difference in deciding the ethical questions. In each of the cases outlined above, the researcher proposes to use some form of deception as a way of obtaining valid research results. Following, I analyze each of the three cases in light of the Belmont Report's criteria for ethically responsible research involving deception of subjects.
In Case 1, the researcher justifies deception on the grounds that awareness of her purposes will bias subjects' responses. Research in the field of social psychology has demonstrated that subjects' self-reports of attitudes can be influenced by a number of factors, including the subjects' desire to please the experimenter. It seems, therefore, that the research in this case meets the report's first criterion, that incomplete disclosure is necessary to accomplish the purposes of the research. The proposed research also meets the second and third criteria: This sort of attitude research does not seem to involve potential harm to subjects, and the researcher has included a plan for debriefing subjects following their participation.
Cases 2 and 3 similarly seem to require an element of deception to accomplish the purposes of the research. In Case 2, the study of conformity requires that subjects not be fully informed, or their behavior would not be spontaneous. The same reasoning applies to Case 3 -- subjects who knew that the research was measuring helping behavior quite naturally would help! Cases 2 and 3 differ, however, on the second criterion, undisclosed risks that offer the potential of harm to subjects. The research on group conformity is not likely to pose a risk to subjects; they are merely discussing a controversial issue, then reporting their attitudes. The research on helping behavior, on the other hand, is likely to entail some degree of harm to subjects. The experimental setup involves placing subjects in a situation that requires a difficult choice (to act or not to act) and then complicating that choice with the powerful influence of others. The subjects are likely to experience mild to extreme distress in such a situation. Case 3, therefore, does not meet the Belmont Report's second criterion of avoiding all but minimal risk to subjects.
With regard to the principle of voluntary consent, both Cases 2 and 3 are suspect. The researcher is also the instructor for the course, which presents a dilemma for students who may be uncomfortable about participating in an experiment. Although the researcher has included an alternative to participation (the 50-page paper), does this option constitute a true alternative? That is, is the option of not participating equally palatable from the student's standpoint? Consider that students may choose to participate in the experiment in spite of their apprehension because the paper option presents a heavy addition to the students' workload when compared to the one-time, one-hour appointment with the researcher.
These issues are complicated when the debriefing of subjects is considered. In Case 2, I noted that this experiment on group conformity was not likely to entail harm to subjects. That is true of the experiment itself -- but possibly not true for the debriefing. The debriefing in this case may do what Diana Baumrind has called "inflicting insight" upon subjects (quoted in Murray, 1980): When they are told that the researcher was actually studying group conformity, subjects who conformed may gain knowledge of themselves they would prefer not to have. Participation in this experiment, for these subjects, provides direct evidence of character traits most of us like to think we don't hold. We believe that we have minds of our own, that we don't bend too easily to outside pressure, etc. Gaining knowledge to the contrary (which, remember, was knowledge that subjects did not consent to gain) may cause subjects embarrassment or a lowering of their self-esteem. The effects of debriefing in Case 3 are similar, but the ramifications of unrequested knowledge are potentially still more serious. It could be quite disturbing for subjects to learn that in an emergency, when someone else needs help, they could be so easily swayed to inaction. Again, subjects may attribute their behavior in the experiment to flaws of character; unknown to the experimenter, some subjects may already struggle with low self-esteem, and their participation in such an experiment could be devastating. Only in the first case is debriefing not likely to introduce or add to the potential of psychological harm to subjects.
We have, therefore, complicated our consideration of the criteria for ethically responsible research involving deception, particularly in Cases 2 and 3. The Belmont Report's second and third criteria appear to conflict: The debriefing process, which is intended in part to "consolidate the educational and therapeutic value" (Sieber, 1992, 39) of research for subjects, is in fact an element of the research that either introduces or magnifies the risk of harm to subjects. Clearly too, deception research violates the principle of informed consent: Subjects in such cases may be understandably angry when the debriefing process "inflicts insight" about themselves that they neither wished to nor consented to gain.
Note that the report's third criterion includes "an adequate plan for debriefing subjects, when appropriate" [emphasis added]. We might conclude that when debriefing introduces or magnifies harm to subjects, as it does in Cases 2 and 3, a debriefing procedure is inappropriate. In such cases, it may be better for subjects not to know what was really being measured by the study. However, the problem of paternalism arises in judging for the subjects what constitutes a harm, and in deciding what is "best" for them. Further, this position seems to violate the concept of respect for persons, a central principle of ethically responsible research with human subjects. In addition to its educational and therapeutic value, the debriefing process also seems to be a gesture of respect for the subjects of research, built on the understanding that subjects have a right to know the true nature of the research in which they participated. We are then left with a difficult choice between introducing or magnifying the risk of harm to subjects by a debriefing process, or sending subjects on their way, never knowing what was actually done to them, an unpalatable option for responsible researchers who believe in honesty in research and who regard "subjects" as partners in the research process.
Options exist, however, for making such a choice, if still difficult, at least less difficult. A sensitive debriefing can go a long way toward alleviating the psychological harm that the process may introduce to subjects. In Case 2, the researcher could make clear that the responses of subjects who conformed are in no way unusual and could briefly explain some of the mechanisms that make group influence so powerful. In Case 3, again, the researcher should point out to subjects that the majority of those studied did not help. The researcher should summarize the research done to date on helping behavior and outline what is known about why people do not help in emergencies. In both cases, an explanation of how the current research is expected to add to the knowledge of group conformity or of helping behavior and a brief statement of the ways in which greater knowledge of these social phenomena may benefit others will also increase subjects' sense of well-being following the experiment.
Another option to minimize the risks of deception research is to anticipate some of the difficulties and adopt a research plan including a milder form of deception. Sieber (1992, 67-68) notes that deception in research takes one of five forms, with each succeeding form removing more of the subjects' right to self-determination and lessening the knowledge that is the basis for their consent to participate:
- informed consent to participate in one of various conditions: subjects know that they will not know which research condition they will participate in (e.g., treatment or control, experimental drug or placebo);
- consent to deception: subjects know there is some aspect of the study that will not be fully disclosed;
- consent to waive the right to be informed: subjects waive their right to be informed and thus are not told of the possibility of deception;
- consent and false informing: subjects give consent but are falsely informed about the nature of the research;
- no informing and no consent: subjects do not know they are subjects in any form of research (as when "real-life" situations are studied, or a seemingly real incident is contrived and then observed).
Each of the three cases analyzed here could be considered an example of consent and false informing: In each case, subjects have given consent but are not told what is actually being studied. Case 1 illustrates what one might consider a mild form of false informing -- that is, subjects are not fully informed because of the vagueness of the explanation of the study's purpose, but neither are they lied to outright. Yet because subjects have not consented to any form of deception (They do not know they are not being given full and adequate information), the case is still an example of consent and false informing. Cases 2 and 3 are clear-cut examples of consent and false informing.
The question then becomes, "Could the research purposes in these three cases be accomplished by employing a 'lesser form' of deception, one that preserves to a greater degree subjects' rights of self-determination and knowledge of the research?" In Case 1, it is questionable whether the accuracy of subjects' attitudinal responses would be compromised if they knew that the researcher could not tell them exactly what was being measured. If they were told that they weren't "getting the whole story," would their responses differ from the responses they would make when they were trying to guess at the purpose of the research? It seems that a milder form of deception might be feasible in Case 1; a well-informed researcher must make that judgment. In Cases 2 and 3, it is more difficult to imagine that any milder form of deception than consent and false informing would result in subjects behaving as they would when they were unaware of the study's purposes. In the study on helping behavior, if subjects were at all aware that they had not been fully informed, they would be quite likely to recognize immediately that the "emergency" was contrived. In the study on group conformity, it is possible that subjects would be so busy trying to figure out what was really being measured that they would not behave at all spontaneously or naturally in the group. It seems, then, that in at least two of the cases, the research cannot be accomplished without deception that limits subjects' autonomy.
However, a further determination must be made before the use of deception in research can be justified. The Belmont Report does not consider the worth of the research as a criterion for justifying the employment of deception. The report's criteria exclude any deception research that involves risks to subjects that are "more than minimal." Notice, however, that in this group of cases, as the risks to subjects escalate in severity, the potential benefits of the research increase as well. The study involving the greatest risk of harm to subjects, the helping behavior study, has enormous potential for increasing our understanding of the reasons people fail to help in emergencies, thereby increasing the possibility that we can develop strategies to combat those reasons. The research on group conformity has potentially beneficial aspects as well -- in increasing our understanding of the ways in which gangs operate, for example. It seems that in making decisions to undertake research involving deception, the potential costs to subjects must be weighed against the potential benefits for society.
Such a judgment is difficult to make. As Sieber (1992) points out, it is not always possible to identify risks and benefits in advance, and those that are identified are often not quantifiable. How does one weigh present harm to one individual against potential future benefits for many individuals? Sieber suggests that "common sense, a review of the literature, knowledge of research methodology, ethnographic knowledge of the subject population, perceptions of pilot subjects and gatekeepers, experience from serving as a pilot subject oneself, and input from researchers who have worked in similar research settings" (1992, 76) should all inform the assessment of risks and benefits. Imperfect as such judgments may be, they must be made. Trivial research involving any degree of harm to subjects is certainly unjustified; important research, on the other hand, may generate such benefits as to be worth some degree of harm (minimized and alleviated as much as possible) to subjects. The key is that the researcher should not be the sole authority in deciding when benefits outweigh risks: "[N]o single source can say what potential risks and benefits inhere in a particular study. . . . The benefit and justifiability of research depend on the whole nature of the research process and on the values of the persons who judge the research." (Sieber, 1992, 76-77)
Once we agree that the benefits and risk of research involving deception must be assessed together, we must consider what those benefits and risks may be. The discussion above identifies some potential benefits of the cases described here and some of the risks to subjects as well. Researchers must also be mindful of less obvious risks when considering research involving deception. These risks do not concern the potential for harm to the subjects of research, but rather entail negative consequences of such research for the researcher and for the science of psychology itself.
In a self-revealing essay entitled "Learning to Deceive," Thomas H. Murray describes his discomfort at engaging in deception in the course of research he helped conduct as a graduate student in social psychology (a helping behavior study similar to the one described in Case 3). He notes of the debriefing procedure following this study, "While I did reveal the true purpose of the study, I did not always answer all questions honestly; and I seriously doubt that I, or anyone else, could have removed all negative effects of participation" (Murray, 1980, 12). After encountering in debriefing anxious subjects who were shaking, stuttering, barely able to speak, he continues, ". . . you try to forget the queasiness in their smiles, and the uncertainty in their handshakes. You try to convince yourself that, yes, all harmful effects have been removed. But I did not believe it then, and I do not today." (Murray, 1980, 12) Disturbing as such post-study encounters may be, however, Murray identifies what he believes to be a more insidious danger of deception in research: the danger that the researcher will come to adopt an attitude of callousness, to view subjects as means to an end, and to believe that the characteristics and reactions induced by experimental manipulations in fact describe the way people are. Murray asks, "In trying to make our laboratory so much like the world, do we sometimes succeed in making our world like the laboratory?. . . Do we eventually come to see people as so easily duped outside the laboratory as inside it? And if our research induces people to behave inhumanely, do we come to believe that that is indeed the way people are?" (Murray, 1980, 14)
Such negative consequences of research involving deception do not end with the experimenter, however. The science of social psychology can itself be affected by the methods adopted by its disciples. The more prevalent the practice of deception in social psychology, the more the science comes to be associated with the practice, leading to an erosion of public trust in scientists and their purposes in any area of research in the field. Greenberg and Folger (1988) document that some social psychologists have challenged the unquestioning adoption of deception strategies, claiming that the "pool" of naive subjects grows smaller as populations, especially those such as college students who are often called on to participate in research, begin to expect to be deceived, thereby casting doubt on the validity of experimental findings. They also note that the public may acquire an attitude of distrust and suspicion regarding laboratories, scientists and even a profession that relies heavily on deception to make its progress.
A shocking incident at the Seattle campus of the University of Washington in 1973 illustrates one danger of such a widespread awareness of deceptive research methods in psychology. Students on their way to class witnessed a shooting and neither stopped to help the victim nor followed the assailant; when questioned later, some witnesses reported that they thought the incident was a psychology experiment! (Greenberg and Folger, 1988, 48). Although the criticism that "real-life" experiments lead to incidents such as the one above could be leveled as well at the movie and television industry, the example illustrates that deception in research has ramifications both for the subjects and for the science that extend beyond the time and place of the studies for which it is employed.
The discussion above, centered on three cases, illustrates why deception is employed as a research strategy and why its use has been called into question. Some of the dangers of deception are identified for the subjects, for the researcher, and for the science itself. Yet Greenberg and Folger (1988, 56) report eight studies that have indicated that subjects are bothered less about being deceived in the course of research than are the IRBs that review the proposals. If these findings are accurate, is more debate being raised about deception in research than is warranted? I believe that such findings add another element for consideration in the assessment of risks and benefits of research involving deception, but they do not eliminate the need for such consideration. Subjects in some kinds of experiments may not "mind" being deceived, but subjects participating in others may mind very much. In addition, subjects may not always recognize immediately, or ever, the subtle effects of such experimentation on their self-esteem, for example, or on their evaluations of social psychology and of scientists in general. We cannot dismiss the possibility that deception in research may have negative consequences for both subjects and researchers as well as for the science. Scientists considering deception have a responsibility to consider the costs with the benefits, and to minimize unavoidable costs wherever possible should they decide ultimately to deceive their research subjects.
References
- Greenberg, Jerald, and Folger, Robert. Controversial Issues in Social Research Methods. Springer Series in Social Psychology. New York: Springer-Verlag, 1988.
- Murray, Thomas H. "Learning to Deceive." Hastings Center Report 10 (April 1980): 11-14.
- The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Washington, D. C. : Department of Health, Education, and Welfare, 1979.
- Sieber, Joan E. Planning Ethically Responsible Research: A Guide for Students and Internal Review Boards. Applied Social Research Methods Series, Vol. 31. Newbury Park, Calif.: SAGE Publications, Inc., 1992.
Issue 1
- The issue of desecration is complex: Do actions constitute desecration, or do intentions? If remains are accidentally excavated -- e.g., when clearing land for a road -- is collection of that material desecration? Students should consider whether continued work on human remains is ethically justifiable once an interest in repatriation has been expressed.
- and 3. Encouragement and discouragement play an important role in reinforcing and sanctioning behavior. Likewise, decisions on the allocation of resources are important ways to support or oppose projects. What should we do when we see a colleague entering an ethically gray area? What are our responsibilities and opportunities? This question allows for something other than an all-or-nothing approach. Stipulations on funding can enforce or encourage certain behaviors while bringing about a compromise. Displays can be prohibited, for example, or sponsors might require the blessing of a tribal shaman or approval from tribal authorities. Many options are available; encourage students to explore them.
- It is important to note that not all Native Americans demand the return of remains for reburial. Some -- whether few or many is unclear -- are indifferent; some want remains studied then returned; others want preservation in museums if remains are treated with respect. Interaction need not be hostile, and good faith cooperation is common. Joint efforts can lead to mutual benefit, even when tribes and institutions act deliberately in their own interests. For example, research can provide a tribe with cultural identity. Here anthropologists get research material for comparative analysis, and descendants acquire useful knowledge. Eventually, remains can be reburied.
Issue 2
- This question raises the issue of foot dragging, a common administrative response by people who disagree with a policy change. How do well-meaning people work in this kind of passive-aggressive environment? It is important here to keep Justus an actor with honest, positive motives but working in an environment (sometimes obvious, sometimes subtle) of less-than-forthright management. This puts a good actor in a difficult but realistic situation where she is an observer of ethical breaches and must respond, probably in a hostile climate.
- One of the most difficult dilemmas facing researchers who have accepted repatriation in principle is deciding when materials should be returned -- immediately upon request, after detailed analysis, after casting, etc. Arguments can be made to support each position. However, students of this case study should appreciate the potential for paternalism evident in the latter two options: only after a researcher has transferred the value of the material to a cast or data set is cooperation forthcoming. That most researchers support repatriation only in such cases -- i.e., only when the original bones are replaced with something that keeps the information intact -- can be a point of discussion among students. Likewise, it is not at all clear that transfer of value is an ethically appropriate response to repatriation. Exhibitions, for instance, often substitute copies of bones for originals. However, some critics believe that displaying these copies is equally reprehensible, arguing that copies of things are essentially like the things they replace. Additional arguments, based on spiritual issues, might claim part or all of an object's essence is transferred during the copying process, e.g., photographs might threaten theft of the soul. Simple copying turns out to be a less-than-simple option.
- This discussion raises the general issues of whistleblowing and combining ethical and prudent actions in a possibly hostile environment. Even though Justus may not be doing anything wrong, she might be in a situation to observe serious ethical breaches. What should a person in this situation do? Be sure to ask students to consider what is at stake for Justus in these circumstances -- to act or not act or to go over her supervisor's head. Encourage them to see the situation through her eyes or to compare similar circumstances in other cases. This case also raises issues of disclosure and the appropriate pace of disclosing information.
Issue 3
- By design, this situation brings Justus into increasingly difficult positions. First, she will need to decide if the excavation was legitimate rescue work or unethical guerrilla excavation. If she believes it to be a legitimate rescue, she must decide, now that she is aware of repatriation policies, whether she needs permission from the Macaques to proceed and whether the museum can claim proprietorship of material collected and undisclosed in this way.
- Sometimes it's easier to decide what clearly should not be done than it is to decide what should be done.
- This discussion explicitly raises the issue of use of data that might be construed as ill-gotten gain. An extreme position in repatriation is that all archaeological and anthropological materials are ill-gotten gain (the result of unauthorized grave robbing), but before exploring that issue, begin with a less extreme example, especially if students believe the collection of this material is unethical or illegal. Students should be encouraged to consider larger issues of ill-gotten gain: medical experiments where data is collected without patient consent, experiments on prisoners or through torture, art history on stolen or looted art work, etc.
Issue 4
- Are felonies ever justifiable? Were Hops' stalling tactics provocative? The point here is to discuss whether her response should be assessed on the basis of the desired effect (return of the material) or on principle (no one contests that the Macaques ultimately have proprietary rights to the remains). Ethical and unethical decisions are not made in vacuums and have real implications for people. Decisions to act -- or not to act -- have downstream effects, and sometimes timing and impressions are crucial. What should be the connections between decisions on ethics (the way we should act) and actions (the way we do act)? Who is responsible when decisions trigger reactions that have negative consequences? Are Hops and the museum victims, or did they get what was coming to them? What about Ten Killer and Strong Jaw? This issue can also be raised in the context of animal rights, with lab break-ins, perhaps also to anti-abortion activism, chemical use, nuclear power, weapons research and medical research relating to HIV.
- Explicit distinctions are drawn here between the letter and the spirit of the law, and between legal and ethical principles as the foundation for action.
- Encourage students to separate short- vs. long-term issues and goals. This discussion offers a good opportunity for student participation, perhaps negotiation scenarios and role playing. Be sure also to discuss the different consequences of settling with enforced solutions and those growing from consensus.
Issue 5
- Resolving this issue may be more simple than students expect. The original intent of this section was to present a dispute over analysis rather than an issue of misconduct. Justus is confident in her analysis and should defend her expert opinion. Research involves the process of assessing competing interpretations. If Justus has confidence in her analysis and has managed to convince some colleagues, why should she back down? Options -- including some unethical ones -- are available to her, but they need not be pursued if she acts with confidence in her own work.
This point provides an opportunity to discuss power relations between junior (Justus) and senior (editor and reviewers) professionals: what Justus perceives, what her senior colleagues are hoping to impose, what she'll allow, and what she wants to create. Withdrawing the paper would mean total placation of senior scholars. How is the power dynamic at work here? What are student experiences in this regard? How is this situation best negotiated? The power relation is produced by a dialogue of different actors, where perceptions play crucial roles. Junior colleagues should be encouraged to engage in active negotiation regarding power relations, not just remain passive recipients or soldiers.
2. and 3. Justus has a variety of options, but they are constrained by issues relating to the destruction of unique and contested materials, human remains, and the destruction of materials without the explicit consent of those with proprietary rights to the materials.
3. Is Justus the person who should be making this decision, especially now that proprietary rights are contested? Is she willing to stand by her original analysis? Is there an alternative means for dating these samples that does not involve destruction of materials?
Issue 6
- This situation again raises the distinctions between the letter and the spirit of the law, and between legal and ethical obligations of researchers and professionals. What are the limits of professional responsibilities? This discussion raises basic issues of artificial boundaries and jurisdiction in human society and their constraints on ethical principles. Do ethical principles transcend geo-political boundaries? If they do, what don't they transcend?
General Issues
- If excavating graves is a form of desecration or theft, should data taken from grave sites be banned from use in research? What about use of materials from plundered sites or sites where excavation was sanctioned by descendants? Some people claim that anthropologists' use of data collected from Native American graves is comparable to the Nazi use of data from medical experiments on prisoners. Is this comparison justified?
- NAGPRA is a federal law and sets a minimum standard applied only to institutions that receive federal funds. (It exempts the Smithsonian Institution, which is covered by other repatriation legislation.) If your research is funded without federal moneys, are you ethically bound to the same principles? Are there circumstances that would exempt you entirely from repatriation's ethical prescriptions? Which standards should determine your obligation: ethical ones or legal ones?
- NAGPRA applies only to the repatriation of Native American remains and sacred objects. If your research and collections concentrate on other peoples (e.g., Polynesians, Africans, etc.), are you still bound by NAGPRA's ethical prescriptions?
- NAGPRA mandates the repatriation of only those remains for which specific hereditary or cultural affiliations can be established. For remains that cannot be so identified, a committee of interested parties is empowered to determine disposition. If you sat on that committee, what would you say should be done with remains -- fragments of bones and small fractions of skeletons -- that cannot be clearly identified? Who should assume proprietary rights over those remains? What is your ethical responsibility regarding artifacts and remains that cannot be identified?
- Of the 500 original nations and innumerable bands of Native Americans, many are now extinct. Who should possess proprietary rights over remains associated with those extinct groups? Are there any circumstances in which researchers' claims for proprietary rights over excavated human remains should prevail over claims made by Native American groups?
- In considering repatriation, what is the best way to balance 1) the essentially spiritual desire for reburial of human beings and 2) the essentially materialistic desire to acquire empirical knowledge from anthropological artifacts?
Further Reading
The National Park Service has an office assigned to implement NAGPRA and oversee the federal repatriation process. For information about regulations, policy changes, implementation strategies and updates:
Archeology and Ethnography Program
National Park Service
PO Box 37127
Washington, DC 20013-7127.
[WWW.CR.NPS.GOV]
Publications of professional associations with members directly affected by NAGPRA -- American Association of Museums, American Anthropological Association, etc. -- contain considerable discussion of repatriation issues, as do many Native American-oriented publications. A useful collection of resources and news on recent NAGPRA developments can be found on the WWW site: http://www.usd.edu/anth/repat.htm.
This site offers the full text of the original NAGPRA law, implementing regulations from the National Park Service and extremely helpful bibliographies of professional and Native American literature on the legislative and operational aspects of the law plus much more.
This case raises several important issues, including collaboration, authorship and supervisor-trainee relationships. Discussions may focus on one or more of these general areas, depending on the interests of the participants. It might be particularly interesting to talk about this case in a group that included people at different points in their scientific careers, i.e., graduate students, post-doctoral fellows, junior faculty and senior faculty.
The interests of this case's characters include the following:
Melissa: She expects Sharon to do the best research she can and to prepare the best possible publications. She is concerned about her own tenure process, and she expects Sharon to do work that supports her advancement.
Sharon: She expects Melissa, as her supervisor and a more experienced researcher, to guide their publications and collaborations. She expects Melissa and Adam to keep her career in mind with regard to publications, exposure within the scientific community, etc.
Adam: He expects Melissa and Sharon, as collaborators, to maintain open communication regarding the progress and presentation of the work they do as part of their joint project.
Conflicts arise between:
- Melissa's desire for tenure, and thus her desire to please the conference organizers, which leads her to want to include in the review all the latest research and results from the collaborative experiments
- Adam's desire for proper credit and acknowledgement and
- Sharon's desire to please both Melissa and Adam, and to do the right thing in the context of the inherent power inequalities. She needs to learn the proper procedures for publishing and collaborating, but she also needs publications and letters of recommendation for her future career. She cannot afford to jeopardize either relationship.
Potential actions for Sharon:
- Since Melissa clearly does not want to bring Adam into this situation, Sharon could refrain from contacting Adam and keep the focus of the paper as it is, on the work that has already been published, and
- not mention the new model;
- suggest the model in a cartoon, with reference to a manuscript in preparation by Sharon, Adam and Melissa
- include the model in detail, in essence using this review to introduce it, with reference to a manuscript in preparation by Sharon, Adam and Melissa.
- Sharon could call Adam and ask his advice, even though Melissa doesn't want her to.
- Sharon could ask another faculty member who is an experienced author for advice.
Consequences of these prospective actions:
1a) This strategy will ensure that there will be no problem in publishing the future biochemical paper. However, Melissa will be just as unhappy as when the initial conversation began. If Sharon opts for this action, she will have to explain why she feels it's inappropriate to mention the new model without consulting Adam. This brings up the more general question of how to resolve disagreements between supervisors and trainees, where there are inherent power disparities. Sharon could present her thoughts to Melissa in the context of ethics, proper accepted practice for publication, and/or specific journals' rules of publication. She should have learned some of these ideas earlier in her career. If Melissa insists on including material that Sharon thinks should not be in the paper, Sharon can insist that her name be removed from the list of authors. This course would have negative consequences for her publication list and probably for her future relationship with Melissa.
1b) This option may be the most obvious compromise for Sharon with regard to the actual material contained in the review and biochemical papers, and Melissa may agree to it. However, it still leaves open the question of whether Sharon and Melissa should contact Adam before referring to their collective unpublished work.
1c) Melissa would probably prefer this option, for the sake of the publication and her tenure process. Before discussing newer, unpublished work in this sort of detail, however, Sharon and Melissa clearly need to contact Adam. Discussion of this option could focus on the conventions about unpublished data and future ability to publish within particular fields of research. It could include the proper acknowledgement of contributions, allocation of credit, and the responsibilities of authors, determined according to journal rules, the field's conventions or conversations between collaborators.
2) Melissa has made it clear that she does not like this option. This scenario also raises the question of how Sharon should present this information to Adam. She could preserve much of the three-way collaborative relationship by mentioning it casually, and asking for his advice on this publication matter with which she is inexperienced. On the other hand, she would probably damage the relationship between Melissa and Adam by saying, "I thought you should know that Melissa is trying to publish without giving you credit." The question also arises of whether Sharon should tell Melissa before, after or at all that she is discussing this question with Adam. Readers of this case would probably wonder about additional information, such as why Melissa doesn't want to contact Adam. What is the past history of their relationship? How does Melissa expect Adam to respond?
3) As in 2), we wonder how Sharon should present the information to the faculty member, with what sort of tone, and whether she should mention to Melissa that she has spoken or is planning to speak to the faculty member. To understand the complexity of Sharon's position, we must consider that post-docs need publications and letters of reference. They also need to make and maintain solid connections and collaborations with more senior researchers. In addition, post-docs often have few institutional advocates or formal channels of support, i.e., there is no postdoctoral correlate to the graduate student thesis committee or council.
This case study raises some of the ethical questions surrounding one of the "housekeeping" details of research, the assignment of authorship of a journal article. The issue is of enormous importance to researchers since decisions about promotion, tenure and the funding of grants are very often based upon the number of articles one has published. Researchers facing pressure to "publish or perish" undeniably have a vested interest in having authorship credit on as many articles as possible, and this pressure may lead to the inclusion of their names even where inclusion is not warranted by their contributions to the research project - a practice known as "unjustified" authorship. (Epstein 1993) Research demonstrates that the average number of authors listed on articles in various prestigious scientific journals has increased over the years (de Villiers 1984, Huth 1986) lending some support to the notion that unjustified authorship is widespread.
In an effort to curb this and other ethically questionable authorship practices, the International Council of Medical Journal Editors (ICJME) revised their "Uniform requirements for manuscripts submitted to biomedical journals" in 1988 and included stringent guidelines to be followed in assigning authorship to journal articles. These criteria, also known as the Vancouver Convention (since the ICJME met in the city of Vancouver), are the most widely referenced criteria for authorship in scientific journals; currently, more than 500 journals require adherence by their contributors. (International Committee of Medical Journal Editors 1997) The Vancouver Convention is currently in its fifth edition.
Despite the prevalence and importance of the Vancouver Convention, many junior researchers are unaware or only dimly aware of their existence, and few have given much thought to their limitations and problems. The questions posed in this case attempt to provide the reader with experience in applying the Vancouver Convention as well as in examining the issue of whether these criteria are a culturally neutral expression of widely shared beliefs about what should constitute authorship, or whether they may be inappropriate in some circumstances.
An analysis of these questions might best begin with an inquiry into whether the other members of Williams' lab can legitimately be included as authors on Charles' manuscript (and he on theirs) under the Vancouver Convention, as the proposed journal requires. Evaluating questions of authorship begins with a determination of the specific contributions of each researcher on a project, including both the type and extent of contribution. Next, the governing body of rules must be consulted and the meaning of its various provisions determined. Finally, the contributions made by each member must be evaluated with respect to the rules in order to determine whether the individual deserves to be listed as an author.
In this case, the determination of authorship is somewhat hampered by a lack of detail. Nevertheless, some conclusions may be drawn. None of the other lab members, it would seem, contributed to either the conception of the research project or to its original design. They were not involved in any data collection or in any of the routine work involved in the project. Their contributions primarily consist of making suggestions on how to overcome problems associated with the research and on how to improve it. This effort may have involved some participation in the analysis or interpretation of data. Unfortunately, neither the quality nor quantity of these suggestions can be determined. They probably contributed little to the writing since they did not help draft the article, although they may have contributed helpful comments when the draft was circulated. Given their limited involvement in the project, one doubts they would have felt comfortable doing much editing or revising. They each did receive a copy of the draft and presumably are expected to participate in the final approval of the version to be published.
The criteria of the Vancouver Convention require "substantial" contributions to each of the three specified areas, yet precisely what constitutes a "substantial" contribution is not specified. Accordingly, whether or not a contribution is enough to satisfy this requirement becomes a question to be resolved by each research group on each research project. While the Convention does not say why it does not define the term "substantial" (probably due to practical considerations), this silence would seem to allow for one of two interpretations: 1) The Convention implicitly presumes the existence of some sort of objective and universally applicable standard of what is "substantial," which any assignor of authorship could use in making the determination, or 2) the Convention intentionally makes allowance for cultural variation, since individuals from different societies may assign a different value to any given contribution when determining whether it is adequate for authorship.
If we follow the former interpretation and view the facts from the perspective of a reasonable person in the United States, the contributions of the lab members would likely be judged insufficient. The third prong of the criteria, which requires would-be authors to approve of the final version of the manuscript, would seem to be satisfied here. However, the first two prongs probably are not satisfied. Few people would regard periodically offering suggestions at lab meetings and supplying a few comments to a manuscript as enough to qualify for authorship. (The reader is encouraged to conceive of scenarios in which such contributions might arguably be deemed "substantial". For example, if a suggestion resolved a problem that prevented the research from progressing, would that be enough to qualify for authorship?) While we do not know the specific contributions of each lab member -- and so we cannot determine whether some individuals might deserve authorship -- it would seem that none have made "substantial" contributions to conception and design or analysis and interpretation of data, or important contributions to the intellectual content of the manuscript.
The second interpretation of the Vancouver Convention, that it was meant to allow for variation by country and culture of origin of the researcher, is most certainly not correct, although it would permit the lab members to be included as authors. Williams' conduct indicates that he clearly feels the other researchers have contributed enough to be included as authors on the paper, and they apparently agree. Since we have no reason to believe this approach does not represent the local standard of Wonkaland, we must presume authorship would be appropriate here. Indeed, if such an interpretation of the Vancouver Convention were correct, one could not easily accuse it of cultural bias. However, since the chaos resulting from each country applying its own standard would subvert the standardization that the Convention clearly attempts to achieve, such an interpretation would not be permissible. Moreover, a local interpretation of "substantial" might render the criteria meaningless if it dictated that nearly any contribution qualifies one for authorship. There would be little need for elaborate rules defining who is entitled to it.
By failing to allow for cultural diversity, the Vancouver Convention risks criticism that it amounts to a kind of unethical cultural imperialism by the ICJME, just as Williams argues. One might well ask, however, how guidelines on authorship might be fashioned so as to be culturally sensitive and yet still reward scientists for their effort and assign public responsibility for what is published, the two main goals of authorship. As suggested above, in this case the reader might speculate that the strong emphasis on the group in Wonkaland results in a nearly automatic authorship credit to any group member. Among the positive benefits of such a system might be that vesting the other lab members with an interest in the success of the other lab projects would stimulate their contributions. This system would also minimize legalistic squabbling about whether someone qualified for authorship under the Convention and thus preserve group harmony. Obviously, however, it would reduce the amount of credit awarded to those who actually did the bulk of the work by diffusing it over a greater number of persons. It also would detract from the public responsibility function since many researchers might not know enough about the research it to defend it effectively.
Returning to the case, we see that the lab members might be included as authors is only through the use of a local definition of the criteria. This goal would be most easily accomplished if Charles can persuade Williams to select a journal that does not require adherence to the Vancouver Convention. If Williams insists on submitting the manuscript to the proposed journal, Charles will be placed in a difficult position. As a graduate student who requires Williams' continued patronage to finish his research -- and, indeed, the good will of the entire lab team -- he may have no choice but to add the names. His future professional contacts may be jeopardized if he refuses. However, submitting the manuscript with the lab members' names added will amount to lying. Charles may have to make a difficult decision.
Williams might try to claim that lying should be allowed in these circumstances. If most or all of the reputable journals follow the Vancouver Convention, he may argue that one has little choice but to lie if he wants to be published. Still, one might respond that he ought to focus his efforts on modifying the Convention to allow for local interpretations or otherwise work to resolve the problem but follow it in its present form until that time. This approach would mean submitting the paper to the proposed journal with his and Charles's names attached but crediting the contributions of the other lab members in the acknowledgments.
The issues raised in this case illustrate some of the difficulties involved in trying to establish authorship criteria that are culturally neutral and fair to all parties and still achieve the goals of giving appropriate credit and assigning responsibility. One recent proposal suggests replacing the notion of "authorship" with one of "contributorship" in which each contributor (defined as one who has added usefully to the work) spells out his or her contribution in the paper. At least one person would be required to take public responsibility for the work, and this role would be indicated in the article. (Rennie, Yank and Emanuel 1997) This approach might be one useful way of resolving many of the problems of the Vancouver Convention.
References
- de Villiers, F. "Publish or Perish -- the Growing Trends towards Multiple Authorship." South African Medical Journal 66 (1984): 882-83.
- Epstein, R. J. "Six Authors in Search of a Citation: Villains or Victims of the Vancouver Convention?" British Medical Journal 306 (1993): 765-67.
- Huth, E. J. "Irresponsible Authorship and Wasteful Publication." Annals of Internal Medicine 104 (1986): 257-59.
- International Committee of Medical Journal Editors. "Uniform Requirements for Manuscripts Submitted to Biomedical Journals." Journal of the American Medical Association 277 (1997): 927-34.
- Rennie, D.; Yank, V.; and Emanuel, L. "When Authorship Fails: A Proposal to Make Contributors Accountable." Journal of the American Medical Association 278 (1997): 579-85.
The process of peer review is based on the premise that accomplished scientists, or "experts," in a particular field of study, are the most qualified to evaluate the scientific work or proposed research in that particular field. While most scientists would agree that this premise is valid in establishing an effective system of peer review, it has become clear that the system has some inherent problems. For the peer review system to be effective, reviewers must be able to evaluate the proposed or completed work honestly and objectively, and they must respect the confidentiality of the work being reviewed. Indeed, most ethical problems encountered during peer review are due to the need to avoid conflicts of interest and maintain confidentiality, which may be very difficult in some situations.
Many of the ethical dilemmas faced by reviewers arise from the fact that guidelines for avoiding conflicts of interest and maintaining confidentiality are often lacking or inadequate. There is a clear need for granting agencies and scientific journals to develop more explicit guidelines for reviewers in dealing with these issues, and many are beginning to adopt such policies. One starting point might be for organizations to model guidelines on those developed by the National Institutes of Health (NIH) designed to avoid conflicts of interest and maintain confidentiality during the scientific review of grant proposals. The NIH provides explicit instructions to reviewers to avoid conflicts of interest during initial review group meetings. These instructions state the following:
A member must leave the room when an application submitted by his/her own organization is being discussed or when the member, his/her immediate family, or close professional associate(s) has a financial or vested interest even if no significant involvement is apparent in the proposal being considered. If the member is available at the principal investigator's institution for discussions; is a provider of services, cell lines, reagents, or other materials, or writer of a letter of reference, the member must be absent from the room during the review. Members are also urged to avoid any actions that might give the appearance that a conflict of interest exists, even though he or she believes there may not be an actual conflict of interest. Thus, for example, a member should not participate in the deliberations and actions on any application from a recent student, a recent teacher, or a close personal friend. Judgment must be applied on the basis of recency, frequency and strength of the working relationship between the member and the principal investigator as reflected, for example, in publications. Another example might be an application from a scientist with whom the member has had long-standing differences which could reasonably be viewed as affecting the member's objectivity. Another example which might be considered is the review of a project which closely duplicates work ongoing in the member's laboratory. (National Institutes of Health 1995)
With respect to maintaining confidentiality, the NIH guidelines state:
All materials pertinent to the applications being reviewed are privileged communications prepared for use only by consultants and NIH staff, and should not be shown to or discussed with other individuals. Review group members must not independently solicit opinions or reviews on particular applications or parts thereof from experts outside the pertinent initial review group," and "privileged information in grant applications shall not be used to the benefit of the reviewer or shared with anyone. (National Insitutes of Health 1995)
These statements offer reviewers clear guidelines for ensuring that they do not have a conflict of interest, and for maintaining confidentiality during the grant review process. However, many journals do not provide such specific guidelines for the review of scientific manuscripts. For example, in response to the claim and subsequent lawsuit by Cistron Biotechnology that scientists at the Immunex Corp. "improperly used information from a paper they reviewed for Nature in their own research," Nature editor John Maddox commented that the journal does not explicitly define confidentiality. The only policy statement regarding confidentiality states that "colleagues may be consulted (and should be identified for us), but please bear in mind that this is a confidential process." (Marshall 1995, p. 1913) Furthermore, Nature does not require reviewers to identify potential conflicts of interest. Maddox continues to assert that there are unwritten rules, generally understood by reviewers, which assert, which assert hat the contents of manuscripts are not to be disclosed to the public and are not to be used to further the reviewer's own research.
Like Nature, many journals provide their reviewers with vague statements regarding confidentiality. Guidelines for avoiding conflicts of interest and maintaining confidentiality vary considerably from journal to journal. This lack of consistency is problematic: When explicit guidelines are not provided, it is difficult for reviewers to know what actions are appropriate. Furthermore, even when explicit guidelines are provided, there are many situations where the appropriate action is not obvious. One way for research groups to handle these issues would be to establish their own review procedures to help guarantee a fair and unbiased review.
Phase 1
In Phase 1 of this scenario, John Slater receives a manuscript from a competitor's laboratory to review; the title of the manuscript suggests that the work is closely related to ongoing research in Slater's laboratory. Slater should immediately recognize that there is a potential conflict of interest in his reviewing the manuscript. Slater's appropriate course of action would be to inform the editor of the journal of the potential conflict of interest prior to reviewing the manuscript.
However, when such situations arise, the editor will often ask the reviewer to review the manuscript despite the potential conflict of interest, with the understanding that the reviewer will remain honest and objective. This outcome is especially likely to occur in situations where relatively few "experts" in the particular field of study are available to review manuscripts.
Phase 2
In Phase 2, Slater decides that he can be objective in his review of the manuscript. He asks Alice Parker, a graduate student in his lab, for her evaluation of the manuscript. In this scenario, Slater's motives are only to solicit Parker's comments, as she is intimately familiar with this field of research. However, it should be noted that Slater could have shown Parker the manuscript for the sole purpose of providing her with confidential information that could benefit her research. Some journals explicitly state that reviewers may consult with colleagues regarding a manuscript as long as the reviewer discloses to the editor the names of those who were consulted. However, many journals do not explicitly state such guidelines. Furthermore, guidelines for general disclosure of the contents of a manuscript, where colleagues are not consulted for their expert opinion on the research, are often absent or extremely vague.
Many scientists would argue that disclosure within the reviewer's research group, or even within the reviewer's own institution, does not constitute a public disclosure of information. On the other hand, some reviewers adhere to a strict definition of confidentiality and do not discuss the contents of a manuscript even within their research groups, except in the situation where a colleague is consulted for his or her expertise. However, when a manuscript contains information that is relevant to the research interests in reviewers' laboratories, it may be very difficult, if not impossible, to keep the information from their research groups. Furthermore, it could be argued that keeping such information confidential would conflict with the collaborative basis of scientific research.
Phase 3
Phase 3 presents the greatest ethical dilemma for Slater. In the course of reviewing the manuscript, Slater and Parker discover that the manuscript describes a novel technique that could potentially benefit their own research efforts. In this scenario, Parker uses the technique in her research, which proves to be beneficial and results in the publication of a manuscript. Scientists generally agree that the contents of manuscripts submitted for publication are privileged information and should not be used by reviewers to further their own research efforts. However, is it reasonable to ask reviewers to disregard information that could potentially benefit his/her own research? Which is more important -- individual researchers' right to confidentiality and credit for their own work, or researchers' commitment to the collaborative basis and overall mission of the scientific enterprise?
It is not clear whether Slater attempts to credit the competitor's group for the use of the technique. In this situation, how should the reviewer cite the source of the information? Consider a situation where a reviewer does not recommend the manuscript for publication, but recognizes that both groups may benefit from a collaboration. Would it be unethical for the reviewer to contact the competitor to discuss this possibility?
This case study illustrates some of the common ethical dilemmas encountered during the peer review of manuscripts submitted for publication in scientific journals. The most common ethical dilemmas appear to revolve around attempts to avoid conflicts of interest and to maintain confidentiality during the peer review process. There is clearly a need for scientific journals to develop more explicit guidelines for handling potential conflicts of interest and safeguarding confidentiality, but as this case study illustrates, explicit guidelines may not address every ethical dilemma that may arise. For this reason, it is necessary for all scientists to have a good understanding of the ethical issues inherent in the peer review process, so that they can make sound ethical decisions when these types of situations are encountered.
References
- Marshall, Eliot. "Peer Review: Written and Unwritten Rules." Science 270 (1995): 1913.
- National Institutes of Health (NIH). "Review Procedures for Initial Review Group Meetings." Issued January 1995; revised April 1997. http://www.drg.nih.gov/guidelines/proc.htm.
This case raises many issues. One may start with the accepted practice of listing only senior members of a laboratory as authors of an abstract if they are presenting data generated by members of their laboratories. Although this practice is generally considered appropriate, many scientists do not use this convention, instead listing all contributing researchers as authors.
A second issue concerns mentioning the contributions of undergraduate laboratory assistants. Many principal investigators (PIs, analogous to laboratory directors or senior members of a laboratory -- the persons who secure funding for the research) do not acknowledge the contributions of undergraduates unless they are considered significant. If the contribution consisted mainly of technical assistance, and no development of experiments was involved, the supervisor of that "technician" is the only one mentioned. This convention sets the standard in authorship. Although it may not seem fair or appropriate, and therefore may appear worthy of discussion, this issue is not considered to be the main issue raised by this case.
The second stage of development of the case is the original publication. Review articles often have only a single author, especially if they are written as an overview of recent advances in a particular field. They normally include only published data , but unpublished facts may be included with permission of the experimenter and listed as "personal communication." If the author of the review article is the experimenter who has generated the new data, it will be listed as "unpublished data." The fact that Gump is the only author on the review is accepted in the field, therefore, provided that he has permission to publish any new findings and includes the appropriate references.
Later in the case, however, we learn that permission was not obtained. The question now is, does a PI require permission to publish or discuss the data generated by researchers in his or her laboratory? That is a difficult issue to resolve. Strictly in terms of maintaining good communication in the laboratory and as a matter of etiquette, the answer is probably yes. In this case the students and post-docs conducting the research will probably hesitate before sharing their findings with Gump. In research there is always a chance that someone else will complete the important experiments and publish their data first. This experience is commonly referred to as being "scooped." If it is acceptable for the PI to publish any data generated in his/her laboratory, then the conditions are set for a race between the researchers (students or post-docs) and the PI to publish first. If the students lose this race and are not included in the authorship, have they been "scooped" by the PI? If the student were writing the review for publication, used an appropriate and considerate manner of referencing and obtained permission to include unpublished results from each researcher, should the PI be included as an author? Convention in the biological sciences says that a student does not publish research without including the PI as an author. The theory is that the PI has helped to shape and direct the research and therefore has made a significant contribution even if he/she hasn't performed any experiments.
Another question arises when one considers that Gump is writing a research article without any new data. Although he does include a couple of new figures that haven't been published previously, most of the results presented have been published in abstracts or other publications. It is standard to develop a full-length manuscript out of work that has already been presented at meetings and included in abstracts. If every bit of the research is referenced to an abstract or previous publication, however, is this considered double publishing? What is and is not publishable? How much new information does one need in order to write a manuscript? This standard will vary by discipline, but the question could stimulate discussion.
This section also implies that the students intend to publish their findings under their own names and in a journal more appropriate to their field. Has Gump lessened their chances of publication by publishing his manuscript? Should the obscure status of the journal be an issue?
Gump's obvious insensitivity to his students and post-docs is demonstrated when he ignores his students' concerns about his "creative" manuscript. If he were a responsible mentor, he would be helping to further their budding careers as scientists. By submitting the research paper as he did, he actually undermined their future publications and did not give appropriate credit.
The main issues of this case are:
- lack of "good mentoring" by the PI and insensitivity to the needs and concerns of his students
- failure to give proper credit to the people making the discoveries, in this case graduate and undergraduate students and post-docs
- concerns about authorship
- "ownership" or proprietary of use of data or discoveries made in a laboratory.
Phase 1
- Johnson and Green should have informed Smith of their results and told him about the work they had submitted for publication as a matter of common courtesy. Although Smith's work may not have been significant, he was a member of the research group and should have been made aware of the group's progress. Furthermore, the confrontation in Phase 2 could have been avoided if all parties were more open with their findings and intentions.
- Based on the information given in Phase 1, it is reasonable to believe that Smith should be listed in the acknowledgments at the end of the publication. Without evidence to support Smith's contribution, most referees would not accept listing Smith as a co-author.
- As a rule, chemists analyze their data carefully before making claims regarding chemistry under investigation. In this example, Smith used very poor judgment, especially for a chemist with the level of experience of a post-doc. For Smith to make a legitimate claim, he would either need to explain the inconsistency in his data or repeat the experiments and obtain valid data. Since Smith took neither of these actions, he forfeited his claim of credit for the discovery.
Phase 2
- To answer this question correctly, we would need detailed information about the research project and the extent to which Smith's idea represented a significant development. An argument could be made that perhaps this idea was only one of several of Smith's ideas about the reaction, and it turned out to be the right answer by chance. However, it is also true that Smith's suggestion undoubtedly saved the group a lot of time and money. Ownership of an idea is sometimes not defined unless the idea has been published formally.
- A patent lawyer could argue that it would be unethical to deprive Smith of any of the royalties because his idea led to the patented process. The lawyer might also argue that Smith should be allowed to file a separate patent on his idea and the process he had investigated. Thus, both parties would receive royalties, but Smith probably would receive a much smaller percentage than Johnson and Green.
- If Smith were to repeat his experiments successfully, demonstrating that his previous claims were legitimate, it would resolve most of the conflicts mentioned in this case. However, Smith may not be willing to do so and may argue that his original work is enough to establish that his idea led to a solution. In a case such as this, a third, independent laboratory would most likely be asked to verify the results of both Smith's work and the work published by Johnson and Green. The results of the independent laboratory's tests would then be used to resolve the question of Smith's contribution.
Part 1
The most salient issue raised by this case is the issue of who should profit from Jones's ideas -- whether the institution's investment in Jones is enough to justify receiving some (or all) of the profit from his business. A second -- not entirely separate issue -- is the appropriate use of university facilities (computers and phone lines, not consumable items) and time (leave) and the fact that the initial investment in Jones was by the college. As this case study was written, Jones is employed by a private college, but the issues would be even more complex if he worked for a state university. One might want to explore these issues when discussing whether it matters that public funds (a grant) supported Jones while he developed the program.
Jones's responsibilities to his institution are one issue, but he also has responsibilities to his students, which readers may or may not see as a separate issue. What Jones does on his own time is his business. But being a professor is not a 9 to 5 job with clear barriers between on and off time. Jones has a variety of responsibilities as a professor; readers' perceptions of and ranking of those responsibilities will greatly affect their answers to the questions posed in Part 1. Part of the problem is that Jones's priorities have changed mid-stream in Mark's graduate career, and the question arises as to whether or not these conflicts are being approached ethically. Jones's conflicts could become more entangled if the situation is allowed to progress without intervention; if BioProgram goes public, Jones also will have responsibilities to investors.
Mark also confronts some ethical issues. Does his responsibility for his own career conflict with his loyalty to Jones? He needs to remain in Jones's good graces for his future career success, yet he has to bring Jones's behavior up for review in order to graduate in a timely fashion; he is in a bit of a no-win situation. Mark's reluctance to disappoint Jones probably enters into his decisions as well, as he seems to get along well with Jones on a personal level.
Part 2
The issues raised here are mostly issues of responsibility -- the responsibilities of members of the department to students on whose committees they sit, the responsibilities to the department for the students and for allocation of resources (Mark's stipend). Underlying these issues is the issue of explicit versus implicit assumptions -- those made by Mark, by Jones and by the rest of the department. Conflicting perceptions of responsibility among the parties in this case led to this ethical dilemma.
Question 1. Peter was added to the case to simulate a frequent occurrence at technical conferences -- salespersons attending conferences to promote products. Many conferences are complemented by trade shows that invite industrial researchers to promote their products. This practice is not problematic as long as the attendees are notified of and aware of the presenters' agendas. However, sales pitches by research associates/salespersons can be biased and should be accepted with caution, especially if proceedings are published.
In this case, the conference was designed for the presentation of theoretical papers. This distinction raises the issue of attendees not knowing the presenter's agenda. This question was added to create an awareness of the possibility that biased results may be presented at a conference. It may also be noted that attendees can usually tell the sales pitches from the true research, which causes the attendees to become uneasy and angry toward the sales staff. Therefore, companies should not use technical conferences for sales promotion.
Question 2. Katherine decided to ignore the findings of William and his team and present an "incomplete" technical paper. William had the same problem but decided to tell Katherine. Or was he just passing the responsibility onto someone else? Either way, Katherine and William were responsible for evaluating the new analyzer, and they found something wrong. Should Katherine have retracted William's abstract? That would have raised concerns with management since Peter was scheduled to attend the conference with William and Katherine. If Katherine had told her supervisor, would she have been fired? Can Katherine and William continue to ignore the problem that they found?
Question 3. Katherine and William's option of continuing to ignore the problem has been eliminated by the professor's question. William must decide what to do. He can pass the responsibility to Katherine, as he did before, by directing the question to her. However, this option would probably get him fired by Katherine.
Will he lie? Should he tell the professor that he looked into the accuracy between the overlapping size ranges and found comparable results? Should he tell the professor that they haven't looked into that aspect of the evaluation? The first lie would be blatant and would create misplaced trust in the analyzer. If William pretends that he did not look into the evaluation, he could delay the inevitable discovery that the analyzer is inaccurate between the size ranges. At least he and Katherine could leave the conference without publicly disclosing the problem before they had a chance to inform management.
The main purpose of this case study is to stimulate a discussion of the criteria for authorship. However, as the scenario unfolds, several issues arise from the actions of the people involved. Several of the questions have been included to help initiate a discussion on how the actions of each character contributed to the eventual conflict.
Questions 1-3
In the first part of this scenario, McClair suggests some experiments that Platt should do for her thesis project. As a committee member, he has a right and a responsibility to suggest experiments that should be performed in order to reach the goals of Platt's thesis proposal. What is questionable is his suggestion that she go to a laboratory where similar work is being performed. McClair's motives are not entirely clear. Perhaps he is aware that Jones is low on funding and believes that going to England is an economical way for Platt to complete the experiments. On the other hand, he could be aware that Gleeson's laboratory has been encountering difficulties in performing experiments that Platt is familiar with. Although we do not know McClair's intentions, it is important to recognize the potential conflict of interest.
It was inappropriate for McClair and Gleeson to tell Platt that she was expected to share her data when her adviser had told her otherwise. McClair and Gleeson should have contacted Jones and clarified the conditions for Platt's trip. In fact, it probably would have been more appropriate for McClair to have approached Jones with the initial suggestion to do the work in Gleeson's lab so that she could evaluate the idea and define the expectations, prior to involving Platt. At the same time, Jones should have been more open with Platt about her concerns. Platt is also at fault in this scenario, however. First, she should not have taken the advice of a committee member over that of her own adviser. Furthermore, she should have insisted that the questions surrounding the sharing of her data and techniques be resolved before she left.
Questions 4-6
The criteria for authorship are not well established, and situations like the one in this case are not uncommon. Scientific journals are becoming more aware of this problem and have begun to set guidelines for authorship. A potential author should play an active role in one or more of the following capacities: 1) formalizing the idea, 2) performing the experiments and 3) writing the article. Furthermore, anyone listed as an author should read and understand the entire article and consent to its publication.
Whether or not Platt's presentations are relevant depends on the following factors. First, were the techniques reported in enough detail that someone could reproduce the experiments directly from the information presented, or did the presentation focus on the data, only mentioning the techniques? Furthermore, the guidelines of the meeting are significant. At some meetings, abstracts and "personal communications" are not to be referenced. The intention is that scientists can share scientific knowledge without the fear of being "scooped."
A collaboration allows for groups with varying areas of expertise to come together to solve a common problem. Although it is imperative that each member of a collaboration be involved in the work, the contribution from each group may not be equal. Therefore, the terms and limitations of the collaboration must be well defined in advance. In most cases, co-authorship is implied in a collaboration. This agreement is part of what distinguishes collaboration from cooperation.
Platt should have known what her role was prior to going to Gleeson's laboratory. Despite Jones's warning, Platt shared some of her data and techniques, perhaps within the limits of what she felt Jones was comfortable sharing. Furthermore, it is possible that Platt felt that the help she gave Gleeson's laboratory was reasonable considering the assistance she had received in performing her experiments. We do not know her exact reasoning, but it should be pointed out that Platt's first loyalty should be to her own research laboratory; if she were at all concerned about what was acceptable, she should have contacted her adviser.
Question 7
Looking at Gleeson's reasons for excluding Platt from the paper, we can see arguments for each side. Although Platt did not actually obtain the data presented in the paper, her contribution to the experimental set-up appears to be significant. Furthermore, merely performing the experiments does not guarantee authorship. For example, technicians are commonly excluded from publications because they fail to provide an intellectual contribution. Conversely, collaborators should not be automatically excluded because they didn't perform the experiments. Laboratory heads (i.e., research advisers) rarely do bench work, and yet they are often listed as authors.
It is important to look at Gleeson's second argument. If Platt had presented enough information at scientific meetings for Gleeson's laboratory to plan and perform the experiments, then her assistance was more of a convenience than a necessity. In this case, Gleeson's argument may be valid. However, if his lab's plan was to answer a certain question, and if they lacked a specific technique for doing so, Platt's contribution was crucial to their success, and she should have been given more credit.
Gleeson's final argument is completely invalid. It is improper to include a lab member who didn't contribute to a project, or to exclude one who did, on the basis of enhancing someone's career.
Question 8
A failure to communicate led to this problem. Jones needs to be more open and honest with her students. Her failure to take a stand prior to Platt's trip to England and her refusal to support Platt's pursuit of authorship may suggest that she should not be an academic research adviser. Not all good scientists are good mentors. Greater communication also should have existed between Jones and McClair. Faculty members should not purposely contradict each other. Furthermore, all three professors involved had a power advantage over Platt. It is difficult for a student to ignore the instructions of faculty members. However, when they are giving opposing directions the situation becomes impossible. Platt has obvious reasons for wanting to keep all three people content: Jones is her adviser, McClair is on her committee, and she will have to depend on Gleeson while in England.
Although she is in a difficult situation , Platt failed to demand that her role be defined prior to leaving. Because of this failure, Platt is not blameless.