Author's Commentary on "But For the Fear of What You Might Find Out"
This case examines the potentially negative outcomes that can occur when all aspects of one's actions are not taken into account. Here, perhaps out of ignorance, the engineers who were developing early prototypes of various medical imaging methods failed to appreciate the potential impact of these untested images on their volunteers. The heart of the problem lies in the cloudy statement of what the experiment's intentions were. As this was not a traditional scientific experiment in the sense of conducting a large number of trials with careful prescreening and a detailed statistical analysis of the results, the "human subjects research" aspects of the examinations are not necessarily clear.
The NIH's "Guidelines for the Conduct of Research Involving Human Subjects" defines "research" as "any systematic investigation designed to develop or contribute to generalizable knowledge." The "try it out and see if it works" type of protocol in place during the tests described in this case study may defy the designation "systematic investigation."
In any case, even if it were clear to the investigators from the start that they should take precautions because of the involvement of human subjects, it is not clear that that would have prevented this situation. Really, the only type of reasoning that could have prevented, or at least predicted, the situation described in this case is careful forethought about the full impact of the imaging tests.
This discussion brings up two interesting points. First, is it enough to merely "predict" a situation such as the one described in this case? If so, at what point does it become necessary to "prevent" a situation, rather than just "predict" it? Is it too much to ask that a woman face the idea that someone may have detected cancer in her breast but can't be sure? These issues must be weighed against the fact that at some point a new medical device will have to be tested if it is ever going to come into regular clinical use. Second, how can we ensure that adequate forethought will precede every experiment without slowing the research process to a halt? Any given action has an uncountable number of potential effects; admittedly, most have very low chances of actually occurring. At what percentage chance of occurrence can one stop worrying about potential experimental side effects? How does this equation change with the severity of the side effect? A related issue concerns the danger that guidelines governing research will become too detailed to be of practical value.
While it may not be common practice among the designers of noninvasive medical instruments, the type of forethought that this case begs for is certainly not crippling. In fact, it bears a close resemblance to the scientific design process used in developing such devices to begin with. The fundamental question one is really asking is, "What would happen if . . . ?", the same type of thought experiment that appears throughout the engineering design process. The only difference is that the "what" is an ethical concept rather than a scientific one.
That means that the person asking the "what if" questions must be versed in issues of ethical importance. While that may mean additional training for members of the scientific or engineering design teams, or even the addition of special ethical consultants or overseers on certain projects, there is a significant benefit to this type of ethical thought experiment, just as there is to those of a scientific nature. When asked by knowledgeable individuals, such questions will provide a great deal of insight into how to steer the project's development to avoid serious ethical problems. In these days of detailed lines of accountability and the threat of serious financial repercussions for poor ethical decisions, the extra cost of such ethical training or expertise is easily returned with the avoidance of even one potential crisis. This case could be seen as an argument for applying the scientific model to the practice of research ethics.