Pimple 2002
Pimple 2002
Keywords: research ethics, education in the responsible conduct of research, RCR training
The study of research ethics spans innumerable and diverse fields of inquiry, ranging
from particle physics to oral history, archeology to bionanotechnology, and touches on
issues as localized as student-mentor relationships, and as global as biomedical
research in developing countries. In some cases the stakes are high, impinging human
health and environmental integrity; in other cases the stakes are, in the grand scheme,
quite low and of interest primarily (if not quite only) to a few researchers involved in a
dispute. As a field of study, research ethics is, not surprisingly, incoherent. This article
offers a framework intended to bring some order to the field.
The United States Public Health Service (PHS) recently issued its PHS Policy on
Instruction in the Responsible Conduct of Research (RCR)1 which states that
research staff [who work on PHS-supported research projects] shall complete a
Address for correspondence: Kenneth D. Pimple, Ph.D., Director of Teaching Research Ethics
Programs, Poynter Center for the Study of Ethics and American Institutions, Indiana University, 618
East Third Street, Bloomington IN 47405-3602, USA;
Email: [email protected]; Web: http://poynter.indiana.edu.
Paper received, 17 September 2001: revised, 11 March 2002: accepted, 30 April 2002.
1353-3452 2002 Opragen Publications, POB 54, Guildford GU1 2YF, UK. http://www.opragen.co.uk
For purposes of stimulating discussion and speculation, I suggest that concerns about
the ethics of any particular research product or project can be divided into three
categories: (A) Is it true? (B) Is it fair? (C) Is it wise? The presentation here is
intentionally provocative.
The first question, Is it true?, concerns the relationship of the research results to
the physical world. Do the data and conclusions really correspond to reality? If data are
made up (fabricated) or fixed up (falsified), they are not true. To a degree, this question
could be re-stated as, Is it good science?
The second question, Is it fair?, concerns social relationships within the world of
research. In this category belong issues such as relationships among researchers
(authorship and plagiarism); between researchers and human subjects (informed
consent); between researchers and animal subjects (animal welfare); and relationships
between researchers, their sponsoring institutions, funding agencies, and the
government. For example, although true reports can be published without citing
previous publications, or without securing informed consent from human subjects,
these are not fair research practices.
a See also reference 2. As of this writing, the PHS policy on RCR instruction has been suspended,
but the Office of Research Integrity still supports the policy and is proceeding with plans to
implement and support it.3
b The following section is adapted and expanded from reference 4.
The third question, Is it wise?, concerns the relationship between the research
agenda and the broader social and physical world, present and future. Will the research
improve the human condition, or damage it? Will it lead to a better world, or a worse
one? Or less grandly, which of the many possible lines of research would we be better
off pursuing? We have finite time and money for pursuing research, and the wisdom of
research programs is a valid question in research ethics. These are the kinds of
questions many people have in mind when they debate the ethics of human cloning.
These three questions provide a succinct guide to the responsible conduct of
research, capturing the heart of the issue in a concise formulation.
The questions are meant to be organized from the smallest to the largest from the
(relatively) simple question of whether a research report is true to the much more
complicated question of which research projects are morally acceptable and which are
morally prohibited. They are also meant to stimulate discussion, even controversy (a
useful pedagogical tool). For example, although attention to truth can hardly be
avoided when discussing the responsible conduct of research, the concepts of truth
and even reality are more problematic today than they were in Bacons time. Even
people unimpressed by postmodernism will recognize that the conclusions of a falsified
or fabricated report could still be accurate (true), and that some reports that are not
accurate (not true) do not imply any misdeed on the part of the researcher as strange
as the phrase may seem, there really are honest mistakes. This deliberately over-
simplified presentation should encourage teachers and students to wrestle with the
concepts of truth, accuracy, the goals of science, the source(s) of sciences authority,
and other fundamental issues in the responsible conduct of research.
Six Domains
Expanding the three questions into six domains provides a logical, intuitive, and less
simplified way of organizing the responsible conduct of research. The subcategories
are numbered for ease of reference only; the numbers are not intended to convey a
sense of precision about the arrangement of items.
Is it true?
1. Scientific integrity c The relationship between research and the truth.
1.1. basic technical competence (including experimental design)
1.2. data manipulation
1.3. statistical methods
1.4. falsification
1.5. fabrication
1.6. unintentional bias
c Not all of the phrases I use to describe the six domains carry the same meaning I assign them in
all contexts. For example, research integrity has been defined as a measure of the degree to
which researchers adhere to the rules or laws, regulations, guidelines, and commonly accepted
professional codes and norms of their respective research areas. 5 (p.2) I do not think the names
of the other five domains violate common sense or usage, with the possible exception of Domain
5, institutional integrity a phrase I have often heard used, but never defined.
Is it fair?
2. Collegiality Relationships among researchers.
2.1. authorship
2.2. data sharing and timely publishing
2.3. plagiarism
2.4. peer review
2.5. confidentiality
2.6. candor
2.7. mentorship
3. Protection of human subjects Relationships between researchers and human
subjects.
3.1. the Belmont Report (Ethical Principles and Guidelines for the Protection
of Human Subjects of Research)6 protection from harms: respect for
persons (autonomy); beneficence (plus non-maleficence); justice
3.2. post-Belmont access to treatments
3.3. informed consent
3.4. assent
3.5. confidentiality and anonymity
3.6. deceit
3.7. debriefing
3.8. research risks and benefits
4. Animal welfare Relationships between researchers and animal subjects.
4.1. the 3 Rs (replacement, reduction, refinement)
4.2. pain and suffering
4.3. enrichment
4.4. animal rights
5. Institutional integrity Relationships between researchers, their sponsoring
institutions, funding agencies, and the government.
5.1. conflict of interest
5.2. conflict of commitment
5.3. regulatory compliance
5.4. data retention
5.5. institutional oversight
5.6. institutional demands and support
Is it wise?
6. Social responsibility The relationship between research and the common good.
6.1. research priorities
6.2. fiscal responsibility
6.3. public service
6.4. public education
6.5. advocacy by researchers
6.6. environmental impact
6.7. forbidden knowledge
The six domains are not hermetically sealed. Many items could be placed into
more than one category. Taking perhaps the most obvious example, most of the items
under 3 (the protection of human subjects) and 4 (animal welfare) also fall under 5.3,
regulatory compliance. But concerns with protecting human subjects and with animal
welfare are not precisely synonymous with a concern for regulatory compliance:
Following the rules is not exactly the same as being ethical. Certainly the two often
overlap, and are generally intended to overlap, but sometimes following rules serves no
moral value other than that of following rules. Indeed, sometimes morality demands
more than rules do, and sometimes morality actually demands actions that run counter
to the rules.
The sub-categories are not intended to be exhaustive; I have no doubt that more
could be added and the existing items refined. Furthermore, each sub-category could be
explained in detail, but I do not do so here because my goal is not to provide an
encyclopedia, but a framework or system the difference is something like that
between a multiplication table and an algorithm for multiplying. Generating the details
and correcting the list are therefore left as an exercise for the reader.
The PHS Core Instructional Areas, being built on hard experience, make sense to
anyone familiar with the public catalog of research abuses over the last quarter century,
but to a novice they might look like a semi-random hodgepodge. The Six Domains
heuristic framework is intended to provide an easy entree to newcomers and an
organizational metaphor for experts.
The following are three examples of this frameworks utility.
Research Misconduct
Since 1989, when the Federal government reluctantly adopted policies to deal with
irresponsible research practices, research misconduct has essentially been defined as
fabrication, falsification, and plagiarism. (An interesting account of the history of the
concept of misconduct in science can be found in reference 7; see also Appendix A.)
Here are the relevant portions of the current, Federal-wide definition of misconduct,
adopted in 2000:8
Research misconduct is defined as fabrication, falsification, or plagiarism in
proposing, performing, or reviewing research, or in reporting research results.
Fabrication is making up data or results and recording or reporting them.
Falsification is manipulating research materials, equipment, or processes, or
changing or omitting data or results such that the research is not accurately
represented in the research record.
Plagiarism is the appropriation of another persons ideas, processes, results, or
words without giving appropriate credit.
The practical difference between fabrication and falsification is one of degree
rather than kind. Both corrupt the research record by misrepresenting a wholly
(fabrication) or partially (falsification) fictitious research report as honestly derived
findings. There can be no question that pawning fiction off as fact is contrary to the
norms indeed, to the very definition of science.
Plagiarism, however, is a different kind of offense. One ideal in science, identified
by Robert Merton as disinterestedness9 holds that what matters is the finding, not
who makes the finding. Under this norm, scientists do not judge each others work by
reference to the race, religion, gender, prestige, or any other incidental characteristic of
the researcher; the work is judged by the work, not by the worker. In this light,
plagiarism seems irrelevant to science. No harm would be done to the Theory of
Relativity if we discovered that Einstein had plagiarized it. (A great deal of harm
would be done to Einsteins memory, but that is sociology or history or biography, not
science.)
The Six Domains framework helps clarify what is at stake with plagiarism; it is an
offense against the community of scientists, rather than against science itself. Who
makes a particular finding will not matter to science in one hundred years, but today it
matters deeply to the community of scientists. Plagiarism is a way of stealing credit, of
gaining credit where credit is not due, and credit, typically in the form of authorship, is
the coin of the realm10 in science. An offense against scientists qua scientists is an
offense against science, and in its way plagiarism is as deep an offense against
scientists as falsification and fabrication are offenses against science.
Data management
The first Core Instructional Area, data acquisition, management, sharing, and
ownership, corresponds to Domains 1, 2, 3, and 5, leaving out only 4 (Animal
Welfare) and 6 (Social Responsibility). A point-by-point comparison of the details of
the PHS Core Instructional Areas (see Appendix B) and the Six Domains highlights the
differences in the approaches:
The long lists corresponding to data ownership and copyright laws may show a
gap in the Six Domains (should there be one or two items, rather than seven, that
correspond to these concepts?), or they may indicate that the concepts of data
ownership and copyright laws are tremendously complex, perhaps more complex than
implied by their short names..
Issues related to data management are clearly extremely complex and multivalent.
An argument could be made that data management is the neglected, but essential,
twin to the scientific method. Descriptions of the scientific method typically mention
observation, hypothesis-forming, and experimentation to test the hypothesis; implied in
all of this, but not stated, is that the observations, hypothesis, experimental details, and
experimental results are recorded (or, in proto-science, remembered). Even more
subtly, the publication of scientific findings is both dependent on adequate data
management and a form of data management. Without publication (as a journal article,
an oral presentation, or what have you), no activity can rightly be thought of as science.
Requiring training in responsible data management is hardly less comprehensive than
requiring training in the responsible conduct of research; it does not quite exist at the
same level of abstraction as the other Core Instructional Areas.
Methods of recording, analyzing, and storing data fall under Domain 1 (scientific
integrity), but retention of data also falls under Domain 5 (institutional integrity).
Likewise, when human subjects are involved, methods of recording, reporting, and
storing data also fall under Domain 4 (protection of human subjects) because of the
importance of informed consent, privacy, confidentiality, anonymity, and the like.
Copyright law and intellectual property both touch on collegiality (Domain 1) as well
as institutional integrity (Domain 5).
The stakes involved in data management are far from intuitive or obvious, thereby
making them difficult to disentangle. Use of the Six Domains framework takes a step
toward simplifying a knotty challenge.
Social responsibility.
Finally, none of the PHS Core Instructional Areas clearly correspond to any of the
items in Domain 6 (Social Responsibility). Although this is a disturbing gap, it is also
easy to guess why issues of social responsibility are not included in the Core
Instructional Areas.
First, there is no consensus on the social responsibility of science or scientists
(aside from those considerations covered by the other five Domains, such as basic
competence and truthfulness). Do scientists have an obligation to act as advocates? I
suggest that some do, but I would not expect everyone to agree with me on this point.
Second, the Core Instructional Areas tend to focus on what an individual scientist
can do. Every researcher is wholly responsible for his own acts of fabrication (if any)
and for doing her part to foster collegial relationships. However, no scientist is
responsible for setting the research agenda for the nation or the world.
These facts notwithstanding, I suggest that omitting social responsibility from the
Core Instructional Areas is regrettable and may send an unfortunate message: Scientists
have individual responsibilities, but they have no social responsibilities. Granted that
no one scientist can bear the burden alone, it is still true that each scientist has an
obligation to carry some part of the burden. I would not suggest in the abstract that
every, or any given, scientist has a moral obligation to undertake public service (for
example), but clearly science as an institution and scientists as a group do have such a
moral obligation. Although in the end precisely what is demanded of any particular
scientist may be difficult to identify, to me it is clear that scientists should be cognizant
of this general responsibility and that training in social responsibility even if such
training were to consist of nothing more than describing and exploring sciences social
responsibilities could be beneficial.
Conclusion
Scientific fraud
Examples of research misconduct include but are not limited to the following:
Misappropriation: An investigator or reviewer shall not intentionally or
recklessly
a. plagiarize, which shall be understood to mean the presentation of the
documented words or ideas of another as his or her own, without attribution
appropriate for the medium of presentation; or
b. make use of any information in breach of any duty of confidentiality
associated with the review of any manuscript or grant application.
Interference: An investigator or reviewer shall not intentionally and without
authorization take or sequester or materially damage any research-related
property of another, including without limitation the apparatus, reagents,
biological materials, writings, data, hardware, software, or any other substance or
device used or produced in the conduct of research.
Misrepresentation: An investigator or reviewer shall not with intent to
deceive, or in reckless disregard for the truth,
a. state or present a material or significant falsehood; or
b. omit a fact so that what is stated or presented as a whole states or presents
a material or significant falsehood.
Free scientific inquiry naturally includes proposing hypotheses that may ultimately
prove to be false, offering interpretations of data that conflict with other interpretations,
and making scientific observations and analyses that may prove to be in error. The
Commissions recommendations pose no threat to such inquiry, which is essential to
the advancement of science.
The sanctionable behaviors defined and elaborated here are not intended to limit or
define comprehensively the oversight role of academic and research institutions, which
are free to adopt more demanding standards.14 (pp.13-14) emphases in original
Pathological science
A term coined by Irving Langmuir: Cases where there is no dishonesty involved but
where people are tricked into false results by a lack of understanding about what
human beings can do to themselves in the way of being led astray by subjective effects,
wishful thinking or threshold interactions.15 (quoted in 16) Pathological science is a form
of unintentional bias, usually used to describe incidents in which many researchers are
Research integrity
In some contexts, research integrity refers to the extent to which publicly presented
research accurately reflects the scientific findings and is not tainted by fabrication or
falsification. A wider definition was proposed by Nicholas Steneck: a measure of the
degree to which researchers adhere to the rules or laws, regulations, guidelines, and
commonly accepted professional codes and norms of their respective research
areas.5 (p.2)
Regulatory compliance
This term expresses concern with following the rules (regulations and laws), including
the rules of a lab, research group, department, university, state, or country.
Scientific ethics
Scientific ethics is sometimes used to describe a subset of professional ethics the
professional ethics of scientists.17, 18
REFERENCES
1. Office of Research Integrity (2001) PHS Policy on Instruction in the Responsible Conduct of
Research (RCR). Available online at http://ori.dhhs.gov/html/programs/rcrcontents.asp.
2. Pimple, K.D. (2000) Letter Commenting on the Proposed PHS Policy on Instruction in the
Responsible Conduct of Research (RCR).
Available online at http://php.indiana.edu/~pimple/rcrpolicy.pdf.
3. Office of Research Integrity (2001) Responsible Conduct of Research (RCR) Education.
Available online at http://ori.dhhs.gov/html/programs/congressionalconcerns.asp.
4. Pimple, K.D. (1998) Contexts for Teaching Research Ethics, Trends 5: 5-8. Available online at
http://poynter.indiana.edu/tre5-2.html.
5. Steneck, N.H. (2002) Assessing the Integrity of Publicly Funded Research, in Steneck, N. H. &
Scheetz, M.D., eds. Investigating Research Integrity: Proceedings of the First ORI Research
Conference on Research Integrity, Office of Research Integrity, pp. 1-16. Available online at
http://ori.dhhs.gov/html/publications/rcri.html.
6. National Commission for the Protection of Human Subjects of Biomedical and Behavioral
Research (1979) Ethical Principles and Guidelines for the Protection of Human Subjects of
Research (The Belmont Report). Department of Health, Education, and Welfare. Available
online at http://ohrp.osophs.dhhs.gov/humansubjects/guidance/belmont.htm.
7. Guston, D.H. (1999) Changing Explanatory Frameworks in the US Governments Attempt to
Define Research Misconduct, Science and Engineering Ethics 5: 137-154.
8. Office of Science and Technology Policy (2000) Federal Policy on Research Misconduct.
Federal Register 65: 76260-76264. Available online at http://frwebgate.access.gpo.gov/cgi-
bin/getdoc.cgi?dbname=2000_register&docid=fr06de00-72.
9. Merton, R.K. (1973 [1942]) The Normative Structure of Science, in Merton, R. K., The
Sociology of Science: Theoretical and Empirical Investigations, The University of Chicago
Press, pp. 267-278.
10. Wilcox, L.J. (1998) Authorship: The Coin of the Realm, the Source of Complaints, Journal of
the American Medical Association 280: 216-217.
11. Committee on Science, Engineering, and Public Policy of the National Academies (1992)
Responsible Science: Ensuring the Integrity of the Research Process, Vol. 1, National Academy
Press, Washington, DC.
12. Public Health Service (1989) Responsibilities of Awardee and Applicant Institutions for Dealing
with and Reporting Possible Misconduct in Science, Federal Register 54: 32446-32451 .
13. National Science Foundation (1991) Misconduct in Science and Engineering, Federal Register
56: 22286- 22290.
14. Commission on Research Integrity (1995) Integrity and Misconduct in Research: Report of the
Commission on Research Integrity. Available online at http://www.faseb.org/opar/cri.html.
15. Langmuir, I. (transcribed and ed., Robert N. Hall) Pathological Science, Physics Today 42 (Oct.
1989): 36-48.
16. Carroll, R.T. (2000) Pathological Science. Available online at http://skepdic.com/pathosc.html.
17. Kovac, J. (1996) Scientific Ethics in Chemical Education, Journal of Chemical Education 73:
926-928.
18. Barden, L.M., Frase, P.A., & Kovac, J. (1997) Teaching Scientific Ethics: A Case Studies
Approach, The American Biology Teacher 59: 12-14.