Jurnal Internasional
Jurnal Internasional
Jurnal Internasional
Author manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
About author manuscripts | Submit a manuscript
Published in final edited form as:
Integr Environ Assess Manag. 2019 May ; 15(3): 320–344. doi:10.1002/ieam.4119.
EPA Author Manuscript
John P Sumpter,
Brunel University London, United Kingdom
Anne Fairbrother,
Exponent, Bellevue, Washington, USA
Thomas P Augspurger,
EPA Author Manuscript
Timothy J Canfield,
US Environmental Protection Agency, Ada, Oklahoma
William L Goodfellow,
Exponent, Alexandria, Virginia, USA
Patrick D Guiney,
University of Wisconsin, Madison, Wisconsin, USA
Anne LeHuray,
Chemical Management Associates, Alexandria, Virginia, USA
Lorraine Maltby,
University of Sheffield, Sheffield, United Kingdom
EPA Author Manuscript
David B Mayfield,
Gradient, Seattle, Washington, USA
Michael J McLaughlin,
University of Adelaide, Adelaide, South Australia, Australia
Lisa S Ortego,
Bayer CropScience, Research Triangle Park, North Carolina, USA
Tamar Schlekat,
Society of Environmental Toxicology and Chemistry, Pensacola, Florida, USA
Richard P Scroggins,
Environment and Climate Change Canada, Ottawa, Ontario, Canada
Tim A Verslycke
Gradient, Cambridge, Massachusetts, USA
EPA Author Manuscript
Abstract
High-profile reports of detrimental scientific practices leading to retractions in the scientific
literature contribute to lack of trust in scientific experts. Although the bulk of these have been in
the literature of other disciplines, environmental toxicology and chemistry are not free from
problems. While we believe that egregious misconduct such as fraud, fabrication of data, or
plagiarism is rare, scientific integrity is much broader than the absence of misconduct. We are
more concerned with more commonly encountered and nuanced issues such as poor reliability and
bias. We review a range of topics including conflicts of interests, competing interests, some
particularly challenging situations, reproducibility, bias, and other attributes of ecotoxicological
studies that enhance or detract from scientific credibility. Our vision of scientific integrity
encourages a self-correcting culture that promotes scientific rigor, relevant reproducible research,
transparency in competing interests, methods and results, and education.
EPA Author Manuscript
Keywords
Reproducibility; Bias; Transparency; Research integrity; Scientific integrity
INTRODUCTION
Large segments of society are distrustful of scientific and other experts. Some have
suggested that we are in a culture in which reality is defined by the observer and objective
facts do not change peoples’ minds, and those that conflict with one’s beliefs are justifiably
questionable (Campbell and Friesen 2015; Nichols 2017; Vosoughi et al. 2018). Science and
scientists have been central to these debates, and the boundaries of science, policy, and
politics may be indistinct. In a social climate skeptical of science, the easy availability of
numerous reports of dubious scientific practices gives fodder to skeptics. Because
environmental regulations on use of chemicals and waste management rely heavily on the
EPA Author Manuscript
Science has long endured questionable science practices and a skeptical public. Galileo’s
criticisms of prevailing beliefs resulted in his issuing a public retraction of his seminal work.
In contrast, purported science “discoveries” such as Piltdown Man, canals on Mars, cold
fusion, archaeoraptor, homeopathic water with memory, arsenic-based life, and many others
have not stood the test of time (Gardner 1989; Schiermeier 2012). By 1954, Huff and Geis
(1954) illustrated how the presentation of scientific data could be manipulated to become
completely misleading yet accurate. Are things worse now? Recent articles in both the
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 3
scientific literature and popular print and broadcast venues paint a bleak picture of the status
of science. One does not have to search hard to find plenty of published concerns about the
credibility of science. These include overstated and unreliable results (Ioannidis 2005; Harris
and Sumpter 2015; Henderson and Thomson 2017), conflicts of interest (McGarity and
Wagner 2008; Stokstad 2012; Boone et al. 2014; Oreskes et al. 2015; Tollefson 2015),
EPA Author Manuscript
profound bias (Atkinson and Macdonald 2010; Bes-Rastrollo et al. 2014; Suter and Cormier
2015a, 2015b), suppression of results to protect financial interests (Wadman 1997; Wise
1997), deliberate misinformation campaigns as a public relations strategy for financial or
ideological aims (Baba et al. 2005; McGarity and Wagner 2008; Gleick and 252 coauthors
2010; Oreskes and Conway 2011), political interference with or suppression of results from
government scientists (Hutchings 1997; Stedeford 2007; Ogden 2016), self-promotion and
sabotage of rivals in hypercompetitive settings (Martinson et al. 2005; Edwards and Roy
2016; Ross 2017), publication bias, peer review and authorship games (Young et al. 2008;
Fanelli 2012; Callaway 2015;), selective reporting of data or adjusting the questions to fit the
data (Fraser et al. 2018), overhyped institutional press releases that are incommensurate with
the actual science behind them (Cope and Allison 2009; Sumner et al. 2014), dodgy journals
(Bohannon 2013), and dodgy conferences (Van Noorden 2014).
Such published concerns reasonably raise doubts about science and scientists and could even
EPA Author Manuscript
lead some to conclude that the contemporary system of science is broken. In writing this
article, we attempt to address some prominent science integrity concerns in the context of
environmental toxicology and chemistry. In our view, there is ample room for improvement
within our discipline, but the science is not broken, and some criticisms are overstated. In
writing this article, we do not pretend to have solutions that will overturn insidious pressures
on scientists and funders for impressive results, or to hold some moral high ground that
makes us immune from such pressures ourselves, or that all of our own works are above
reproach. Our recommendations are pragmatic, not dogmatic. Our goal is to nudge practices
and pressures on scientists to advance the science, while maintaining and improving
credibility through transparency, ongoing review, and self-correction.
Many of the prominent science integrity controversies have been in the high stakes
biomedical discipline, and in response that discipline probably has done more self-
evaluation and taken more steps toward best practices than most other disciplines. Results of
EPA Author Manuscript
self-reported, anonymous surveys of scientists, mostly in the biomedical fields, have not
been reassuring. In a 2002 survey of early and midcareer scientists, 0.3% admitted to
falsification of data, 6% to a failure to present conflicting evidence, and 16% to changing of
study design, methodology, or results in response to funder pressure (Martinson et al. 2005).
A subsequent metaanalysis of surveys suggested problems were more common, with close
to 2% of scientists admitting to having been involved in serious misconduct, and more than
70% reporting that they personally knew of colleagues who committed less severe
detrimental research practices (Fanelli 2009). Overt misconduct can occur in ecotoxicology
just as with any discipline (Marshall 1983; Keith 2015; Enserink 2017; http://
retractiondatabase.org, search term “toxicology”) and when exposed, is universally
condemned and, in many countries, is career ending. In contrast, the ambiguous, more
nuanced issues of science integrity that all of us are likely to experience in our careers
require thoughtful consideration, not condemnation. It is in regard to the latter that we
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 4
discuss efforts toward remedies from other disciplines to examine similar issues in
ecotoxicology, focusing on SETAC.
Before we can discuss integrity in ecotoxicology and related environmental science fields,
we must first distinguish what is meant by “science” in this context. Broadly speaking,
environmental science includes the disciplines of biology, ecology, chemistry, physics,
geology, limnology, mineralogy, marine studies, and atmospheric studies, that is, the study
of the natural world and its interconnections. The applications of environmental science
extend to agriculture, fisheries management, forestry, natural resource conservation, and
chemicals management, all of which have associated multibillion-dollar industries and vocal
environmental advocacy groups. The subdiscipline of environmental toxicology or
ecotoxicology, pursued by SETAC scientists, studies in great detail how the natural world is
influenced by chemicals, both natural and synthetic, introduced by human endeavors that are
largely in pursuit of the production of desired goods and services (food, clean water, plastic
products, metals, etc.). Because exposure to chemicals can have negative and sometimes
unexpected consequences for people and the environment, a body of regulation has
developed over the past century to control the kinds and amounts of allowable chemical
EPA Author Manuscript
exposures. Such regulations necessarily are based on scientific concepts such as Paracelsus’
directive that “the dose makes the poison” and physicochemical properties that influence
transport and fate of substances. Because of the complexity, inexactitude, and uncertainty of
ecotoxicology and associated sciences, rulemaking often is subject to challenge, leading to
accusations of profit over people or the environment or unreasonably restrictive and
burdensome requirements. Scientists are called upon to inform disputes based on their
knowledge or underlying principles or enter the conversation through self-initiated in-depth
literature review and commentary. Only by conscientiously adhering to fundamental
principles of the scientific method can environmental scientists maintain their integrity and
continue to play a valid role in environmental policy and management.
agree with the conclusions, authors’ interpretations of its implications, importance, or many
other things, but we have to be confident that the procedures described were indeed followed
and all relevant data were shown, not just those that fit the hypothesis. As Goodstein (1995)
put it: “There are, to be sure, minor deceptions in virtually all scientific papers, as there are
in all other aspects of human life. For example, scientific papers typically describe
investigations as they logically should have been done rather than as they actually were
done. False steps, blind alleys and outright mistakes are usually omitted once the results are
in and the whole experiment can be seen in proper perspective.” Indeed, no one wants to
read the chronology of a study. However, should for example, the omissions include
unfruitful statistical fishing trips or anomalous data that were assumed to be in error because
they did not fit expectations, such little omissions may bias the story and the body of
literature.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 5
objectivity, honesty, openness, accountability, fairness, and stewardship (NAS 2017). More
specific “research integrity” guidelines define appropriate expectations of individual
researchers and their institutions and may be highly procedural. Protecting the privacy,
rights, and safety of human research participants and animal welfare with institutional
review board clearance requirements is a common element of research integrity guidelines.
Academic research integrity guidelines have been established individually or in aggregate by
research funders and individual institutions (Goodstein 1995; NRC 2002; Steneck 2006;
ARC 2007; Resnik and Shamoo 2011; NRC-CNRC ). In most countries, research
institutions are usually responsible for investigating potential breaches of research integrity
by their scientists, although this can create difficult conflicts of interest for the institution
(Glanz and Armendariz 2017). A prominent recent exception is China, which announced
reforms that no longer allow institutions to handle their own misconduct investigations
(Cyranoski 2018).
EPA Author Manuscript
Whether research integrity guidelines should best be defined narrowly or broadly has been
an area of controversy. As of 2015, 22 of the world’s top 40 research countries had national
research conduct policies, all of which included fabrication, falsification, and plagiarism
(FFP), with some going further. In this context, “fabrication” is making up data;
“falsification” includes manipulating studies or changing or omitting data such that the
record does not accurately reflect the actual research; and “plagiarism” includes the
appropriation of another person’s ideas, methods, results, or words without giving
appropriate credit (ORI 2018). The Research Councils of the United Kingdom has a lengthy
list of misdeeds, including FFP, misrepresentation, breach of duty of care, and improper
dealing with allegations of misconduct, with many subcategories (NAS 2017). In contrast,
from the 1980s to 2000, the National Science Foundation (US) had defined serious science
misconduct broadly to include “…fabrication, falsification, plagiarism, or other practices
that seriously deviate from those that are commonly accepted within the scientific
community for proposing, conducting and reporting research” (Goodstein 1995). The
EPA Author Manuscript
controversial part was the catchall phrase “practices that seriously deviate from those
commonly accepted…” To the stewards of public science funds, such a catchall phrase was
preferable to an itemized list of all potential avenues of mischief, yet it raised the specter of
penalizing scientists who strayed too far from orthodox thought (Goodstein 1995). In 2000,
this definition of disbarring research misconduct was narrowed to just “fabrication,
falsification, or plagiarism in proposing, performing, or reporting research” with lesser
offenses classified as questionable research practices. Other misconduct was defined as
“forms of unacceptable behavior that are clearly not unique to the conduct of science,
although they may occur in the laboratory or research environment.” Yet only FFP research
misconduct findings were subject to reporting requirements to federal science funding
agencies, with questionable science practices or other misconduct handled locally (Resnik et
al. 2015; NAS 2017).
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 6
In many countries, there is an active debate about whether a legal definition is appropriate
for something that is really an academic judgment rather than a legal one. Denmark recently
similarly narrowed its broad definitions of research misconduct to only FFP following high-
profile cases in which scientists succeeded in having their academic misconduct findings
overturned in the courts. Yet if research conduct policies are considered “academic” without
EPA Author Manuscript
legal weight, institutions may have difficulty enforcing polices, such as when deliberate
intent is required to be shown and the researcher claims “honest mistake.” For instance, the
US Office of Research Integrity found that a tenured professor had committed research
misconduct by inappropriately altering data in 5 images from 3 papers. Yet when the
university sought to terminate her, she fought back contesting the university’s procedures,
and the university ultimately paid her US$100 000 to leave (Stern 2017). In private research,
it is not obvious which scientific integrity concepts have the force of law. In an example
from the United States, testimony of egregious breaches of scientific integrity norms
(including faking credentials and selective publication of only favorable results) was
disallowed in a court dispute between 2 private companies because there was no federal law
on scientific integrity (Krimsky 2003).
Science is a human endeavor and the “other misconduct” that scientists may commit is
diverse and may be horrific, such as bullying and abuse of power; taking advantage of
EPA Author Manuscript
The US National Academy of Sciences (NAS 2017) recently argued that the definitions of
research misconduct as fabrication, falsification, or plagiarism were too narrow. In
EPA Author Manuscript
particular, questionable research practices were more than just “questionable,” but were
clear violations of the fundamental tenets of research and were given a less ambiguous label
of “detrimental.” Consensus detrimental research practices were as follows:
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 7
substance to the transparency provisions, requiring open access to federally funded research
articles and more importantly, requiring archiving and public availability of the underlying
raw data (Holdren 2013). These broad policies become more specific and procedural in
government science agencies, and expanded to codes of scholarly and scientific conduct
such as a list of 19 principles for the US Department of Interior (USDOI 2014).
We expect the vast majority of scientists consider themselves to hold science integrity, as
self-defined in terms of honesty, transparency, and objectivity, sticking to the research
question and avoiding bias in data interpretation (e.g., Shaw and Satalkar 2018). Yet most
scientists will encounter ethically ambiguous situations. For instance, some may feel that
they struggle to advance science against a rising tide of administrative requirements
accompanied by declining support for science and increasing competition for funding. When
does cutting through bureaucratic institutional requirements cross the line from being
commendable efficiency to violating research integrity rules? Using grant or project funds
EPA Author Manuscript
for unrelated purchases or conference travel? Should minor misbehaviors such as posting
ones’ article on a website after signing a publication and copyright transfer agreement with
the publisher agreeing not to do so still be considered misbehaviors when done by many?
When does cleaning data become cooking data when, for example, anomalous values are
suppressed? There are many ethically ambiguous situations in which scientists may consider
that doing the “right thing” (compliance with all rules) might need to be balanced with doing
the “good thing,” especially when the welfare of others such as students or subordinates is
involved (Johnson and Ecklund 2016).
To us, scientific integrity can be simplified to cultures of personal integrity plus a few
profession-specific provisions of transparency and reproducibility. At their roots, these
norms are those children are hopefully acculturated to in primary school: Tell the truth, and
tell the whole truth (no data sanitizing, selective reporting, and report all conflicts); tell both
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 8
sides of the story (avoid bias); do your own work (no plagiarism); read the book, not just the
back cover before writing your report (properly research and cite primary sources); show
your work for full credit (transparency); practice makes perfect (rigor); share (publish your
work and data in peer-reviewed outlets for collective learning); and listen (with humility and
collegial fraternity to observations and suggestions of others). Finally, the golden rule “do
EPA Author Manuscript
unto others as you would have them do unto you” should resonate throughout the
professional interactions of environmental scientists, and especially in peer reviewing and
data sharing. When encountering an inevitable science dispute, keep criticisms objective,
constructive, and focused on the work and not the worker; do peer reviews of your rivals’
work as you would hope to receive reviews of your own, reward and recognize good
behavior in science, and so on.
The term “conflicts of interest” is commonly narrowly defined to financial conflicts. One
definition is “a set of conditions in which professional judgment concerning a primary
interest tends to be or could be perceived to be unduly influenced by a secondary interest
(such as financial gain). More simply, a conflict of interest is any financial arrangement that
compromises, has the capacity to compromise, or has the appearance of compromising trust
(Krimsky 2003, 2007). The term “competing interests” is often used where nonfinancial
factors compete with objectivity, such as allegiances, personal friendships or dislikes, career
advancement, having taken public stances on an issue, or political, academic, ideological, or
religious affiliations (PLoS Medicine Editors 2008; Nature Editors 2018b). Bias in study
design or data interpretation may arise from either conflicts or competing interests and can
be either overt or unrecognized by the scientist (Suter and Cormier 2015b).
Generally, the concern over conflicting or competing interests in science is that secondary
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 9
physician is a part owner, or the financial advisor who receives commissions when clients
are steered to financial products offered through their employer. Such arrangements do not
mean that the patient’s care will suffer or that bad financial advice will be proffered, but
these self-interests compete with the interests of those they serve (Cain et al. 2005).
EPA Author Manuscript
The mere existence of a potential conflict of interest should not alone throw results in doubt
when it is disclosed and acknowledged appropriately. However, although most articles in the
environmental sciences routinely disclose funding sources that could be perceived as
potential conflicts of interest, major omissions have occurred (Oreskes et al. 2015; Ruff
2015; Tollefson 2015; Krimsky and Gillam 2018; McClellan 2018). For instance, the
findings of a study on risks of contamination from natural gas extraction from hydraulic
fracturing of bedrock were undermined when it came out (apparently unbeknownst to the
university) that the research supervisor was being paid 3 times his university salary by
serving as an advisor to an oil and gas company invested in the practice. The failure to
disclose this financial relationship in the publication brought the study’s objectivity and
credibility into question, independent of its substance (Stokstad 2012). Authors and journals
have been criticized for gaming ethical financial disclosure requirements, such as by overly
narrow disclosures or disclosing a conflict in the cover letter to the editor accompanying the
manuscript (which is usually hidden from the reviewers and readers) but not including it in
EPA Author Manuscript
It should be noted that the severe conflicts of interest that some academic biomedical
researchers have created for themselves by setting up business interests to directly and
personally profit from their research outcomes (Krimsky 2003) are probably much less of an
issue in the environmental sciences. Dual affiliations and the resultant potential for divided
loyalties for university researchers have certainly come to light in the environmental
sciences, such as if the scientist has a public facing, disinterested, researcher identity but
privately has set up spin-off, personal business interests (Stokstad 2012; Fellner 2018).
While we are not aware of any systematic review, we think these situations are far less
pervasive in the environmental sciences than in biomedicine. Rather, in ecotoxicology and
environmental chemistry, the more common (and insidious) concern for authors and
institutions is to be self-aware of the potential for funding bias through unconscious
internalization of the interests of their research sponsors. The informative value of conflict
EPA Author Manuscript
of interest or funding disclosures varies. The shortest (and least informative) statement we
have seen was that “the usual disclaimers apply” (Descamps 2008), while the detailed
disclosures in biomedical literature can go on for pages (Baethge 2013; ICMJE 2016).
Funding sources can be obscured by channeling funding through intermediaries, such as a
critical review of cancer risks from talcum powder funded through a law firm involved in
toxic tort litigation (Muscat and Huncharek 2008). Requirements for highly detailed
disclosures risk diminishing their importance to that of the “fine print” cautions in
commerce that are seldom read. Much like computer software user terms and conditions that
have to be clicked past or the ubiquitous consumer product safety stickers that may be
written more to avoid product liability claims than for practical safety, detailed conflict of
interest disclosures may reach a point of diminishing returns. There is some evidence that
overreliance on conflicts disclosure is ineffective or can give moral license to scientists to be
biased (Cain et al. 2005). Our view is that true financial conflicts of interests should be
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 10
avoided, not just disclosed. Yet for most scientists in ecotoxicology and the environmental
sciences who sought and received funding in order to pursue studies, simple, unambiguous
statements of the funding sources should generally be sufficient.
Nonfinancial factors may also compete with scientific objectivity. Factors or values such as
EPA Author Manuscript
these are usually termed “competing interests” reserving “conflicts of interest” for financial
conflicts (Nature Editors 2018b; PLoS Medicine Editors 2008). In our observations,
competing interests are rarely mentioned in environmental science publications. Rather, they
are often discussed behind the scenes, such as in correspondence between an editor and
potential reviewers, along the lines of “Yes, I would be happy to review this article and
believe I can be objective; however, you should know that I used to be a labmate of the PI
and we collaborated on an article 3 years ago.” Marty et al. (2010) give such an example of a
disclosure of competing interests based on personal relationships. Whether or how
competing interests or values affect the assumptions and perspectives of scientists should be
more formally stated is an area of rich debate in the philosophy of science literature (PLoS
Medicine Editors 2008; Douglas 2015; Elliott 2016).
We reiterate our belief that the existence of a potential conflict or competing interests is a
ubiquitous part of the environmental science landscape and does not indicate poor science.
EPA Author Manuscript
Most scientists strive to present unbiased data and interpret their data evenhandedly.
However, the varied experiences of scientists can influence their perspectives in ways that
they may not recognize themselves. The transparency in disclosure reminds the reader to
consider perspectives and alternate interpretations when judging the merits of a study,
synthesis paper, or risk assessment.
Bias
Many of the published concerns in the environmental science literature come down to
cognitive bias. Science is not value free, and personal bias in interpreting science is often
related to differing worldviews (Lackey 2001; Douglas 2015; Nuzzo 2015; Elliott 2016). For
instance, the collapse of major fisheries that ostensibly had been scientifically managed for
sustainable yields helped inspire the precautionary principle. This philosophy sought more
cautious management and the reversal of the burden of proof for sustainable exploitation of
natural resources (Peterman and M’Gonigle 1992). Those with precautionary principle or
EPA Author Manuscript
risk assessment worldviews may interpret the same set of facts very differently. The
precautionary principle adherent may emphasize absence of conclusive evidence of safety,
and the risk assessment adherent may emphasize absence of conclusive evidence of harm
(Fairbrother and Bennett 1999). In such settings, values and biases are interwoven. Even
self-disciplined scientists who seek openness and objectivity carry some biases from
experiences and acculturation (here meaning how working in different environmental
organizations can lead scientists to modify their perception and thinking). Recognizing
sources of bias does not imply ill intent, for just the process of acculturation to a particular
place of employment can bias perceptions and inclinations (Figure 2) (Suter and Cormier
2015a, 2015b; Brain et al. 2016).
Professional societies such as SETAC can serve as a form of acculturation; some of the
authors of this essay have been active members of SETAC for much longer than they have
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 11
been employed by any single employer. Even self-disciplined scientists who seek openness
and objectivity carry biases from their experiences. What becomes particularly difficult to
self-regulate is the convergence of cognitive bias, a human nature to seek to please one’s
patron, and the interests of one’s employer or client. For instance, studies funded by drug or
medical device makers tend to find positive effects that favor the company funding the
EPA Author Manuscript
research (Lexchin et al. 2003; Smith 2006), and the funding effect for studies of chemical
toxicity may lean toward finding negative effects (Krimsky 2003, 2013; Bero et al. 2016).
However, concordance between a funder’s self-interest and research findings does not alone
indicate bias. Alternatively the industry-funded researchers could have deeper knowledge of
a drug or chemical than the nonprofit-funded academic researcher who might have less
extensive experience, the industry-funded work could have been more thoroughly vetted on
the basis of prior internal research, or the industry-funded scientists might have better ability
to obtain the resources and skill to carry out well-focused and rigorous research (Krimsky
2013; Macleod 2014). It is doubtful that these influences can be completely separated. To us,
disclosure, transparency, and balanced external reviews are presently the best pragmatic
approach to managing cognitive biases.
Tit for tat, adversarial claims of bias in the scientific literature doubtfully advance the
science. Conflicting perspectives can become personalized and intractable. How to know
EPA Author Manuscript
which is more credible? Neither? Both? Food nutrition researchers pointed out examples of
selective data interpretations and publication bias in obesity research in relation to
sweetened beverage (soft drink) consumption and in the health benefits of breast feeding.
They termed this distortion of information to further what may be perceived to be righteous
ends as “white hat bias” (Cope and Allison 2009). However, their conflicted financial
backing from the soft drink industry and from manufacturers of baby formula contributed to
countercriticisms of funding bias (Bes-Rastrollo et al. 2014; Harris and Patrick 2011;
Mandrioli et al. 2016). Unresolved in the claims and counterclaims of bias and financial
conflicts of interest was what advice was most credible.
al. 2008; Kintisch 2010; Raloff 2010; Rohr and McCoy 2010; Benderly 2014), sufficiently
safe levels of Se for fish and birds (Skorupa et al. 2004; Renner 2005), and a dispute that
was maintained for more than 20 y about whether an oil spill resulted in indirect harm to
salmon (Burton and Ward 2012). These intractable, mutual-bias criticisms make it very
difficult for nonspecialist readers to make informed judgments of which is the more credible
science.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 12
influence research findings in ways that damage the credibility of research are present. In
environmental toxicology, risk assessments or critical reviews fit that test and can be
vulnerable to bias. Suter and Cormier (2015a) identified sources of bias in ecological risk
assessment as including personal bias, regulatory capture, advocacy assessment, biased
stakeholder and peer-review processes, preference for standard studies, inappropriate
EPA Author Manuscript
than Type II error (failing to discover degradation when in fact it is occurring), when the
science is ambiguous. Conversely, the regulatory scientists entrusted to provide scientific
advice to protect environmental quality might be obliged to err on the side of precaution and
be more accepting of risk of Type I error, especially when it is “other people’s” money at
stake.
While science ethicists and the NAS (NAS 1992; Krimsky 2005; Boden and Ozonoff 2008;
Elliott 2014) have emphasized industry funding bias risks, these risks are not unique to
industry’s funding of science. For example, many countries have provisions for natural
resource damage assessment and restoration (NRDAR) to compensate the public for lost
opportunities following shipwrecks, oil spills, releases of industrial chemicals, and so on
(Flamini et al. 2004; Descamps 2008; Boehm and Ginn 2013; Goldsmith et al. 2014). These
assessments rely on science to some degree to establish linkages from the release to harm to
the environment. In turn, trustees of natural resources rely on science advisors to assess the
EPA Author Manuscript
extent and scale of injuries (adverse effects) and the monies needed to restore the lost
services. In large incidents, the responsible parties will inevitably retain their own science
advisors. Complex situations are resolved by either negotiation or adversarial litigation
(Flamini et al. 2004; Goldsmith et al. 2014). This environment produces an atmosphere with
strong incentives for plaintiff/trustee science advisors to maximize the magnitude and spatial
extent of effects to the environment and to downplay uncertainties or the influence of
potential other, non-compensable stressors and vice versa for those scientists retained to help
defend against claims. Maintaining objectivity and advancing science in such a work
environment would require extraordinary self-discipline by the individual scientists, an
institutional environment emphasizing science credibility, and an openness to external,
disinterested review (Wagner 2005; Boden and Ozonoff 2008; Elliott 2014).
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 13
In at least some jurisdictions, monies from NRDARs must go to restoring the damaged
public natural resources (beyond paying for salaries, consulting fees, and expenses to
support claims) and cannot be used to enrich those pursuing the cases. Toxic torts, by
comparison, pursue damages on behalf of private individuals or groups who consider
themselves to have been harmed by exposures to toxic chemicals. Toxic tort cases are
EPA Author Manuscript
adversarial proceedings with the lawyers expected to advocate only for their client, and
expert witnesses are paid to present testimony to support just one side. These torts may be
highly lucrative for the plaintiff attorneys who select the science testimony. For example, in
the Vioxx litigation the share for plaintiff lawyers was about US$1.5 billion (32%) of the US
$4.85 billion settlement (McClellan 2008), and in successful asbestos litigation the average
share of payouts going to the victims was only 37% (Elliott 1988). The lures and risks of
such immense payouts in toxic torts create strong incentives for biased science. At best,
critical reviews or product defense studies conducted for toxic tort science should be
regarded with skepticism.
Defense of science and engineering in favor of protecting enterprises that reflect large
investments and years of devoted work is understandable but becomes dangerous when
objectivity is compromised. Case studies such as the Vioxx case, in which the maker of the
drug downplayed increased risks of mortality from a successful product in which they were
EPA Author Manuscript
deeply vested (Curfman et al. 2005; McClellan 2008) and the cross-claims of blame between
the engineering consultants and the mine operator in the aftermath of the Mount Polley
(British Columbia, Canada) mine tailing dam failure (Topf 2016; Amnesty 2017), remind us
that objective science (including recognizing and disclosing uncertainty, and encouraging
additional science to narrow that uncertainty) is good business.
Academic–Industry Collaborations
The role of industry funding and concerns of perceived conflicts of interest in academic-
industry collaborations have been addressed in literature and are a common element in
institutional research integrity policies (Resnik and Shamoo 2011; Elliott 2014). Often
through philanthropic foundations, industry may contribute to basic science education and
research to strengthen regional universities and further the science literacy of the potential
workforce and society. Industry may also support applied ecotoxicology and other
environmental science research to inform specific scientific questions that affect their
EPA Author Manuscript
business interests. When industrial and academic research interests become at least partially
congruent, academic scientists may actively seek out such interest and support for their
projects and graduate students. Pragmatically, academia–industry collaborations are
necessary because public funding alone may be insufficient to support graduate research or
to address important questions relevant to industry and society. In the United States, about
40% of national research and development is funded by the private sector (NAS 2017). In
the United States, public funding for university research on the effects of chemicals in the
environment has consistently declined since 2000 (Bernhardt et al. 2017; Burton et al. 2017),
which implies that without industry–academia collaborations, there would be much less
substantive university research. The need for sufficient funding to support training and
research can trump concerns over the color of money, as captured in a university president’s
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 14
quip, “the only problem with tainted research funding is there t’aint enough of it” (Krimsky
2003).
Benefits of collaboration run both ways, with expertise from academic and public sectors
helping industry find solutions to lessen or avoid contributing to environmental problems
EPA Author Manuscript
between industry and academic scientists requires industry to provide expertise as well as
funds. Collaboration with industry scientists engenders a shared desire to succeed and
creates a sense of ownership of a project (Edwards 2016). The interchange of science
through academic, industry, and government scientists is deeply rooted in SETAC culture,
and the favorable views of the authors toward working across sectors is undoubtedly
influenced through our history with SETAC. However, industry support to academics or
others in support of applied environmental questions may come with inherent conflicts of
interest, and critics may consider scientists as collaborators in the pejorative sense of the
word (Hopkin 2006). This setting requires vigilance from both industrial research sponsors
and recipients to avoid unconscious bias.
While readers might presume situations in which individuals or institutions with strong
incentives to influence research findings consistent with their financial interests will do so, it
is important not to judge a study solely by its funder, nor to presume the sponsor’s preferred
EPA Author Manuscript
outcome. For example, an energy company sponsored a study to see if they could develop a
scientific case for relief from costly requirements for meeting dissolved oxygen criteria in a
river downstream of its hydroelectric dam. Instead the testing showed that the existing
criteria could impair hatching salmon (Geist et al. 2006). The company scientists easily
could have buried the results, which could have been discounted as being from novel
techniques. Their path of least resistance would have been to leave the study in the file
drawer, rather than going to the trouble of defending novel science and publishing it in the
open literature. In the long view, a reputation of science credibility may be more valuable for
companies than short-term project benefits.
Other examples include scientists from mining and metals trade groups publishing studies
showing that existing USEPA criteria for Zn and other metals could be underprotective of
aquatic species or entire communities (Brix et al. 2011; DeForest and Van Genderen 2012).
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 15
would also be biased to favor the advocacy group’s positions, and they questioned the
researchers’ probity (Blumenstyk 2007). In fact, the Se concentrations projected by these
academics to cause detrimental population-level effects were higher than concentrations
previously derived by industry-funded consultants who themselves had been on the receiving
end of bias implications because they were industry funded (Skorupa et al. 2004; Van Kirk
and Hill 2007). Unfortunately, these favorable collaboration examples are countered by
examples in which studies were funded as part of deliberate strategies to shape the science to
fit business interests. This “tobacco strategy” has been asserted with various substances such
as asbestos, benzene, chromium, lead, vinyl chloride, and more (Krimsky 2003; Sass et al.
2005; Cranor 2008; Michaels 2008; Oreskes and Conway 2011; Anderson 2017).
In keeping with the adage to be careful judging a book by its cover or wine by its label,
judging science by its funder or by presumed interests or leanings of the scientists can lead
to mistaken and unfair perceptions. Brain et al. (2016) pointed out that the career path of
EPA Author Manuscript
environmental scientists is often ambiguous, and whether scientists end up in careers with
industry, academia, or government has more to do with chance and timing of opportunities
than with a particular desire to work in 1 sector or another. Such is often the case with
academic and government scientists who work with industry to jointly fund or investigate a
science question of mutual interest (Hopkin 2006). The convergence of scientific interests
with financial interests can lead to a good marriage, so long as the parties are principled and
forthright with each other. While there may be a perception that research contracts are highly
restrictive, in our experiences these agreements establish expectations of academic freedoms.
“Interested science” should be viewed with open-minded skepticism, and studies with
immense financial implications warrant a higher level of scrutiny than others (van
Kolfschooten 2002; Krumholz et al. 2007; Suter and Cormier 2015b). It does not necessarily
follow that interested science is wrong or tainted. Ensuring transparency and complete data
reporting is one tangible step researchers can take to improve credibility of and perceptions
toward industry-academia collaborations.
EPA Author Manuscript
We think that SETAC is notable for its directed and sustained efforts to balance competing
perspectives in its deliberative processes and other activities. The founding principles of
SETAC set out a tripartisan structure with regulatory, industrial, and academic scientists (Bui
et al. 2004; Menzie and Smith 2018). As a result, SETAC now has well-developed norms for
balancing interests, inclusiveness of differing viewpoints, and neutrality in the reporting.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 16
Augspurger 2014). In contrast to multisector, nonprofit organizations such as the Health and
Environmental Science Institute (hesiglobal.org), which brings scientists together from
academia, government, industry, and nongovernmental advocacy organizations (NGOs) to
conduct original research in the public domain, SETAC is more a forum for dialogue and
promotion of best practices within the discipline.
The intended balanced representation of industry, government, and academia is not always
achievable, for there are also guidelines for gender equity and geographic representation,
and of course, people have to be willing to volunteer. Further, the tripartisan emphasis
underrepresents scientists from environmental advocacy groups or other NGOs. These
groups are influential for shaping public debate, policy, and law on environmental issues, but
their low participation in the Society suggests that they may not be attracted to or feel
welcomed by a “hard” scientific society such as SETAC, and meeting costs may be a barrier.
Despite these imperfections, the norms of seeking to balance potentially conflicting interests
EPA Author Manuscript
and to provide a safe forum to express differing scientific viewpoints are deeply ingrained in
the Society’s culture and activities.
Environmental relevance
By definition, environmental chemistry and ecotoxicology are concerned with how
chemicals, both natural and synthetic, pose a threat or influence the natural world (Johnson
et al. 2017). Because of pragmatic and ethical constraints, research in this domain is often
done in laboratory environments, testing cultured laboratory organisms or cell lines or other
EPA Author Manuscript
in vitro surrogates for organisms. However, the intent of such research invariably still has
some intended relevance to conditions that occur in the environment. We have seen articles
in ecotoxicology literature discussing some novel research based on undertested taxa,
underappreciated endpoints, unexpected multiple stressor effects, or unanticipated indirect
effects via untested commensal microbes. An article may start out with an introduction on
the ecological importance of the novel work, the work is reported, and then the discussion
closes arguing the ecological importance of their work, how it should change the thinking in
the field, and management implications. Yet to obtain their desired experimental effects,
exposure concentrations may have been orders of magnitude higher than those typical in the
real world, or exposure routes, chemical forms, or dilution media may be unlike those that
the organisms could encounter in nature (Johnson and Sumpter 2016; Mebane and Meyer
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 17
2016; Weltje and Sumpter 2017). When authors present such studies with a narrative on the
ecological importance of their topic, this may be a form of misrepresentation.
Environmental relevance and regulatory relevance may not always be one and the same. Still
with studies of potential ecological effects of chemicals, investigators often hope that their
EPA Author Manuscript
research will inform future regulatory assessments of risk and safety. In practice,
environmental regulators may pass over results of academic studies in favor of research
sponsored by industry. Steps academic ecotoxicologists could take to improve the utility of
their research for informing policy include developing an understanding of environmental
regulatory frameworks; using existing chemical assessments to inform new studies;
conducting and reporting studies to include sufficient rigor, quality assurance, and detail to
enable regulatory use; and placing academic studies in a regulatory context (Ågerstrand et
al. 2017).
Rigor
Funders, journals, and institutions reward novelty, such as the short-lived discovery of a
bacterium that grows with As instead of P (Alberts 2012). Highly selective journals with
article acceptance rates of 10% or less preferentially publish findings that are sensational or
at least surprising. These incentives are influential because universities and research
EPA Author Manuscript
institutes often hire and promote scientists on the basis of their record of acquiring grant
money and on the number of publications times the journal impact factors of the journals
published therein (Parker et al. 2016). With finite career opportunities and high network
connectivity, the marginal return for being in the top tier of publications may be orders of
magnitude higher than an otherwise respectable publication record (Smaldino and
McElreath 2016). The editorial quest for novelty has led to publication of questionable
articles in elite journals, such as one positing that caterpillars were the results of accidental
sex between insects and worms (Borrell 2009). Top tier journals also tend to have higher
retraction rates than midtier journals, suggesting that rigor has sometimes been
compromised in the competition for paradigm-shifting results (Nature Editors 2014).
the exposure, adequate replication, unbiased analysis and reporting of the results, and
repeating experiments that yielded surprising or ambiguous responses. There are ample
opportunities for improvement. For example, Harris and Sumpter (2015) asked a very basic
question of a sample of studies published in 2013 in 3 leading ecotoxicological publications:
Was the concentration of the test chemical actually measured? Of the studies reviewed from
Environmental Toxicology and Chemistry, 20% failed this basic aspect of experimental
credibility, as did 33% and 41% of ecotoxicology studies published in Aquatic Toxicology
and Environmental Science and Technology, respectively (Harris and Sumpter 2015).
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 18
related to the 12 principles described by Harris et al, we suggest 9 considerations that are
important to most field-based ecotoxicological studies or environmental effects monitoring.
flowing rivers and impounded reservoirs have very different communities and
study designs that attempt to detect pollution effects on communities across such
disparate habitats may have very low discriminatory power (Buys et al. 2015).
By failing to account for natural variability, adverse pollutant effects could be
obscured (Parker and Wiens 2005; Wiens and Parker 1995).
5. Gradient sampling: Studying a number of locations that vary in the degree of the
factor under investigation, such as chemical pollution, may improve the ability to
accurately estimate a relationship between exposure to the environmental factor
of interest and the effect of that factor, if such a relationship exists.
9. Transparency: Reporting sufficiently detailed methods and raw data for others to
reproduce the analyses or to further examine the data using alternative analyses
is a key attribute and common shortcoming of studies (Duke and Porter 2013;
Schäfer et al. 2013; McNutt et al. 2016).
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 19
Reproducibility
Reproducibility is one indicator of reliable research. However, the inability of researchers to
reproduce influential studies of others or their own has garnered enough attention to be
called a “reproducibility crisis” (Baker 2016a; Henderson and Thomson 2017; Munafò et al.
EPA Author Manuscript
2017). However, not all studies are easily reproduced. Environmental data are often messy
and field studies are more often observational than experimental. Large-scale, ecologically
realistic studies such as long-term, experimental lake studies are difficult to do even once,
and hopefully no one wishes for mishaps such as tailings dam failures or oil spills to study
(Wiens and Parker 1995; Schindler 1998; Parker and Wiens 2005). Such studies require a
logical system for causal inference to separate cause and effect from serendipitous
correlations (Norton et al. 2002; Suter et al. 2002).
Even rigorous laboratory studies may be difficult to replicate due to the highly variable
nature of biological systems and unanticipated responses to unknown factors. Demands for
reproducibility may favor industrial science over academic science. Industry often works
within strict good laboratory practice (GLP) rules and with well-studied species tested
through standardized protocols (Elliott 2016). Academic science is often framed around
education, and grants and graduate student researchers are usually encouraged to go after
something new and novel; protocols may be developed as they go, and quality control may
EPA Author Manuscript
Better experimental protocols that are easier to follow are one tangible way to strive for
EPA Author Manuscript
better reproducibility and transferability of both novel and standard experimental methods
(Figure 4). Multimedia experimental protocols could make it much easier to explain and to
teach techniques than the conventional, densely worded, printed protocols. The Journal of
Visualized Experiments (JoVE) is an innovative peer-reviewed, science methods journal in
which the articles are a unique blend of the conventional printed article with professionally
produced videography. Ecotoxicology methods articles have begun to be published in this
format (van Iersel et al. 2014; Calfee et al. 2016). The field would benefit from broader use
of new visualization techniques to document new methods and to improve education and
training on techniques that need to be highly standardized to be repeatable. At the minimum,
with the availability of electronic data repositories and supplemental information in journals,
there is no reason why detailed methods, including video demonstrations, cannot be
published.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 20
Reproducing a statistical summary or model run reported in a scientific publication when the
underlying data and code are provided and explained is one thing. Reproducing an actual
complex experiment is hard and is rarely attempted, unless perhaps the results are novel and
have a high regulatory or societal impact. Even under the best of circumstances, such as
when the original investigators have the resources to and are motivated to repeat an
EPA Author Manuscript
experiment in the same lab, with organisms from the same culture, using as close to identical
methods as they could manage, and including positive controls, the investigators may be
unable to produce the same result twice (Mebane et al. 2008; Owen et al. 2010). Positive
controls (testing a substance with well-characterized effects) are not always used in toxicity
testing programs, but their routine use can help investigators understand variability in test
results (Glass 2018). Nosek and Errington (2017) caution that if investigator #2 reports that
the results of study #1 could not be reproduced, that by itself does not indicate which is
more credible: result #1, #2, neither, or both. Further, much of the “reproducibility” debate
in the natural sciences is focused on cell biology or human behavior (psychology)
experiments, which may be more tractable to reproducibility studies than messy
environmental observational or experimental studies. Especially with complex biological
testing such as multigeneration tests, a green thumb husbandry factor may bring together art
and science to environmental chemistry and toxicology (Figure 4). Subtle methods
differences, strain differences, or stochastic events can be so puzzling that investigators are
EPA Author Manuscript
left thinking demons must have snuck into their study and interfered with one treatment but
not others (Hurlbert 1984). (We presume that Hurlbert’s [1984] suggestions for exorcisms or
human sacrifice for troubleshooting suspected demonic intrusions, might run afoul of some
contemporary institutional review board policies.)
Still, reproducibility is a core tenet of science, and successful reproduction adds confidence
in the credibility of novel findings. Divergent but individually credible results may further
advance the science by illuminating important aspects that were missed in the initial study
(Owen et al. 2010). If for instance, an investigator were to find a novel, major adverse effect
of a class of chemicals to a previously untested taxonomic group, then other equally diligent
and skilled investigators should be able produce similar effects in other research settings,
even if the test conditions were only similar. For instance, a stand-alone paper from the
1970s that reported a snail’s anomalous sensitivity to Pb was skeptically regarded. More
than 30 y later, open-minded skepticism led to follow-on studies from a new generation of
EPA Author Manuscript
scientists that not only affirmed the anomalous early report of sensitivity but also led to
important advances in comparative physiology and underlying mechanisms of toxicity (Brix
et al. 2012). Similarly, early reports that freshwater mussels and other mollusks were
unusually sensitive to ammonia were not widely persuasive. After repeated studies across
multiple laboratories and species showed similar findings, the issue gained traction with
standardized method development, interlaboratory round-robin testing, and attention by
environmental managers (Farris and Hassel 2006; USEPA 2013).
Individual investigators may not always have the opportunities for self-replication, but best
practices call for repeating what one can (Harris and Sumpter 2015). In field studies,
multiple measures of exposure, multiple years of field data, and so on give credence to
findings. We recognize that all science has practical resource limits and we are not going so
far as to argue that novel findings from small studies should never be published. Rather, the
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 21
appropriate conclusion from such studies is along the lines of “if these findings turn out to
be repeatable, they could be an important development.” In our view, novel, major findings
that are supported only by a one-off study are best regarded as tentative.
Transparency
EPA Author Manuscript
Transparency in reporting research, including all the relevant underlying data that were
relied upon in the paper, has become a critical element of integrity in science. Science’s
claim to self-correction and overall reliability is based on the ability of researchers to
replicate the results of published studies (Nosek et al. 2015). Studies cannot be replicated or
even reconstructed if scientists will not share additional data, information, or materials from
published studies, and we believe that upholding such ethical norms is every scientist’s
responsibility. The embrace of the principle of transparent reporting has been uneven across
disciplines, and the field of ecotoxicology has certainly not distinguished itself as a leader in
this regard (Meyer and Francisco 2013; Schäfer et al. 2013; Womack 2015; McNutt et al.
2016; Parker et al. 2016).
Researchers in ecotoxicology and environmental chemistry have long presented only highly
reduced data summaries. The only “data” included in some publications are crowded figures
and tables with results of statistical outputs, such as F-values, effects concentration point
EPA Author Manuscript
estimates (EC50, EC10, etc.), or no- and lowest-observed effects concentrations (NOECs,
LOECs). These derived values are not data. Such data-poor publications were
understandable before the early 2000s, in which strict page limits and word limits precluded
authors “wasting” space to publishing data tables. With the provisions for electronic
supplemental material beginning in the 2000s, and dedicated data repositories becoming
widely available at low or no cost to authors in the 2010s, these reasons for opaque
publication are no longer justified. Researchers who choose not to transparently report the
actual data underlying their scientific findings may have other reasons for doing so. They
may be concerned about others scooping them on their own data (McNutt 2016), although
counterintuitively, publishing data may actually help establish priority and reduce scooping
concerns (Laine 2017). Other less charitable reasons why researchers might resist publishing
data include that they have not devoted the needed time to organize their data in a coherent
fashion that is interpretable by others, their reported results might not be able to be
reconstructed from the underlying data, they are not keen to facilitate alternate statistical
EPA Author Manuscript
analyses or interpretations of their data, they wish to publish unfalsifiable findings, or there
is simply less there than they led readers to believe (Smith and Roberts 2016).
Data sharing may still be regarded more as an imposition from science funders to be
complied with rather than as a universal principle embraced by those who conduct and
publish scientific research (Nelson 2009; Burwell et al. 2013; Holdren 2013; Nosek et al.
2015; European Commission 2016; Collins and Verdier 2017). There are many pragmatic
obstacles to effective data sharing, such as the expertise, extra work, and costs to researchers
to organize, serve, and preserve their data in a comprehensible manner; privacy and
anonymity concerns for environmental data collected from private property or about human
subjects; and balancing intellectual property concerns. Some environmental science research
is intended to be confidential, such as private sector economic geology, agricultural chemical
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 22
product development, and innumerable other corporate research efforts that are intended to
develop products and recoup research and development investments. However, in our view,
researchers on such ventures cannot have it both ways, by publishing some outcomes in the
peer reviewed literature but withholding the supporting data as private. A recent corporate
initiative to make available traditionally protected crop safety information is noteworthy in
EPA Author Manuscript
Most environmental science journals have policies that encourage and facilitate data sharing.
SETAC journals are probably typical in requiring a statement by the authors about whether
and how the data underlying their analyses are available, with an admonition that authors
should share upon request. A passable statement may be something as feeble as “Contact the
Corresponding Author for data availability.”
The strongest data disclosure policy for journals publishing in the environmental sciences is
probably that developed for the Public Library of Science (PLOS) family of journals. “PLoS
journals require authors to make all data underlying the findings described in their
manuscript fully available without restriction, with rare exception” (PLOS 2014).
Exceptions are limited to privacy or vulnerability concerns such as data on human research
subjects that could not be fully anonymized; locations of archeological, fossil, or endangered
EPA Author Manuscript
species that could be exploited or damaged; or safety and security considerations. Penalties
for authors who fail to comply include rejection, or if they decline to provide data for an
already published article, the editors could flag their article with a cautionary correction or
even retract it (PLOS 2014). Whether PLoS’s stand requiring authors to make available all
data underlying their findings will lead other journals to stiffen their resolve, or whether the
comparatively lax policies of competing journals will undermine PLoS and other open-
science advocates remains to be seen (Nosek et al. 2015; Davis 2016).
derived values, and the true raw data behind a data point include survey data, unprocessed
sensor readings, spectral outputs, and such. Unless the study involves methods comparisons
or forensic data audits, usually the researcher just wants the resultant derived values at a
level of detail sufficient to reconstruct and further analyze the original results.
While the notion that investigators should preserve and share underlying data is simple, the
reality of doing so is much more complicated and challenging. To us, it is a priority to
strongly encourage, for without data, the credibility of science cannot be evaluated. Some
research has shown that the willingness and ability of authors to share data decline
significantly with time, and having a weak data availability policy is only marginally better
than having no policy at all (Vines et al. 2014). Computer servers get replaced, directories
flushed, offices moved, files dumped, and investigators move on, retire, and eventually die.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 23
Rather than mandates, one simple incentive to improving openness in reporting has been for
journals to award prominent open data “badges” for articles verified as being supported by
available, correct, usable, and complete data. By showing an open data badge on the issue
table of contents, article web page, and including a “verified open data” statement in the
bibliographic indexing metadata, articles without such badge endorsement may be seen as
EPA Author Manuscript
incomplete. Over time, this might shift the norm toward open preservation and sharing. In at
least one journal, this approach appeared to markedly improve the sharing and preservation
of data through linked, independent repositories (Kidwell et al. 2016; Munafò et al. 2017).
host of data reduction, data standardization, and analysis decisions need to be made. These
decisions and associated biases may be deliberate and clearly explained or the analyst may
not even recognize that they have made a decision (Roberts et al. 2006; Suter and Cormier
2015a). In some cases we suspect analysts obscured their decisions. Systematic review
methodology is now also being used for some chemical assessments in which case data
synthesis may be highly structured, with criteria clearly defined for data inclusion and search
strategies (Hobbs et al. 2005; Van Der Kraak et al. 2014; Whaley et al. 2016). Other
situations may follow the wending path of the present article: discussions among the authors
(“Have you seen so-and-so?”) and readings that led to other relevant material through
forward and backward citing, along with some specific subject searches. This path led to
much relevant and thoughtful material across many disciplines. But it was hardly systematic
or reproducible.
Literature searches from different sources can yield very different results. For example,
using a 2007 original research article on population modeling of Se toxicity to trout (Van
EPA Author Manuscript
Kirk and Hill 2007), 4 leading bibliographic indexing services were searched for articles
citing that study. Web of Science (WoS), Elsevier’s Scopus, Digital Science’s Dimensions,
and Google Scholar found 7, 10, 15, and 22 citing publications respectively. Scopus found
all articles found by WoS, plus articles WoS missed in Human and Ecological Risk
Assessment and Integrated Environmental Assessment and Management. Google Scholar
found all articles found by Scopus and WoS, plus articles in Ecotoxicology Modeling, Water
Resources Research, 3 government reports, 2 books, a thesis, a conference proceeding, a
duplicate, and 2 ambiguous citations. It follows from this 3-fold difference in valid citations
that a critical review of published literature on a topic or a regulatory assessment could miss
relevant science if the assessors relied too heavily on a single search provider.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 24
This simple example was from the “current era” of science, which began by 1996 or so,
depending on which bibliographic indexing service scholars are using. Websites for WoS
and Scopus respectively report their indexing databases are reliable from 1971 and 1996
forward. Relying exclusively on bibliographic index searching may omit important, relevant
older research.
EPA Author Manuscript
Thus, we have the indexing bias problem in metaanalyses and assessment (those that are not
indexed will not be retrieved), and the related problem of reviewing the secondary source but
citing the original. We have seen assessments that omitted seminal research published before
the current digital era, which may reflect indexing bias. Ecotoxicology syntheses often rely
on variations of species-sensitivity distributions, which may provide more explanations of
statistical characteristics of the data sets, data extrapolations, transformations, and
normalizations, than on where the data came from in the first place. We have seen
micrograms and milligrams mixed up, and rankings that mistakenly commingled endpoints
such as time to death in hours with effects concentrations. Some of these issues are
undoubtedly related to the online availability of well-curated databases such as the European
Center for Ecotoxicology and Toxicology of Chemicals ECETOC Aquatic Toxicity (EAT)
Database or the US Environmental Protection Agency’s EcoTox databases. These compiled
databases are valuable resources but reliance on secondary compilations deprives the
EPA Author Manuscript
original authors of credit via citations. At least for publicly funded science, citations may be
a way that authors demonstrate the value of their work to the scientific community, and thus
build the case for further funding. Further, reliance on secondary sources is a good way to
introduce or repeat inaccuracies (Rekdal 2014). We echo previous calls for better training
and rigor when conducting and reporting secondary analyses of ecotoxicology and related
literature. Practices from other fields, such as the Cochrane systematic review approach and
guidelines for the ethical reuse of data could be adapted to the ecotoxicology practice
(Roberts et al. 2006; Duke and Porter 2013; Suter and Cormier 2015a).
Objectivity
Beyond the topics discussed, some explicit steps to ensure objectivity bear consideration
from initial planning through the interpretation of results. Even very careful scientists are not
immune from cognitive biases that affect objectivity. These include hypothesis dependency
(alternatives never considered will not be evaluated), hypothesis tenacity (sticking with an
EPA Author Manuscript
initial hypothesis despite evidence to the contrary), and anchoring, in which beliefs are
excessively influenced by initial perceptions (Norton et al. 2003).
There are some steps scientists could consciously take to avoid these pitfalls. Strive to be
self-skeptical of possible personal predilections toward expected causative factors or
outcomes. Consider the opposite—maybe it is correct. For topics with competing schools of
thought, consider how a scientist from a different school might interpret the data, or how
might one from a different sector analyze the problem (Suter and Cormier 2015a)? If
considerations such as these are sincere and not done just to anticipate and refute criticisms,
they can help to ensure the objectivity and balance of interpretations.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 25
Environmental chemistry
We discuss environmental chemistry separately because it has different scientific integrity
challenges than does the biological side of ecotoxicology. Unlike the situation in the
biological side of ecotoxicology where serious questions have been aired about the
EPA Author Manuscript
reproducibility of some of the published research (Scott 2012, 2018; Sumpter et al. 2014;
Sumpter et al. 2016), analytical environmental chemistry does not appear to suffer from such
problems to the same extent. The likely reason for this is that quality assurance mechanisms
are routinely incorporated into analytical projects that involve the measurement of
environmental concentrations of chemicals, thus ensuring that the results are accurate. These
include the use of high-quality standards, which are widely available, the use of high-
specification instruments, and general guidelines proposed by national and international
institutions. That combination enables recovery rates to be determined, preferably at
different concentration ranges, for intra- and interday precision to be assessed, detection
limits to be quantified, and matrix effects (interference from other substances) to be
investigated. These quality assurance procedures are adopted routinely, are always checked
by reviewers of analytical papers, and ensure quality is maintained.
databases, and in citation practices. For example, metadata specifying fundamental details
may be missing or misunderstood, such as whether concentrations of metals or other
elements in water are from filtered or unfiltered samples or whether they reflect the total
mass of the element or only 1 speciation state (Sprague et al. 2017). Aquatic metals
concentrations declined from milligram-per-liter levels in reports from the 1980s to
microgram-per-liter or submicrogram-per-liter levels by the late 1990s. This remarkable,
widespread decline was not due to better pollution controls or global geochemical change,
but to improved recognition and control of ubiquitous contamination in field and laboratory
sampling and analysis methods. There are ongoing debates over the most appropriate
sampling and analysis methods for inorganic water quality constituents, particularly for
environments that are expensive and difficult to sample representatively, such as large rivers
(Horowitz 2013). Such sampling biases and analytical method differences may be
substantial enough to confound analyses.
EPA Author Manuscript
Organic environmental chemistry data sets have similar pitfalls that can confound
subsequent reviews and secondary analyses. For example, Kolpin et al. (2002a) published a
summary of a survey of different pharmaceuticals, hormones, and other organic
contaminants from 139 streams. This highly influential paper showed that some organic
contaminants were widespread in streams and contributed to heightened concern and
research interest in their potential health and environmental risks. The paper is presently the
most highly cited paper ever published in the journal Environmental Science & Technology
(1st of 46 011 papers published, with 5104 citations in the Scopus database as of 16 August
2018). However, reported concentrations of at least 1 of the 95 chemicals reported, 17α-
ethinyl estradiol (EE2), were questioned because the median and maximum concentrations
of 73 and 831 ng/L, respectively, were about 10 to 1000 times higher than those from other
surveys or analyses (Ericson et al. 2002; Hannah et al. 2009). Kolpin et al. (2002b)
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 26
responded that upon further inspection, they had discovered that their maximum reported
EE2 value of 831 ng/L was indeed incorrect owing to analytical interferences. They further
explained that they had defined “median” in a peculiar way, as the median of detected values
in streams, not in its usual meaning as a central tendency of all values. Because EE2 was
undetectable in 94% of streams sampled, the median of detected values was skewed far
EPA Author Manuscript
above the median of all streams (<5 ng/L). However, despite the discovery of the mistakes,
no correction was issued for the original publication. The Kolpin et al. (2002b)
acknowledgment of the mistaken values was buried among the other 5103 citing papers and
their subtle, peculiar definition of a “median” was likely overlooked by most readers. As of
August 2018, at least 50 citing papers were identified that re-reported and perpetuated the
exorbitant and mistaken 831 ng/L maximum EE2 value for US streams.
Thus, it is easy for authors to misinterpret or to perpetuate erroneous relevant values from
the literature. The problem of citing unreliable maximum values would be avoided if authors
simply cited extreme statistics, such as percentile concentrations (e.g., the 10th to 90th, 5th to
95th, 1st to 99th) instead of ranges (Weltje and Sumpter 2017). Whereas a single extreme
value defines the range, extreme percentiles are more representative of severe conditions that
organisms may actually encounter and will be more stable and are far less vulnerable to be
mistaken. For instance, Santore et al. (2018, their Figure 9) elegantly summarized about 29
EPA Author Manuscript
000 paired reports from aggregated data sources of dissolved and total aluminum (Al) in
fresh water. Logically, the dissolved fraction of trace metals in water can be no greater than
the total. In practice, results do not always come out that way, especially when the 2 values
are close. Factors such as differences in sample digestion, differences between instruments,
or slight differences in technique may introduce subtle analytical biases that produce the
dissolved fraction greater than the whole (Paul et al. 2016). In the Santore et al. (2018)
comparison, at least 150 of the 29 000 Al pairs show the dissolved fraction is greater than
the whole. While such logically impossible values should usually simply indicate that close
to 100% of the total Al was present in dissolved form, some are obviously impossible values
with the dissolved fraction 2 to 3 orders of magnitude higher than the total concentration.
Should an imprudent analyst uncritically report on ranges of dissolved and total Al, they
would report nonsense results. Simply backing off to the 99th or 95th percentile for a large
data set such as this one would still reflect the extremes of environmental conditions but be
somewhat insulated from dubious, single values.
EPA Author Manuscript
The counterpart to avoiding pitfalls working with big data is avoiding pitfalls when working
with small data sets. Small data sets may be the best that can be acquired from studies in
extreme environments where sampling is difficult, dangerous, or very expensive or where
conditions are ephemeral or rare. Dismissing high-quality data that may have been collected
under heroic or unrepeatable circumstances on pedantic statistical grounds would be
foolhardy. Small data sets can be important, so long as one is cautious about inferences.
Efforts to “improve” small data sets through sophisticated statistical techniques should be
resisted (Murtaugh 2007).
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 27
ADVOCACY
Science is the enterprise for answering questions and making predictions about the how the
universe works. Science can inform issues, but science can never answer “should” questions.
For example, science cannot tell societies whether they should restrict chemical uses and
EPA Author Manuscript
releases, whether natural preserves should be set aside from human exploitation, or whether
biodiversity should be protected. These are among the myriad value judgments that societies
must make, and while science can support societies in making these choices through
predictions founded upon a body of knowledge, there are never “scientifically correct”
answers to questions of human values, morals, and ethics (Snyder and Hooper-Bui 2018).
Scientists are humans, and like all people, they hold ethical and moral values that drive
assumptions which may not be explicitly stated, if even recognized. For example, the notion
of “environmental protection” in the environmental toxicology field is rooted in societal
norms, statutes, and international agreements with goals of minimizing harm (a human
concept) from activities such as extraction, manufacture, use, and disposal of chemical
products. Scientists in the field develop informed opinions toward the “should” questions
relating to their experiences, which leads to questions of whether and how scientists
advocate for “should” questions.
EPA Author Manuscript
The underpinnings of science are that researchers have no vested interest in the results of
their observations, that they objectively record and analyze these results, and that they fairly
report the outcomes in the peer-reviewed literature. Advocacy can compromise these
underpinnings, at the cost of scientists’ credibility (Fenn and Milton 1997). Scientists tend to
be passionate about their science, which has led to controversy over the role that scientists
should play in related public policy debates. While we think most scientists would agree that
advocacy for science having a role in environmental policy debates is appropriate, there is
likely much less agreement about whether it is appropriate for scientists to advocate for
particular outcomes in policy debates. If the policy debate turns on questions of science
central to a scientist’s particular area of study, probably no one is better positioned than that
scientist to lay out the evidence for or against a particular course of action. If the scientist is
regarded as a neutral and informed voice, their advice may be valued by all sides in a policy
dispute (Sedlak 2016). However, if the scientist’s experience or analyses leads them to the
strong conviction that one policy direction is more correct and should be adopted, then they
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 28
Institutional constraints aside, how scientists balance these competing issues and choose
when or whether to engage in advocacy is a deeply personal choice and is situational (Meyer
et al. 2010; Sedlak 2016). However, just as science journals discourage comingling original
research results and commentary, scientists should keep science and advocacy distinct in
their publications and speaking. In particular, we argue that scientists should be watchful for
stealth policy advocacy. Stealth advocacy is the use of value-laden language in scientific
EPA Author Manuscript
writing that assumes a policy preference (Lackey 2007; Pielke 2007). Rather than openly
disclosing assumed values or policy preferences, biases may be unconsciously (or
deliberately) cloaked through normative science. Normative science is science developed,
presented, or interpreted on the basis of an assumed, usually unstated, preference for a
particular policy or class of policy choices. This covert advocacy may be reflected in word
choices, and such advocacy is not always apparent even to the advocate. For instance, value-
laden words such as “stressors,” “impacted,” “degraded,” “improved,” “good,” and “poor”
may be used to describe habitats or other environmental features. Less value-laden words
would be “factors,” “exposed,” “altered,” “changed,” “increased,” or “decreased.” The use
of normative science is potentially insidious because the tacit, usually unstated, preference
for a particular policy or class of policy choices is not perceptibly normative to policy
makers or even to many scientists (Lackey 2007).
Criticisms of normative science can be excessive because, taken literally, the entire
EPA Author Manuscript
discipline of conservation biology could be considered too normative. Similarly, the mission
statement of SETAC “to support the development of principles and practices for protection,
enhancement and management of sustainable environmental quality and ecosystem
integrity” could be too much for some. Science is normative, with topics and study questions
influenced by normative treaties and laws. Areas of study or techniques once considered
appropriate areas of science inquiry such as craniometry, eugenics, or experimentation on
human subjects without informed consent are no longer considered to be within the norms of
ethical science. Within environmental toxicology, pressure to reduce the use of animal
testing might be an example of normative science.
Our point is not to argue for or against scientists engaging in overt policy advocacy, which is
a personal decision, but for clarity and transparency. Just as original results, opinion,
judgments and speculation should not be blended in a scientific paper, science and advocacy
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 29
need some separation (Scott and Rachlow 2011). Covert advocacy is a form of bias.
Environmental scientists should clearly differentiate between research findings and policy
advocacy based upon those findings.
We recognize that “scientific integrity” discussions can easily be diminished to going down
the path carved by “sound science” strategic initiatives, which often boiled down to
campaigns to call “my science good science and your science junk” (McGarity 2003;
Doremus 2007; Oreskes and Conway 2011; Kapustka 2016). The goal may be to recast
policy, ideological, or economic disputes as scientific doubts or conflicts. In countries with a
tort-based, adversarial legal system for resolving injuries or damages, science-based
information becomes just another tool for dueling experts, who often have primary
responsibility for advocating for the interests of their client (Wagner 2005). Research
integrity policies or requirements for data transparency can be used as weapons to bury
public university or government scientists with vexatious, intrusive, and costly demands for
records such as raw laboratory notebooks, instrument calibration records, emails between
coauthors, working drafts, and peer comments and responses. Such demands can be effective
tools for interfering with the work of public-sector scientists, including academics in public
EPA Author Manuscript
institutions (Folta 2015; Halpern and Mann 2015; Kloor 2015; Kollipara 2015;
Lewandowsky and Bishop 2016) or academics in private institutions who receive research
support from public sources (Hey and Chalmers 2010; Shrader-Frechette 2012). For
example, Deborah Swackhamer, an environmental chemist at the University of Minnesota,
USA, was targeted under state open records laws with legal demands for raw unpublished
data, class notes, purchase records, telephone records, and more from a 15-y period.
Ironically, the identities of those seeking the information were themselves shielded from
disclosure (Halpern 2015). Some scientists have learned to use transparency laws against
their peers in the highly competitive arena of grant funding. Through freedom of information
demands for competing grant proposals, scientists have been able to obtain details on
competitors’ new research direction, preliminary results, and cost structure. For those
targeted scientists, such information gathering may be seen as research espionage under the
rubric of transparency (Carey and Woodward 2017).
EPA Author Manuscript
The sunshine laws enacted in many jurisdictions were intended to illuminate the business of
government officials; it is doubtful they were intended by their crafters to sweep up
university professors. Nevertheless, some see scientists as fair targets of such tactics, given
that inspections of their erstwhile private communications have uncovered peer-review
misconduct, undisclosed conflicts of interest, or bias (e.g., Russell et al. 2010; Fellner 2018;
Krimsky and Gillam 2018). Privately funded research is generally shielded from such
practices (Wagner and Michaels 2004; Brain et al. 2016). However, researchers at private
institutions may also be subject to baseless litigation to intimidate scientists and deter others
by inflicting long and costly legal processes, disruption, and threats of personal financial
liability. Such harassing lawsuits have been employed often enough to get a name, “SLAPP
suits,” for Strategic Litigation Against Public Participation (Johnson 2007; Nature Medicine
Editors 2017; Robbins 2017). Although legal, such strategies represent detrimental practices
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 30
cloaked in the vernacular of transparent science (Wagner and Michaels 2004; Johnson 2007;
McGarity and Wagner 2008; Levy and Johns 2016).
It is one thing to realize that there is a problem, but quite another to find effective solutions
to that problem. For high scientific integrity to be the norm, the culture of science has to
systematically embrace exemplary practices and discourage bad behavior (Benderly 2010).
However sometimes the system and its incentives are part of the problem. Many of the
reliability concerns and detrimental behaviors we have discussed are related to the perverse
incentives under which scientists may operate. These incentives are grants and publications.
In the case of grants, the more, and bigger, they are, the better, as far as institutions are
concerned. In the case of publications, the number of these seems to be much more
important than their quality. This is probably because assessing the quality of a scientific
article is not easy; there is no established, widely accepted way of doing this. The “status” of
the journal in which an article is published, which is most often taken to be the impact factor
of that journal, also is considered an important consideration. Hence scientists strive to get
their papers published in journals with high impact factors and may act unethically to do so
(Nature Editors 2014). These incentives, particularly those concerning publications (“the
EPA Author Manuscript
more the merrier”), probably contribute to many of the lapses in integrity in ecotoxicology.
This problem became particularly severe in China with its high volume of scientific output
and its system of tying substantial cash bonuses or other overt rewards for researchers to the
impact factors of journals in which their articles were published. This contributed to
ingenious and widespread misconduct (Hvistendahl 2013). In response, China recently
initiated sweeping reforms with strong disincentives for academic misconduct: Institutions
will no longer investigate themselves, funding will be withheld from institutions at which
misconduct occurred, researchers will be deterred from publishing findings in journals that
are deemed to be of poor academic quality and set up merely for profit, and scientists found
to have conducted willful misconduct will face severe penalties (Cyranoski 2018; Nature
Editors 2018a). How the education for and enforcement of these initiatives develops is likely
to influence research communities and funders elsewhere. Likewise in ecotoxicology,
moving to more ethical practices in which integrity is central to any endeavor will not be
easy to accomplish, nor will it be achieved quickly.
EPA Author Manuscript
Education of ecotoxicologists, both young and old, is a key way forward toward a better
culture of integrity in our discipline. That education can be delivered in a variety of ways,
the two most obvious and practical being 1) the publication of articles in journals in which
integrity and ethics are discussed and 2) courses run by scientific societies such as SETAC.
The present article is an example of a very direct attempt at highlighting integrity issues in
our field, with the hope that by making ecotoxicologists aware of these unethical practices
they will change their behavior and act more ethically. Other, less direct attempts have
involved the publication of papers covering suggestions for how to improve the quality of
ecotoxicology research, from the planning stage (Harris et al. 2014) to the publication stage
(e.g., Moermond et al. 2016; Hanson et al. 2017). However, it seems unlikely that published
papers alone will have a significant influence on the quality of ecotoxicology research
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 31
because few scientists will be aware of them, and even fewer will read them carefully and
subsequently act on the advice in them.
Although there has been some public discussion about what training and skills the
ecotoxicologists of the future will require (Harris et al. 2017), this crucial aspect of
EPA Author Manuscript
producing better ecotoxicologists, capable of doing better, and hence more useful, research
has rarely been addressed. Yet there are undoubtedly things that could be done to better
educate the ecotoxicologists of the future. A radical proposal would be to require aspiring
ecotoxicologists to pass examinations before they are allowed to practice ecotoxicology,
either as researchers or regulators. Many professions do insist that its practitioners pass
examinations before they are allowed to practice: Doctors, dentists, accountants, and lawyers
are obvious examples. This strategy ensures that practitioners are adequately trained. As a
first step toward the goal of ensuring all ecotoxicologists are appropriately trained, specific
courses could be introduced and attendance could become mandatory. Courses on topics
such as experimental design, statistical analysis, data presentation, and how to write a
scientific paper could be designed easily. In fact, many research organizations and some
industries already run “in house” courses on these topics, which may include short,
memorable messages such as David Glass’s series of animations on common pitfalls in
experimental design (Glass 2018). Munafò et al.’s (2017) “manifesto for reproducible
EPA Author Manuscript
In summary, identifying that there are problems with the way ecotoxicologists are trained
currently about integrity issues in their discipline is only the first step. Better education of
ecotoxicologists (both those starting their careers and those already well established) is
needed. Such education will need to be provided in a range of formats, to maximize its
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 32
4. Studies that are not supported by primary data released through data repositories
or detailed supporting information are not fully credible.
9. Scientists, like all people, have moral and ethical assumptions, based upon their
values. These should not be intermixed with their interpretations and reporting of
science. If scientists’ values lead them to cross the lines from analysis to
advocacy, they need to be particularly careful about distinguishing between
science, values, assumptions, and opinion.
published works.
11. Professional societies such as SETAC could support a standing training seminar
on principles of scientific integrity, the transparent conduct of science, and best
practices for peer review in conjunction with its annual meetings.
Supplementary Material
Refer to Web version on PubMed Central for supplementary material.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 33
Acknowledgment
We thank the 4 anonymous peer reviewers and the 2 institutional reviewers for their constructive criticisms and
insights. This commentary and perspectives followed discussions by the Society of Environmental Toxicology and
Chemistry (SETAC) ad hoc subcommittee on Scientific Integrity (Fairbrother 2016). Author roles: CA Mebane
wrote most of the paper, with contributions by JP Sumpter and A Fairbrother. All other authors participated in
EPA Author Manuscript
discussions, reviewed and edited the drafts, supported the recommendations, and approved of its publication.
Disclaimer
The views expressed in this article are those of the authors and do not necessarily represent the views or policies of
SETAC, the US Environmental Protection Agency, or the US Fish and Wildlife Service. Affiliations are given for
identification, but do not necessarily imply endorsement. Author affiliations reflect a variety of academic, business,
and governmental employment, and all authors have competing interests from our previous experiences and work
environments, but hopefully these were somewhat balanced out. No specific funding was provided for the writing of
this manuscript and no financial conflicts of interest were declared by any author.
REFERENCES
[AAAS] American Association for the Advancement of Science. 2000 The role and activities of
scientific societies in promoting research integrity. In: A report of a conference; 2000 Apr 10–11;
Washington, DC 24 p. https://www.aaas.org/sites/default/files/content_files/The%20Role%20and
%20Activities%20of%20Scientific%20Societies%20in%20Promoting%20Research
%20Integrity.pdf.
Ågerstrand M, Sobek A, Lilja K, Linderoth M, Wendt-Rasch L, Wernersson AS, Rudén C. 2017 An
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 34
Bero L, Anglemyer A, Vesterinen H, Krauth D. 2016 The relationship between study sponsorship,
risks of bias, and research outcomes in atrazine exposure studies conducted in non-human animals:
Systematic review and meta-analysis. Environ Int 92–93: 597–604. 10.1016/j.envint.2015.10.011
Bes-Rastrollo M, Schulze MB, Ruiz-Canela M, Martinez-Gonzalez MA. 2014 Financial conflicts of
interest and reporting bias regarding the association between sugar-sweetened beverages and
weight gain: A systematic review of systematic reviews. PLoS Med 10(12): e1001578 10.1371/
journal.pmed.1001578
Blumenstyk G 2007 When research criticizes an industry. Chron High Educ 54(4): A21.
Boden LI, Ozonoff D. 2008 Litigation-generated science: Why should we care? Environ Health
Perspect 116(1): 117–122. [PubMed: 18197310]
Boehm PD, Ginn TC. 2013 The science of natural resource damage assessments. Environ Claims J
25(3): 185–225. 10.1080/10406026.2013.785910
Bohannon J 2013 Who’s afraid of peer review? Science 342(6154): 60–65. 10.1126/
science.342.6154.60 [PubMed: 24092725]
Boone MD, Bishop CA, Boswell LA, Brodman RD, Burger J, Davidson C, Gochfeld M, Hoverman JT,
Neuman-Lee LA, Relyea RA et al. 2014 Pesticide regulation amid the influence of industry.
BioScience 64(10): 917–922. 10.1093/biosci/biu138
EPA Author Manuscript
Bornstein-Forst SM. 2017 Establishing good laboratory practice at small colleges and universities. J
Microbiol Biol Educ 18(1): 181.10. 10.1128/jmbe.v18i1.1222
Borrell B 2009 8 24 National Academy as National Enquirer? PNAS publishes theory that caterpillars
originated from interspecies sex. Scientific American (The Sciences). https://
www.scientificamerican.com/article/national-academy-as-national-enquirer/
Brain R, Stavely J, Ortego L. 2016 In response: Resolving the perception of bias in a discipline
founded on objectivity—A perspective from industry. Environ Toxicol Chem 35(5): 1070–1072.
10.1002/etc.3357 [PubMed: 27089441]
Brix KV, DeForest DK, Adams WJ. 2011 The sensitivity of aquatic insects to divalent metals: A
comparative analysis of laboratory and field data. Sci Total Environ 409(20): 4187–4197. 10.1016/
j.scitotenv.2011.06.061 [PubMed: 21820156]
Brix KV, Esbaugh AJ, Munley KM, Grosell M. 2012. Investigations into the mechanism of lead
toxicity to the freshwater pulmonate snail, Lymnaea stagnalis. Aquat Toxicol 106–107(0): 147–
156. 10.1016/j.aquatox.2011.11.007
Bui C, Parrish R, Meredith M, Giddings J, editors. 2004 A history of SETAC Part 1: The beginning.
Pensacola (FL): SETAC 36 p.
EPA Author Manuscript
Burton GA, Di Giulio R, Costello D, Rohr JR. 2017 Slipping through the cracks: Why is the U.S.
Environmental Protection Agency not funding extramural research on chemicals in our
environment? Environ Sci Technol 51(2): 755–756. 10.1021/acs.est.6b05877 [PubMed: 28033008]
Burton GA, Ward CH. 2012 Editorial preface [letters to the editor on the Exxon Valdez oil spill and
pink salmon]. Environ Toxicol Chem 31(3): 469 10.1002/etc.1741 [PubMed: 21954042]
Burwell SM, VanRoekel S, Park T, Mancini DJ. 2013 Open data policy-managing information as an
asset. Washington (DC): US Office of Management and Budget M-13-13 [accessed 2018 Sep 15].
https://obamawhitehouse.archives.gov/sites/default/files/omb/memoranda/2013/m-13-13.pdf
Buys DJ, Stojak AR, Stiteler W, Baker TF. 2015 Ecological risk assessment for residual coal fly ash at
Watts Bar Reservoir, Tennessee: Limited alteration of riverine-reservoir benthic invertebrate
community following dredging of ash-contaminated sediment. Integr Environ Assess Manag
11(1): 43–55. 10.1002/ieam.1577 [PubMed: 25158124]
Cain DM, Loewenstein G, Moore DA. 2005 The dirt on coming clean: Perverse effects of disclosing
conflicts of interest. J Legal Studies 34(1): 1–25. 10.1086/426699
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 35
Calfee RD, Puglis HJ, Little EE, Brumbaugh WG, Mebane CA. 2016 Quantifying fish swimming
behavior in response to acute exposure of aqueous copper using computer assisted video and
digital image analysis. J Visualized Exp (108): e53477 10.3791/53477
Callaway E 2015 8 18 Faked peer reviews prompt 64 retractions. Nature (News).0 10.1038/
nature.2015.18202
EPA Author Manuscript
Campbell T, Friesen J. 2015 3 3 Why people “fly from facts”: Research shows the appeal of untestable
beliefs and how they lead to a polarized society. Scientific American (Behavior & Society). https://
www.scientificamerican.com/article/why-people-fly-from-facts/
Carey TL, Woodward A. 2017 9 2 These scientists got to see their competitors’ research through
public records requests. BuzzFeed News (Science). https://www.buzzfeednews.com/article/
teresalcarey/when-scientists-foia
Chapman PM, Brain RA, Belden JB, Forbes VE, Mebane CA, Hoke RA, Ankley GT, Solomon KR.
2018 Collaborative research among academia, business, and government. Integr Environ Assess
Manag 14(1): 152–154. 10.1002/ieam.1975 [PubMed: 29027366]
Christensen SW, Klauda RJ. 1988 Two scientists in the courtroom: What they didn’t teach us in
graduate school. Monogr Am Fish Soc 4: 307–315.
Collins SL, Verdier JM. 2017 The coming era of open data. BioScience 67(3): 191–192. 10.1093/
biosci/bix023
Cope MB, Allison DB. 2009 White hat bias: Examples of its presence in obesity research and a call for
renewed commitment to faithfulness in research reporting. Int J Obes 34(1): 84–88.
Couzin-Frankel J 2017 2 22 Firing of veteran NIH scientist prompts protests over publication ban.
Science (News). 10.1126/science.aal0808
EPA Author Manuscript
Cranor CF. 2008 The tobacco strategy entrenched. Science 321(5894): 1296–1297. 10.1126/
science.1162339
Curfman GD, Morrissey S, Drazen JM. 2005 Expression of concern: Bombardier et al, “Comparison
of upper gastrointestinal toxicity of rofecoxib and naproxen in patients with rheumatoid arthritis,”
N Engl J Med 2000;343:1520–8. N Engl J Med 353(26):2813–2814. 10.1056/NEJMe058314
[PubMed: 16339408]
Cyranoski D 2018 China introduces sweeping reforms to crack down on academic misconduct. Nature
558(14 June 2018): 171 10.1038/d41586-018-05359-8 [PubMed: 29895918]
Davis P 2016 8 23 Scientific Reports on track to become largest journal in the world Scholarly Kitchen
[blog]. Oakbrook Terrace (IL): Society for Scholarly Publishing https://
scholarlykitchen.sspnet.org/2016/08/23/scientific-reports-on-track-to-become-largest-journal-in-
the-world/
DeForest DK, Van Genderen EJ. 2012 Application of U.S. EPA guidelines in a bioavailability-based
assessment of ambient water quality criteria for zinc in freshwater. Environ Toxicol Chem 31(6):
1264–1272. 10.1002/etc.1810 [PubMed: 22447746]
Descamps H 2008 Natural resource damage assessment (NRDA) under the European Directive on
Environmental Liability: A comparative legal point of view. Océanis 32(3/4): 439–46.
EPA Author Manuscript
Dixon PM, Pechmann JHK. 2005 A statistical test to show negligible trend. Ecology 86(7): 1751–
1756. 10.1890/04-1343
Doremus H 2007 Scientific and political integrity in environmental policy. Texas Law J 86: 1601–
1653.
Douglas H 2015 Politics and science: Untangling values, ideologies, and reasons. Ann Am Acad Pol
Soc Sci 658: 296–306. 10.1177/0002716214557237
Douglas HE. 2014 Scientific integrity in a politicized world. In: Schroeder-Heister P, Heinzmann G,
Hodges W, Bour PE, editors. Logic, methodology and philosophy of science. Proceedings of the
Fourteenth International Congress Logic, Methodology and Philosophy of Science; 2011 Jul 19–
26; Nancy, France. London (UK): College Publ. p 254–268.
Duke CS, Porter JH. 2013 The ethics of data sharing and reuse in biology. BioScience 63(6): 483–489.
10.1525/bio.2013.63.6.10
Edwards A 2016 Reproducibility: Team up with industry. Nature 531(301): 299–301.
10.1038/531299a [PubMed: 26983524]
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 36
Edwards MA, Roy S. 2016 Academic research in the 21st century: Maintaining scientific integrity in a
climate of perverse incentives and hypercompetition. Environ Eng Sci 34(1): 51–61. 10.1089/
ees.2016.0223
Elliott ED. 1988 The future of toxic torts: Of chemophobia, risk as a compensable injury and hybrid
compensation systems. Houston Law Rev 25: 781–799.
EPA Author Manuscript
Elliott KC. 2014 Financial conflicts of interest and criteria for research credibility. Erkenntnis 79(5):
917–937. 10.1007/s10670-013-9536-2
Elliott KC. 2016 Standardized study designs, value judgments, and financial conflicts of interest in
research. Perspect Sci 24(5): 529–551. 10.1162/POSC_a_00222
Else H 2018 Does science have a bullying problem? Nature 563: 616–618. [PubMed: 30487619]
Enserink M 2014 Sabotaged scientist sues Yale and her lab chief. Science 343(6175): 1065–1066.
10.1126/science.343.6175.1065 [PubMed: 24604172]
Enserink M 2017 3 21 A groundbreaking study on the dangers of “microplastics” may be unraveling.
Science (News). 10.1126/science.aal0939
Ericson JF, Laenge R, Sullivan DE. 2002 Comment on “Pharmaceuticals, hormones, and other organic
wastewater contaminants in U.S. streams, 1999−2000: A national reconnaissance.” Environ Sci
Technol 36(18): 4005–4006. 10.1021/es0200903 [PubMed: 12269756]
European Commission. 2016 10 28 Open access to scientific information | Digital Single Market
Brussels (BE). https://ec.europa.eu/digital-single-market/en/open-access-scientific-information
Extance A 2018 11 5 Chemistry graduate student admits poisoning colleague with carcinogen.
Chemistry World (News). https://www.chemistryworld.com/news/chemistry-graduate-student-
admits-poisoning-co-worker/3009717.article
EPA Author Manuscript
Fairbrother A 2016 2 18 New subcommittee on scientific integrity. SETAC Globe 17(2). https://
globearchive.setac.org/2016/february/integrity.html
Fairbrother A, Bennett RS. 1999 Ecological risk assessment and the precautionary principle. Hum Ecol
Risk Assess 5(5): 943–949. 10.1080/10807039991289220
Fanelli D 2009 How many scientists fabricate and falsify research? A systematic review and meta-
analysis of survey data. PLoS One 4(5): e5738 10.1371/journal.pone.0005738 [PubMed:
19478950]
Fanelli D 2012 Negative results are disappearing from most disciplines and countries. Scientometrics
90(3): 891–904. 10.1007/s11192-011-0494-7
Farris JL, Hassel JHV, editors. 2006 Freshwater bivalve ecotoxicology. Boca Raton (FL): SETAC and
CRC Pr. 408 p.
Fellner C 2018 6 18 Toxic secrets: Professor “bragged about burying bad science” on 3M chemicals.
Sydney Morning Herald (Investigation). https://www.smh.com.au/lifestyle/health-and-wellness/
toxic-secrets-professor-bragged-about-burying-bad-science-on-3m-chemicals-20180615-
p4zlsc.html
Fenn DB, Milton NM. 1997 Advocacy and the scientist. Fisheries 22(11): 4.
EPA Author Manuscript
Flamini F, Hanley N, Shaw WD. 2004 Settlement versus litigation in environmental damage cases.
Working paper. Glasgow (UK): Business School - Economics, Univ Glasgow http://
EconPapers.repec.org/RePEc:gla:glaewp:2003_4
Folta KM. 2015 8 16 Transparency weaponized against scientists. Science 20 http://
www.science20.com/kevin_folta/transparency_weaponized_against_scientists-156873
Fraser H, Parker TH, Nakagawa S, Barnett A, Fidler F. 2018 7 16 Questionable research practices in
ecology and evolution. PLoS One 13(7): e0200303 10.1371/journal.pone.0200303 [PubMed:
30011289]
Gardner M 1989 Water with memory? The dilution affair. Skeptical Inquirer 13: 132–141.
Geist DR, Abernethy CS, Hand KD, Cullinan VI, Chandler JA, Groves PA. 2006 Survival,
development, and growth of fall Chinook salmon embryos, alevin, and fry exposed to a variable
thermal and dissolved oxygen regime. Trans Am Fish Soc 135(6): 1462–1477. 10.1577/T05-294.1
Ghorayshi A 2016 6 29 “He thinks he’s untouchable”-Sexual harassment case exposes renowned ebola
scientist. BuzzFeedNews (Science). https://www.buzzfeed.com/azeenghorayshi/michael-katze-
investigation.
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 37
Gibbons A 2014 Sexual harassment is common in scientific fieldwork (July 16, 2014). Science https://
www.sciencemag.org/news/2014/07/sexual-harassment-common-scientific-fieldwork
Glanz J, Armendariz A. 2017 3 8 Years of ethics charges, but star cancer researcher gets a pass. New
York Times (Science). https://www.nytimes.com/2017/03/08/science/cancer-carlo-croce.html
Glass D 2018 Experimental design for biologists: 1. System validation. [video]. [accessed 2018 Nov
EPA Author Manuscript
Hannah R, D’Aco VJ, Anderson PD, Buzby ME, Caldwell DJ, Cunningham VL, Ericson JF, Johnson
AC, Parke NJ, Samuelian JH et al. 2009 Exposure assessment of 17α-ethinylestradiol in surface
waters of the United States and Europe. Environ Toxicol Chem 28(12): 2725–2732.
10.1897/08-622.1 [PubMed: 19645524]
Hanson ML, Wolff BA, Green JW, Kivi M, Panter GH, Warne MSJ, Ågerstrand M, Sumpter JP. 2017
How we can make ecotoxicology more valuable to environmental protection. Sci Total Environ
578: 228–235. 10.1016/j.scitotenv.2016.07.160 [PubMed: 27503632]
Harris CA, Scott AP, Johnson AC, Panter GH, Sheahan D, Roberts M, Sumpter JP. 2014 Principles of
sound ecotoxicology. Environ Sci Technol 48(6): 3100–3111. 10.1021/es4047507 [PubMed:
24512103]
Harris CA, Sumpter JP. 2015 Could the quality of published ecotoxicological research be better?
Environ Sci Technol 49(16): 9495–9496. 10.1021/acs.est.5b01465 [PubMed: 26247126]
Harris D, Patrick M. 2011 6 21 Is “big food’s” big money influencing the science of nutrition? New
York (NY): ABC News http://abcnews.go.com/US/big-food-money-accused-influencing-science/
story?id=13845186
Harris MJ, Huggett DB, Staveley JP, Sumpter JP. 2017 What training and skills will the
ecotoxicologists of the future require? Integr Environ Assess Manag 13(4): 580–584. 10.1002/
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 38
Hopkin M 2006 Ecology: Caught between shores. Nature 440(7081): 144–145. 10.1038/440144a
[PubMed: 16525438]
Horowitz AJ. 2013 A review of selected inorganic surface water quality-monitoring practices: Are we
really measuring what we think, and if so, are we doing it right? Environ Sci Technol 47(6):
2471–2486. 10.1021/es304058q [PubMed: 23406404]
EPA Author Manuscript
Huff D, Geis I. 1954 How to lie with statistics. New York (NY): WW Norton (1993 reissue). 142 p.
Hurlbert SH. 1984 Pseudoreplication and the design of ecological field experiments. Ecol Monogr
54(2): 187–211. 10.2307/1942661
Hutchings JA. 1997 Is scientific inquiry incompatible with government information control? Can J
Fish Aquat Sci 54(5): 1198–1210. 10.1139/f97-051
Hvistendahl M 2013 China’s publication bazaar. Science 342(6162): 1035–1039. 10.1126/
science.342.6162.1035 [PubMed: 24288313]
[ICMJE] International Committee of Medical Journal Editors. 2016 ICMJE form for disclosure of
potential conflicts of interest. 3 p. http://icmje.org/conflicts-of-interest/
Ioannidis JPA. 2005 Why most published research findings are false. PLoS Med 2(8): e124 10.1371/
journal.pmed.0020124
Johnson AC, Donnachie RL, Sumpter JP, Jürgens MD, Moeckel C, Pereira MG. 2017 An alternative
approach to risk rank chemicals on the threat they pose to the aquatic environment. Sci Total
Environ 599–600: 1372–1381. 10.1016/j.scitotenv.2017.05.039
Johnson AC, Sumpter JP. 2016 Are we going about chemical risk assessment for the aquatic
environment the wrong way? Environ Toxicol Chem 35(7): 1609–1616. 10.1002/etc.3441
[PubMed: 27331653]
EPA Author Manuscript
Johnson BL. 2007 Intimidation of scientists: A HERA experience. Hum Ecol Risk Assess 13(3): 475–
477. 10.1080/10807030701340870
Johnson DR, Ecklund EH. 2016 Ethical ambiguity in science. Sci Eng Ethics 22(4): 989–1005.
10.1007/s11948-015-9682-9 [PubMed: 26169696]
Kapustka L 2016 Words matter. Integr Environ Assess Manag 12(3): 592–593. 10.1002/ieam.1767
[PubMed: 27332929]
Keith R 2015 10 9 13th retraction issued for Jesús Ángel Lemus. Retraction Watch http://
retractionwatch.com/2015/10/09/13th-retraction-issued-for-jesus-angel-lemus/
Kidwell MC, Lazarević LB, Baranski E, Hardwicke TE, Piechowski S, Falkenberg L-S, Kennett C,
Slowik A, Sonnleitner C, Hess-Holden C et al. 2016 Badges to acknowledge open practices: A
simple, low-cost, effective method for increasing transparency. PLoS Biol 14(5): e1002456
10.1371/journal.pbio.1002456 [PubMed: 27171007]
Kintisch E 2010 8 19 “I told ya, you can’t stop the rage,” UC endocrinologist Hayes writes to
Syngenta. Science (News). http://news.sciencemag.org/2010/08/i-told-ya-you-cant-stop-rage-uc-
endocrinologist-hayes-writes-syngenta
Kloor K 2015 2 11 Agricultural researchers rattled by demands for documents from group opposed to
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 39
Krimsky S 2007 When conflict-of-interest is a factor in scientific misconduct. Med Law 26(3): 447–
463. [PubMed: 17970245]
Krimsky S 2013 Do financial conflicts of interest bias research?: An inquiry into the “funding effect”
hypothesis. Sci Technol Hum Values 38(4): 566–587. 10.1177/0162243912456271
Krimsky S, Gillam C. 2018 Roundup litigation discovery documents: Implications for public health
EPA Author Manuscript
and journal ethics. J Public Health Policy 39(3): 318–326. 10.1057/s41271-018-0134-z [PubMed:
29884897]
Krumholz HM, Ross JS, Presler AH, Egilman DS. 2007 What have we learnt from Vioxx? BMJ
334(7585): 120–123. 10.1136/bmj.39024.487720.68 [PubMed: 17235089]
Lackey RT. 2001 Values, policy, and ecosystem health. BioScience 51(6): 437–443.
10.1641/0006-3568(2001)051[0437:VPAEH]2.0.CO;2
Lackey RT. 2007 Science, scientists, and policy advocacy. Conserv Biol 21(1): 12–17. 10.1111/
j.1523-1739.2006.00639.x [PubMed: 17298504]
Laine H 2017 Afraid of scooping; Case study on researcher strategies against fear of scooping in the
context of open science. Data Sci J 16: 29 10.5334/dsj-2017-029
Levy KE, Johns DM. 2016 When open data is a Trojan Horse: The weaponization of transparency in
science and governance. Big Data & Society 3(1). 10.1177/2053951715621568
Lewandowsky S, Bishop D. 2016 Research integrity: Don’t let transparency damage science. Nature
529: 459–461. 10.1038/529459a [PubMed: 26819029]
Lexchin J, Bero LA, Djulbegovic B, Clark O. 2003 Pharmaceutical industry sponsorship and research
outcome and quality: Systematic review. BMJ 326(7400): 1167–1170. 10.1136/
bmj.326.7400.1167 [PubMed: 12775614]
EPA Author Manuscript
Lindenmayer DB, Likens GE. 2010 The science and application of ecological monitoring. Biol
Conserv 143(6): 1317–1328. 10.1016/j.biocon.2010.02.013
Macleod M 2014 Some salt with your statin, Professor? PLoS Biol 12(1): e1001768 10.1371/
journal.pbio.1001768 [PubMed: 24465176]
Mandrioli D, Kearns CE, Bero LA. 2016 Relationship between research outcomes and risk of bias,
study sponsorship, and author financial conflicts of interest in reviews of the effects of artificially
sweetened beverages on weight outcomes: A systematic review of reviews. PLoS One 11(9):
e0162198 10.1371/journal.pone.0162198 [PubMed: 27606602]
Marcus A, Oransky I. 2016 1 21 CRISPR controversy reveals how badly journals handle conflicts of
interest. Boston (MA): STAT (The Watchdogs) https://www.statnews.com/2016/01/21/crispr-
conflicts-of-interest/
Marshall E 1983 The murky world of toxicity testing. Science 220(4602): 1130–1132. 10.1126/
science.6857237 [PubMed: 6857237]
Martinson BC, Anderson MS, de Vries R. 2005 Scientists behaving badly. Nature 435(7043): 737–
738. 10.1038/435737a [PubMed: 15944677]
Marty GD, Saksida SM, Quinn TJ. 2010 Relationship of farm salmon, sea lice, and wild salmon
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 40
McNutt M, Lehnert K, Hanson B, Nosek BA, Ellison AM, King JL. 2016 Liberating field science
samples and data. Science 351(6277): 1024–1026. 10.1126/science.aad7048 [PubMed:
26941302]
Mebane CA, Eakins RJ, Fraser BG, Adams WJ. 2015 Recovery of a mining-damaged stream
ecosystem. Elementa 3(1): 000042 10.12952/journal.elementa.000042
EPA Author Manuscript
Mebane CA, Hennessy DP, Dillon FS. 2008 Developing acute-to-chronic toxicity ratios for lead,
cadmium, and zinc using rainbow trout, a mayfly, and a midge. Water Air Soil Pollut 188(1–4):
41–66. 10.1007/s11270-007-9524-8
Mebane CA, Meyer JS. 2016 Environmental toxicology without chemistry and publications without
discourse: Linked impediments to better science. Environ Toxicol Chem 35(6): 1335–1336.
10.1002/etc.3418 [PubMed: 27216837]
Melvin SD, Munkittrick KR, Bosker T, MacLatchy DL. 2009 Detectable effect size and bioassay
power of mummichog (Fundulus heteroclitus) and fathead minnow (Pimephales promelas) adult
reproductive tests. Environ Toxicol Chem 28(11): 2416–2425. 10.1897/08-601.1 [PubMed:
19604029]
Menzie C, Smith R. 2018 8 9 Scientific integrity must rise above partisanship. SETAC Globe 19(8).
https://globe.setac.org/scientific-integrity-must-rise-above-partisanship/
Meyer JL, Frumhoff PC, Hamburg SP, de la Rosa C. 2010 Above the din but in the fray:
Environmental scientists as effective advocates. Front Ecol Environ 8(6): 299–305.
10.1890/090143
Meyer JN, Francisco AB. 2013 A call for fuller reporting of toxicity test data. Integr Environ Assess
Manag 9(2): 347–348. 10.1002/ieam.1406 [PubMed: 23529809]
EPA Author Manuscript
Michaels D 2008 Doubt is their product: How industry’s assault on science threatens your health.
Oxford (UK): Oxford Univ Pr. 384 p.
Moermond CTA, Kase R, Korkaric M, Ågerstrand M. 2016 CRED: Criteria for reporting and
evaluating ecotoxicity data. Environ Toxicol Chem 35(5): 1297–1309. 10.1002/etc.3259
[PubMed: 26399705]
Mozur MC. 2012 A global professional society. Integr Environ Assess Manag 8(1): 1 10.1002/
ieam.1270 [PubMed: 22184140]
Mudge JF, Barrett TJ, Munkittrick KR, Houlahan JE. 2012 Negative consequences of using α = 0.05
for environmental monitoring decisions: A case study from a decade of Canada’s Environmental
Effects Monitoring Program. Environ Sci Technol 46(17): 9249–9255. 10.1021/es301320n
[PubMed: 22873710]
Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, Percie du Sert N, Simonsohn U,
Wagenmakers E-J, Ware JJ, Ioannidis JPA. 2017 A manifesto for reproducible science. Nat Hum
Behav 1: 0021 10.1038/s41562-016-0021
Munkittrick KR, Arens CJ, Lowell RB, Kaminski GP. 2009 A review of potential methods for
determining critical effect size for designing environmental monitoring programs. Environ
Toxicol Chem 28(7): 1361–1371. 10.1897/08-376.1 [PubMed: 19199371]
EPA Author Manuscript
Murtaugh PA. 2007 Simplicity and complexity in ecological data analysis. Ecology 88(1): 56–62.
10.1890/0012-9658(2007)88[56:SACIED]2.0.CO;2 [PubMed: 17489454]
Muscat JE, Huncharek MS. 2008 Perineal talc use and ovarian cancer: A critical review. Eur J Cancer
Prev 17(2): 139–146. 10.1097/CEJ.0b013e32811080ef [PubMed: 18287871]
[NAS] National Academy of Sciences. 1992 Responsible science: Ensuring the integrity of the
research process Washington (DC): Natl Acad Pr. 224 p. 10.17226/1864
[NAS] National Academy of Sciences. 2017 Fostering integrity in research. Washington (DC): Natl
Acad Pr. 284 p. 10.17226/21896
Nature Editors. 2014 9 17 Why high-profile journals have more retractions. Nature (News). 10.1038/
nature.2014.15951
Nature Editors. 2018a China sets a strong example on how to address scientific fraud. Nature 558: 162
10.1038/d41586-018-05417-1
Nature Editors. 2018b Nature journals tighten rules on non-financial conflicts. Nature 554(6). 10.1038/
d41586-018-01420-8
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 41
Nature Medicine Editors. 2017 Take science off the stand. Nat Med 23(3): 265–265. 10.1038/nm.4303
[PubMed: 28267711]
Nelson B 2009 Data sharing: Empty archives. Nature 461: 160–163. 10.1038/461160a [PubMed:
19741679]
Nelson MP, Vucetich JA. 2009 On advocacy by environmental scientists: What, whether, why, and
EPA Author Manuscript
Parker TH, Forstmeier W, Koricheva J, Fidler F, Hadfield JD, Chee YE, Kelly CD, Gurevitch J,
Nakagawa S. 2016 Transparency in ecology and evolution: Real problems, real solutions. Trends
Ecol Evol 31(9): 711–719. 10.1016/j.tree.2016.07.002 [PubMed: 27461041]
Paul AP, Garbarino JR, Olsen LD, Rosen MR, Mebane CA, Struzeski TM. 2016 Potential sources of
analytical bias and error in selected trace element data-quality analyses. Reston (VA): US
Geological Survey. Scientific Investigations Report 2016-5135 68 p. http://pubs.er.usgs.gov/
publication/sir20165135
Peterman RM, M’Gonigle M. 1992 Statistical power analysis and the precautionary principle. Mar
Pollut Bull 24(5): 231–234. 10.1016/0025-326X(92)90559-O
Pielke RA Jr. 2007 The honest broker. New York (NY): Cambridge Univ Pr 188 p.
PLOS. 2014 Data availability. San Francisco (CA) [accessed 2017 Mar 2]. http://journals.plos.org/
plosone/s/data-availability
PLoS Medicine Editors. 2008 Making sense of non-financial competing interests. PLoS Med 5(9):
e199 10.1371/journal.pmed.0050199 [PubMed: 18828670]
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 42
Power M, Power G, Dixon DG. 1995 Detection and decision-making in environmental effects
monitoring. Environ Manage 19(5): 629–639. 10.1007/BF02471945
Raloff J 2010 5 6 Atrazine paper’s challenge: Who’s responsible for accuracy? Science News (Science
& the Public). https://www.sciencenews.org/blog/science-public/atrazine-paper%E2%80%99s-
challenge-who%E2%80%99s-responsible-accuracy
EPA Author Manuscript
Rekdal OB. 2014 Academic urban legends. Soc Stud Sci 44(4): 638–654. 10.1177/0306312714535679
[PubMed: 25272616]
Renner R 2005 Proposed selenium standard under attack. Environ Sci Technol 39(6): 125A–126A.
10.1021/es053215n
Resnik DB, Neal T, Raymond A, Kissling GE. 2015 Research misconduct definitions adopted by U.S.
research institutions. Account Res 22(1): 14–21. 10.1080/08989621.2014.891943 [PubMed:
25275621]
Resnik DB, Shamoo AE. 2011 The Singapore statement on research integrity. Account Res 18(2): 71–
75. 10.1080/08989621.2011.557296 [PubMed: 21390871]
Robbins R 2017 Jan 10. A supplement maker tried to silence this Harvard doctor — And put academic
freedom on trial. Boston (MA): STAT [accessed 2017 Mar 6]. https://www.statnews.com/
2017/01/10/supplement-harvard-pieter-cohen/
Roberts PD, Stewart GB, Pullin AS. 2006 Are review articles a reliable source of evidence to support
conservation and environmental management? A comparison with medicine. Biol Conserv
132(4): 409–423. 10.1016/j.biocon.2006.04.034
Rohr JR, McCoy KA. 2010 Preserving environmental health and scientific credibility: A practical
guide to reducing conflicts of interest. Conserv Lett 3(3): 143–150. 10.1111/
EPA Author Manuscript
j.1755-263X.2010.00114.x
Ross T 2017 1 28 Crime in the cancer lab. New York Times (Opinion). https://www.nytimes.com/
2017/01/28/opinion/sunday/a-crime-in-the-cancer-lab.html?_r=1
Ruff K 2015 Scientific journals and conflict of interest disclosure: What progress has been made?
Environ Health 14(1): 1–8. 10.1186/s12940-015-0035-6 [PubMed: 25564290]
Russell M, Boulton G, Clarke P, Eyton D, Norton J. 2010 The independent climate change email
review. 160 p. http://www.cce-review.org/
Santore RC, Ryan AC, Kroglund F, Teien HC, Rodriguez PH, Stubblefield WA, Cardwell AS, Adams
WJ, Nordheim E. 2018 Development and application of a Biotic Ligand Model for predicting the
toxicity of dissolved and precipitated aluminum. Environ Toxicol Chem 37(1): 70–79. 10.1002/
etc.4020 [PubMed: 29080370]
Sass JB, Castleman B, Wallinga D. 2005 Vinyl chloride: A case study of data suppression and
misrepresentation. Environ Health Perspect 113(7): 809–812. [PubMed: 16002366]
Schäfer RB, Bundschuh M, Focks A, von der Ohe PC. 2013 To the Editor [authors should make all
their raw data accessible]. Environ Toxicol Chem 32(4): 734–735. 10.1002/etc.2140 [PubMed:
23508403]
Schiermeier Q 2012 7 9 Arsenic-loving bacterium needs phosphorus after all. Nature (News). 10.1038/
EPA Author Manuscript
nature.2012.10971
Schindler DW. 1998 Replication versus realism: The need for ecosystem-scale experiments.
Ecosystems 1(4): 323–334. 10.1007/s100219900026
Scott AP. 2012 Do mollusks use vertebrate sex steroids as reproductive hormones? Part I: Critical
appraisal of the evidence for the presence, biosynthesis and uptake of steroids. Steroids 77(13):
1450–1468. 10.1016/j.steroids.2012.08.009 [PubMed: 22960651]
Scott AP. 2018 Is there any value in measuring vertebrate steroids in invertebrates? Gen Comp
Endocrinol 265: 77–82. 10.1016/j.ygcen.2018.04.005 [PubMed: 29625121]
Scott JM, Rachlow JL. 2011 Refocusing the debate about advocacy. Conserv Biol 25(1): 1–3. 10.1111/
j.1523-1739.2010.01629.x [PubMed: 21251070]
Scott JM, Rachlow JL, Lackey RT, Pidgorna AB, Aycrigg JL, Feldman GR, Svancara LK, Rupp DA,
Stanish DI, Steinhorst RK. 2007 Policy advocacy in science: Prevalence, perspectives, and
implications for conservation biologists. Conserv Biol 21(1): 29–35. 10.1111/
j.1523-1739.2006.00641.x [PubMed: 17298508]
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 43
Sedlak D 2016 Crossing the imaginary line. Environ Sci Technol 50(18): 9803–9804. 10.1021/
acs.est.6b04432 [PubMed: 27588331]
Shaw D, Satalkar P. 2018 Researchers’ interpretations of research integrity: A qualitative study.
Account Res 1–15. 10.1080/08989621.2017.1413940 [PubMed: 29140730]
Shrader-Frechette K 2012 Research integrity and conflicts of interest: The case of unethical research-
EPA Author Manuscript
Sprague LA, Oelsner GP, Argue DM. 2017 Challenges with secondary use of multi-source water-
quality data in the United States. Water Res 110: 252–261. 10.1016/j.watres.2016.12.024
[PubMed: 28027524]
Stedeford T 2007 Prior restraint and censorship: Acknowledged occupational hazards for government
scientists. William and Mary Environ Law Policy Rev 31(3): 725–745.
Stein R, Eilperin J. 2010 12 10 Obama administration issues guidelines designed to ensure “scientific
integrity.” Washington Post (Politics). http://www.washingtonpost.com/wp-dyn/content/article/
2010/12/17/AR2010121705774.html?noredirect=on
Steneck NH. 2006 Fostering integrity in research: Definitions, current knowledge, and future
directions. Sci Eng Ethics 12(1): 53–74. 10.1007/PL00022268 [PubMed: 16501647]
Stern V 2017 10 2 Updated: Why would a university pay a scientist found guilty of misconduct to
leave? Science (News). 10.1126/science.aan7182
Stokstad E 2012 7 24 Fracking report criticized for apparent conflict of interest. Science (News). http://
www.sciencemag.org/news/2012/07/fracking-report-criticized-apparent-conflict-interest
Sumner P, Vivian-Griffiths S, Boivin J, Williams A, Venetis CA, Davies A, Ogden J, Whelan L,
Hughes B, Dalton B, Boy F, Chambers CD. 2014 The association between exaggeration in health
related science news and academic press releases: Retrospective observational study. BMJ 349
EPA Author Manuscript
10.1136/bmj.g7015
Sumpter JP, Donnachie RL, Johnson AC. 2014 The apparently very variable potency of the anti-
depressant fluoxetine. Aquat Toxicol 151: 57–60. 10.1016/j.aquatox.2013.12.010 [PubMed:
24411166]
Sumpter JP, Scott AP, Katsiadaki I. 2016 Comments on Niemuth, N.J. and Klaper, R.D. 2015.
Emerging wastewater contaminant metformin causes intersex and reduced fecundity in fish;
Chemosphere 135, 38–45. Chemosphere 165: 566– 569. 10.1016/j.chemosphere.2016.08.049
Suter GW II, Cormier SM. 2015a Bias in the development of health and ecological assessments and
potential solutions. Hum Ecol Risk Assess 22(1): 99–115. 10.1080/10807039.2015.1056062
Suter GW II, Cormier SM. 2015b The problem of biased data and potential solutions for health and
environmental assessments. Hum Ecol Risk Assess 21(7): 1736–1752.
10.1080/10807039.2014.974499
Suter GW II, Norton SB, Cormier SM. 2002 A method for inferring the causes of observed
impairments in aquatic ecosystems. Environ Toxicol Chem 21(6): 1101–1111. 10.1002/
etc.5620210602 [PubMed: 12069293]
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 44
Tollefson J 2015 Earth science wrestles with conflict-of-interest policies. Nature 522: 403–404.
10.1038/522403a [PubMed: 26108831]
Topf A 2016 7 10 Imperial Metals sues engineering companies over Mount Polley disaster.
Mining.com [accessed 2017 Feb 7]. http://www.mining.com/imperial-metals-sues-engineering-
companies-mount-polley-disaster/
EPA Author Manuscript
US Department of Interior. 2014 Code of scientific and scholarly conduct [poster]. [accessed 2017 Feb
26]. https://www.doi.gov/scientificintegrity/upload/DOI-Code-of-Scientific-and-Scholarly-
Conduct-Poster-December-2014.pdf
[USEPA] US Environmental Protection Agency. 2013 Aquatic life ambient water quality criteria for
ammonia − Freshwater 2013. Washington (DC) EPA 822-R-13-001. 255 p. http://water.epa.gov/
scitech/swguidance/standards/criteria/aqlife/ammonia/index.cfm
Van Der Kraak GJ, Hosmer AJ, Hanson ML, Kloas W, Solomon KR. 2014 Effects of atrazine in fish,
amphibians, and reptiles: An analysis based on quantitative weight of evidence. Crit Rev Toxicol
44(sup5): 1–66. 10.3109/10408444.2014.967836
van Iersel S, Swart EM, Nakadera Y, van Straalen NM, Koene JM. 2014 Effect of male accessory
gland products on egg laying in gastropod molluscs. J Vis Exp 88: e51698 10.3791/51698
Van Kirk RW, Hill SL. 2007 Demographic model predicts trout population response to selenium based
on individual-level toxicity. Ecol Model 206(3–4): 407–420. 10.1016/j.ecolmodel.2007.04.003
van Kolfschooten F 2002 Conflicts of interest: Can you believe what you read? Nature 416(6879):
360–363. [PubMed: 11919595]
Van Noorden R 2014 2 24 Publishers withdraw more than 120 gibberish papers. Nature (News).
10.1038/nature.2014.14763
EPA Author Manuscript
Vines TH, Albert AYK, Andrew RL, Débarre F, Bock DG, Franklin MT, Gilbert KJ, Moore J-S,
Renaut S, Rennison DJ. 2014 The availability of research data declines rapidly with article age.
Curr Biol 24(1): 94–97. [PubMed: 24361065]
Vosoughi S, Roy D, Aral S. 2018 The spread of true and false news online. Science 359(6380): 1146–
1151. 10.1126/science.aap9559 [PubMed: 29590045]
Wadman M 1997 $100m payout after drug data withheld. Nature 388(6644): 703–703.
Wagner WE. 2005 The perils of relying on interested parties to evaluate scientific quality. Am J Public
Health 95(S1): S99–S106. 10.2105/AJPH2004.044792 [PubMed: 16030346]
Wagner WE, Michaels D. 2004 Equal treatment for regulatory science: Extending the Controls
governing the quality of public research to private research. Am J Law Med 30(2–3): 119–154.
10.1177/009885880403000202 [PubMed: 15382750]
Weltje L, Sumpter JP. 2017 What makes a concentration environmentally relevant? Critique and a
proposal. Environ Sci Technol 51: 11520–11521. 10.1021/acs.est.7b04673 [PubMed: 28956595]
Whaley P, Halsall C, Ågerstrand M, Aiassa E, Benford D, Bilotta G, Coggon D, Collins C, Dempsey
C, Duarte-Davidson R et al. 2016 Implementing systematic review techniques in chemical risk
assessment: Challenges, opportunities and recommendations. Environ Int 92–93: 556–564.
10.1016/j.envint.2015.11.002
EPA Author Manuscript
Wiens JA, Parker KR. 1995 Analyzing the effects of accidental environmental impacts: Approaches
and assumptions. Ecol Appl 5(4): 1069–1083. 10.2307/2269355
Wise J 1997 Research suppressed for seven years by drug company. BMJ 314(7088): 1145–1145.
10.1136/bmj.314.7088.1145 [PubMed: 9146378]
Womack RP. 2015 Research data in core journals in biology, chemistry, mathematics, and physics.
PLoS One 10(12): e0143460 10.1371/journal.pone.0143460 [PubMed: 26636676]
Young NS, Ioannidis JPA, Al-Ubaydli O. 2008 Why current publication practices may distort science.
PLoS Med 5(10): e201 10.1371/journal.pmed.0050201 [PubMed: 18844432]
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 45
EPA Author Manuscript
EPA Author Manuscript
Figure 1.
Conflicts of interest in science arise when secondary interests such as financial gain or
maintaining professional relationships compromise the primary interest of upholding
scientific norms such as the objective design, conduct, and interpretation of studies and the
open sharing of scientific discoveries to advance our collective learning (Reprinted with
permission. © Benita Epstein.)
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 46
EPA Author Manuscript
EPA Author Manuscript
Figure 2.
Confirmation bias is the tendency to seek and interpret evidence in a way that confirms
preexisting beliefs and gives less consideration to alternative hypotheses (Reprinted with
permission. © Benita Epstein).
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 47
EPA Author Manuscript
EPA Author Manuscript
Figure 3.
Large environmental chemistry and toxicology laboratories that use standard methods to
produce results that may be submitted to regulatory agencies usually have a well-established
quality management structure. Quality management in academic research laboratories
focused on novel methods may be more ad hoc, especially if the research work force is
EPA Author Manuscript
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.
Mebane et al. Page 48
EPA Author Manuscript
EPA Author Manuscript
Figure 4.
The brief methods descriptions in journal articles are seldom sufficient to be reproducible by
others. Step-by-step video documentation of experimental protocols can be published as
video articles, uploaded to online repositories, or published as supplemental information.
Video protocols are underutilized in environmental toxicology (Credit: S Harris,
EPA Author Manuscript
sciencecartoonsplus.com).
Integr Environ Assess Manag. Author manuscript; available in PMC 2020 June 24.