Skip to main content

The Causes, Prevention and Management of Epidemiological Bias: A Workshop Approach

Thumbnail
Workshop participants at the International Society for Environmental Epidemiology| Richard Neutra

By Raymond Richard Neutra, MD, Dr.PH, Retired Chief, Division of Environmental and Occupational Disease Control, California Department of Public Health

At the twenty sixth annual meeting of the International Society for Environmental Epidemiology (ISEE) in Seattle, Washington professors, Erik Lebret, Leah Ceccarelli, Bruce Lanphear, Irva Hertz-Piccioto and I conducted a day long workshop to try out experiential course materials aimed at raising the awareness of young epidemiologists on the causes, prevention and management of bias in epidemiological research and in its use in legal and political settings. Twenty-two academic, government and consultant epidemiologists attended the session.

The results of environmental epidemiological research almost always have policy implications that are welcomed by or are deemed inconvenient by stakeholders on various sides of the issue. When epidemiologists take sides in the arena of public or legal discourse, either with ideological or financial incentives, there is a danger of straying from impartiality. ISEE has developed and published ethical guidelines related to this issue [1].

Greenland has written to advocate the reporting of financial conflicts of interest, not only as an ethical duty, but as a tool for adjusting for the documented general tendency for funded research to be pleasing to its sponsors [2]. He has also advocated the development of objective criteria such as transparency about possible sources of bias, and balance in the consideration of alternative hypotheses as procedures that would tend to minimize bias [3].

This workshop was sponsored by the ISEE Policy Committee as the result of several episodes in the past years in which epidemiologists consulting for industry failed to declare financial conflicts of interest and provided testimony that was judged by many to have been imbalanced.

The overall instructional objectives of an eventual course in which these materials would be included could be summarized as:

a) Students, given a case history will be able to identify scientific players and societal stakeholders and list ideological, financial, political and scientific-cultural factors that would tend to influence attitudes to and perception of epidemiologically relevant facts in the case.

b) Students will be able to detect lack of balance and lack of transparency as well as unscientific rhetorical tricks in the various stages of scientific study relevant to the case.

c) Students will be able to list the techniques used by ideological and financial stakeholders in a given case that are aimed at influencing the conduct of science in such a way as to cast doubt or manufacture alarm.       

d) Students will be able to list possible approaches to countering unbalanced, non-transparent or rhetorical "scientific" arguments and for countering other techniques for manufacturing doubt or alarm.

The day included some of the following: We started with a lecture by University of Washington professor of rhetoric Leah Ceccarelli that was based on her recent article [4]. She explained that all arguments involve making a claim about what is the state of the world (science) or about what one ought to do (the law and politics). These claims are based on factual grounds and general rules of inference that warrant the claim. An unbalanced non-transparent bias can creep into any of these three parts of an argument: grounds, warrant and claim.  Professor Ceccarelli pointed out that norms of argumentation in the law court were different from the norms of argumentation in science. A standard reference on legal ethics makes this clear:

"The goal of cross examination is to damage the credibility of the adverse witness even if that witness is telling the absolute truth" [5].

A similar attitude prevails in arguments in the court of public opinion. Professor Ceccarelli argued that when scientific claims about what is are unsupportive of what legal contenders or societal stakeholders believe one ought to do, these stakeholders like to shift the criticism to the science.  Sometimes it is better for scientists to refocus the discussion on the policy or legal argument hovering in the background.  Also scientist make a mistake in public argumentation when they say that they have already settled issues such as evolution, the viral cause of AIDS or human causes of global climate change within the realm of scientific discourse, and then stigmatize dissident scientists who disagree with the consensus but are viewed by some stakeholders as their embattled champions. The public expects a "fair" re-argumentation of the science all over again. Attempts to resist this and stigmatize the minority opinion only loses mainstream scientists’ public support.

I gave a lecture entitled "Causes of Bias and an Example of a Transparent Process for Minimizing Bias."  The causes were outlined (see below) and a description of the California Electric and Magnetic Field Program (www.ehib/emf) followed.  In that case, bias was minimized by agreeing ahead of time about:

1) What general rules of inference were to be used to warrant conclusions.

2) Criteria for entering facts into evidence when summarizing evidence.

3) Unbiased language to describe methodology, volume and quality of evidence.

4) Procedures for achieving balanced arguments.

5) How to express and justify a degree of certainty of causality.

6) How to deal with differing opinions.

7) Separating arguments about what is and what ought to be done in response to differing degrees of certainty.

8) How the results should be summarized for laypeople and scientists.

9) Creating a venue for stakeholder scrutiny and a rigorous public comment period for the risk evaluation guidelines and the risk evaluation itself.

Later in the day we lead a structured discussion on the psychological drivers of bias and what options are available to lower the probability of individual's bias in the generation and interpretation of scientific facts and societal-science arguments.  What options are available when bias is suspected? What are their pros and cons? Examples of bias drivers that were discussed included:

Psychological and Sociological Drivers

a) the "publish or perish" pressure that favors the publication of a newsworthy fact or association.

b) the tendency to welcome the confirmation of one’s own previous publications or to resist findings that are contrary to one’s own findings.

c) the tendency to welcome findings of friends, or family and doubt the findings of those we don't like or are from another discipline.

d) the tendency to fear being viewed as insufficiently critical of a finding that goes against "common knowledge."

e) the tendency to accept findings of those more academically powerful than the investigator.

f) the tendency to be a contrarian and a critic of other's conclusions.

g) the tendency to favor interpretations that would imply policy actions that accord with one's deeply cherished values and world view and to doubt interpretations that imply policy actions that would violate one's deeply cherished values.

h) the universal tendency to be over-confidant about things we know, such as the case of  over-confidence in the "important scientist" who is hired  by a stakeholder to pronounce on topics about which he/she is not a subject matter specialist.

i) "White Knight Syndrome," a tendency to exaggerate a possible danger with the desire to be viewed as a public hero.

j) the willingness to consciously cause bias to advance one's own interest to the detriment of the public good.

k) the willingness to have one's true but biased beliefs be sponsored and gain academic and public attention while it is used to advance a stakeholder's interest against the common good.

Political and Financial Sources of Bias

l) the tendency to favor interpretations that will please non-scientific stakeholders who have paid for or will be paying for your research.

m) the tendency to favor interpretations that will please non-scientific stakeholders who have paid you to interpret the evidence.

n) the tendency to favor interpretations that will increase the chance of receiving further research funding to follow up on what has been found.

o) the tendency to favor interpretations that will increase the chance of continued funding to your institution.

p) The tendency to favor interpretations that augment and continue the flow of consultation money or expert legal fees to you.

The participants were invited to discuss the pros and cons of a number of responses to bias and the pros and cons of involving other players in the detection and criticism of bias.

Another session was devoted to "Options for Preventing and Responding to Financial and Political Attempts to Systemically Produce Biased Results in Responsible Institutions." After a presentation by Professor Bruce Lanphear on cautionary examples, we held small group discussions of the pros and cons of various ways to prevent and respond to such efforts.

Some of the techniques of ideological or financial stakeholders to influence the scientific process for their benefit include:

a. Direct research funding to individual researchers by wealthy interested stakeholders or politically driven governments.

b. Funding to individual researchers to provide consultation.

c. Funding from stakeholders to support endowed professorial chairs.

d. Provision and rescinding of general university support by stakeholders.

e. Rewarding institutions that announce support only for certain lines of research.

f. Influencing whether research programs are designed to produce policy-relevant results as opposed to support diverse researcher interest.

g. Influencing institutions that decide what gets studied.

h. Influencing who does the study.

i. Framing or limiting the study's focus.

j. Influencing if and where something is published.

k. Influencing the content of abstract and summary.

l. Influencing content of press releases.

m. Influencing content of editorials.

n. Orchestrating letters to the editor.

o. Planning or conspiring to influence scientific debates or publications.

p. Intimidation through accusations of malpractice.

q. Intimidation by demanding time-consuming re-analysis.

r. Intimidation by legal deposition.

s. Intimidation by violence.

The attendees all agreed that the topic of psychological and systemic drivers of bias was not being covered in most courses of epidemiology and that the materials presented represented a promising beginning worthy of refinement. The working group and members of the ISEE governing council have agreed to find academic colleagues to try out these materials supplemented by case histories.

References

[1] Kramer, S., Soskolne, C., Mustapha, B.A. “The results of environmental epidemiological research almost always have policy implications that are welcomed by or are deemed inconvenient by stakeholders on various sides of the issue,” and Al-Delaimy, W. Editorial, “Revised Ethics Guidelines for Environmental Epidemiologists,” Environmental Health Perspectives, Vol. 108 No. 2, pp. A299-A301, August 2012.

[2] Greenland, S. “Accounting for Investigator Bias. Disclosure is Informative,” Journal of Epidemiology and Community Health, Vol. 63, pp. 593-598, 2009.

[3] Greenland, S. “Transparency and Neutrality, Shared Values or Just Shared Words,” Journal of Epidemiology and Community Health, Vol. 66, pp. 967- 970, 2012.

[4] Ceccarelli, L., “Manufacturing Scientific Controversy: Science, Rhetoric and Public Debate,” Rhetoric & Public Affairs, Vol.14, No.2, pp. 195-288, 2011.

[5] Friedman M.H. and Smith, A., Understanding Lawyer's Ethics, 4th Edition  LexisNexus, 2010, p.  213.


This article is part of the Summer 2014 issue of Professional Ethics Report (PER). PER, which has been in publication since 1988, reports on news and events, programs and activities, and resources related to professional ethics issues, with a particular focus on those professions whose members are engaged in scientific research and its applications.