Fostering a Culture of Scientific Integrity: Legalistic vs. Scientific Virtue-Based Approaches

Robert T. Pennock is Professor of History and Philosophy of Science at Michigan State University, and a Visiting Scholar at the American Association for the Advancement of Science through the end of June 2015.

The research ethics community has come to a consensus that promoting responsible conduct of research (RCR) cannot be done on a piecemeal basis, but will require the cultivation of an ethical scientific culture (e.g., Gunsalus 1993, Atlas 2009).  One highly-cited paper puts it this way: “[A]ll explanations [of research misconduct] seem to share a common denominator—the failure to foster a culture of integrity” (Titus et. al. 2008, 981-982).  Focusing on culture is critical, but ethics and culture interact in complex ways, so fostering an ethical culture is not always straightforward.  Science, as C. P. Snow emphasized (1959), has its own distinctive culture and thus its own ways of expressing integrity.  Unfortunately, RCR is often framed in ways that are insensitive to how ethical norms are embodied and transmitted culturally in general, let alone in scientific culture.

Whereas deeply rooted cultural norms organically structure a society or a practice from within, RCR literature and training too often theorize and present research ethics in terms of quasi-legalistic external control.  I suggest an alternative that is explicitly centered instead on internal norms, specifically on scientific character virtues that embody both epistemic and ethical values.  The Scientific Virtues Project  has been developing theory and curricula along these lines, running courses, and holding RCR training workshops based on this approach.  Recently, it has been conducting a national survey of scientists to better understand the place of these values in scientific culture.   We shall have a better chance of fostering a culture of integrity if we broaden and reframe research ethics and science education in light of this perspective.  Although there is not space here to lay out this scientific virtue-based approach in detail, it may be illustrated by way of contrast to the legalistic approach. The paper quoted above that calls for fostering a culture of integrity will serve as a representative example of the latter.  It summarizes the issues in this way:

No regulatory office can hope to catch all research misconduct and we think that the primary deterrent must be at the institutional level. Institutions must establish the culture that promotes the safeguards for whistleblowers and establishes zero tolerance both for those who commit misconduct and for those who turn a blind eye to it.  (Titus et. al. 2008,  980)

Such sentences bristle with regulatory and legal terminology.  The paper’s recommendations for fostering an ethical culture in research are put in the same external, legalistic terms: institute “zero tolerance,” whistleblower protections, a clear reporting system, mentor training (specifically so mentors are “more aware of their roles in establishing and maintaining research rules and minimizing opportunities to commit research misconduct”), and alternative oversight mechanisms beyond formal complaints (e.g., institutional auditing of research records).  Even the final recommendation to model ethical behavior is formulated in like manner and focuses mostly on “policies,” “procedures,” and “deterrents” (Titus et. al. 2008, 982).  This is not the development of an ethical culture but of an enforcement culture.

Inherent in its name, RCR focuses on behavior—how should scientists conduct their work.  Conduct in the RCR literature is typically couched in terms of rule following and rule breaking.  Laws are not the only kind of rules, of course, but because the field arose in response to egregious behavior (Steneck, 1994, Steneck & Bulgar 2007), it is not surprising that RCR rules were originally theorized and are still largely framed in legalistic terms.  Putting it bluntly, RCR as currently taught is not so much focused on conduct as misconduct.

A legal framework may be necessary as a way for institutions to deal with misconduct, but this is not the most effective way to foster a culture of integrity.  It is not that rules of conduct are problematic in and of themselves, but in understanding cultural dynamics, one must take into account that rules seen as imposed from without are viewed very differently than those that are part of a culture.  This is one reason why scientists sometimes see RCR regulations as interfering with science rather than furthering its aims.

Furthermore, a legalistic approach that focuses on misconduct misses an important feature of culture, in that it goes beyond behaviors to include attitudes.  Culture is essentially normative, involving all sorts of values and ideals, including ideals of character.  Put another way, culture involves not only what kind of behaviors I should or shouldn’t do, but also what kind of person I should or shouldn’t be.  Thinking in terms of scientific virtues allows one to analyze and promote such values in the culture of science. 

By better understanding the character traits that make for an exemplary scientist one can acquire a better understanding of the actions that follow.  This is directly related to the notion of research itself.

When one speaks of responsible conduct of research, the tacit assumption is that we are dealing with scientific research, which is characterized by its distinctive aims and methods.  A scientific virtue-based approach begins here.  Aristotle explained how virtues arise in relation to the telos or ends of a practice: they are those settled dispositions that are conducive to the achievement of excellence in that practice.  The central aim of scientific practice is the discovery of empirical truths about the natural world, and the methods of science reflect its basic epistemic values, such as testability and repeatability.  Scientific virtues are thus those character traits—curiosity and honesty being the most central, together with related virtues of attentiveness, objectivity, skepticism, meticulousness, and some others—that a scientist should try to embody for science to flourish (Pennock 2006).

The final key term in RCR is responsible.  Typically this is thought of in this context as a synonym for ethical conduct of research, but it is worth considering what is implied specifically by the notion of responsibility.  The primary question one asks in this regard is “responsible for what”?  Appropriate answers to this question involve enumeration of one’s duties.  As previously noted, duty in science is not limited to compliance with laws and rules.  But a second question when one speaks of responsibility is to whom or to what is one responsible?  This is a more fundamental question, as duties are derivative of it.  I argue that the basic responsibility of the scientist is to science itself, in part because science is based on evidence rather than authority.  The scientist is not responsible to a scientific leader or any particular person but rather is responsible first to the values that structure science as a practice and then to humanity as a whole, as all practices themselves ultimately aim at human flourishing.

What this means is that scientific integrity is more than research integrity. Integrity involves the notion of a unified wholeness of parts that function together by virtue of the strength of its supporting structure.  Scientists are researchers at base but they are not only that.  They are also colleagues and mentors.  They interact with other actors in other professions and other walks of life.  They are citizens and human beings.  Thus we need to broaden the scope of research ethics in this way, for there is more to science than just the conduct of research.

As a way to speak about this, my own tendency is to retain the traditional sense of RCR with its focus on research integrity and think of that as one core part of a broader category of science ethics, which should be seen as also encompassing the scientific virtues and other topics that may be linked to but are not directly a part of basic research.  But one does not need to legislate terminology; research ethics is already a rather broad term.  As Pimple points out, it may even be said to be an “incoherent” field, with subject matter that encompasses “ageless moral truths and recent arbitrary conventions; minute details of particular actions and the broad sweep of public policy; life-and-death issues and matters just the other side of simple etiquette” (Pimple 2002, 198).  Whether we adopt a new term or further expand the scope of the old one, my point is just that we need a broader notion that incorporates this wider perspective and that explicitly includes the character of the scientist.

One advantage of the scientific virtue approach is that it provides a way to systematize some of these disparate aspects of the subject matter.  A scientific virtue approach can be helpful in analyzing traditional issues in RCR such as just authorship attribution (Pennock 1996), socially controversial subjects such as human cloning (Pennock 2001), responsible research funding and conflict of interest (Pennock 2002), and general issues such as the responsibility to defend the integrity of scientific methods (Pennock 2006).  It also helps highlight other professional responsibilities that deserve greater attention, including peer-review, dissemination, professional development, mentoring, and education.  It helps make sense of interests and conflicts of interest.  It can even help put issues of scientists’ social responsibility (which also goes beyond the traditional legalistic framework) in a new light, as such issues involve relationships between scientific and broader human values.  These and other aspects of the scientific virtue approach deserve further attention, but here my purpose was just to highlight its general utility for developing a culture of integrity.

The scientific virtue approach does not reject the importance of rules or even of law as a means of supplementing self-regulation.  Again, the problem is not with rules and laws per se, but rather with whether they are imposed from without or whether they arise as an expression of intrinsic values from within the culture.  The Scientific Virtues Project is making the case that science has an inherent moral structure and that the scientific virtues are a promising organizing principle for reconceiving and expanding science education and RCR.  To foster a culture of scientific integrity, taking the values already inherent in scientific culture seriously is a good place to begin.

References

Atlas, Ronald. 2009. “Responsible conduct by life scientists in an age of terrorism,” Science and Engineering Ethics, 15(3): 292-301.
Gunsalus, C. K. 1993 "Institutional structure to ensure research integrity," Academic Medicine, 68(9): S33-S38.
Pennock, R. T. 1996. "Inappropriate Authorship in Collaborative Scientific Research," Public Affairs Quarterly, 10(4): 379-393.
Pennock, R. T.  2001. "The Virtuous Scientist Meets the Human Clone." In New Ethical Challenges in Science and Technology. Sigma Xi Forum 2000 Proceedings, 117-124.
Pennock, R. T. 2002. "Research Funding and the Virtue of Scientific Objectivity," Academic Integrity, 5(2): 3-6.
Pennock, R. T. 2006. "Scientific Integrity and Science Museums," Museums and Social Issues, 1(1): 7-18.
Pimple, Kenneth. 2002. “Six domains of research ethics: A heuristic framework for the responsible conduct of research, Science and Engineering Ethics, 8(2): 191-205.
Snow, C. P.  1959.  The Two Cultures and the Scientific Revolution.  Cambridge University Press.
Steneck, Nicholas H. 1994. “Research Universities and Scientific Misconduct: History, Policies, and the Future,” The Journal of Higher Education, 65(3): 310-330.
Steneck, Nicholas H, Ruth Ellen Bulger. 2007. "The History, Purpose, and Future of Instruction in the Responsible Conduct of Research," Academic Medicine, 82(9): 829-834.
Titus, Sandra L., James A. Wells, and Lawrence J. Rhoades, 2008. “Repairing Research Integrity,” Nature, 453(7198): 980-982.

Acknowledgments

This material is based in part upon work supported by the National Science Foundation under Cooperative Agreement No. DBI-0939454 and by the John Templeton Foundation.  Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation or the John Templeton Foundation.  Thanks to Mark Frankel for helpful comments on an earlier draft of this paper.

Cite as: Pennock, R. T. July 2, 2015. “Fostering a Culture of Scientific Integrity: Legalistic vs. Scientific Virtue-Based Approaches.” Professional Ethics Report, 28(2): 1-3. DOI: 10.1126/srhrl.acr8257