Stephanie J. Bird is co-Editor of Science and Engineering Ethics and an independent consultant. She is a laboratory-trained neuroscientist and former Special Assistant to both the Provost and the Vice President for Research of the Massachusetts Institute of Technology (MIT), where she was responsible for developing educational programs on the responsible conduct of research, engineering ethics, and professional ethics more generally.
The general view has been that science education should emphasize scientific concepts and principles. To the extent that it was given any thought within the science community, it was assumed that students would learn about responsible research conduct and other ethical concerns by observing good examples and through courses and education outside the core of science itself. It has become apparent that this approach is inadequate and serves neither the needs of the research community, nor those of society as a whole.
Presently, the focus of US ethics education in science and engineering tends to be on the individual and the responsible conduct of research (Kline 2013), or microethics [i]. In Europe, ethics education in science and engineering is grounded firmly in the concept of social responsibilities of scientists and engineers (Zandvoort, et al. 2013; Bird et al. 2013), or macroethics [ii]. The US focus on microethics rather than macroethics has led to some criticism (Kline 2013; Zandvoort et al. 2013). However, this difference is not only understandable, but appropriate because it reflects the different circumstances that have led to current efforts to introduce and integrate an awareness and examination of ethical values and conflicts into the education of scientists and engineers. Neither of these two approaches is sufficient on its own, but together they are complementary, and each makes important contributions to the education of scientists and engineers in a global and technological society.
Scientific Research: Misconduct and the Responsible Conduct of Research
Members of the scientific community rely on their colleagues to provide accurate, dependable, reproducible research that can be relied on to serve as a solid foundation upon which other researchers can build. In general, scientists are intensely curious about the nature of the universe. They expect that their work will contribute to "a common fund of knowledge" and that, in the aggregate, this knowledge will make the world a better place. Research in basic science investigates topics like how nerve cells communicate with each other and what a memory is. Research in applied science builds on basic science, for example, investigating how the memory process might be enhanced. While basic science and applied science are different, like most things, the difference is most apparent from a distance and much less distinct at the interface.
In the 1980s, the highly publicized examples of research misconduct were of fabrication, falsification, plagiarism, sexual misconduct/ harassment of graduate students by US investigators doing research outside of the US, and similar serious deviations from accepted, and acceptable, research practice. Although not uniform in their views, members of the scientific community generally recognized that research misconduct represented a significant internal threat to the research enterprise because of the real possibility that it could undermine not only public trust, but also confidence in the research process within the community as well (National Academy of Sciences 1992).
It is readily apparent that egregious misconduct undermines the fundamental assumptions that one can rely on the work of colleagues and build on it, and that one will receive fair credit for one's contribution to a project. These microethical issues are ones that can be addressed within the everyday context of research through education, explicit policies developed by laboratory and department heads, university administrators, disciplinary societies, and in the relationships between individuals. The elements of the responsible conduct of research are part of the research environment with which researchers are (or become) familiar and over which they have some control.
On the other hand, macroethical issues arise out of the use and potential misuse and abuse of research findings. These include their use in the development or support of public policy, and in the design and diffusion of technologies. Generally, the uses of scientific research are determined not by the researchers themselves but by employers, commercial private sector entities, government agencies, including the military, healthcare workers, the media, other members of the public, or in any case, individuals or groups that are only indirectly, or even completely, unrelated to the researchers. Basic science researchers have little, if any, control of the uses or misuse of their research. The general perspective within and even beyond the research community has long been that it is the user not the researchers who should be held responsible for how research findings are used, a view that remains widely held (Kline 2013). But, again, all research is not the same. As a college student a few decades ago, I remember a philosophy professor emphatically explaining that it was the military and the politicians, not Robert Oppenheimer and his fellow scientists, who should be held responsible for the death and destruction wrought by the atomic bomb. Yet it is one thing to investigate the secrets of the atom which may lead to applications beyond the wildest imaginings of the researcher; it is another to specifically work to apply those findings to develop a bomb with only one obvious use. The nature of the connection between research and its product is an important and substantial difference between basic and applied research.
It is not surprising that research ethics in the US emphasizes responsible conduct of research both because the elements of research conduct are immediately relevant to the day-to-day research environment, and because they are within the sphere of influence of the researcher. A fundamental tenet of adult education is that it needs to start where people are. The potential applications of much of basic science are beyond the ken of the average researcher, never mind the possible economic, legal, cultural and ethical implications of those applications.
However necessary the microethics approach, it is not sufficient. The research community is a part of, not apart from, the larger society. Like other professionals, scientists contribute to society through their work in a manner that reflects their interests, talents and expertise. Some benefits and privileges accompany their professional role, as well as some responsibilities. The social responsibilities of researchers arise not simply because research is funded (directly or indirectly) by the public. Research is carried out in the name of society as an expression and reflection of the society's needs, interests, priorities and expected impacts. Like anyone claiming to act in the name or interests of society, there is a largely unwritten, unexpressed contract. While researchers are compensated financially, with intellectual rewards and social status, society expects more than a high quality product. This expectation is expressed to some degree in the "broader impacts" criterion for evaluation and funding of National Science Foundation grant proposals, the inclusion of "significance" as a criterion for evaluating National Institutes of Health applications, and in the various formulations of the America COMPETES Act (2007).
Social responsibility has been identified as the responsibility embodied in the Paramountcy principle, the fundamental and primary ethical principle of engineering included in the professional engineers code of ethics: “Engineers, in the fulfillment of their professional duties shall hold paramount the safety, health and welfare of the public” (NSPE 2003). The social responsibility of scientists requires that they also attend to the foreseeable societal impacts of their work, particularly as these impacts affect the safety, health or welfare of the society. In part that responsibility flows from privileged status. For example, researchers are allowed to carry out experiments as they deem appropriate with relatively little oversight. An exception is research that involves research subjects whose humane treatment, whether laboratory animals or humans, is a responsibility that goes with the privilege and is explicitly expected under the rubric of the responsible conduct of research as well as spelled out in regulations that codify the principles of bioethics. But the social responsibilities of researchers extend beyond upholding the ethical standards of society. The Uppsala Code of Ethics for Scientists highlights the responsibility of scientists to refrain from, and speak out against, weapons research and other scientific research with the potential for detrimental consequences for the environment, and for present and future generations (Gustafsson et al. 1984). Furthermore, researchers' special knowledge that comes from their work, education and expertise enables them to understand the limits of the science and when its application (e.g., in the development or support of public policy) is a misuse or even abuse of the science. Researchers have a responsibility to oppose the misuse of their work.
Moreover, because of their special knowledge researchers are in a position to contribute substantially to public understanding of science and technology, and thereby to a democratic society, by promoting an informed citizenry. It seems plausible that these larger notions of responsibility underlie the relatively recent addition of discussions of "the scientist as a responsible member of society, contemporary ethical issues in biomedical research, and the environmental and societal impacts of scientific research"(NIH 2009) as appropriate elements of education in the responsible conduct of research.
The European Perspective
While research scandals have been receiving increasing attention in the US, in Europe there has been growing awareness and concern regarding the intended and unintended environmental and societal impacts of technological and scientific advances (of which the Uppsala Code is an example). The European macroethical approach to science ethics education arises from a full-throated declaration of the goals and role of higher education in society. In the last 10 years, as part of an effort to harmonize educational requirements at institutions of higher learning across Europe, an overarching educational framework has been adopted that highlights the widespread and strongly-held European view of social responsibility (Bologna Process 2005). The framework of qualifications for European Higher Education (EHEA) includes the expectation that all graduates, including those in science and engineering, "have the ability to gather and interpret relevant data (... within their field of study) to inform judgments that include reflection on relevant social, scientific or ethical issues" (at the bachelor's level) and "have the ability to integrate knowledge... and formulate judgments ... that include reflecting on social and ethical responsibilities linked to the application of their knowledge and judgments" (at the master's level).
Consistent with the idea that part of the social responsibility of researchers is to contribute to the development of an informed citizenry, the EHEA framework indicates that all graduates are expected to be able to "communicate information, ideas, problems and solutions to both specialist and non-specialist audiences" (at the bachelor's level), "communicate their conclusions, and the knowledge and rationale underpinning these [conclusions], to specialist and non-specialist audiences clearly and unambiguously" (at the master's level), and "communicate with their peers ... and society in general about their areas of expertise" (at the doctoral level).
Ethics Education in Science: Recommendations
Educational programs in science ethics in Europe and the US approach the topic from different directions for understandable reasons. In Europe, the macroethical approach is primary. However, providing "the big picture" is not enough. Trainees need to understand what is expected of them by peers in their field and what they themselves can expect as members of the global research community. At a minimum, this is responsible research conduct: the research integrity that all researchers around the globe rely on and require of their colleagues. It is noteworthy that the EHEA framework expects that graduates at the doctoral level "have demonstrated the ability to conceive, design, [and] implement... research with scholarly integrity." This is an important first step but more is needed, including in-depth discussions of the standards of the research community and responsible research conduct, potential for miscommunication, reasonable and unreasonable expectations, the potential for self-deception, and long-held, unconscious and possibly invalid assumptions or bias.
The microethical approach in the US is an essential beginning because it fulfills the fundamental educational criterion of starting where people are, in their own environment, acknowledging and validating the foundational ethical principles that underlie the research process. But it is insufficient because it does not adequately recognize the larger societal context of which research (and engineering) are a part. James Rest (1986, 1988), Muriel Bebeau (1991) and others have shown that moral development can and, with attention and nurturing, does continue throughout formal education as individuals learn about and appreciate their personal and professional role in society as members, professionals, contributors, and citizens. Yet, as employers, students and their families have called for an increasingly specialized education to meet the needs of the modern, technological world, US higher education has moved toward specialization at the expense of "breadth requirements" - that is, the very courses outside the core of science where students might learn about responsible research conduct and other ethical concerns. These are the elements of higher education, analogous to the multidisciplinary/cross-disciplinary/interdisciplinary components of ethics education in science and engineering in many European programs, that equip specialists to see and understand the societal context of their work. At the same time, while it is clear that science and engineering curricula should include stand-alone courses examining the ethical, legal and social policy implications of science and technology, along with responsible research conduct these issues must also be integrated into core courses in the form of modules, problem sets, exam questions and as elements of graduate theses.
Social responsibility and responsible research conduct are the two essential sides of ethical science. Both are necessary for an adequate education in science and engineering.
[i] In the US, it is not entirely true that science and engineering ethics education deals purely with microethical issues since programs and courses in science, technology and society (STS) are available at many universities and have been for decades, but they are not usually required for students majoring in science and engineering.
[ii] John Ladd (1980) is the source of this useful nomenclature that has been expanded and enhanced by Joseph Herkert (2005). It should be noted that macroethics includes not only collective professional responsibility, but also the decisions made by society about technology.
America COMPETES Act of 2007 (2007). https://www.govtrack.us/congress/bills/110/hr2272#summary (see Sections 7008 and 7009 [Accessed 30 June 2014]).
Bebeau, Muriel (1991) Can Ethics be Taught? A Look at the Evidence. Journal of the American College of Dentists 58 (1): 100-115.
Bird, Stephanie J., Zandvoort, Henk, Børsen, Tom & Deneke, Michael (2013). European Perspectives on Teaching Social Responsibility in Science and Engineering Science and Engineering Ethics 19 (4): 1413-1594.
Bologna Process (2005). Bologna Qualifications Framework http://www.nqai.ie/documents/bolognasummary.pdf (Accessed 27 June 2014) and http://www.ond.vlaanderen.be/hogeronderwijs/bologna/qf/overarching.asp (Accessed 27 June 2014).
Gustafsson, Bengt, Wallensteen, Peter, Ryden, Lars, and Tibell, Gunnar (1984). The Uppsala Code of Ethics for Scientists. Journal of Peace Research 21 (4): 311-316. (see also http://www.codex.uu.se/en/texts/Uppsala%20codex.pdf [Accessed 27 June 2014]).
Herkert, Joseph R. (2005). Ways of thinking about and teaching ethical problem solving: Microethics and macroethics in engineering. Science and Engineering Ethics 11(3): 373–385.
Kline, Ronald (2013). Teaching Social Responsibility for the Conduct of Research, IEEE Technology and Society Magazine Summer 2013, 52-58.
Ladd, John (1980). The quest for a code of professional ethics: An intellectual and moral confusion. In Rosemary Chalk, Mark S. Frankel and S.B. Chafer (eds) AAAS Professional Ethics Project: Professional Ethics Activities in the Scientific and Engineering Societies. AAAS, Washington DC, pp. 154-159.
National Academy of Sciences (NAS) (1992). Responsible Science: Ensuring the Integrity of the Research Process, Vol. I. Washington, DC: National Academies Press.
National Institutes of Health (NIH) (2009). "Update on the Requirement for Instruction in the Responsible Conduct of Research." NIH Guide for Grants and Contracts November 24. http://grants.nih.gov/grants/guide/notice-files/NOT-OD-10-019.html [Accessed 27 June 2014].
National Society of Professional Engineers (2003). Code of Ethics for Engineers. http://www.mtengineers.org/pd/NSPECodeofEthics.pdf (Accessed 27 June 2014).
Rest, James R. (1986). Moral Development in Young Adults, in R.A. Mines and K.S. Kitchener, (eds), Adult Cognitive Development. Praeger, New York.
Rest, James R. (1988) Can Ethics Be Taught in Professional Schools? The Psychological Research, Easier Said Than Done, pp. 22-26.
Zandvoort, Henk, Børsen, Tom, Deneke, Michael & Bird, Stephanie J. (2013). Editors’ Overview - Perspectives on Teaching Social Responsibility to Students in Science and Engineering. Science and Engineering Ethics 19 (4): 1413-1438.
This article is part of the Spring 2014 issue of Professional Ethics Report (PER). PER, which has been in publication since 1988, reports on news and events, programs and activities, and resources related to professional ethics issues, with a particular focus on those professions whose members are engaged in scientific research and its applications.