Skip to main content

Professional Ethics Report: Volume XXIV, Number 1, Winter 2011

Professional Ethics Report (PER), which has been in publication since 1988, reports on news and events, programs and activities, and resources related to professional ethics issues, with a particular focus on those professions whose members are engaged in scientific research and its applications.

Each quarterly issue is comprised of a cover story addressing one particular issue or event, sometimes written by an expert outside the AAAS; a series of timely, in the news stories; brief updates from the societies; and useful resources and announcements.

PER was first published on the web in the Spring of 1995. Archives of Professional Ethics Report are available here.

Volume XXIV, Number 1, Winter 2011

[PDF Version]

Cover Story

In the News

In the Societies



Contributing Staff

  • Mark S. Frankel, Editor and contributing author
  • Rebecca Carlson, Deputy Editor and contributing author
  • Rebecca Friedman, Contributing author
  • Brent Hagen, Contributing author
  • Stephen Uyeno, Contributing author

To Subscribe: To receive electronic versions of PER, please subscribe by filling out this form:

This newsletter may be reproduced without permission as long as proper acknowledgement is given. ISSN: 1045-8808

Cover Story

Reshaping Responsible Conduct of Research Education

Rebecca Carlson and Mark S. Frankel

Rebecca Carlson is the Program Assistant for the AAAS Program on Scientific Freedom, Responsibility and Law, and is the Deputy Editor of PER.

Mark S. Frankel is the Director of the AAAS Programs on Scientific Freedom, Responsibility and Law and on Science and Human Rights. He is also the Editor of PER.

The climate of scientific research is shifting. It is becoming increasingly global and more and more relevant for public policy. As a result, the professional practices of scientists are becoming increasingly scrutinized by the public, with greater expectations for accountability, integrity in research, and access to information. This emerging environment poses new demands on current responsible conduct of research (RCR) education. As professional research practices evolve, so must the education options offered to scientists and their students.

There are two fundamental types of responsibilities associated with scientists. One focuses on the internal workings of science, and the responsibility to uphold community standards for doing science. The other is outward directed, focusing on scientists’ social responsibilities to the larger community, which experiences the risks, costs and benefits of science. Traditional RCR education has concentrated on the first of those sets of responsibilities, those internal to science. Typically, it covers nine instructional areas: (1) Data Acquisition, Management, Sharing and Ownership; (2) Conflict of Interest and Commitment; (3) Human Subjects; (4) Animal Welfare; (5) Research Misconduct; (6) Publication Practices and Responsible Authorship; (7) Mentor/Trainee Responsibilities; (8) Peer Review; and (9) Collaborative Science[1]. It is virtually silent, however, on the social responsibilities of scientists. As one of the leading commentators on ethics in science and engineering has observed, “Currently, attention to professional responsibility in science and engineering research concentrates more on issues that arise in the conduct of science than in its social influence [2].” Scientists must be prepared to consider their responsibilities associated with, for example, emerging technologies, public pressure for better communication of and access to scientific work, their role in public policy deliberations, and the relationship between science and human rights.

Emerging Technologies

Society most often experiences the benefits from science in the form of new technologies. That is also the manner by which they experience its risks and costs. As novel technologies are developed and introduced into society, they could pose serious dangers, either inadvertently or purposely. This has been the case in recent years for life sciences research [3], which holds great promise for increasing our understanding of basic biological processes that could improve public health. Such research, however, is not without potentially serious risks. The National Science Advisory Board for Biosecurity (NSABB) is tasked with providing “advice, guidance, and leadership regarding biosecurity oversight of dual use research, defined as biological research with legitimate scientific purpose that may be misused to pose a biologic threat to public health and/or national security [4].” As novel organisms or devices are created, a possibility presents that they could accidently leak into the environment, or be used for nefarious purposes such as the weaponization of pathogens. In this case, it is critical that researchers understand the potential dual use implications of their work. In the broader sense, all technologies come with potential risks, and in most cases questions will surface about the range of accessibility of new technologies to various population groups. There must be appropriate educational opportunities available to scientists to consider what ethical responsibilities they have, including how their research might be made available to and used by others.

Science Communication

Science communication is critical for responding to rising public expectations for accountability and transparency in science, and to maintaining public trust, especially when federal funding is involved. However, communicating science is not so straightforward, having both practical and normative dimensions. “Policy can be severely subverted by science badly presented to the media, sometimes initiated by scientists themselves seeking to exaggerate the significance of their research [5].” Communicating well is a learned skill, as is an understanding of and preparing for the ethical dimensions of communicating science to the public, including policy makers. Several of those ethical issues include: (1) determining the proper level of inference that is warranted by one’s data; (2) deciding how much to communicate about the confidence of one’s findings as well as the uncertainties associated with them; and (3) balancing the requirements of peer review with growing public demands for access to potentially breakthrough research results and the ubiquity of social networking technologies. The lack of communication can also raise its own ethical conundrum, having been linked to the public’s inflated assessment of the risks and concern about synthetic biology research [6].

Responsible Advocacy

Scientists and engineers have much to contribute to public policy debates. However, what role, if any, they should play as advocates for specific policies is a matter of debate, both inside the scientific community and in society more generally. Advocacy fits centrally into notions of public accountability of scientists, concerns about research ethics, and scientists’ social responsibilities. As such, it overlaps both types of responsibilities incurred by scientists. Advocacy in the policy arena can raise questions about both the integrity of the research as well as of the researcher. For example, does advocacy detract from the objectivity and dispassion typically expected of scientists? When do scientists cross the line from being an independent source of valued information to designing or using their research to support some preconceived policy preference? How can a scientist advocate responsibly? There is growing public concern about the impartiality of scientists working in highly politicized fields, as reflected in events that transpired during the Climategate scandal, or the clash between science and politics surrounding human embryonic stem cell research. Yet, there is an educational gap in the training of scientists on what exactly constitutes advocacy, let alone, how to do it responsibly.

Human Rights

Science is an increasingly global enterprise, and scientists have responsibilities that cross national boundaries. One international agreement that speaks to these responsibilities is Article 15 of the International Covenant on Economic, Social and Cultural Rights (ICESCR) in the Declaration of Human Rights, which includes the human right to “enjoy the benefits of scientific progress and its applications [7].” The elements of this right remain to be determined, and scientists can play an important role in defining the provisions of the right and contributing to its global realization. Scientists should also be aware of the human rights implications of their work. A scientist working on drug development should consider how the results of her work will reach historically underserved populations, such as developing countries [8]. There is yet another connection between human rights and the responsibilities of scientists and engineers, and that is the possible use of their research to promote human rights. For example, the Global Alliance for Clean Cookstoves aims to provide healthy and efficient cooking solutions throughout the world, with the use of a relatively simple technology [9]. An additional example is the geospatial technologies project at AAAS that employs satellite imagery and global positioning systems to collect and analyze data that could be used to document human rights violations [10]. To our knowledge, there is nothing in current RCR education which touches on such important matters.


The evolution of professional practices, and our broadening understanding of the social and ethical responsibilities of scientists, requires us to assess responsible conduct of research education in order to call attention to the ways in which the curriculum can be reshaped to better prepare scientists for the future. The topics we discuss here are merely examples; we encourage others to add to the list. It is imperative that we give both veteran scientists and students the tools and skills necessary to navigate situations that raise ethical questions, not only in the lab, but also in the increasingly public and globalized world of scientific discovery.


[1] U.S. Department of Health and Human Services, Office of Research Integrity. Responsible Conduct of Research

[2] Rachelle Hollander, “Professional Responsibilities in Scientific and Engineering Research.” Science, Technology, and Society: An Encyclopedia, edited by Sal Restivo, 414-420. New York: Oxford University Press, 2005.

[3] National Institutes of Health, Office of Biotechnology Activities. Dual Use Research

[4] Charter for the National Science Advisory Board for Biosecurity. March 10, 2010.

[5] Leonard Fleck, in Wrestling with Behavioral Genetics: Science, Ethics, and Public Conversation, edited by, Erik Parens, Audrey R. Chapman, and Nancy Press. 336 pp. Baltimore, Johns Hopkins University Press, 2006.

[6] David Rejeski, “Synthetic Biology, the Public and the Media.” Presentation for the Presidential Commission for the Study of Bioethical Issues. July 9, 2010.

[7] Declaration of Human Rights, International Covenant on Economic, Social and Cultural Rights (ICESCR). Article 15.

[8] For more information, see ”Working Towards Global Access to Medicine and Health Technology,” Professional Ethics Report Vol. 23. No. 3 Summer 2010.

[9] Global Alliance for Clean Cookstoves

[10] American Association for the Advancement of Science, Science and Policy Programs. Geospatial Technologies and Human Rights.


In the News

Recommendations of Guiding Principles for Regulation of Emerging Technologies

On March 11, 2011 John P. Holdren, Director of the Office of Science and Technology Policy, along with Cass R. Sunstein of the Office of Management and Budget and Islam A. Siddiqui, a United States Trade Representative, issued a Memorandum for the Heads of Executive Departments and Agencies. This memorandum stresses the need to develop guiding principles for regulations that will assure “appropriate and balanced oversight” of emerging technologies. The memorandum classifies emerging technologies as fields such as nanotechnology, synthetic biology, and genetic engineering as well as other related sciences. The White House Emerging Technologies Interagency Policy Coordination Committee (ETIPC) outlined several important principles that should be considered in the implementation of regulating policies for these technologies so that regulations and oversight are advanced to protect safety, health, and the environment while maintaining innovation, the development of new technologies, and trade of these technologies.

The principles were developed in the context of the January 18, 2011 Executive Order 13563 [1], which ordered agencies to create preliminary plans of regulation within four months. The Executive Order promoted a regulatory system that fosters economic growth through innovation, competitiveness, and job creation. Each agency will be responsible for proposing regulations that put only a small burden on society while maximizing the net benefits of regulating the technology. It was recommended that agencies encourage public participation, innovation, and flexibility in the regulations. Agencies were also encouraged to use alternatives to regulation such as economic incentives.

The March 11 memorandum identifies five key principles for developing oversight of emerging technologies:

  • Agencies must base regulations on the most accurate and current scientific evidence available, and new information should be taken into account when it becomes available.
  • The public, especially stakeholders or those affected by the regulations, should have access to and be able to comment on the information. This excludes information involving national security or confidential business matters.
  • Regulatory agencies should clearly communicate the risks and benefits of the new technologies. This includes economic, environmental, or public health risks and benefits.
  • The regulations should allow for flexibility to take into account new evidence and the fluid state and applications of these emerging technologies.
  • The regulations should follow the principles outlined in EO 13563, such as using the best scientific information available, promoting innovative ideas for achieving regulatory goals, considering not regulating technologies if there are no oversight issues, and ensuring that the regulations are performance-based.

A copy of the memorandum can be viewed here:




Dispute Over Human Embryonic Stem Cell Research Continues

Battle lines have been drawn in the controversy surrounding human embryonic stem cell research (hESC). On January 21, 2011, the U.S. Court of Appeals for the Fourth Circuit ruled to uphold a lower court’s dismissal of the plaintiff’s standing in Mary Scott Doe v. Obama [1]. The plaintiff, a human embryo representing all frozen embryos in the United States, challenged the constitutionality of Executive Order 13505 and the 2009 NIH Guidelines on Human Stem Cell Research, which authorize federal funding for hESC research [2].

To be heard by the court, the plaintiff must prove a legitimate standing. To this end, the plaintiff must demonstrate: “(1) it has suffered an “injury in fact” that is (a) concrete and particularized and (b) actual or imminent, not conjectural or hypothetical; (2) the injury is fairly traceable to the challenged action of the defendant; and (3) it is likely, as opposed to merely speculative, that the injury will be redressed by a favorable decision [3].” The court reasoned that the plaintiff failed to identify a particularize harm because “the complaint does not identify any of the named plaintiff’s particularized characteristics.” The court similarly rejected the plaintiff’s argument that the NIH Guidelines threaten all frozen embryos, noting that only research involving stem cells that have been voluntarily donated are eligible for funding. The court did, however, narrowly define the scope of its ruling, leaving the door open for future challenges.

Another case involving a dispute over federal funding of hESC research is currently awaiting a ruling by the U.S. Court of Appeals for the D.C. Circuit, which must decide whether to uphold a District Court’s ruling to issue a preliminary injunction on all federally funded hESC research in the case Sherley v. Sebelius [4].On September 9, 2010 the Court ruled to temporarily stay the injunction, a decision that will remain in effect until a final opinion is issued.

[1] MARY SCOTT DOE v. OBAMA United State District Court, District of Maryland, January 21, 2011.


[3] (page 5)

[4] SHERLEY v. SEBELIUS, 704 F. Supp.2d 63(2010), United States District Court, District of Columbia, August 23, 2010.



HHS Inspector General Recommends NIH Address Institutional Conflicts of Interest

In January 2011, the HHS Office of Inspector General (OIG) released a report, Institutional Conflicts of Interest at NIH Grantees. In a previous OIG report, issues were identified in regards to oversight of institutional conflicts [1]. As such, this report bolsters the OIG’s recommendations to the NIH to create regulations to address institutional conflicts. Current federal regulations mandate only that each grantee institution have a written policy regarding researcher conflicts, but do not incorporate institutional conflicts. Various groups, including HHS’ Office for Human Research Protections and the Institute of Medicine, have expressed concern about these conflicts’ impact on research bias. This OIG report examines which grantee institutions have written procedures in place and which, if any, grantee institutions had previous financial conflicts of interest in FY 2008.

OIG recorded information from 156 grantee institutions that responded to its survey. Almost half of these institutions have some type of written policy in place, with institutional financial interests defined in various ways. Of the institutions with written policies, many had followed previous HHS recommendations from 2004 that instructed institutions to establish conflict-of-interest committees [2]. According to OIG, the institutions most likely to report researcher conflicts had additional written policies to address institutional financial interests. In institutions with over 38 institutional conflicts, OIG found that the conflicts of interest of individual researchers and the institution as a whole were often with the same company.

OIG noted that 98 of the 156 institutions surveyed had some policy to address institutional conflicts. The most common methods of identifying and managing institutional financial interests were committees established by the governing boards of the institutions. These committees tended to use disclosure as a method to address the conflicts.

OIG emphasized the need for national regulations and increased oversight of institutional conflicts. Until such regulations are in place, OIG encouraged NIH to support individually written policies by each grantee institution. The OIG report presented several recommendations to NIH:

  • Establish an agreed upon definition of institutional financial conflict of interest to avoid disparity in what the term includes
  • Require grantee institutions to develop a written policy to identify and oversee institutional conflicts
  • Develop specific requirements for a policy to identify and manage institutional conflicts
  • Require grantee institutions to report specific information to NIH detailing how institutional conflict will be managed
  • Specify guidelines for grantee institutions on how institutional conflicts can be addressed

The NIH agreed that the issue of institutional financial conflicts of interest was important and consistent with addressing researcher and institutional bias. NIH noted that it would take into consideration the OIG recommendations and studies, without taking an official position on whether it would implement them.

To view the full report, visit:





Presidential Commission Seeks Public Comment

The Presidential Commission for the Study of Bioethical Issues (PCSBI) requests public comment on both federal and international standards to protect human subjects in scientific studies funded by the federal government [1]. The recent discovery of a U.S. backed research program that intentionally infected Guatemalan citizens with sexually transmitted diseases during the mid to late 1940s led President Obama to ask the Commission to perform a review of current standards in human subjects protection in studies supported by the government [2].

The Commission seeks a variety of perspectives on existing standards for human subjects research in both the U.S. and abroad. In particular, some topics the Commission wishes to have addressed are how U.S. funded research, working abroad or with international collaboration, proceeds in practice. Also of interest are the ethical and legal issues of clinical trials conducted abroad with differing standards of care and access to treatments.

The Commission requests comment by May 2, 2011. For more details see [1].





Federal Judge Claims Existence of Gene for Child Pornography

On January 28, 2011, the U.S. Court of Appeals rejected a prior criminal sentencing that had been issued based on an unsubstantiated claim: that the defendant in a child pornography case possessed a gene that would continue to compel him to view child pornography. Defendant Gary Cossey of United States of America v. Gary Cossey pled guilty in 2009 to possession of child pornography, and subsequently had been sentenced to 78 months in prison and a lifetime of supervised release.

In justifying the sentencing, Judge Gary Sharpe of the U.S. District Court for the Northern District of New York made a number of claims based on his personal opinion. Foremost of these was Sharpe’s idea that Cossey’s criminal behavior was based on “a gene you were born with. And it’s not a gene you can get rid of.” Sharpe conjectured that the science of genetics would reflect his opinion “fifty years from now.”

Furthermore, he dismissed the results of favorable psychological evaluations of Cossey because “opinions of the psychologists and psychiatrists as to what harm you may pose to those children in the future is virtually worthless here,” and he claimed to not “have a lot of faith in that profession in the first place.” Due to these beliefs, Sharpe concluded that Cossey would be likely to re-offend in the future, and thus issued a long prison term.

On appeal, Judges Kearse, Walker Jr., and Pooler of the U.S. Court of Appeals for the Second Circuit stated that “[i]t is undisputed that it would be impermissible for the court to base its decision of recidivism on its unsupported theory of genetics.”

The sentence was vacated by the Court of Appeals, and remanded to a district court judge other than Sharpe due to “serious concerns over the objectivity of the judge in resentencing Cossey.”

To view the Court of Appeals opinion, visit:



PHS Conflict of Interest Regulations

The long-awaited Public Health Service revised regulation on conflict of interest, first proposed in May 2009, has been sent to the White House Office of Management and Budget for regulatory review. It is tentatively scheduled for final issuance in April 2011.



In the Societies

Guiding Scientists Through Science Communication

Communication is a fundamental tool for the scientific community and the public at large. It provides an outlet for scientists to highlight their achievements and the significance of their work, while simultaneously informing the public, and those with a vested interest (such as policy makers), about the current state of the science. Although effective communication is the nexus between scientists and the public, there are no clear standards available to which scientists can refer. Should scientists be expected to communicate with the media? What are their obligations to the public? What are the “best practices” for communicating one’s work? While these questions may not have clear-cut answers, recently, steps have been taken to better address the larger issue of science communication.

An Advisory Note released last December by the Committee on Freedom and Responsibility in the conduct of Science (CFRS) of the International Council for Science (ICSU) emphasizes the increasing importance and necessity of science communication, stating “new information and communication technologies provide both enormous opportunities and new threats for effective science communication. Provided that issues of accuracy, transparency, accountability and openness are taken seriously, then the use of rapid worldwide communication tools can improve public understanding and engagement [1].” While not a binding set of rules, the Advisory Note offers guidance to scientists and underscores the responsibility of scientists to accurately communicate their work. It will likely inform future discussions on this issue.

The Note stresses that scientists must be aware of the impact of their communications, and to that end, should be held responsible for what they say. This is especially true for those who are asked to respond during a time of public emergency, when it is critical not to incite public alarm. Furthermore, it is important to recognize that scientific findings are always subject to interpretation. The Committee encourages scientists to accurately portray their data, but also to address any uncertainties. Similarly, discussion of the impacts and potential implications of scientific findings should be realistic. Scientists can follow this guidance by exclusively communicating new findings that have been subject to peer-review. The Note gives special attention to the communication of results which may directly impact human well-being and the environment.

As a scientist engaging with the media, it is important to be as transparent as possible, especially with regard to one’s qualifications. It is necessary to distinguish between being a scientist who is speaking on behalf of a field in which he or she is an expert, and one who is speaking as a scientist generally. The Committee also recommends that scientists be aware of their audience; know to whom you are speaking and how your message might be tailored accordingly. Finally, the Note advises scientists to be prepared to engage in any dialogues or debates that may arise.





Review of “The Lab: Avoiding Research Misconduct”

“The Lab: Avoiding Research Misconduct,” a newly released training DVD produced by the Office of Research Integrity (ORI) in the U.S. Department of Health and Human Services (HHS), is intended as a primer on what to do when confronted with an unfortunate instance of fa-brication, falsification, or plagiarism. The film’s narrative centers on an accusation that a respected post-doctoral student falsified the data for his highly-anticipated scientific paper. To learn about the issues at hand, the viewer is introduced to four people at the University who are affected by the suspected misconduct: Kim Park, a graduate student in the lab, Hardik Rao, another of the lab’s post doctorate students, Aaron Hutchins, the lab’s principal investigator, and Beth Ridgely, the University’s Research Integrity Officer. “The Lab” is designed so the viewer is responsible for the ethical decisions the characters make, both in the workplace and in their personal lives. Choose correctly, disaster is thwarted and justice is served. Choose incorrectly, and the lab could see the end of its days.

The authors contributing to this article have varied experiences with research misconduct education. They have independently watched and reviewed this film.

The Early Undergraduate

This film takes a vastly different approach than conventional methods of teaching research ethics. Rather than communicating only the basic ethical standards of research, ‘The Lab’ imparts memorable examples of decisions made on a daily basis. It asks players to make simple decisions that might not seem to have long-lasting ethical ramifications such as: ‘Can I afford to cut a few hours off the experiment and go home?’ and ‘How do you divide your time between science and mentoring?’ These decisions can create a negative lab environment unable to address adequately problems arising with research misconduct.

Exercises in the film revolving around the steps of ethical decision making will likely be the most useful to students learning about ethical conduct in research. As opposed to much of the video, rather than just identifying a good and bad decision with differing scenarios, the tutorials on ‘ethical decision making’ lay out steps to take in order to reach the most ethical decision. These steps are coupled with video examples as well as questions to ask oneself before choosing a course of action. By taking on the part of these lab characters, students and researchers will be reminded that each small ethical decision can have multiple consequences.

While the film is more engaging than one would expect from an ethics tutorial expressing the best methods of making decisions, it lacks depth in explaining how to prevent situations of research misconduct from occurring in poor lab environments in the first place.


The Recent Graduate

Despite its orientation towards scientists, “The Lab” is not encumbered by discussion of complicated scientific procedures, test tubes and micropipettes; its message could be readily disseminated to students or professionals in other fields. The choose-your-path interactive feature of the film allows the viewer to be engaged and to experience the scenario from multiple perspectives, but is at times hindered by the obviousness of the choices one is given: should Kim read the draft of the article before signing the release form for the use of her data? Why yes, clearly Kim should.

The characters must navigate family pressures, high expectations at work, new professional responsibilities, competition among their peers, and the simple challenges that accompany daily life. Will Aaron fulfill his responsibilities as Principle Investigator and take the time to speak with Kim? Can Hardik put off his experiment to spend some much needed time at home with his wife? Will Beth be able to help her student and combat misconduct at her University? While the four storylines were, at moments, overshadowed by extraneous discussion, the film was able to portray effectively that being a scientist means making tough choices both inside the lab and at home. For those viewers without experience related to research misconduct, the lessons the film is designed to teach are clear. In addition, the Facilitator’s Guide, with its succinct step-by-step analysis of the characters’ decisions, will likely be a useful tool for instructors [2]. The film focuses on graduate and post-doctoral level researchers, but seems better suited for a younger audience (perhaps beginning undergraduate researchers) or for those with little education in research misconduct issues. Although the stories might not be immediately applicable for younger viewers, they provide insight on how to best handle similar situations that may arise in the future, and serve to reinforce the importance of integrity in research.


The Graduate Student

In practice, “The Lab” is a workshop in ethical decision-making. From the perspective of a graduate student and former research assistant, the day-to-day reality of working in a laboratory includes a number of decision-points just like those illustrated in “The Lab,” including the challenge of remaining diligent in your work despite time pressure, the choice of whether to attend helpful seminars when they may take you away from your experiments, and even maintaining proper life/work balance. “The Lab” takes the potentially dry topic of research misconduct and invigorates it by enlisting the viewer as the decision maker in a number of scenarios.

Most choices are not black-or-white. When graduate student Kim Park suspects that a postdoc in her lab may be incorrectly using her research in a prestigious publication, what should she do? Does she think the postdoc has made a careless mistake, or was it deliberate? Should she talk to a trusted coworker, her PI, the postdoc who may be lying, or the university Research Integrity Officer (RIO)?

Here are a few things “The Lab” does particularly well:

  • Presents material in an engaging manner. Time flies by as the viewer places him/herself in the role of a graduate student, postdoc, PI, or RIO, and sees both the immediate and eventual consequences of decisions made.
  • Illustrates the pervasive nature of ethical decision-making in contexts large and small. Even small decisions have consequences. When postdoc Hardik Rao chooses to fudge a cell harvest time point in order to be on time for a dinner with his in-laws, he finds himself committed to a chain of falsified results.
  • Clearly explains the role of the RIO. Before viewing this presentation, many in the audience will have never heard of the RIO, a position required at any university that uses NIH funds. After, they will have seen one in action dealing with a very sensitive situation.

“The Lab” was clearly made by people who understand what it is like to work in a real research laboratory with complicated social dynamics, the need to produce good experimental results, and pressure to compromise standards in the face of conflicting demands.





NSF Releases Online, Searchable Database of Case Closeout Memoranda

For over 20 years, the National Science Foundation (NSF) Office of Inspector General (OIG) has been conducting investigations regarding the reported misuse of NSF funds or actions contrary to NSF policies and procedures. These investigations cover a broad range of misconduct such as violations of human subjects regulations to more overarching themes of intellectual theft and fraud. Upon completion of the investigation, a Case Closeout Memorandum is prepared, outlining the nature of the case, the investigative actions taken, and the recommendations of OIG. The creation of an online, searchable database of case closeout memoranda stems from an in-house commitment to greater transparency and consistency among past rulings. The database is open to the public and is searchable by case number or category. By creating this database, NSF is more readily able to search previous rulings related to current investigations and ensure that the policies, procedures and investigative practices of the NSF are met [1]. Prior to the introduction of this database, longitudinal studies analyzing policy and misconduct changes were difficult to conduct. Researchers had to request change over time data through the Freedom of Information Act (FOIA), a lengthy, tedious process.





Call for Ethicists – The Ethics AdviceLine for Journalists is in search of ethicists to volunteer to be call-takers. Advice Line, which is a project of the Chicago Headline Club, offers support to journalists who seek advice about a professional ethics issues. When on duty (one week out of every four-six), call-takers will be expected to check their voice mail throughout the day, respond to any calls, and write a brief report on the ethics questions posed by the caller. For additional information, contact: David Ozar at or Casey Bukro at

Call for Papers – The Society for Ethics Across the Curriculum is seeking paper submissions for it’s 13th International Conference that will be held November 3-5, 2011. The theme of this year’s conference focuses on ethics in higher education. Specifically, the moral ideals one might face in a variety of professional environments and their relationship with society at large. The deadline for submissions is September 15. Papers and abstracts can be sent via email to Donna Werner, at: Hard copies should be mailed to Dr. Donna Werner, Chair, Humanities Department, St. Louis Community College, Meramec, 11333 Big Bend Boulevard, St. Louis MO 63122-5799. For more information, see:

Call for Postdocs – The Center for Genetic Research Ethics and Law (CGREAL) ), Department of Bioethics at Case Western Reserve University in collaboration with the Department of Bioethics at Cleveland Clinic is seeking 1-2 post-doctoral trainees with an interest in the ethical legal, and social issues (ELSI) in human genetics. Trainees will work closely with scholars to gain clinical, disciplinary, and empirical experience and will each be responsible for a research project. Review of applications will begin April 15. For more information about the program and necessary qualifications, visit their website at: Or contact, Aaron Goldenberg at (216) 368-8729, or

ConferenceThe annual AAAS Forum on Science and Technology Policy is the conference for people interested in public policy issues facing the science, engineering, and higher education communities. It is the place where insiders go to learn what is happening and what is likely to happen in the coming year on the federal budget and the growing number of policy issues that affect researchers and their institutions. Come to the Forum, learn about the future of S&T policy, and meet the people who will shape it. The Forum will take place May 5-6, in Washington, DC. To register or view the full agenda, visit:

Conference – On April 15-16, ST Global Consortium, in partnership with AAAS and the National Academies, will hold the 11th annual “Conference on Science & Technology in Society” in Washington DC. Registration for the meeting closes April 13. For more information, see:

Conference – PRIM&R will hold a meeting on social, behavioral, and educational research (SBER) issues April 28-29 in Boston, MA. In addition, there will be two pre-conference programs, Institutional Review Board (IRB) 101sm SBER and IRB Administrator 101, which will be held April 26-27. For information about registration and for more program details, see:

Conference – May 23-30, 2011, UNESCO will host the International Conference & Courses in Bioethics in Singapore. Topics covered include the history and future of bioethics, bioethics education and methodology, as well as the cultural, social, and legal aspects of bioethics. For additional information, visit:

Course – The 5-day summer course Theory and Skill of Ethics Teaching, taught by Deni Elliott, will be offered this summer August 1-5 at the University of Montana, Missoula. Space is limited to 20. For more information, visit or contact Dr. Elliott