Skip to main content

Evaluating? See

A new website offers evaluators of STEM education programs some excellent tools for designing and executing evaluations that are sensitive to context, population, and purpose. is the brainchild of Eric Jolly, Ph.D., of the Science Museum of Minnesota, and Patricia B. Campbell, Ph.D., who are both members of AAAS, as well as Tom R. Kibler of Campbell-Kibler Associates, Inc.—under funding from the National Science Foundation. Campbell and Jolly responded to MemberCentral about the site and the resources it encompasses.

AAAS MemberCentral: The appearance of a website designed to improve education evaluations is welcome news. Was it a natural outgrowth of your longtime collaboration, and how difficult was it to garner support from the NSF?
AAAS members Eric Jolly, Ph.D., Patricia B. Campbell, Ph.D.,: is a logical outgrowth of our earlier work on diversity and methods. In 2004, with Lesley Perlman, we published "Engagement, Capacity and Continuity: A Trilogy for Student Success," which grew out of our concern that the many STEM programmatic, instructional, and curriculum successes had not led to expected increases in the number and diversity of scientists or even of highly achieving students. We wanted to know why the successes were not translating into more progress. 

A similar conundrum led to We wondered why, with all the evaluations of all the STEM programs, we didn't know more about what works for whom in what context. We soon realized that it wasn't just about rigor. Even with the most rigorous designs and methods, if the evaluation did not take into account the needs, issues, and goals of different subgroups, the results were incomplete and often inaccurate. 

From the beginning, NSF was very supportive, insisting that the ideas in be useful and actionable. With their backgrounds in evaluation and diversity, our former and current program officers were extraordinarily helpful in making sure that happened.     

AAAS MC: The website offers lots of tips to improve evaluation with diverse populations, beginning with "Swimming in a Sea of Context." This is appealing to education evaluators, but still an esoteric concern to most researchers who are principal investigators. How do you intend to inform and then engage them?
Campbell and Jolly
: Reaching researchers is a challenge even though, for the most part, is equally useful to researchers and evaluators. We've started by focusing on academics and others who do both evaluation and research, as well as getting the word out through organizations whose members do applied research such as the Visitors Studies Association and the Understanding Interventions resources. Academic articles and presentations don't seem the most useful way to get researchers involved, so we are starting to reach out to diversity-oriented special interest groups within educational and social-science professional associations that should be more receptive to the message. We welcome any and all suggestions on other ways to reach researchers.   

AAAS MC: Among the novel features of are tips to better understand how factors such as participant gender, disability and military service can influence what data should be collected. Including disability and military service as categories of under-participation addresses a longtime oversight. Please elaborate on what you have in mind here.
Campbell and Jolly
: We realized that too often in STEM, the term "diverse populations" was code for populations that primarily consisted of racial/ethnic minorities. Pretty much everything else was ignored—except gender—but even then it was usually gender or race/ethnicity, not gender and race/ethnicity.

As individuals, we are not just one thing. Rather, we consist of many pieces—race, gender and ethnicity, but also age, geographic location, education, income, disability status, veteran status ... It makes no sense to focus on only one demographic variable. For each study we need to figure out the characteristics that are integral for analysis and incorporate them in the evaluation design, implementation, analysis and interpretation of results. In, with a lot of help from our friends, we've figured out some ways to do that.

AAAS MC: Another goal of the site is to help program officers and reviewers better critique evaluation plans and assess how well evaluations are working. This is another innovation to broaden the circle of stakeholders, raising consciousness and imparting skills. How have program officers and reviewers responded so far?
Campbell and Jolly
: Two years ago, we started by interviewing program officers as well as PIs and evaluators. The tips we developed for program officers grew out of their concerns with the limitations of current processes. We have just begun sharing the website with these federal staff and the response has been great. We've been invited to NSF and NIH to discuss Several program officers have already sent information about the site to their grantees.  

AAAS MC: Finally, how will you know that stakeholders are discovering the site and utilizing what they learn? In other words, what measures do you have in mind to help you evaluate the success of the site?
Campbell and Jolly
: Although the website is just up and running, we have indications of interest and use. Information about the site has been distributed by a variety of organizations including the American Physiological Society, the American Society of Engineering Education, AAAS Science Books & Films (SB&F), the Visitor Studies Association, Women in Engineering ProActive Network (WEPAN), and the Coalition for AfterSchool Science. Several faculty have told us they are using the site in their online evaluation courses.

Blog Name