New findings reported at two recent conferences demonstrate the range of research being conducted by AAAS’s Project 2061, a long-term science education reform initiative. At meetings of the American Educational Research Association (AERA) and the National Association for Research in Science Teaching (NARST), Project 2061 researchers presented results from three studies focused on different aspects of science education, from understanding how students learn energy concepts and what kinds of curriculum supports teachers need to implement Next Generation Science Standards (NGSS) to exploring the factors that affect the performance of English learners (ELs) on science assessments. The following provides highlights from and links to their papers and posters.
What do students know (or think they know) about energy and when do they know it? In papers presented at the AERA and NARST conferences, Project 2061’s Cari Herrmann-Abell and George DeBoer reported on data drawn from a study that is developing assessment instruments to measure students’ understanding of energy concepts at the elementary, middle, and high school levels. With support from the U.S. Department of Education’s Institute of Education Sciences, the study involves a diverse sample of more than 20,000 students across the U.S. and is yielding important insights about the progression of students’ learning of increasingly complex energy concepts and about the misconceptions they are likely to hold at each grade level.
Consistent with the core science ideas and crosscutting concepts about energy recommended in the NGSS, the assessment items developed for the study reflect a theoretical model of students’ growth of understanding about energy that moves from a phenomenological understanding, to being able to explain phenomena using basic energy concepts, to being able to explain phenomena using more advanced energy concepts (i.e., atomic/molecular). The assessment items also include common energy misconceptions as one or more answer choices. In their papers and presentations, the researchers described how they used Rasch Modeling and option probability curves generated for each item to explore the prevalence of particular misconceptions at each grade level as a way to better understand how students’ knowledge of energy develops over time. Results confirmed the hypothesized progression of understanding, which begins with students having an awareness of simple energy phenomena and then moves toward their use of increasingly more sophisticated energy concepts to solve energy-related problems.
“Although multiple-choice assessments are often criticized, our work points to the kinds of rich and useful data that well designed items can provide,” said Herrmann-Abell, principal investigator for the study. In addition to increasing teachers’ awareness of misconceptions about energy, the results from the study also validate the progression of energy learning goals from elementary through middle and high school that are specified in national content standards. This information should be helpful to teachers, curriculum developers, and researchers for designing and sequencing learning activities that respond to the needs of students.
Can the cognitive complexity of science assessments explain differences in performance of students who are English learners compared to native English speakers? Although some differences in assessment results may reflect real disparities in students’ knowledge, it is also possible that the tests are not fairly evaluating what students know. In a study funded by the National Science Foundation to identify factors that contribute to the underperformance of English learners, Project 2061 researchers George DeBoer, Cari Herrmann-Abell, and Sarah Glassman, along with collaborators at WestEd, are exploring the role that the linguistic and cognitive complexity of test items play in that underperformance. At the NARST conference, the researchers described their efforts to develop a measure of cognitive complexity that would evaluate both the type of knowledge and the type of mental processing required to answer a test question and reported on their results when the measure was applied to nearly 500 middle school science items. They also discussed preliminary findings from a linguistic analysis of the same items using an automated sentence parser to calculate the syntactic complexity of each sentence in each item.
“Our work in the first year of this study has focused on seeing whether it is even possible to design measures of cognitive and linguistic complexity that can help explain item difficulty for both EL and non-EL students, and we think we have demonstrated that it is,” said George DeBoer, the principal investigator for the study and deputy director of Project 2061.
“Although the variance in item difficulty that is explained is small, it is still significant,” he continued. The research team also noted differences in how item features operate for different topics and will be exploring this as the study continues.
How can curriculum materials help teachers implement NGSS core science ideas, crosscutting concepts, and practices in their classrooms? In a poster session at the AERA conference, Project 2061 researchers Jo Ellen Roseman and Cari Herrmann-Abell described a study in which they developed and tested an NGSS-aligned curriculum unit and investigated the kinds of support teachers need to implement the new standards successfully. Their presentation focused on efforts to improve support for student explanation writing for both students and teachers. The unit engaged students in constructing evidence-based explanations of phenomena involving chemical reactions in non-living and living systems. The five-year Toward High School Biology study was funded by the U.S. Department of Education’s Institute of Education Sciences.
Over several iterations of the new unit, the researchers drew on data gathered from students’ pre- and post-instruction tests, from embedded assessment tasks in student notebooks, and from teacher surveys to introduce additional scaffolding for the lessons as well as strategies, including a scoring rubric, for encouraging teachers to provide more extensive feedback to students on their explanations. “By the end of the study, most teachers, especially those who were experienced users of the curriculum unit and had participated in more of the study’s professional development, were able to use the rubric reliably,” said principal investigator Jo Ellen Roseman, who directs Project 2061.
Strategies were also suggested to help teachers be more efficient in providing feedback to students. “Some teachers had their students use the rubric to self-score, which saved time,” she said. “That’s important because realizing the NGSS vision of alignment, coherence, and instructional quality may exceed the time and resources that are typically available in U.S. classrooms.”