Students learning English generally score lower on standardized science tests than those who speak English as their native language. Now, a new study by AAAS' Project 2061 will examine whether this gap is affected by the test questions themselves.
Project 2061, in partnership with educational research center WestEd, has received a grant from the National Science Foundation to scrutinize a large set of test questions developed for middle school students. They want to uncover any factors in the test questions—such as unfamiliar non-science vocabulary or the complexity of the sentences—that may be related to the lagging scores of English language learners.
The findings could be especially important with the start of language-intensive testing under the Common Core Standards and the Next Generation Science Standards. Among other skills, these standards contain a stronger emphasis on written and oral communication when developing models and engineering solutions, and in obtaining and evaluating scientific evidence.
Another concern is that the number of English language learners is growing as well, said Sharon Nelson-Barber, WestEd's Director of Language, Culture and Eco-Literacy, and the grant's co-principal investigator.
"On a broader scale, these are not new issues," she said, "but there is now greater awareness that increasing reliance on test scores to make high-stakes decisions about students may not be so appropriate in the context of an increasingly diverse student population."
The study will draw from the extensive database of test questions available at the AAAS Science Assessment Website, said George DeBoer, the grant's principal investigator and deputy director of Project 2061. The website contains over 800 test questions that target key ideas in 16 science topics, from evolution and natural selection to atoms, molecules, and states of matter. The questions have been field-tested by over 100,000 middle school and early high school students for their ability to assess students' knowledge of life and physical sciences and to identify students' common misconceptions about the sciences.
The website development team considered linguistic complexity when they developed the assessment items and "emphasized plain language" when possible, DeBoer said. "We made as much use as we could of strategies that experts had proposed for making these items accessible, but we did not systematically study the effects of making these kinds of changes to meet English language learner needs."
The study will analyze the number of noun, verb and prepositional phrases and their relationship to each other in the assessment questions. The number of branches, nodes and levels in the diagram are used to calculate linguistic complexity. (The example comes from a previous WestEd study of American Indian and Alaska Native students' experiences with language complexity in testing.) | WestEd
Under the new four-year grant, which begins 1 June, DeBoer and colleagues will measure the linguistic and cognitive complexity of the questions in the AAAS database. They will then determine whether there is a link between the linguistic complexity of the questions and how well English language learners in the field tests scored on the questions. ELL students made up 8.3% of all the field test participants, a proportion similar to that found in U.S. classrooms.
Some of the linguistic features that can pose a problem for ELL students include long phrases within sentences, and less commonly-used verb tenses. These factors increase the "language load" of a question, and it's not just ELL students who can falter under this load.
Nelson-Barber noted that high language load also can cause problems for English speakers who use a different dialect from mainstream English, or who have difficulties with written language. "Moreover, for any student, the more unnecessary complexity added to a test question, the less accessible the question is likely to be."
Questions can be cognitively complex in a number of ways. They may require straightforward knowledge of facts, or they may rely on high-level conceptual knowledge or familiarity with mathematical symbols.
For instance, DeBoer explained, one of the items in the AAAS Science Assessment Website discusses whether a jar with a plant sealed inside it will contain the same mass before and after the plant dies. Answering a question like this "goes beyond facts and requires applying appropriate scientific principles, in this case, the conservation of matter," he said.
Nelson-Barber said the study team wants to find ways "to simplify test language, yet keep items academically rigorous."
In the grant's third year, the team will take what they have learned about the assessment items and re-write some of the questions with the goal of making them more accessible to ELL students. After that, they will test the new items in classrooms across the country to see if the changes can improve ELL students' test performance.
"It will be powerful to see what is possible," Nelson-Barber said, "what can give students meaningful access so that they can demonstrate their learning, and what can yield information that will be useful for teachers."