During the 2020–2021 academic year, the Dynamic Learning Maps® (DLM®) Alternate Assessment System offered assessments of student achievement in English language arts, mathematics, and science for students with the most significant cognitive disabilities in grades 3–8 and high school. Due to differences in the development timeline for science, separate technical manual updates were prepared for English language arts and mathematics (see Dynamic Learning Maps Consortium, 2021a, 2021b).
The purpose of the DLM system is to improve academic experiences and outcomes for students with the most significant cognitive disabilities by setting high, actionable academic expectations and providing appropriate and effective supports to educators. Results from the DLM alternate assessment are intended to support interpretations about what students know and are able to do and to support inferences about student achievement in the given subject. Results provide information that can guide instructional decisions as well as information for use with state accountability programs.
The DLM Alternate Assessment System is based on the core belief that all students should have access to challenging, grade-level content. Online DLM assessments give students with the most significant cognitive disabilities opportunities to demonstrate what they know in ways that traditional paper-and-pencil, multiple-choice assessments cannot. A year-end assessment is administered in the spring, and results from that assessment are reported for state accountability purposes and programs.
A complete technical manual was created after the first operational administration in 2015–2016. After each annual administration, a technical manual update is provided to summarize updated information. The current technical manual provides updates for the 2020–2021 administration. Only sections with updated information are included in this manual. For a complete description of the DLM assessment system, refer to previous technical manuals, including the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017).
1.1 Impact of COVID-19 on the Administration of DLM Assessments
The COVID-19 pandemic had a significant impact on instruction, learning, and assessment. Beginning in March 2020, in response to the pandemic, many states and local school districts closed in an effort to slow the spread of the virus, as recommended by the Centers for Disease Control and Prevention (2020b, 2020a). During school closures, students across the country were unable to complete their spring assessments, including the DLM alternate assessments. As a result, on March 20, 2020, the U.S. Secretary of Education used her authority under the Elementary and Secondary Education Act of 1965 (Elementary and Secondary Education Act of 1965, 1965), as amended by the Every Student Succeeds Act (Every Student Succeeds Act, 2015), to invite states to submit 1-year waivers of the assessment and accountability requirements, which all 50 states, the District of Columbia, the Commonwealth of Puerto Rico, and the Bureau of Indian Education applied for and received (Recommended Waiver Authority Under Section 3511(d)(4) of Division A of the Coronavirus Aid, Relief, and Economic Security Act (“CARES ACT”), 2020).
Following the complete school and district closures and the halting of assessment administration in the spring of 2020, the reopening of schools in fall 2020 was characterized by variations of remote, in-person, and hybrid instructional models both within and across states. In many states and districts, the degree to which these instructional models were used changed over the course of the school year and was dependent on multiple factors, including COVID-19 case counts, district size, ages of students within schools, local policy, student needs, and parent choice. While state and local education agencies made every effort to ensure all students had access to instruction and instructional materials regardless of learning environment, it is well acknowledged that changes to learning inevitably occurred during the 2020–2021 academic year. Recognizing both the variability of instructional access and state and local need for data on student achievement, on February 22, 2021, the U.S. Department of Education’s Office of Elementary and Secondary Education provided states with guidance regarding assessment, accountability, and reporting requirements for the 2020–2021 school year. The department’s guidance, as it relates to assessments, offered states the option to apply for a 1-year waiver from accountability requirements as well as flexibility in assessment administration. The types of flexibility described in the department’s letter included administering shorter versions of state assessments, offering remote administration where feasible, and extending testing windows. The guidance further explained that the focus of this year’s assessments is “to provide information to parents, educators, and the public about student performance and to help target resources and supports” (Rosenblum, 2021).
This manual presents evidence for the results that were provided in 2020–2021, as well as other administration, test development, and research activities that occurred in 2020–2021 and were unaffected by the COVID-19 pandemic.
In 2020–2021, DLM assessments were available to students in 21 states and one Bureau of Indian Education school: Alaska, Arkansas, Colorado, Delaware, District of Columbia, Illinois, Iowa, Kansas, Maryland, Missouri, New Hampshire, New Jersey, New Mexico, New York, North Dakota, Oklahoma, Pennsylvania, Rhode Island, Utah, West Virginia, Wisconsin, and Miccosukee Indian School.
Three DLM Consortium partners, District of Columbia, Maryland, and Miccosukee Indian School did not administer operational assessments in 2020–2021.
In 2020–2021, the Accessible Teaching, Learning, and Assessment Systems at the University of Kansas continued to partner with the Center for Literacy and Disability Studies at the University of North Carolina at Chapel Hill and Agile Technology Solutions at the University of Kansas. The project was also supported by a Technical Advisory Committee.
1.3 Technical Manual Overview
This manual provides evidence collected during the 2020–2021 administration of science assessments.
Chapter 1 provides a brief overview of the assessment and administration for the 2020–2021 academic year and a summary of contents of the remaining chapters. While subsequent chapters describe the individual components of the assessment system separately, key topics such as validity are addressed throughout this manual.
Chapter 2 provides an overview of the purpose of the Essential Elements (EEs) for science, including the intended coverage with the Framework for K–12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (National Research Council, 2012) and the Next Generation Science Standards (NGSS Lead States, 2013). For a full description of the process by which the EEs were developed, see the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017).
Chapter 3 outlines evidence related to test content collected during the 2020–2021 administration, including a description of test development activities, external review of content, and the operational and field test content available.
Chapter 4 provides an update on test administration during the 2020–2021 year. The chapter describes the DLM policy on virtual test administration and provides a summary of updated Personal Needs and Preferences Profile selections, a summary of administration time and device usage, and teacher survey results regarding user experience, remote assessment administration, and accessibility.
Chapter 5 provides a brief summary of the psychometric model used in scoring DLM assessments. This chapter includes a summary of 2020–2021 calibrated parameters. For a complete description of the modeling method, see the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017).
Chapter 6 was not updated for 2020–2021; no changes were made to the cut points used in scoring DLM assessments. See the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017) for a description of the methods, preparations, procedures, and results of the original standard-setting meeting and the follow-up evaluation of the impact data. For a description of the changes made to the cut points used in scoring DLM assessments for grade 3 and grade 7 during the 2018–2019 administration, see the 2018–2019 Technical Manual Update—Science (Dynamic Learning Maps Consortium, 2019).
Chapter 7 reports the 2020–2021 operational results, including student participation data. The chapter details the percentage of students achieving at each performance level; subgroup performance by gender, race, ethnicity, and English-learner status; and the percentage of students who showed mastery at each linkage level. Due to the confounding factors of assessment administration changes and COVID-19, these results should be interpreted with caution and should not be directly compared to previous assessment administrations. Finally, the chapter provides descriptions of changes to score reports and data files during the 2020–2021 administration.
Chapter 8 summarizes reliability evidence for the 2020–2021 administration, including a brief overview of the methods used to evaluate assessment reliability and results by performance level, subject, domain, EE, linkage level, and conditional linkage level. For a complete description of the reliability background and methods, see the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017).
Chapter 9 describes additional validity evidence collected during the 2020–2021 administration not covered in previous chapters. The chapter provides evidence collected for two of the five critical sources of evidence: test content and response process.
Chapter 10 describes updates to the professional development offered across the DLM Consortium in 2020–2021, including participation rates and evaluation results.
Chapter 11 summarizes the contents of the previous chapters. It also provides future directions to support operations and research for DLM assessments.