-
Start Preamble
Start Printed Page 88940
AGENCY:
Office of Elementary and Secondary Education, Department of Education.
ACTION:
Final regulations.
SUMMARY:
The Secretary issues final regulations under title I, part B of the Elementary and Secondary Education Act of 1965 (ESEA) to implement changes made to the ESEA by the Every Student Succeeds Act (ESSA) enacted on December 10, 2015, including the ability of the Secretary to provide demonstration authority to a State educational agency (SEA) to pilot an innovative assessment and use it for accountability and reporting purposes under title I, part A of the ESEA before scaling such an assessment statewide.
DATES:
These regulations are effective January 9, 2017.
Start Further InfoFOR FURTHER INFORMATION CONTACT:
Jessica McKinney, U.S. Department of Education, 400 Maryland Avenue SW., Room 3W107, Washington, DC 20202-2800.
Telephone: (202) 401-1960 or by email: jessica.mckinney@ed.gov.
If you use a telecommunications device for the deaf (TDD) or a text telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1-800-877-8339.
End Further Info End Preamble Start Supplemental InformationSUPPLEMENTARY INFORMATION:
Executive Summary
Purpose of This Regulatory Action: On December 10, 2015, President Barack Obama signed the ESSA into law. The ESSA reauthorizes the ESEA, which provides Federal funds to improve elementary and secondary education in the Nation's public schools. Through the reauthorization, the ESSA made significant changes to the ESEA for the first time since the ESEA was reauthorized through the No Child Left Behind Act of 2001 (NCLB), including significant changes to title I. In particular, the ESSA includes in title I, part B of the ESEA a new demonstration authority under which an SEA or consortium of SEAs that meets certain application requirements may establish, operate, and evaluate an innovative assessment system, including for use in the statewide accountability system, with the goal of using the innovative assessment system after the demonstration authority ends to meet the academic assessment and statewide accountability system requirements under title I, part A of the ESEA. Aligned with President Obama's Testing Action Plan, released in October 2015, the demonstration authority seeks to help States interested in fostering and scaling high-quality, innovative assessments.[1] An SEA would require this demonstration authority under title I, part B, if the SEA is proposing to develop an innovative assessment in any required grade or subject and administer the assessment, initially, to students in only a subset of its local educational agencies (LEAs) or schools without also continuing administration of its current statewide assessment in that grade or subject to all students in those LEAs or schools, including for school accountability and reporting purposes under title I, part A, as it scales the innovative assessment statewide. Unless otherwise noted, references in this document to the ESEA refer to the ESEA as amended by the ESSA.
On July 11, 2016, the Secretary published a notice of proposed rulemaking (NPRM) for the title I, part B regulations pertaining to the innovative assessment demonstration authority in the Federal Register (81 FR 44958). We issue these regulations to provide clarity to SEAs regarding the requirements for applying for and implementing innovative assessment demonstration authority. These regulations will also help to ensure that SEAs provided this authority can develop and administer high-quality, valid, and reliable assessments that measure student mastery of challenging State academic standards, improve the design and delivery of large-scale assessments, and better inform classroom instruction, ultimately leading to improved academic outcomes for all students.
Summary of the Major Provisions of This Regulatory Action: The following is a summary of the major substantive changes in these final regulations from the regulations proposed in the NPRM. (The rationale for each of these changes is discussed in the Analysis of Comments and Changes section elsewhere in this preamble.)
- The Department has renumbered the proposed regulatory sections, as follows, in the final regulations:
—New section 200.104 (proposed § 200.76) entitled “Innovative assessment demonstration authority.”
—New section 200.105 (proposed § 200.77) entitled “Demonstration authority application requirements.”
—New section 200.106 (proposed § 200.78) entitled “Innovative assessment selection criteria.”
—New section 200.107 (proposed § 200.79) entitled “Transition to statewide use.”
—New section 200.108 (proposed § 200.80) entitled “Extensions, waivers, and withdrawal of authority.”
- The Department has made a number of changes to new § 200.104 (proposed § 200.76), which provides definitions and describes general requirements for SEAs and consortia of SEAs applying for and implementing the innovative assessment demonstration authority:
—Section 200.104(b)(1) has been added to define an “affiliate member of a consortium” to be an SEA that is formally associated with a consortium of SEAs that is implementing the innovative assessment demonstration authority, but is not yet a full member of the consortium because it is not proposing to use the consortium's innovative assessment system under the demonstration authority.
—Section 200.104(b)(3) has been revised to clarify the definition of “innovative assessment system” to indicate that an innovative assessment system:
- Produces an annual summative determination of each student's mastery of grade-level content standards aligned to the challenging State academic standards under section 1111(b)(1) of the ESEA.
- In the case of a student with the most significant cognitive disabilities assessed with an alternate assessment aligned with alternate academic achievement standards (AA-AAAS) under section 1111(b)(1)(E) of the ESEA and aligned with the State's academic content standards for the grade in which the student is enrolled, produces an annual summative determination relative to such alternate academic achievement standards for each such student;
- May include any combination of general assessments or AA-AAAS in reading/language arts, mathematics, or science; and
- May, in any required grade or subject, include one or more types of assessments listed in § 200.104(b)(3)(ii).
—Section 200.104(b)(4) has been added to define a “participating LEA” as an LEA in the State with at least one Start Printed Page 88941school participating in the innovative demonstration authority.
—Section 200.104(b)(5) has been added to define “participating school” as a public school in the State in which the innovative assessment system is administered under the innovative assessment demonstration authority instead of the statewide assessment and where the results of the school's students on the innovative assessment system are used by its State and LEA for purposes of accountability and reporting.
- The Department made a number of changes to § 200.105 (proposed § 200.77), which sets forth the application requirements that an SEA or consortium of SEAs must meet in order to receive approval to implement demonstration authority:
—Section 200.105(a) has been revised to require collaboration with representatives of Indian tribes located in the State and to clarify that in consulting parents, States must consult parents of children with disabilities, English learners and other subgroups under section 1111(c)(2) of the ESEA.
—Section 200.105(b) has been revised to clarify that the innovative assessment system may be administered to a subset of LEAs or schools within an LEA, and must be administered to all students within the participating LEA or schools within the LEA, except that an LEA may continue to administer an AA-AAAS that is not part of the innovative assessment system to students with the most significant cognitive disabilities, consistent with section 1111(b)(1)(E) of the ESEA.
—Section 200.105(b)(2) has been revised to clarify that the innovative assessment must align with the challenging State academic content standards for the grade in which the student is enrolled. In addition, § 200.105(b)(2)(ii) clarifies that the innovative assessment may include items above or below a student's grade level so long as the State measures each student's academic proficiency based on the challenging State academic standards for the grade in which the student is enrolled.
—Section 200.105(b)(4) has been revised to clarify that determinations of the comparability between the innovative and statewide assessment system must be based on results, including annual summative determinations, as defined in § 200.105(b)(7), that are generated for all students and for each subgroup of students.
—Section 200.105(b)(4)(i)(C) has been revised to clarify that States may include, as a significant portion of the innovative assessment system in each required grade and subject in which both an innovative and statewide assessment is administered, items or performance tasks from the statewide assessment system that, at a minimum, have been previously pilot tested or field tested for use in the statewide assessment system.
—Section § 200.105(b)(4)(i)(D) has been added to clarify that States may include, as a significant portion of the statewide assessment system in each required grade and subject in which both an innovative and statewide assessment is administered, items or performance tasks from the innovative assessment system that, at a minimum, have been previously pilot tested or field tested for use in the innovative assessment system.
—Section § 200.105(b)(4)(ii) has been added to require that States' innovative assessment systems generate results, including annual summative determinations, that are valid, reliable, and comparable for all students and for each subgroup of students among participating schools and LEAs, which an SEA must annually determine as part of its evaluation plan described in § 200.106(e) (proposed § 200.78(e)).
—Section 200.105(b)(7) has been revised to require that the innovative assessment produce an annual summative determination of achievement for each student that describes—
- The student's mastery of the challenging State academic standards (i.e., both the State's academic content and achievement standards) for the grade in which the student is enrolled; and
- In the case of a student with the most significant cognitive disabilities assessed with an AA-AAAS under section 1111(b)(1)(E) of the ESEA, the student's mastery of those alternate academic achievement standards.
—Section 200.105(d)(4) has been revised to require that each participating LEA inform parents of all students in participating schools about the innovative assessment and that information shared with parents include the grades and subjects in which the innovative assessment will be administered.
—Section 200.105(f)(2) has been added to clarify that a consortium must submit a revised application to the Secretary in order for an affiliate member to become a full member of the consortium and use the consortium's innovative assessment system under the demonstration authority.
- The Department made a number of changes to § 200.106 (proposed § 200.78), which describes the selection criteria the Secretary will use to evaluate an application for demonstration authority:
—Section 200.106(a)(3)(iii) has been revised to clarify that the baseline for setting annual benchmarks toward high-quality and consistent implementation across schools that are demographically similar to the State as a whole is the demographics of participating schools, not participating LEAs.
—Section 200.106(d) has been revised to clarify that each SEA or consortium's application must include a plan for delivering supports to educators that can be consistently provided at scale; will be evaluated on the extent to which training for LEA and school staff will develop teacher capacity to provide instruction that is informed by the innovative assessment system results; and should describe strategies and safeguards to support educators and staff in developing and scoring the innovative assessment, including how the strategies and safeguards are sufficient to ensure objective and unbiased scoring of innovative assessments. Section 200.106(d) has also been revised to provide for the SEA or consortium to include supports for parents, in addition to educators and students, and require States to describe their strategies to familiarize parents as well as students with the innovative assessment system.
- The Department has revised § 200.107 (proposed § 200.79) to clarify that the baseline year used for purposes of evaluating the innovative assessment to determine if a State may administer the assessment statewide is the first year the innovative assessment is administered by a participating LEA under the demonstration authority.
Costs and Benefits: The Department believes that the benefits of this regulatory action outweigh any associated costs to a participating SEA, which may be supported with Federal grant funds. These benefits include the administration of assessments that more effectively measure student mastery of challenging State academic standards and better inform classroom instruction and student supports, ultimately leading to improved academic outcomes for all students. Please refer to the Regulatory Impact Analysis section of this document for a more detailed Start Printed Page 88942discussion of costs and benefits. Consistent with Executive Order 12866, the Office of Management and Budget (OMB) has determined that this action is significant and, thus, is subject to review by OMB under the Executive order.
Public Comment: In response to our invitation to comment in the NPRM, 89 parties submitted comments on the proposed regulations.
We discuss substantive issues under the sections of the proposed regulations to which they pertain, except for a number of cross-cutting issues, which are discussed together under the heading “Cross-cutting issues.” Generally, we do not address technical and other minor changes, or suggested changes the law does not authorize us to make under the applicable statutory authority. In addition, we do not address general comments that raised concerns not directly related to the proposed regulations or that were otherwise outside the scope of the regulations, including comments that raised concerns pertaining to instructional curriculum, particular sets of academic standards or assessments or the Department's authority to require a State to adopt a particular set of academic standards or assessments, as well as comments pertaining to the Department's regulations on statewide accountability systems, data reporting, and State plans.
Tribal Consultation: The Department held four tribal consultation sessions on April 24, April 28, May 12, and June 27, 2016, pursuant to Executive Order 13175 (“Consultation and Coordination with Indian Tribal Governments”). The purpose of these tribal consultation sessions was to solicit tribal input on the ESEA, including input on several changes that the ESSA made to the ESEA that directly affect Indian students and tribal communities. The Department specifically sought input on: The new grant program for Native language Immersion schools and projects; the report on Native American language medium education; and the report on responses to Indian student suicides. The Department announced the tribal consultation sessions via listserv emails and Web site postings on http://www.edtribalconsultations.org/. The Department considered the input provided during the consultation sessions in developing the proposed requirements.
Analysis of Comments and Changes: An analysis of the comments and of any changes in the regulations since publication of the NPRM follows.
Cross-Cutting Issues
Reorganization and Renumbering of the Proposed Regulations
Comments: None.
Discussion: The NPRM included proposed regulatory sections to implement the innovative assessment demonstration authority in §§ 200.75 through 200.80. However, some of these sections contain existing regulations that have not yet been removed and reserved. Accordingly, we are revising the final regulations by renumbering the proposed sections, as follows:
- New § 200.104 (proposed § 200.76) entitled “Innovative assessment demonstration authority.”
- New § 200.105 (proposed § 200.77) entitled “Demonstration authority application requirements.”
- New § 200.106 (proposed § 200.78) entitled “Innovative assessment selection criteria.”
- New § 200.107 (proposed § 200.79) entitled “Transition to statewide use.”
- New § 200.108 (proposed § 200.80) entitled “Extensions, waivers, and withdrawal of authority.”
Changes: We have revised the final regulations by renumbering the regulatory sections, as proposed. As a result, we have added §§ 200.104 through 200.108 in the final regulations, which describe the demonstration authority, in general; application requirements; selection criteria; transition to statewide use; and extensions, waivers, and withdrawal of authority.
Overtesting
Comments: A few commenters raised concerns that the proposed requirements impose new testing requirements. Of these commenters, a few expressed concern that the assessments would serve to punish teachers who work with children who are struggling academically. Others were concerned that the assessments would be inappropriately used for high stakes decisions.
Discussion: Neither section 1204 of the ESEA nor the proposed regulations impose new assessment requirements beyond those required by title I, part A of the ESEA. Accurate and reliable measurement of student achievement based on annual State assessments in reading/language arts and mathematics remains a core component of State assessment and accountability systems under the ESSA. In support of these goals, section 1111(b)(2)(B)(v)(I) of the ESEA requires annual assessments in reading/language arts and mathematics to be administered to all students in each of grades 3 through 8, and at least once between grades 9 and 12. Section 1204 allows a State to pilot new innovative assessments under a demonstration authority, but requires that each State assess all students on the applicable assessments, using either the innovative assessment in participating LEAs and schools or the statewide assessment in non-participating LEAs and schools. No State is required to participate in the innovative assessment demonstration authority. Finally, while States are required to use the results of State assessments in statewide accountability systems, consistent with sections 1111(c) and 1111(d) of the ESEA, there are no further requirements for how assessment results are used, including for teacher evaluation or student advancement and promotion decisions. Decisions about the use of test results for those purposes remain a State and local decision.
Changes: None.
Comments: One commenter commended the Department for allowing States the option to pilot a new assessment in a subset of schools rather than the entire State, but stressed that true innovation is needed to reduce the unnecessary and high stakes associated with assessments in the United States. The commenter encouraged the Department to look for opportunities to reduce testing, particularly for high stakes purposes. Another commenter noted that districts are already required to track student growth through Response to Intervention in kindergarten through grade 5 (K-5), so having State assessments in grades 3-5 is duplicative testing.
Discussion: Section 1111(b)(2)(B)(v)(I) of the ESEA requires that each State administer reading/language arts and mathematics assessments in each of grades 3 through 8 and at least once in grades 9 through 12; while some schools may be required by their LEA or State to use Response to Intervention in grades K-5, there is no Federal requirement to do so. We believe that while the ESEA maintains this core requirement for annual assessment, it also presents States with opportunities to streamline low-quality or duplicative testing. Each State, in coordination with its LEAs, should continue to consider additional action it may take to reduce burdensome and unnecessary testing. We know that annual assessments, as required by the ESSA, are tools for learning and promoting equity when they are done well and thoughtfully. When assessments are done poorly, in excess, or without a clear purpose, they take time away from teaching and learning. The President's Testing Action Plan provides a set of principles and Start Printed Page 88943actions that the Department put forward to help protect the vital role that good assessments play in guiding progress for students, advancing equity for all, and evaluating schools, while providing help in reducing practices that have burdened classroom time or not served students or educators well. We plan to issue further non-regulatory guidance to help States and LEAs use the provisions of the ESEA to take actions aligned with the Testing Action Plan to improve assessment quality and reduce the burden of unnecessary and duplicative testing.
Changes: None.
Parental Rights
Comments: One commenter noted the importance of parental involvement in issues pertaining to State assessments under the ESEA, including test design, reporting, and use of test results, and voiced support for parents' rights to make decisions around their child's participation in assessments. Another commenter was supportive of expecting students to take assessments, but concerned—given the decisions some parents make to opt their children out of taking assessments—about requiring that a 95 percent participation rate among students and subgroups of students be a factor for school accountability purposes. The commenter suggested that the final regulations make 95 percent participation a goal, rather than a requirement, and expect States to review participation rates in schools that fail to assess at least 95 percent of their students.
Discussion: We agree with commenters that it is important to seek and consider input from parents when designing and implementing State assessment systems and policies. Accurate and reliable measurement of student achievement based on annual State assessments in reading/language arts and mathematics remains a core component of State assessment and accountability systems under the ESEA. In support of these goals, section 1111(b)(2)(B)(i) and (v)(I) of the ESEA requires annual assessments in reading/language arts and mathematics to be administered to all students in each of grades 3 through 8, and at least once between grades 9 and 12. Section 1111(c)(4)(E) of the ESEA also requires that States hold schools accountable for assessing at least 95 percent of their students. The statute reiterates these critical requirements for holding participating schools in the innovative assessment demonstration authority accountable, as described in sections 1204(e)(2)(ix) and 1204(j)(1)(B)(v)(II), which both reference the requirements in section 1111(c) in the application requirements and requirements for transitioning to using the innovative assessment system statewide. All States, regardless of their participation in innovative assessment demonstration authority, are responsible for ensuring that all students participate in the State's annual assessments and that all schools meet the statutory and applicable regulatory requirements to hold schools accountable for the 95 percent participation rate requirement. The final regulations for the innovative assessment demonstration authority, like the proposed regulations, are designed to assist States in fulfilling this responsibility.
Changes: None.
Comments: A few commenters raised concerns that the proposed regulations will impose new data collection requirements that might lead to data mining. These commenters were particularly concerned about student privacy and the right of parents to protect their students' data from being collected.
Discussion: We agree with the commenters' concern that it is paramount to protect student privacy. New § 200.105(b)(8) (proposed § 200.77(b)(8)) requires that each State and LEA report student results on the innovative assessment, consistent with sections 1111(b)(2)(B) and 1111(h) of the ESEA, including section 1111(b)(2)(B)(xi), which provides that in reporting disaggregated results, the State, LEA, and school may not reveal personally identifiable information about an individual student. Further, new § 200.105(d)(3)(ii) (proposed § 200.77(d)(3)(ii)) requires that any data submitted to the Secretary regarding the State's implementation of the innovative assessment demonstration authority may not reveal any personally identifiable information. We disagree with the commenters that this regulation requires new student-level data to be publicly reported beyond those requirements in the statute; rather, it requires that any State choosing to participate in the innovative assessment demonstration authority continue to meet the reporting requirements of sections 1111(b)(2)(B) and 1111(h) of the ESEA.
Changes: None.
Stakeholder Engagement
Comments: Multiple commenters supported the proposed regulations for prioritizing meaningful consultation with stakeholders in various phases of the innovative assessment demonstration authority, such as in developing States' applications and plans for innovative assessment demonstration authority in proposed § 200.77(a)(2) and in requiring ongoing feedback from stakeholders on implementation in proposed § 200.77(d)(3)(iv). These commenters appreciated that the proposed regulations emphasized a meaningful role for assessment experts; parents and parent organizations; teachers, principals and other school leaders, and local teacher organizations (including labor organizations); local school boards; groups representing the interests of particular subgroups of students, including English learners, children with disabilities, and other subgroups included under section 1111(c)(2) of the ESEA; and community organizations and intermediaries.
Discussion: We appreciate the support for these provisions and agree that meaningful, timely, and ongoing consultation with a diverse group of stakeholders at all phases of the innovative assessment demonstration authority is essential to ensure effective implementation and development of a high-quality innovative assessment system. We strongly encourage States to engage in substantial outreach with stakeholders in developing and implementing an innovative assessment system under the ESSA.[2]
Changes: None.
Comments: Several commenters suggested that evidence of consultation with stakeholders at the time a State is seeking demonstration authority in proposed § 200.77(a) be submitted directly from stakeholders, rather than from the State.
Discussion: We believe the commenters' concern that evidence of meaningful consultation under new § 200.105(a) (proposed § 200.77(a)) is submitted from the State, rather than from required groups, is mitigated by the selection criterion under new § 200.106(b)(3) (proposed § 200.78(b)(3)), which requires a State to submit signatures directly from groups and individuals supporting the application, many of whom overlap with those who must be consulted under new § 200.105(a). As a result, we believe that adding to the provisions for consultation by requiring States to gather and submit further information from organizations and individuals directly would add burden to the application process without providing Start Printed Page 88944substantially new information that would aid in the external peer review of a State's application.
Changes: None.
Comments: A few commenters requested that the Department add specific groups of stakeholders to the list of those with which the State must consult in developing its innovative assessment system and application under proposed § 200.77(a)(2). Commenters suggested adding groups such as specialized instructional support personnel, representatives of community-based organizations, and organizations and parents who advocate for the interests of particular subgroups of children or are experts in working with these subgroups. In addition, one commenter representing tribal organizations suggested that tribal leaders be included as a required group for consultation under proposed § 200.77(a)(2). Stakeholders supported including these groups under proposed § 200.77(a)(2) because States would then be required to regularly solicit ongoing feedback from these additional groups under proposed § 200.77(d)(3)(iv) and during the transition to statewide use of the innovative assessment system under proposed § 200.79(b)(3).
Discussion: The list of stakeholders that are part of required consultation under new § 200.105(a)(2) (proposed § 200.77(a)(2)) comes directly from section 1204(e)(2)(A)(v)(I) of the ESEA. The Department added students to the list of required stakeholders, given the substantial and direct impact of implementing a new innovative assessment on the teaching and instruction students will receive and to reinforce related statutory requirements for ensuring students are acclimated to the innovative assessments, as described in section 1204(e)(2)(B)(vi) of the ESEA. While we recognize that the additional groups suggested by commenters for inclusion in the regulations may also provide valuable input in developing the innovative assessment, we believe that the current list, as proposed, already includes broad categories to ensure diverse input, such as “educators” and those “representing the interests of children with disabilities, English learners, and other subgroups.”
We note that a State may always consult with additional groups beyond those required in the regulations in developing its innovative assessment system, and we strongly encourage States to ensure meaningful and ongoing engagement with a diverse group of stakeholders. The Department has issued non-regulatory guidance, generally, on conducting effective outreach with stakeholders in implementing the ESSA, with suggestions and examples of best practices for meaningful stakeholder engagement.[3]
We agree that it would be helpful to emphasize that parents of particular subgroups of students, as well as organizations representing these students, must be consulted, and are revising the final regulations accordingly. The State must consider the appropriate services to ensure meaningful communication for parents with limited English proficiency and parents with disabilities.
In addition, we agree that it would be beneficial to add representatives of Indian tribes to the list of required stakeholders, as some LEAs have a high percentage of their student population who are American Indian or Alaska Native, and these LEAs will be expected to implement the innovative assessment by the time the State transitions to statewide use of the innovative assessment system. This requirement is consistent with the new requirement in title I, part A for States to consult with representatives of Tribes prior to submitting a State plan (section 1111(a)(1) of the ESEA), and the new requirement that certain LEAs consult with Tribes prior to submitting a plan or application for covered programs (section 8538 of the ESEA).
Changes: We have added new § 200.105(a)(2)(iv) to require State collaboration with representatives of Indian tribes and § 200.105(a)(2)(v) to specify that parents who are consulted must include parents of children in subgroups described in § 200.105(a)(2)(i) (proposed § 200.77(a)(2)(i)).
Comments: Several commenters suggested that particular groups or individuals be added to the list of entities for which a State submits signatures under the selection criterion demonstrating stakeholder support for innovative assessment demonstration authority in proposed § 200.78(b)(3)(iv). Commenters suggested that disability rights organizations, community-based organizations, and statewide organizations representing superintendents or school board members also be added. Some of these commenters felt that signatures from other stakeholders listed in proposed § 200.78(b)(3)(iv) should be required, believing these organizations' views were considered as less important than groups representing local leaders, administrators, and teachers. Another commenter recommended that we require teacher signatures where local teacher organizations do not exist to ensure that States have support from teachers in the development and implementation of the innovative assessment system.
Discussion: In proposed § 200.78(b)(3), the Department prioritized requiring signatures from those individuals and organizations that are most directly involved in the implementation of innovative assessments at the local level, such as superintendents, school boards, and teacher organizations, as these are the individuals who will be charged (depending on the State's innovative assessment system design) with developing, administering, or scoring the assessments; thus, their input and support are essential to the successful implementation of the innovative assessment system. We agree with commenters that signatures of support from other individuals, however, can be beneficial and note that while the selection criterion in new § 200.106(b)(3)(i)-(ii) (proposed § 200.78(b)(3)(i)-(ii)) specifically references signatures from superintendents and school boards in participating districts, this does not preclude a State from requesting and including signatures and letters of support from State organizations representing superintendents and school boards, as such groups may be included under “other affected stakeholders” as described in new § 200.106(b)(3)(iv) (proposed § 200.77(b)(3)(iv)). Signatures from disability and community-based organizations may also be included under new § 200.106(b)(3)(iv). Moreover, because these signatures are part of the selection criteria, if a State were to include signatures from a wide range of individuals—including those that are not required, but may be included, as described in new § 200.106(b)(3)(iv)—it would strengthen this component of the State's application. In this way, we believe the requirements, as proposed, provide a strong incentive for a State to seek input and support from a diverse group of stakeholders, and organizations representing those stakeholders in developing its application, without adding burden to the process for States by including additional required signatures from groups who may not be directly involved in implementation of the innovative assessment system. Similarly, while signatures from individual teachers in participating districts could be a powerful demonstration of support from Start Printed Page 88945educators in participating districts, we believe such a requirement would add a significant burden for LEAs and SEAs. A State may choose to collect teacher signatures, but we also recognize it may be more efficient and feasible for SEAs and LEAs to collect signatures from organizations that represent teachers.
Changes: None.
Comments: One commenter recommended that the final regulations require ongoing collaboration with stakeholders, including parents and organizations that advocate on behalf of students, in addition to consultation on the development of the innovative assessment system at the time of the State's application as described in proposed § 200.77(a).
Discussion: New § 200.105(d)(3)(iv) (proposed § 200.77(d)(3)(iv)) requires each State to submit an assurance in its application that it will annually report to the Secretary on implementation of its innovative assessment system, including ongoing feedback from teachers, principals, other school leaders, students and parents, and other stakeholders consulted under new § 200.105(a)(2) (proposed § 200.77(a)(2)) from participating schools and LEAs. As States must collect and report on this stakeholder feedback each year, and the Department will use it to inform ongoing technical assistance and monitoring of participating States, we believe no further requirements related to ongoing consultation are necessary.
Changes: None.
Comments: One commenter supported the provisions for States to include the prior experience of external partners as part of the selection criterion in proposed § 200.78(b), but suggested that we revise the final regulations in proposed § 200.78(d) to include community-based organizations so as to emphasize the need for States to partner with external organizations to provide training to staff and to familiarize parents and students with the innovative assessment.
Discussion: SEAs and consortia of SEAs must submit evidence under new § 200.105(a)(1) (proposed § 200.77(a)(1)) of collaboration in developing the innovative assessment system, including experts in the planning, development, implementation, and evaluation of innovative assessment systems, many of whom could be part of external partnerships the SEA or consortium has established. We are revising the regulations in new § 200.105(a)(1) to more clearly describe that external partners may be included as collaborators. The commenter is correct that the selection criterion in new § 200.106(b) (proposed § 200.78(b)) provides for States to describe the prior experience of their external partners, if any. Further, we presume the role of external partners in executing a State's plan for demonstration authority will be fully described, if applicable, in each relevant selection criterion, and do not feel it is necessary to explicitly note that a State may work with external partners in each and every area, as we believe States are best positioned to determine the areas in which their work could benefit from external partnerships, based on their innovative assessment system design. A high-quality plan for supporting educators and students, for example, would include sufficient detail on any external partnerships and resources to accomplish this work, if the State has determined such partnerships are necessary.
Changes: We have added new § 200.105(a)(1) (proposed § 200.77(a)(1)) to clarify that experts in the planning, development, implementation, and evaluation of innovative assessment systems with whom SEAs collaborate to develop the innovative assessment system may include external partners.
Comments: One commenter encouraged the Department and States to engage local school boards in the process to identify participating districts and schools for the innovative assessment pilot.
Discussion: SEAs and consortia of SEAs must consult with school leaders during the application process under new § 200.105(a)(2)(ii) (proposed § 200.77(a)(2)(ii)). The selection criterion provides for SEAs to submit signatures from LEA superintendents and local school boards participating in the demonstration authority, consistent with new § 200.106(b)(3)(i)-(ii) (proposed § 200.78(b)(3)(i)-(ii)), as a showing of support for the innovative assessment demonstration authority. We believe that these requirements and selection criterion provide opportunities for SEAs to speak with local school leaders, including local school boards, about their plans for and support of innovative assessments. These conversations will also be the time for SEAs to discuss district or school participation with local leaders, including school boards. Given these provisions, we do not think further changes to the regulations are necessary.
Changes: None.
200.104 Innovative Assessment Demonstration Authority
General
Comments: Many of the commenters supported the innovative assessment demonstration authority as an opportunity to move toward more innovative and meaningful systems for assessing student learning, beyond traditional multiple choice exams. In particular, some commenters supported the inclusion of performance- and competency-based assessments. One commenter advocated for a regulation that encourages new ways to assess under an existing system (e.g., embedding technology-enhanced items), different strategies to do what current assessments intend to do but fail to do (e.g., assessing higher-order thinking skills), or new ways to assess student competencies beyond what current assessments can do (e.g., assessing in individualized or real world settings).
One commenter appreciated the opportunity to use the advances in assessment to better measure student learning, but asked the Department to ensure that this focus on innovation does not jeopardize assessment rigor and comparability. Multiple commenters felt that the regulations provided appropriate flexibility with protections to ensure that assessments are high-quality, valid, and reliable measurements consistent with the provisions of ESEA.
Discussion: We appreciate commenters' support of the innovative assessment demonstration authority and believe that this authority can enhance State efforts to measure student mastery of challenging State academic standards and will lead to improved academic outcomes for all students. We also agree that it is essential, even as States are piloting more innovative assessments, that all students, including students with the most significant cognitive disabilities, be held to challenging content standards, and that all assessments be of high quality, producing valid, reliable, and comparable determinations of student achievement, except for alternate assessments for students with the most significant cognitive disabilities, as defined by a State under § 200.6(d)(1) and section 1111(b)(2)(D) of the ESEA, who may be assessed with alternate assessments aligned with alternate academic achievement standards consistent with section 1111(b)(1)(E) of the ESEA.
In developing these regulations, we worked carefully to balance the flexibility offered to States under this authority and the need to provide room for innovation with the responsibility to ensure that States continue to meet the requirements of title I of the ESEA. As long as States meet the requirements of title I of the ESEA, they may explore new ways to assess students beyond Start Printed Page 88946what is possible with the current assessments.
Changes: None.
Comments: Several commenters expressed general disagreement with providing States innovative assessment demonstration authority, claiming that the authority would not support students or their learning. Other commenters expressed concern that the regulations, as proposed, require too many assurances and documentation, create too many prescriptive requirements, and impede States' ability to create truly innovative assessment systems.
Discussion: The innovative assessment demonstration authority provides flexibility to States to develop and administer a new system of assessments that may include different types of assessments, such as instructionally embedded assessments or performance-based tasks, that provide useful and timely information for educators to guide instruction and identify appropriate instructional supports. Under the demonstration authority, States may develop new innovative assessments that meet the needs of their teachers and that provide better measures for learning. However, section 1204(e)(2)(A)(vi) of the ESEA requires that assessments be developed so that they are accessible to all students, including English learners and students with disabilities; are fair, valid, and reliable; and hold all students to the same high standards.
We disagree that the requirements are unnecessarily burdensome or too prescriptive. Under section 1204 of the ESEA, the demonstration authority is for those States interested in piloting new innovative assessments and administering the innovative assessments in a subset of schools for the purposes of accountability and reporting instead of the statewide assessment, until a State fully scales use of the innovative assessment among all LEAs and schools. If a State wants to create an innovative assessment outside of the demonstration authority while continuing to use the statewide assessment in all schools and LEAs, the State may do so. Section 1204 of the ESEA further establishes the application requirements for States seeking innovative assessment demonstration authority. The regulations clarify and organize those statutory requirements in new §§ 200.105 and 200.106 (proposed §§ 200.77 and 200.78). Given that the demonstration authority is initially limited to seven States, we particularly believe the selection criteria outlined in new § 200.106 will provide the chance for peer reviewers to distinguish high-quality applications consistent with the requirements of the statute. Moreover, section 1601(a) of the ESEA provides that the Secretary “may issue . . . such regulations as are necessary to reasonably ensure that there is compliance” with the law. The Department also has rulemaking authority under section 410 of the General Education Provisions Act (GEPA), 20 U.S.C. 1221e-3, and section 414 of the Department of Education Organization Act (DEOA), 20 U.S.C. 3474. These regulations are necessary and appropriate to assist States in developing new, innovative assessments while maintaining high expectations, validity, and rigor; further, they are consistent and specifically intended to ensure compliance with section 1204 of the ESEA.
Changes: None.
Comments: One commenter suggested the Department ask States to indicate their interest in the innovative assessment demonstration authority when they submit their consolidated State plan. The commenter noted that under this recommendation a State would share its vision for an innovative assessment without submitting a binding application, allowing the Department to provide targeted technical assistance to interested States.
Discussion: Title I, part B is not one of the programs included in the definition of “covered program” in section 8101(11) of the ESEA as it applies to the consolidated State plan. Accordingly, we do not believe it is necessary to include a requirement for States to indicate their interest in the demonstration authority in the consolidated State plan.
Changes: None.
Comments: None.
Discussion: In reviewing the proposed regulations, the Department believes it would be helpful to establish definitions of “participating LEA” and “participating school.” At some points during implementation, States may have both participating and non-participating LEAs and schools, and this change provides clarity about what it means for an LEA or school to be participating in the demonstration authority.
Changes: We have added § 200.104(b)(4) to define a “participating LEA” as an LEA in the State with at least one school participating in the innovative demonstration authority. We also have added § 200.104(b)(5) to define “participating school” as a public school in the State where the innovative assessment system is administered under the innovative assessment demonstration authority instead of the statewide assessment under section 1111(b)(2) of the ESEA and where the results of the school's students on the innovative assessment system are used by its State and LEA for purposes of accountability and reporting under section 1111(c) and 1111(h) of the ESEA. We have made conforming edits in new §§ 200.105 and 200.106.
Defining Innovative Assessment
Comments: Many commenters requested clarity concerning which parts of the innovative assessment system need to meet the requirements of section 1111(b)(2) of the ESEA. Specifically, commenters asked the Department to be clear that it is the innovative assessment system that must meet the requirements, not each individual innovative assessment. The commenters noted that a grade-level innovative assessment may be comprised of multiple parts, each of which may be a stand-alone assessment (e.g., an interim assessment, a performance-based assessment, or a competency-based assessment), which sum to an annual, summative grade-level determination of how a student performed against the challenging State academic standards. Commenters suggested that individual assessments should not be required to meet the requirements of peer review or section 1111(b)(2) individually.
Discussion: The Department believes there may have been some confusion about the meaning of innovative assessments in the context of an innovative assessment “system.” The Department considers an assessment system to be inclusive of all required assessments under the ESEA, such as the general assessments in all grade levels in reading/language arts, mathematics, and science, and the AA-AAAS. A grade-level innovative assessment, on the other hand, refers to the full suite of items, performance tasks, or other parts that sum to the annual, summative determination.
The Department, through its peer review process, will review the innovative assessment system overall, including a review of documentation and evidence provided for the innovative assessment at each grade level that comprises the innovative assessment system. The provision in new § 200.107(b) (proposed § 200.79(b)), which requires an innovative assessment to meet all of the requirements of section 1111(b)(2) of the ESEA, does not mean that each part of a grade-level innovative assessment (e.g., an interim assessment, a performance-based assessment, a competency-based assessment) must meet those requirements. Accordingly, Start Printed Page 88947the Department will not review each part of the grade-level innovative assessment (e.g., a single performance task that makes up part of the State's innovative 4th-grade mathematics test) to ensure that it meets the requirements in § 200.2(b) and, therefore, the peer review will not result in a determination that a single grade-level assessment does or does not meet the requirements of peer review. We do note, however, that, as a component of the peer review, a State must submit grade-specific documentation, such as alignment evidence, test blueprints, or documentation outlining the development of performance tasks or other components, and documentation about the validity of the inferences about the student.
To provide further clarity, we are revising the definition of “innovative assessment system” in new § 200.104(b)(3) (proposed § 200.76(b)(2)) to specify that an “innovative assessment system” produces an annual summative determination of each student's mastery of grade-level content standards aligned to the challenging State academic standards under section 1111(b)(1) of the ESEA, or, in the case of a student with the most significant cognitive disabilities assessed with an AA-AAAS under section 1111(b)(1)(E) of the ESEA and aligned with the State's academic content standards for the grade in which the student is enrolled, an annual summative determination relative to such alternate academic achievement standards for each such student. We also are revising the definition of “innovative assessment system” to specify that an innovative assessment may include, in any required grade or subject, one or more types of assessments, such as cumulative year-end assessments, competency-based assessments, instructionally embedded assessments, interim assessments, or performance-based assessments.
Changes: We have added a revised definition of “innovative assessment system” in new § 200.104(b)(3) (proposed § 200.76(b)(2)) to clarify the definition of “innovative assessment system” to indicate that an innovative assessment system:
- Produces an annual summative determination of each student's mastery of grade-level content standards aligned to the challenging State academic standards under section 1111(b)(1) of the ESEA, or, in the case of a student with the most significant cognitive disabilities assessed with an alternate assessment aligned with alternate academic achievement standards under section 1111(b)(1)(E) of the ESEA and aligned with the State's academic content standards for the grade in which the student is enrolled, an annual summative determination relative to such alternate academic achievement standards for each such student;
- May include any combination of general assessments or alternate assessments aligned to alternate academic achievement standards (AA-AAAS) in reading/language arts, mathematics, or science; and
- May, in any required grade or subject, include one or more types of assessments listed in new § 200.104(b)(3)(ii).
Comments: Two commenters asked the Department to be more explicit in the regulations that the innovative assessment could be an innovative general assessment, an innovative AA-AAAS, or both.
Discussion: As we stated in the preamble of the NPRM, an SEA or consortium of SEAs may propose an innovative general assessment in reading/language arts, mathematics, or science; an innovative AA-AAAS for students with the most significant cognitive disabilities, as defined by a State under section 1111(b)(2)(D) of the ESEA and § 200.6; or both. The definition of “innovative assessment system” in new § 200.104(b)(3) (proposed § 200.76(b)(2)) also specifies that a State's innovative assessment system may include assessments that produce an annual summative determination aligned with alternate academic achievement standards for students with the most significant cognitive disabilities. In such cases, a State's application would demonstrate that an innovative AA-AAAS has or will meet all requirements, including for technical quality, validity, and reliability, that are included under section 1111(b)(2)(B) of the ESEA. We are further revising new § 200.104(b)(3) to clarify that the innovative assessment system may include any combination of general assessments or AA-AAAS in any required grade or subject.
Changes: We have added new § 200.104(b)(3) (proposed § 200.76(b)(2)) to specify that the innovative assessment system may include any combination of general assessments or AA-AAAS in reading/language arts, mathematics, or science that are administered in at least one required grade under section 1111(b)(2)(B)(v) of the ESEA.
Defining Types of Innovative Assessments
Comments: Multiple commenters asserted that the terms used in proposed § 200.76(b)(2) to define an innovative assessment, such as competency-based assessments, instructionally embedded assessments, and performance-based assessments, are too open to interpretation and may, in fact, limit assessment options. Commenters recommended that proposed § 200.76(b)(2) provide more specific examples, such as essays, research papers, science experiments, and high-level mathematical problems.
Discussion: The definition of “innovative assessment system” in new § 200.104(b)(3) (proposed § 200.76(b)(2)) is consistent with the definition in section 1204(a)(1) of the ESEA. We note that essays, research papers, science experiments, and high-level mathematical problems may be examples of performance-based assessments, competency-based assessments, or instructionally embedded assessments. However, we do not believe it is necessary to provide that level of specificity in the regulations. We think that this kind of detailed clarification can be more effectively provided in non-regulatory guidance.
Changes: None.
Demonstration Authority Period
Comments: Multiple commenters agreed with the proposed regulation as written and believe that a requirement for immediate implementation of the innovative assessment system will ensure that States receiving authority commit time and resources to develop a successful innovative assessment system.
Discussion: We appreciate the support of commenters for innovative assessments and for the timeline for implementation. States only need demonstration authority when they are ready to use the innovative assessment, including for accountability and reporting purposes, in at least one school and at least one required grade or subject instead of the statewide assessment; prior to that, States have discretion to consider and test different innovative models to subsequently propose under this authority.
Changes: None.
Comments: Numerous commenters expressed concern about the requirement that States be ready, upon receiving demonstration authority, to immediately implement a new innovative assessment in at least one school. Commenters believe States may be unwilling or unable to commit time and resources to the development of an innovative assessment system without an assurance that the Department would consider their approach to an innovative assessment system. These commenters Start Printed Page 88948suggested the Department consider a two-stage application process in which applicants may receive conditional approval that would allow time for planning prior to administration of the innovative assessment system in at least one school. One commenter noted that this would be an opportunity for States to work directly with the Department and receive feedback and technical assistance.
One commenter stated that, were the Department to consider a conditional approval process, it might risk exceeding the seven-State limitation during the initial demonstration authority period if the Department receives more than seven high-quality applications that meet all of the application requirements and selection criteria. The commenter proposes a contingency plan to rank the applications in the event that the number of applications exceeds the cap.
Several commenters suggested that this requirement means the Department drafted the proposed rule to accommodate specific States or may favor the participation of specific States. One of these commenters recommended the Department commit to granting demonstration authority so that States may pursue assessment innovation without the burden of sanctions or the threat of losing funds.
Discussion: We recognize that many States need time to develop and implement an innovative assessment system. However, a State does not need demonstration authority to plan for, develop, or pilot an innovative assessment system. The authority is only needed once the State is ready to administer an innovative assessment in at least one school and will administer the innovative assessment in place of the statewide assessment, including for purposes of accountability and reporting under title I, part A.
If the Department grants demonstration authority, even on a conditional basis, to seven States in the first year, there would be no additional opportunities for other States to pursue authority until the initial demonstration period ends. The Department is concerned that providing conditional approval to States that are not ready to implement an innovative assessment system in at least one school may, as a result, take an opportunity away from a State that is close to being ready but waits to submit an application to the Department, even though that second State may ultimately be ready to begin implementing its innovative assessment system sooner than the first State. In addition, because we know there is a tremendous amount of work involved in developing an innovative assessment system, we think that it is possible that a State with conditional approval may subsequently encounter unanticipated delays, challenges, or the need for substantial redesign. If this were to happen, it could negatively affect the Department's ability to evaluate the initial demonstration authority before determining to expand the innovative demonstration authority, as required by section 1204(c)(3) of the ESEA.
We encourage States to consider several options for how they may develop, implement, and scale an innovative assessment. If a State plans to pursue demonstration authority immediately, a State might choose to partner with an LEA or a school that already has an innovative assessment model in place at the local level. The State could choose to partner with that LEA or school using an innovative assessment model to begin piloting this model and using it for accountability and reporting purposes under the ESEA in that LEA or school, with the intention of moving statewide, once the State is granted innovative assessment demonstration authority. Alternatively, a State may choose to start small with a focus on a single grade and content area, like 8th-grade science. If the Department does not receive and grant demonstration authority to seven States in the first year, we anticipate that there will be additional opportunities for States to apply for demonstration authority until seven States have been approved.
Finally, the regulations are not designed to favor the participation of certain States. We will hold all applicants to the same high expectations, outlined in new §§ 200.105 and 200.106 (proposed §§ 200.77 and 200.78), based on external peer review of applications, before granting innovative assessment demonstration authority.
Changes: None.
Comments: Several commenters objected to proposed § 200.76(b)(1), which would require States to use the innovative assessment system for purposes of accountability during the demonstration authority period. These commenters cited section 1204(h) of the ESEA which provides that States may use the innovative assessment system for accountability during the demonstration authority. The commenters believe that requiring immediate use for accountability will limit innovation and may discourage States from applying until they are ready.
Discussion: Schools and LEAs in a State that are participating in an innovative assessment must continue to be included in the State's accountability system to ensure transparency to educators, parents, and the public about school performance. Section 1204(e)(2)(C)(iii) requires an SEA's plan for innovative assessment demonstration authority to include a description of how the SEA will hold all participating schools accountable for meeting the State's expectations for student achievement. The manner in which an SEA holds schools accountable for meeting the State's expectations for student achievement is through the statewide accountability system under section 1111(c) of the ESEA. A State may elect, pursuant to section 1204(e)(2)(B)(i) of the ESEA, to use the statewide academic assessments required under section 1111(b)(2) of the ESEA in the participating schools and participating LEAs for accountability purposes while piloting the innovative assessment system. In the alternative, the State may use its innovative assessments, instead of the statewide academic assessments, in reading/language arts, mathematics, or science for accountability purposes under the demonstration authority if the innovative assessment meets all of the statutory requirements.
If a State does not wish to use an innovative assessment for accountability and reporting purposes, it does not need demonstration authority to pilot its innovative assessments. Only those States that wish to use the innovative assessment in place of the statewide assessment, including for the purposes of accountability and reporting under title I, part A, in at least one school, require innovative assessment demonstration authority.
Changes: None.
Comments: Several commenters strongly supported the option in proposed § 200.77(b)(1) for SEAs to use the statewide academic assessments for accountability should they choose not to use the innovative assessments for such purposes.
Discussion: We appreciate the commenters' support.
Changes: None.
Community of Practice
Comments: Multiple commenters expressed support for a process that encourages States to undergo careful planning, gather technical expertise, and engage stakeholders before piloting an innovative assessment. One commenter supported the idea of having a community of practice to provide feedback and support to States in their planning for an innovative assessment system. However, the commenter noted Start Printed Page 88949that the lack of funding for the community of practice does not indicate a high level of support for States in the development of an innovative assessment system.
Discussion: We appreciate the support of commenters for planning time and a community of practice that provides technical assistance in the planning and development of an innovative assessment system. We agree that a community of practice would provide an opportunity for States that are not yet ready to apply for demonstration authority an opportunity to work together and with the Department and experts in assessment and accountability, to share information on challenges faced, lessons learned, and promising and best practices to support continuous learning in ways to strengthen student assessments. The Department will strive to work collaboratively with States and other interested parties to provide technical assistance and support to all interested States.
Changes: None.
Peer Review of Applications
Comments: Commenters recommended that teachers be included in the list of peer reviewers on the basis that teachers have experience developing and implementing innovative item types and may be implementing the innovative assessment systems that will be under consideration in peer review. In addition, commenters suggested that principals and parents also be considered as peer reviewers.
Discussion: We agree with commenters that educators, including teachers and principals, should be considered as external peer reviewers. The experience of principals and teachers, especially of those already implementing innovative assessments in their schools and classrooms, is valuable in the peer review process to evaluate the strength of the application and its supporting evidence. In new § 200.104(c)(2) (proposed § 200.76(c)(2)), the Department specifies that peer review teams will consist of individuals with expertise in developing and implementing innovative assessments, such as psychometricians, researchers, State and local assessment directors, and educators—which includes teachers and principals. Therefore, this is already addressed in the regulations.
We do not agree that parents in general should be added to the list of peer reviewers in new § 200.104(c)(2). The very technical nature of these reviews requires that peer reviewers have the experience and expertise to evaluate an SEA's application, with an emphasis on knowledge of and experience with the development and implementation of innovative assessments and assessment technical requirements such as test design, comparability, and accessibility. Certainly, if a parent meets these requirements, including the level of expertise expected in the development and implementation of innovative assessments, that person would be considered to serve as a peer reviewer for the innovative assessment demonstration authority.
Changes: None.
Comments: One commenter recommended that tribal representatives be included in the list of peer reviewers of State applications for demonstration authority.
Discussion: As stated above, peer reviewers will be selected based on the individual's experience and expertise, with an emphasis on knowledge of and experience with the development and implementation of innovative assessments. Peer reviewers may also be individuals with past experience developing innovative assessment systems that support all students, including English learners, children with disabilities, and disadvantaged students (ESEA section 1204(f)(2)). Prior to selecting peer reviewers, the Department will publish a notice seeking peer reviewers and will reach out to a wide variety of stakeholders with such experience. We encourage tribal representatives with the experience and expertise in the development and implementation of innovative assessments to apply to be a peer reviewer.
Changes: None.
Granting Demonstration Authority
Comments: Commenters expressed concern that proposed § 200.76(d), which stated that the Secretary may award demonstration authority to “at least one” State, suggests that the Secretary might reject eligible applicants or limit the pilot to fewer States than the seven-State limit set forth in the statute during the initial demonstration period. Commenters asked that § 200.76(d), and other sections of the regulations, as appropriate, be changed to clarify that any State that meets the eligibility criteria will receive demonstration authority, not to exceed the seven-State limit.
Discussion: We intended new § 200.104(d) (proposed § 200.76(d)) to provide that the initial demonstration period is the three years beginning with the first year in which the Secretary awards at least one State or consortium demonstration authority under section 1204 of the ESEA. This is important to clarify because, during the initial demonstration authority period, the Secretary may not grant demonstration authority to more than seven States, including States participating in a consortium. We do not believe additional clarification is needed in the regulation as the Department references “at least one State” to indicate when the initial demonstration authority period begins (i.e., it is when at least one State is granted the authority and begins implementing in at least one school; not when a full cadre of seven States have been granted the authority).
Each State that applies for the demonstration authority will undergo peer review, as identified in the statute and regulations. The peers will review the strength of the State's application and evidence against the application requirements and selection criteria before providing recommendations to the Secretary.
Changes: None.
Developing Innovative Assessments
Comments: One commenter recommended that the Department include a requirement that SEAs or consortia of SEAs use competitive bidding to identify and select developers for innovative assessments under the innovative assessment demonstration authority. The commenter asserted that such a requirement would ensure that SEAs or consortia of SEAs consider the expertise of a wide range of entities experienced in the design and development of assessments, including the types of assessments likely to be included as part of an innovative assessment system. Finally, the commenter noted that this requirement would not be burdensome as many State procurement laws specifically require this type of process.
Discussion: We believe it is important that each SEA or consortia of SEAs consider the expertise and experience of both LEAs within the State and any external entities that will be supporting the development and implementation of innovative assessments. As noted by the commenter, many State procurement laws already govern the process that States must use to identify and select external partners. We do not believe it is necessary or within the scope of these regulations for the Department to require specific procurement processes. Therefore, the Department declines to include additional requirements.
Changes: None.Start Printed Page 88950
Consortia
Comments: One commenter recommended that tribes be allowed to apply for innovative assessment demonstration authority, and that tribes be allowed to participate in a consortium of SEAs without counting against the four-State limitation on consortium membership. The commenter also requested that tribes be considered and included in State innovative assessment pilots.
Discussion: Under section 1204 of the ESEA, the Secretary may provide an SEA, or a consortium of SEAs, innovative assessment demonstration authority. An SEA is defined as “the agency primarily responsible for the State supervision of public elementary schools and secondary schools” (section 8101(49) of the ESEA), and “State” is defined for purposes of title I, part B as the 50 States, the District of Columbia, and the Commonwealth of Puerto Rico (section 1203(c) of the ESEA). The law does not provide for separate eligibility for tribes so we are unable to make that change in these regulations. We note that these regulations only govern States and their school districts, and not schools funded by the Bureau of Indian Education (BIE) or by tribes. We also note, however, that title I, part B does provide a specific set-aside of funds for the BIE for assessments (section 1203(a)(1) of the ESEA), and nothing in the law prohibits those funds from being distributed to tribes for the development of assessments.
For the many State-funded public school districts serving substantial populations of American Indian/Alaska Native students, and for individual State-funded public schools operated by a tribe (as in the case of some charter schools), such public schools in a State granted the demonstration authority would be eligible to participate in the innovative assessment system. We agree that, in such States, collaboration with tribal communities is essential. Therefore, we strongly encourage interested States to work closely with any tribes located in their State when developing and administering innovative assessments. To prioritize this collaboration, and as previously described, we are requiring, in new § 200.105(a)(2) (proposed § 200.77(a)(2)), State collaboration with representatives of Indian tribes located in the State in the development of the innovative assessment.
Changes: None.
Comments: One commenter appreciated the allowance in proposed § 200.76(d)(2), which provides that an SEA that is affiliated with a consortium but not planning on using its innovative assessment under the demonstration authority would not count toward the four-State limit on consortium size. The commenter believed that this would create an opportunity for some States to receive technical assistance and additional time for planning prior to implementation of an innovative assessment system. The commenter suggested the final regulations include information about how affiliate members transition to become full, participating members in a consortium, including requiring these members to receive approval through the Department's peer review process before implementing innovative assessment systems for accountability purposes.
Discussion: An SEA may be affiliated with a consortium in order to participate in the planning and development of the innovative assessment, but is not considered a full member of the consortium unless the SEA is using the innovative assessment system in at least one LEA for the purposes of accountability and reporting under title I, part A of the ESEA instead of the statewide assessment. Affiliate members do not need to be included in the application for demonstration authority, nor do they count toward the four-State limitation on consortium size. The Department believes that it is the responsibility of the consortium of States and the affiliate State to determine when the affiliate State is ready to transition to full membership in the consortium and begin using the innovative assessment system, consistent with the innovative assessment demonstration authority requirements. At that point, the consortium, in partnership with the State seeking to transition from affiliated to full-member status, must apply for and receive authority from the Secretary to use the innovative assessment system for accountability and reporting purposes in place of the statewide assessment system in participating LEAs.
The Department believes it would be helpful to establish a definition of “affiliate member of a consortium.” A consortium of States may have both full members and affiliate members, and we believe it is necessary to clarify that a State is not a full member of a consortium unless it is proposing to use the consortium's innovative assessment system. In addition, we agree with commenters that it is necessary to provide detail on how an affiliate member of a consortium becomes a full member with authority to administer the consortium's innovative assessment system under demonstration authority.
Changes: We have added § 200.104(b)(1) to include a definition of “affiliate member of a consortium” to be an SEA that is formally associated with a consortium of SEAs that is implementing the innovative assessment demonstration authority, but is not yet a full member of the consortium because it is not proposing to use the consortium's innovative assessment system under the demonstration authority. We have made corresponding edits to new § 200.105(f)(1)(i) (proposed § 200.77(f)(1)(i)). We also have added § 200.105(f)(2) to clarify that the consortium must submit a revised application to the Secretary in order for an affiliate member to become a full member of the consortium and use the consortium's innovative assessment system under the demonstration authority.
200.105 Demonstration Authority Application Requirements
General
Comments: One commenter suggested that the innovative assessment system incorporate expanded learning time or other strategies that emphasize out-of-school time as part of a coordinated effort to provide students the opportunity to demonstrate mastery anytime, anywhere, including new requirements for SEAs and consortium of SEAs throughout proposed §§ 200.77(b) and 200.78(a) to incorporate after school and expanded learning time programs.
Discussion: This regulation is intended to support States as they apply for and implement innovative assessment demonstration authority under section 1204 of the ESEA, which includes the development and expansion of an innovative assessment system that can, at the conclusion of the demonstration authority period, meet requirements for statewide assessment and accountability systems under title I, part A. As there are no requirements regarding instructional programming or learning opportunities for students outside of the school day related to assessments and accountability systems under title I, part A, nor in section 1204 of the ESEA, we believe that decisions related to how extended learning time may support implementation of the innovative assessment system are best left to SEAs and LEAs.
Changes: None.
Comments: None.
Discussion: The Department believes it would be helpful for States interested in innovative assessment demonstration authority to reiterate in the regulations Start Printed Page 88951the statutory requirement in section 1204(e) of the ESEA that an SEA or consortium's application for demonstration authority must be submitted to the Secretary “at such time” and “in such manner” as the Secretary reasonably requires. Given that the innovative assessment demonstration authority is a new flexibility permitted under the ESEA, and that commenters, as previously described, and stakeholders have asked questions and requested greater specificity on the application process, we believe this revision would better align the final regulations to the statute and provide further clarity for States, LEAs, and interested stakeholders.
Changes: We have added to the introductory paragraph of new § 200.105 (proposed § 200.77) to clarify that applications for innovative assessment demonstration authority must be submitted to the Secretary at such time and in such manner as the Secretary may reasonably require.
Comments: None.
Discussion: In reviewing the proposed regulations, the Department believes it will improve consistency with the application requirements in new § 200.105(b) (proposed § 200.77(b)), which requires that each application demonstrate how the innovative assessment system does or will meet certain requirements for alignment, validity, reliability, and quality, to add to new § 200.104(c)(2) (proposed § 200.76(c)(2)) to state that the external peer review process will evaluate how the SEA's application “meets or will meet” each of these requirements in new § 200.105.
Changes: We have added § 200.104(c)(2) (proposed § 200.76(c)(2)) to specify that the peer review of SEA applications will be used to determine if an application “meets or will meet” each of the requirements in § 200.105.
Comments: None.
Discussion: We further believe it is necessary to clarify certain application requirements pertaining to the assurances a State must include relating to annual reporting of information on the demonstration authority. First, we believe it would be helpful to clarify in new § 200.105(d)(3) (proposed § 200.77(d)(3)) that States must provide this information in a time and manner as reasonably required by the Secretary—which is consistent with the requirement in new § 200.104(c) for the submission of applications. Second, because new schools within participating LEAs and new LEAs may join the demonstration authority annually, we believe it would be helpful to clarify in new § 200.105(e)(2) (proposed § 200.77(e)(2)) that LEAs must annually assure they will follow all requirements in § 200.105 and add to new § 200.105(d)(3)(i)(B) (proposed § 200.77(d)(3)(i)(B)) that the State must include these updated assurances in its annual reporting to the Secretary. Finally, in order to ensure consistent reporting between participating and non-participating schools, we believe States should annually report data on student achievement on the innovative assessment system to the Secretary in a way that is consistent with requirements for State and LEA report cards required under section 1111(h) of the ESEA, which includes reporting on student achievement and progress toward meeting long-term goals. We are revising § 200.105(d)(3)(ii) accordingly.
Changes: We have added to new § 200.105(d)(3) (proposed § 200.77(d)(3)) to specify that annual reporting is required at such time and in such manner as the Secretary may reasonably require. We have further added to new §§ 200.105(d)(3)(i)(B) and 200.105(e)(2) (proposed § 200.77(e)(2)) to require States to include updated assurances from each participating LEA annually that the participating LEA will meet all requirements in new § 200.105. Finally, we have added to new § 200.105(d)(3)(ii) to specify that reporting on the performance of all students in participating schools must be consistent with reporting student achievement and participation data on State and LEA report cards under section 1111(h) of the ESEA.
Innovative Assessment Design and Alignment
Comments: One commenter expressed support for proposed § 200.77(b)(1), which would allow States flexibility in selecting specific grades or subject areas to administer innovative assessments, rather than assessments in all required grades or subject areas.
Discussion: We appreciate the support for providing flexibility for States to propose an innovative assessment system in any, or all, required grades and subjects under section 1111(b)(2)(B)(v) of the ESEA as it enables States to develop the innovative demonstration authority at a scope to meet their needs and priorities.
Changes: None.
Comments: A few commenters encouraged the Department to clarify in proposed § 200.77(b)(1) that the innovative assessment must be administered to all students and all student subgroups within participating schools, believing that it is critical to emphasize that all students in each school are expected to participate in the innovative assessment.
Discussion: We agree with commenters that it is important for all students, including all students within particular subgroups, to be administered the innovative assessment in each participating school, and the intent of proposed § 200.77(b)(1) was to require all students in each participating school to take the innovative assessment, if an innovative assessment was developed for a subject or grade in which they were enrolled under the demonstration authority. Given the concerns of the commenters, we are revising the regulations to more clearly state that all students in each participating school must take the innovative assessment in each grade and subject in which an innovative assessment is being piloted. However, we note that, taken together, final § 200.105(b)(1)(i) and (ii) (proposed § 200.77(b)(1)(i) and (ii)) do not require States to develop an innovative AA-AAAS for students with the most significant cognitive disabilities for each innovative general assessment; a State only developing an innovative general assessment would be required to continue administering its statewide AA-AAAS to students with the most significant cognitive disabilities, consistent with applicable statutory and regulatory requirements under title I, part A. All children with disabilities ineligible for the AA-AAAS in the participating school in the grade and subject for which the State has an innovative assessment should participate in the innovative assessment.
Changes: We have added to new § 200.105(b)(1)(i) (proposed § 200.77(b)(1)(i)) to clarify that the innovative assessment must be administered to all students in a subset of participating LEAs or a subset of participating schools within a participating LEA.
Comments: One commenter recommended that proposed § 200.77(b)(1)(i), which exempts States from administering the same assessment to all elementary and secondary students in the State once it has been granted demonstration authority, be clarified, as it suggests States may simultaneously pilot multiple innovative assessments even within the same grade or content area. If that was the Department's intent, the commenter suggested that multiple innovative assessments should each meet all applicable regulatory requirements.
Discussion: We appreciate the commenter's suggestion for clarification in this area. The Department intends for the demonstration authority to be used Start Printed Page 88952to pilot a single innovative assessment system, which—if successful—will replace the current statewide assessment. It was not meant to allow for a State to try out multiple different innovative assessment systems simultaneously; accordingly, we are adding to new § 200.105(b)(1)(i) (proposed § 200.77(b)(1)(i)) to clarify that a State with demonstration authority may implement a single innovative assessment system, rather than “innovative assessments,” and that the requirement to administer the same assessment to all public school students in the State does not apply during the demonstration authority period, extension period, or waiver period, but does apply once the innovative assessment system is used statewide consistent with new § 200.107 (proposed § 200.79).
Changes: We have added to new § 200.105(b)(1)(i) (proposed § 200.77(b)(1)(i)) to specify that a State with demonstration authority may implement an “innovative assessment system” initially in a subset of LEAs, or a subset of schools within an LEA, during the demonstration authority period, extension period, or waiver period, but must administer the same assessment to all public school students upon transition to statewide use consistent with new § 200.107 (proposed § 200.79).
Comments: One commenter suggested that proposed § 200.77(b)(2) be modified to more clearly specify that all innovative assessments, including an innovative AA-AAAS for students with the most significant cognitive disabilities, align with challenging academic content standards for the grade in which the student is enrolled, similar to proposed requirements for statewide assessments under part A of title I of the ESEA.
Discussion: The regulations in new § 200.105(b)(1) (proposed § 200.77(b)(1)) require that the innovative assessment system meet the requirements of section 1111(b)(2)(B) of the ESEA, including demonstrating that it is aligned with the challenging State academic standards and provides information about student attainment of such standards and whether the student is performing at the student's grade level. The requirement in new § 200.105(b)(2)(i) (proposed § 200.77(b)(2)) applies to any innovative assessment developed under the demonstration authority, including an innovative AA-AAAS for students with the most significant cognitive disabilities.
We agree with the commenter that it is critical for requirements related to alignment of assessments with academic content standards to be the same for the innovative assessment demonstration authority under part B of title I as they are for statewide assessments under part A of title I; like statewide assessments, all innovative assessments must be aligned with the breadth and depth of the challenging State academic content standards. To improve consistency between these regulations and requirements for State assessment systems under title I, part A and to reiterate uniform expectations for alignment, we are revising these regulations by adding “challenging” to the reference to the State's academic content standards and removing “full” modifying depth and breadth of State academic content standards. We also agree with commenters that it would be helpful to clarify that these standards apply to the grade in which a student is enrolled, which also improves alignment of these requirements with those in section 1111(b)(2)(B) of the ESEA.
Changes: We have added § 200.105(b)(2)(i) to clarify that the innovative assessment must align to the challenging State academic content standards under section 1111(b)(1) of the ESEA, including their depth and breadth, for the grade in which a student is enrolled.
Comments: One commenter appreciated the clarification and the flexibility in the proposed regulations to allow implementation of the innovative assessment pilot in a subset of LEAs or schools in one or more LEAs. Another commenter, however, objected to this flexibility, believing that participating LEAs should be required to administer the same assessment in all schools in the LEA each year. The commenter was concerned the requirement would set a precedent for incomparable assessment results and different expectations among schools in a single school district.
Discussion: We appreciate commenters' feedback, but continue to believe that it is helpful to provide States and LEAs with flexibility to determine whether it is best to pilot the innovative assessment system in all schools within an LEA in the same year, or whether an LEA would be able to better support high-quality implementation if it has multiple years to expand the pilot within the LEA to all schools. In particular, we believe this flexibility will benefit especially large LEAs that will need to support hundreds of schools in implementing a new—and potentially quite different—system, which will require shifts in instruction, new professional development, and other significant investments of time and resources.
Further, we believe that the statutory and regulatory requirements that ensure valid, reliable, and comparable annual summative determinations, based on the State's academic standards, between the innovative assessment system and the statewide assessment, particularly in new § 200.105(b)(2)-(4), allay the commenter's concern that this flexibility will result in incomparable data and disparate expectations for students in participating and non-participating schools. To that end, we are adding to new § 200.105(b)(3) (proposed § 200.77(b)(3)) to clarify that the innovative assessment system must express student results “consistent with” the “challenging” State academic achievement standards; we are making these changes given that, as proposed, the provision to express results “in terms consistent with” the State's academic achievement standards could have been misinterpreted to only require that the same labels be used to describe student achievement on the innovative assessment as are used to describe student achievement on the statewide assessment—even if those labels carried very different meaning in terms of students' mastery of the challenging State academic achievement standards. We believe that removing “in terms” and adding “challenging” to new § 200.105(b)(3) helps clarify that the academic achievement standards must be consistent and comparable between the innovative and statewide assessment systems. This requirement is also reiterated in new § 200.105(b)(4)(ii), as discussed in response to comments on comparability of the two assessment systems.
Changes: We have added § 200.105(b)(3) (proposed § 200.77(b)(3)) to clarify that the innovative assessment system must express student results or competencies “consistent with” the “challenging” State academic achievement standards.
Comments: One commenter suggested the Department require SEAs to include demographically diverse LEAs or schools in the innovative assessment pilot from the very beginning of the demonstration authority period, as opposed to the requirement in the proposed regulations under which SEAs must ensure they are moving toward including demographically diverse LEAs over the course of the demonstration authority. The commenter pointed out that the inclusion of different types of LEAs from the outset, such as urban, suburban, and rural LEAs, will ensure that SEAs understand the needs of different types of districts and schools as they implement an innovative Start Printed Page 88953assessment system. Another commenter supported the intent of proposed §§ 200.77(d)(3)(ii) and 200.78(a)(3)(iii), but suggested the final rule strengthen the selection criterion so that a State must use the demographic composition of its public school students, rather than its initially participating LEAs, as the baseline to measure progress toward a more demographically representative subset of schools participating in the innovative assessment system.
Discussion: The Department shares a commitment to ensuring that SEAs include demographically diverse LEAs and schools in their innovative assessment systems over time, but we continue to believe that it is necessary to provide States with reasonable flexibility in how they scale their innovative assessment system statewide during the demonstration authority period. While it is critically important for States to implement and pilot their new assessment systems in demographically diverse LEAs and schools as soon as possible in order to make sure the assessment system is viable and effective in a wide range of contexts, requiring implementation in demographically representative LEAs and schools in the first year could result in rushed implementation in LEAs and schools that are not fully prepared for the significant changes an innovative assessment system may require. With gradual implementation, SEAs may be better able to recruit districts and schools that are willing and prepared to try the innovative assessment system first, which can serve as proof points for other districts and help set the entire State and its schools up for success. Nonetheless, all participating States must demonstrate in their application under new § 200.105(b)(5) (proposed § 200.77(b)(5)) that the innovative assessment system will provide for the participation of, and be accessible to, all students, including children with disabilities and English learners, and provide appropriate accommodations consistent with section 1111(b)(2) of the ESEA.
Further, we believe that States will be most likely to succeed in scaling their innovative assessment if they can develop rigorous criteria for determining when to add new LEAs or schools, with a plan that includes annual benchmarks, as described in new § 200.106(a)(3)(iii) (proposed § 200.78(a)(3)(iii)), to achieve implementation in demographically diverse settings over time. We are, however, revising new § 200.106(a)(3)(iii) to clarify that the benchmarks are intended to achieve high-quality and consistent implementation across all participating schools that are similar demographically to the State as a whole during the demonstration authority period, using the demographics of participating schools as the baseline. Our intent in specifying that the demographics of initially participating schools must serve as the baseline in setting these benchmarks is to signal that the demographics of initial participants, which may be a subset of schools with an LEA, are the starting point—while the demographics of all students and schools in the State serve as the end point for these benchmarks.
Changes: We have added to new § 200.106(a)(3)(iii) (proposed § 200.78(a)(3)(iii)) to clarify that the baseline for setting annual benchmarks toward high-quality and consistent implementation across schools that are demographically similar to the State as a whole is the demographics of participating schools, not LEAs.
Comments: One commenter requested that the Department require innovative assessments to include items and tasks that are the same across all participating LEAs and schools. The commenter argued that administering identical assessments is a critical equity lever to ensure that all students are receiving rigorous instruction, and that schools are being held accountable for the performance of all students on high-quality assessments.
Discussion: Under new § 200.105(b)(1) (proposed § 200.77(b)(1)), the innovative assessments included within a State's innovative assessment system under the demonstration authority must meet the requirements of section 1111(b)(2)(B) of the ESEA. As section 1111(b)(2)(B) and corresponding regulations do not require a State to use the same items or tasks on an assessment administered statewide under part A of title I and allow for multiple forms of the statewide assessment, we believe it would be inappropriate, and counter to the purpose of encouraging assessment innovation and flexibility, to include such a requirement for assessments developed under the innovative assessment demonstration authority. In addition, we note that the requirements for valid, reliable, and comparable annual summative determinations, based on the State's academic standards, between the innovative assessment system and the statewide assessment, particularly as set forth in new § 200.105(b)(2)-(4), (proposed § 200.77(b)(2)-(4)) help ensure that accountability and data reporting will be consistent between participating and non-participating schools and help to protect equitable expectations for all students.
Changes: None.
Comments: A few commenters recommended that the regulations explicitly require that a State be able to calculate student growth from its innovative assessment system. Another commenter suggested that the peer review process should be used to make a determination on whether the innovative assessment system may be used to calculate student growth.
Discussion: The Department appreciates the commenters' views on the use of innovative assessments to estimate student growth, and encourages States to strongly consider if it will be beneficial for the innovative assessment to measure student growth when designing the system. However, the Department believes it is more consistent with both the requirements for State assessments under section 1111(b)(2)(B)(vi) of the ESEA, and the prohibition in section 1111(e)(1)(B)(iii)(III) of the ESEA, for the innovative assessment demonstration authority to not include a requirement for innovative assessments to measure student growth or for peer reviewers to make a determination of whether the innovative assessment system may be used to measure student growth.
Changes: None.
Comparability
Comments: Several commenters supported the requirement in proposed § 200.77(b)(4) that States demonstrate comparability of the innovative assessment results to the statewide academic assessment. One commenter, while providing general support for the requirement, also encouraged the Department to avoid adding burden with overly prescriptive requirements for comparability and for the design and implementation of an innovative assessment system. Another commenter did not agree with the requirement that the innovative assessment must provide comparable, valid, and reliable results to the statewide assessment.
Discussion: The Department agrees that comparability is key to the development of a valid and reliable innovative assessment system that meets the statutory requirements for innovative assessment demonstration authority. Additionally, the Department solicited feedback from the public during the notice and comment period of the NPRM to gather additional ideas on how the Department can ensure comparability between existing statewide assessments and innovative assessments a State may pilot. Section 1204(e)(2)(A)(iv) of the ESEA requires Start Printed Page 88954that a State's innovative assessment system generate “results that are valid and reliable, and comparable, for all students and for each subgroup of students” compared to the results for those students on the statewide assessment under title I, part A. Section 1601(a) of the ESEA provides that the Secretary “may issue . . . such regulations as are necessary to reasonably ensure that there is compliance” with the law. The Department also has rulemaking authority under section 410 of the GEPA, 20 U.S.C. 1221e-3, and section 414 of the DEOA, 20 U.S.C. 3474.
We firmly believe that the requirements for comparability are necessary to reasonably ensure that States meet the requirement in section 1204(e)(2)(A)(iv) as well as other statutory requirements under section 1204(e)(2)(A)(xi) of the ESEA, such as the requirement “to validly and reliably aggregate data from the innovative assessment system” for purposes of school accountability and data reporting under title I, part A. Thus, these regulations are consistent and specifically intended to ensure compliance with section 1204 of the ESEA.
The Department acknowledges that the requirements for comparability for innovative assessment systems are rigorous in these regulations, but believes they are reasonable because setting clear expectations for comparability will lead to stronger evidence of validity and reliability from States. While the Department appreciates the need to allow States flexibility in designing innovative assessments, this flexibility must be balanced with the imperative that States meet all of the statutory provisions and ensure their innovative assessment systems are valid, reliable, fair, and of high-quality. In addition, by providing multiple paths to demonstrating comparability, including a State-determined method, we believe we are providing sufficient flexibility to States in how they may demonstrate comparability.
Changes: None.
Comments: One commenter urged the Department to ensure that the comparability requirements in proposed § 200.77(b)(4) provide for the evaluation of new innovative assessments in terms of their ability to allow for the comparison of student performance against the challenging State academic standards across districts and among subgroups of students.
Discussion: The Department agrees that it is important to establish comparability of student performance on the innovative assessment systems with statewide assessments, and believe the regulations sufficiently address the commenter's concern. New § 200.105(b)(2)-(3) (proposed § 200.77(b)(2)-(3)) requires the innovative assessment system to be aligned with the same academic content and achievement standards with which the statewide assessment is aligned, and as previously described, we are revising new § 200.105(b)(2)-(3) to further clarify these expectations. In addition, new § 200.105(b)(4)(i) (proposed § 200.77(b)(4)) will ensure that States plan, as described further in the selection criterion related to evaluation and continuous improvement in new § 200.106(e) (proposed § 200.78(e)), for how they will demonstrate that the annual summative determinations for students (which are based on the challenging State academic standards) are comparable between the two assessment systems, including for all students and for each subgroup of students under section 1111(b)(2)(B)(xi) of the ESEA.
Changes: None.
Comments: Many commenters requested that the Department make explicit that the requirement for comparability is based on the annual summative determinations of student proficiency on the innovative assessment as compared to the results (i.e., the academic achievement levels) on the statewide assessment.
Discussion: The Department agrees with these commenters that comparability of the innovative assessment to the statewide assessment should be based on annual summative determinations of student proficiency on the innovative assessment system. While the two assessment systems must be aligned to the same challenging State academic content and achievement standards and produce student results that are valid, reliable, and comparable—as described in section 1204(e)(2)(A)(ii)-(iv) of the ESEA—we did not intend to imply that the raw scores or scale score levels must be directly comparable, and we are adding to new § 200.105(b)(4)(i) (proposed § 200.77(b)(4)) to clarify that the requirement for comparability between the two assessment systems is based on results, including annual summative determinations, generated for all students and for each subgroup of students.
Changes: We have added to new § 200.105(b)(4)(i) (proposed § 200.77(b)(4)) to clarify that determinations of the comparability between the innovative and statewide assessment systems must be based on results, including the annual summative determinations, as defined in new § 200.105(b)(7) (proposed § 200.77(b)(7)), that are generated for all students and for each subgroup of students and have made a conforming change to new § 200.106(b)(1)(ii)(C) (proposed § 200.78(b)(1)(ii)(C)).
Comments: A number of commenters urged the Department not to define comparability so narrowly that it would stifle innovation and generally advised the Department not to list specific methodologies for establishing comparability in regulation, but instead provide examples of various approaches in non-regulatory guidance. These commenters also recommended that the Department allow a State to develop an evaluation methodology for establishing comparability that is consistent with the design and context of its innovative assessment system. Similarly, some commenters advised that States should consider multiple approaches to comparability evaluations to provide a more complete picture of the degree of comparability.
Discussion: The Department agrees with commenters that States may need flexibility in establishing the comparability of their innovative assessment system with their statewide assessment system, and that it is important for a State to select a comparability methodology that is best aligned with the design and context of its innovative assessment system. To support these goals, new § 200.105(b)(4)(i)(E) (proposed § 200.77(b)(4)(iv)) allows for a State-designed comparability methodology should the State not wish to pursue one of the other four methods in the regulations; States may propose an alternate methodology that provides for an equally rigorous and statistically valid comparison between student performance on the innovative assessment and the statewide assessment.
However, we also believe that demonstrating comparability between the two assessment systems, as required by section 1204(e)(2)(A)(iv) of the ESEA is a critical safeguard for fairness and equity during the demonstration authority period, when both assessment systems will be in use throughout the State for school accountability and data reporting purposes under title I, part A for a period of five years, or more. If the data from the innovative assessment system are not comparable to the statewide assessment during this time, the integrity and validity of the school accountability system will be jeopardized; schools and students requiring additional supports may go Start Printed Page 88955unidentified and not receive the extra resources they deserve; and parents, educators, and community members will lack transparent and clear data about student performance. Because the comparability requirement is paramount to consistently measuring student progress against the challenging State academic standards throughout the State, and recognizing that demonstrating comparability may be technically challenging for States, the regulations include examples of four methods a State may use to demonstrate comparability, in addition to providing the option for a State-designed methodology. We believe providing these examples in the regulations, which were developed based on public comment and recommendations from researchers and assessment experts, States and other stakeholders, will be helpful to States interested in the demonstration authority for several reasons. Having these examples in the regulation will help States in evaluating and adopting rigorous and well-established methods to meet the statutory requirement for comparable assessment systems; can support States in immediate planning for the activities and strategies that will be part of an innovative assessment pilot prior to the release of any Notice Inviting Applicants (NIA), peer review guidance, or additional non-regulatory guidance; and provides context and a helpful comparison if States decide to pursue their own State-designed method to demonstrate comparability. Because a State-designed method for demonstrating comparability between the two assessments is also permitted, we believe the regulations balance the requirement that States must sufficiently demonstrate comparability, as described in section 1204(e)(2)(A)(iv) of the ESEA, with the desire to provide States with flexibility and promote innovation in designing innovative assessment systems.
Changes: None.
Comments: Several commenters provided technical advice to the Department regarding the methodologies for demonstrating comparability. These commenters urged the Department to make judgments on the strength of the theory and evidence provided by States to support comparability for each innovative assessment system and avoid an overly prescriptive approach, offering a detailed list of considerations and decision points States could use in selecting a comparability method. Finally, while agreeing with the technical soundness of the methodologies provided in the regulations, these commenters described a dozen specific research approaches for evaluating comparability under proposed § 200.77(b)(4), such as propensity score matching. These commenters encouraged the Department to not include any specific methodologies in regulation but provide a multitude of methodologies in guidance.
Discussion: The Department appreciates these commenters' analysis and recommendations, but as previously discussed, continues to believe that new § 200.105(b)(4)(i) (proposed § 200.77(b)(4)) should include examples of methods that we believe a State could use in order to meet the requirement in section 1204(e)(2)(A)(iv) of the ESEA to generate results that are valid, reliable, and comparable between the two assessment systems—including a State-designed methodology—as a way to help States develop strong proposals and to clarify what the expectations of the peer reviewers will be, among other reasons. These examples were not intended to be the only methodologies the Department would consider for a State to demonstrate comparability. The Department agrees that there are a number of technically sound methodologies that, if well-designed, could support a State's demonstration of comparability for its innovative assessment system beyond those specified in new § 200.105(b)(4)(i)(A)-(D) (proposed § 200.77(b)(4)(i) through (iii)) and provide for an equally rigorous and statistically valid comparison. Further, we note that several of the specific suggestions (e.g., propensity score matching) from the commenters could be used to evaluate comparability as part of any of the methods included in new § 200.105(b)(4)(i), as these methods consider how a State may use its innovative and statewide assessment systems during the demonstration authority in order to establish comparability between the two systems but do not specify a particular research or evaluation approach. We believe that States should administer the innovative and statewide assessments in participating schools and LEAs in a way that works best for the design of their innovative assessment system, and select an approach and research methodology for demonstrating comparability that is appropriate to that design. We believe that the regulations provide sufficient flexibility for States to do so—including by allowing for a State-determined method beyond the options described in new § 200.105(b)(4)(i)(A)-(D). We will consider providing additional examples in any technical assistance the Department may provide to States and in guidance for peer reviewers.
In response to the additional proposed methodologies that included a suggestion to allow States to administer items from the innovative assessment to students taking the statewide assessment, we are clarifying in new § 200.105(b)(4)(i)(C) and (D) that States may include items “or performance tasks” from the innovative assessment on the statewide assessment, and vice versa, if their inclusion constitutes a significant portion of the assessment and is appropriate for the research design to demonstrate comparability proposed by the State.
Changes: We have added to new § 200.105(b)(4)(i)(C) to clarify that States may include, as a significant portion of the innovative assessment system in each required grade and subject in which both an innovative and statewide assessment is administered, items or performance tasks from the statewide assessment system that, at a minimum, have been previously pilot tested or field tested for use in the statewide assessment system.
We have also added § 200.105(b)(4)(i)(D) to clarify that States may include, as a significant portion of the statewide assessment system in each required grade and subject in which both an innovative and statewide assessment is administered, items or performance tasks from the innovative assessment system that, at a minimum, have been previously pilot tested or field tested for use in the innovative assessment system.
Comments: Some commenters noted that as an innovative assessment system is taken to scale statewide, comparability with the statewide assessment systems becomes less important than the comparability of results among LEAs and schools using the innovative system of assessments. These commenters urged the Department to modify the regulations to not require an annual comparability evaluation between the statewide and innovative assessment systems; they argued that if the evidence for comparability across the two systems of assessment is strong, comparability of the innovative assessment with the statewide assessment need not be re-evaluated every year.
Discussion: The Department agrees that as the innovative assessment system scales into wider use among LEAs and schools, comparability among the LEAs and schools administering the innovative assessment system will become more important than in the beginning of the demonstration Start Printed Page 88956authority period. Further, we note that the comparability, validity, reliability, and technical quality of innovative assessments across participating LEAs and schools will be one critical component of the peer review required to transition to statewide use of the innovative assessment for purposes of part A of title I, as described further in new § 200.107 (proposed § 200.79). Given these comments, the Department is also concerned that the requirement for comparable results within the innovative assessment system was unclear in the regulations, as proposed. As the innovative assessment system will be used during the demonstration authority period for purposes of school accountability and reporting, it is imperative for States to have plans and procedures in place to ensure the quality, validity, reliability, and consistency of assessment blueprints, items or tasks, test administration, scoring, and other components across participating LEAs and schools. To clarify that comparability between LEAs and schools participating in the innovative assessment is required and reinforce that States should take this into account as they develop and implement their innovative assessment system, we are adding new § 200.105(b)(4)(ii) to specify that States must annually determine the comparability of the innovative assessment system, including annual summative determinations that are valid, reliable, and comparable for all students and each subgroup of students, among participating schools and LEAs. This will also be part of a State's plan for evaluation and continuous improvement as described in new § 200.106(e) (proposed § 200.78(e)).
We disagree that an annual demonstration of comparability between the innovative and statewide assessment systems is unnecessary or overly burdensome as States focus on scaling their innovative systems. As provided in section 1601(a) of ESEA, “[t]he Secretary may issue . . . such regulations as are necessary to reasonably ensure that there is compliance” with the statute. Also, the Department has rulemaking authority under section 410 of the GEPA, 20 U.S.C. 1221e-3, and section 414 of the DEOA, 20 U.S.C. 3474. Section 1204(e)(2)(A)(iv) requires that the innovative assessment system generates valid, reliable, and comparable results relative to the statewide assessment during the demonstration authority period. We believe that as an innovative assessment system goes to scale, the regulations related to statewide assessment will remain a valuable reference to monitor effective implementation across the increasing number of LEAs and schools that adopt the innovative assessment. Further, annual information on comparability will enable the Department to better support and work with States to make needed adjustments over time to maintain a high level of comparability between the two assessment systems, which is not only required by the statute, but also critical to maintain fair and valid school accountability determinations and transparent data reporting while both assessment systems are in operation during the demonstration authority period. Finally, these final regulations are consistent and specifically intended to ensure compliance with section 1204 of the ESEA.
For example, the evidence a State will provide to demonstrate that its statewide and innovative assessment systems are comparable may need to change little from one year to next, particularly in any year of the demonstration authority period where the innovative assessment has not expanded to a large number of new schools or where implementation has been relatively stable—in such cases, providing this information will result in minimal work for SEAs and will assure the Department that the SEA continues to comply with the minimal requirements for demonstration authority. However, there are many cases where implementation from one year to the next will not be as stable, leading to variation in the results between the two assessments over time. For instance, comparability could be strengthened in later years if the State makes adjustments to modify its performance tasks to better align with the State's academic content standards or to improve the inter-rater reliability and training of evaluators. However, comparability could decline in later years of the demonstration authority period if the initial participating LEAs had greater prior experience with the innovative assessment system, and newly added LEAs struggle to implement the innovative assessment system with the same fidelity as early adopters. Similarly, if initially participating schools are not demographically representative of the State as a whole, the comparability of the innovative assessment system results to the statewide assessment could change as greater numbers of students take the innovative assessment, including children with disabilities and English learners. Without annual information on comparability between the statewide and innovative assessment systems, the Department would not be able to provide the necessary technical assistance to States that see these fluctuations over time and would not have essential information to ensure compliance with the statutory requirements in section 1204 for the demonstration authority.
Changes: We have added § 200.105(b)(4)(ii) to require that States' innovative assessment systems generate results, including annual summative determinations, that are valid, reliable, and comparable for all students and for each subgroup of students among participating schools and LEAs, which an SEA must annually determine as part of its evaluation plan described in § 200.106(e).
Accessibility
Comments: A few commenters supported proposed § 200.77(b)(5), which would require SEAs to ensure that the innovative assessment systems provide for the participation of, and are accessible to, all students, including students with disabilities and English learners. One commenter also expressed support for the provision that the innovative assessment system may incorporate, as appropriate, the principles of universal design for learning (UDL), noting that UDL includes principles for flexible approaches and accommodations in assessment. However, another recommended that the words “as appropriate” be removed, in order to require the use of the principles of UDL in the development of innovative assessments, which they believed would be more consistent with the requirements of section 1204(e) of the ESEA.
Discussion: We appreciate the support of commenters for ensuring innovative assessments are accessible to all students, and share their belief that innovative assessments should be accessible to all students. We agree that the language should encourage States to incorporate the principles of UDL. We also believe this language should be consistent with how principles of UDL are included in § 200.2(b)(2)(ii) with respect to the requirements for statewide assessments under part A of title I. This will help to reiterate for States that they should develop innovative assessment systems that will be able to meet the title I, part A requirements when the States seek to transition to statewide use of the innovative assessment and undergo peer review under title I, part A, as described in § 200.107 (proposed § 200.79).Start Printed Page 88957
We are therefore adding to new § 200.105(b)(5) (proposed § 200.77(b)(5)) to state that the principles of UDL should be incorporated “to the extent practicable” instead of “as appropriate” consistent with section 1111(b)(2)(B)(xiii) of the ESEA.
Changes: We have added to new § 200.105(b)(5) to make clearer the three concepts contained in that section include: Participation of all students; accessibility by incorporating principles of UDL; and accommodations. We have also specified in § 200.105(b)(5)(ii) that the principles of UDL should be incorporated “to the extent practicable.”
Comments: Multiple commenters advocated amending proposed § 200.77(b)(5) to require specific accessibility standards for digital content, such as Web Content Accessibility Guidelines (WCAG) 2.0, as part of an innovative assessment system.
Discussion: Section 1204(e)(2)(A)(vi) of the ESEA requires all innovative assessment systems to be accessible to all students, such as by incorporating the principles of UDL. The requirement that assessment systems be accessible to individuals with disabilities is also based on the Federal civil rights requirements of section 504 of the Rehabilitation Act, 29 U.S.C. 794, title II of the Americans with Disabilities Act, 42 U.S.C. 12131 et seq., and their implementing regulations, all of which are enforced by the Department's Office for Civil Rights (OCR). In OCR's enforcement experience, where an SEA collects information through electronic and information technology, such as student assessment, it is difficult to ensure compliance with accessibility requirements without adherence to modern standards, such as the WCAG 2.0 Level AA standard. However, we do not think further requirements regarding digital content are appropriate here since the assessment models that States pilot could be quite different depending on a State's specific priorities and goals—some innovative assessments may be heavily dependent on digital content, while another innovative assessment system could use very little digital content. Regardless, the baseline requirement under both ESEA and Federal civil rights laws remains that the innovative assessment system must be accessible for all students, including all children with disabilities. In addition, we note that any innovative assessment system developed under the demonstration authority must, prior to transition to statewide use, undergo a second peer review as described in new § 200.107 (proposed § 200.79) to determine if the system meets the requirements for State assessments and accountability under part A, of title I, which includes a regulatory requirement related to accessibility and nationally recognized accessibility standards under § 200.2. Thus, it is clear that SEAs' innovative assessment systems will, when implemented at scale, also be subject to these same requirements to incorporate the principles of UDL to the extent practicable.
Changes: None.
Participation Rates
Comments: One commenter opposed the requirement in proposed § 200.77(b)(6) that, for purposes of the State accountability system, the innovative assessment system must annually measure the achievement of at least 95 percent of all students, and 95 percent of students in each subgroup. The commenter believes that this provision would impose an additional requirement taken from section 1111(c)(4)(E)(iii) of the ESEA on participating schools and additional consequences on such schools for not assessing 95 percent of students, contrary to congressional intent. The commenter recommended requiring innovative assessment participation in schools participating in the demonstration authority at a rate that is no less than the participation rate of students in the statewide assessment system. In particular, the commenter does not believe that demonstration authority should be placed at risk because of assessment participation requirements.
Discussion: We believe the commenter's concerns may be addressed by further clarifying the intent of new § 200.105(b)(6) (proposed § 200.77(b)(6)) and related requirements. The commenter is correct that section 1111(c)(4)(E)(iii) of the ESEA requires States to factor 95 percent participation in State assessments into their accountability systems. However, section 1111(c)(4)(E)(i)-(ii) also includes specific requirements for the measurement of academic achievement based on State assessments, including (1) a requirement that States annually measure, for school accountability, the progress of at least 95 percent of all students and 95 percent of students in each subgroup on the State's reading/language arts and mathematics assessments, and (2) a requirement that, for purposes of measuring, calculating, and reporting on the Academic Achievement indicator, the denominator must always include either the number of students with valid assessment scores or 95 percent of students enrolled in the school, whichever is greater. New § 200.105(b)(6) (proposed § 200.77(b)(6)) and related requirements for 95 percent assessment participation in the final regulations for innovative assessment demonstration authority were intended to clarify how these statutory requirements for measurement of academic achievement related to school accountability apply to participating schools in the demonstration authority.
Section 1204(e)(2)(A)(ix) of the ESEA requires that the innovative assessment system annually measure the progress of “not less than the same percentage” of all students and students in each subgroup in participating schools as were assessed by schools administering the statewide assessments and “as measured under section 1111(c)(4)(E)” (emphasis added). As explained previously, the percentage of all students and students in each subgroup whose performance on assessments must be measured for accountability under section 1111(c)(4)(E)(i) of the ESEA is 95 percent of students and 95 percent of students in each subgroup; the requirements in section 1111(c)(4)(E)(ii) of the ESEA reinforce this further by requiring that at least 95 percent of all students and students in each subgroup be included in calculating the Academic Achievement indicator. As a result, “not less than the same percentage” will always be 95 percent, because the Academic Achievement indicator—“as measured under ESEA section 1111(c)(4)(E)”—will always measure the performance of 95 percent of all students and 95 percent of students in each subgroup enrolled in a school.
New § 200.105(b)(6) does not prescribe how each State will factor participation rates into its accountability system for all public schools, as required under section 1111(c)(4)(E)(iii) of the ESEA. This requirement would still apply to all schools in the State, including schools participating in the innovative assessment demonstration authority, because of requirements in section 1204(e)(2)(A)(xi) and (C)(iii) of the ESEA to maintain consistent, valid, and reliable accountability for all schools, but the actions for holding schools accountable for improving school participation rates are determined by the State as described in the statutory requirements for statewide accountability systems. While the commenter is correct that the Secretary may withdraw demonstration authority for a number of reasons, including when a State cannot provide evidence that it is meeting the requirements under new § 200.105, this does not mean low Start Printed Page 88958assessment participation in a school or LEA will automatically result in withdrawal of demonstration authority. In order for a State to meet the requirement under new § 200.105(b)(6), the State would need to hold participating schools accountable for 95 percent participation in assessments in the same way as it does for all public schools, including the calculation of the Academic Achievement indicator and the way the State determines it will factor the 95 percent participation requirement into its overall accountability system consistent with section 1111(c)(4)(E) of the ESEA. We believe the requirements in new § 200.105(b)(6) help clarify the statutory language and ensure fairness and consistency in accountability determinations between participating and non-participating schools, without creating any new requirements for participating schools.
Changes: None.
Annual Summative Determinations for Students
Comments: Several commenters supported requirements in proposed § 200.77(b)(7) regarding annual summative determinations for student performance on the innovative assessment. These commenters noted the importance of providing students and families an indicator of grade-level mastery of the State's academic content standards and making sure that all students are held to the same academic standards. One commenter also noted this requirement will help ensure comparability in student results between the statewide annual assessment and the innovative assessment. A few commenters requested further clarification in proposed §§ 200.76(b)(2) and 200.77(b)(1) that innovative assessments may assess a student on content that is above or below the content standards for the grade in which the student is enrolled, citing section 1111(b)(2)(J) of the ESEA, which allows computer-adaptive assessments to include items above or below grade level. These commenters believe that innovative assessments should be able to use a different approach for measuring student academic proficiency, while maintaining an annual grade-level determination of proficiency. Another commenter was concerned that the proposed requirements to produce an annual grade-level determination would mean innovative assessments would not also produce a valid result for a student's performance above or below that standard.
Discussion: Given that the assessment requirements in title I, part A of the ESEA focus on the alignment of the assessment system to the challenging State academic standards and these academic standards also apply to innovative assessments as described in section 1204(e)(2)(A)(ii)-(iii) of the ESEA, we believe it is both consistent with the statute and critically important to continue this focus within the demonstration authority. While we support the need for better and more valid assessments of student knowledge, we do not think that these assessments should set a different or lower expectation for student achievement. In addition, it is vital that the innovative assessment system provide valid, reliable, comparable, and fair determinations of student achievement against the challenging State academic standards for the student's grade, because the innovative assessments (1) will be used in place of the statewide assessments that are administered to meet the requirements in section 1111(b)(2)(B) of the ESEA; (2) will be required to meet these same requirements as described in section 1204(e)(2)(A)(i) of the ESEA; and (3) will be used in the State's accountability system for participating LEAs and schools.
There is nothing in these regulations that would preclude a State from including additional content to measure a student's mastery of content other than the content for the grade in which the student is enrolled, and we are revising the final regulations to make this clear. A State is able to include such content, whether through a computer-adaptive design or some other innovative design, provided the innovative assessment system meets the statutory and regulatory requirements, including by producing an annual summative determination that describes the student's mastery of the State's grade-level academic content standards based on the State's aligned academic achievement standards.
Changes: We have added new § 200.105(b)(2)(ii) (proposed § 200.77(b)(2)) to clarify that innovative assessments may include items above or below the State's academic content standards for the grade level in which a student is enrolled, so long as, for purposes of reporting and school accountability consistent with new § 200.105(b)(3) and (7)-(9), the State measures a student's academic proficiency based on the challenging State academic standards for the grade in which a student is enrolled.
Comments: One commenter recommended that the regulations clarify more specifically that the annual summative determination under proposed § 200.77(b)(7) be based on the State's academic achievement standards that are aligned to grade-level academic content standards. One commenter specifically recommended that proposed § 200.77(b)(7) be modified to state that the achievement standards must be “aligned” to the State's grade-level academic content standards, believing such an addition was especially critical if a State adopts an innovative AA-AAAS.
Discussion: The Department agrees that any innovative assessment (including an innovative AA-AAAS) must produce an annual summative determination for each student that describes the students' mastery of grade-level academic content standards, using either the State's academic achievement standards or, for students with the most significant cognitive disabilities, the State's alternate academic achievement standards. Section 1111(b)(1) of the ESEA requires that challenging State academic standards include academic content standards and aligned academic achievement standards, and these requirements apply whether or not a State applies for or receives innovative assessment demonstration authority. To clarify this in the final regulations, we are adding to new § 200.105(b)(7) to specify that (1) the annual summative determination of achievement for a student on the innovative assessment describes the student's achievement of the challenging State academic standards (i.e., both the State's academic content and achievement standards) for the grade in which the student is enrolled; and (2) in the case of a student with the most significant cognitive disabilities assessed with an innovative AA-AAAS aligned with the challenging State academic content standards for the grade in which the student is enrolled, the innovative AA-AAAS must provide an annual summative determination of to the student's mastery of the alternate academic achievement standards for each such student.
Changes: We have added to new § 200.105(b)(7) (proposed § 200.77(b)(7)) to require that the innovative assessment produce an annual summative determination of achievement for each student that describes the student's mastery of the challenging State academic standards (i.e., both the State's academic content and achievement standards) for the grade in which the student is enrolled, or, in the case of a student with the most significant cognitive disabilities assessed with an alternate assessment aligned with alternate academic Start Printed Page 88959achievement standards under section 1111(b)(1)(E) of the ESEA, the student's mastery of those standards.
Reporting to Parents
Comments: Multiple commenters expressed strong support for the requirements in proposed § 200.77(d)(4). This section would require an SEA to provide an assurance that it will ensure each LEA provides information to parents in a timely, uniform, and understandable format. In particular, commenters asserted the importance of providing assessment information for non-English speaking parents in their native language. While appreciating the requirement to provide oral translations to parents with limited English proficiency when written translations are not practicable, one commenter suggested the regulations require LEAs to secure written translations for the most populous language spoken, other than English, by participating students. Another commenter, however, recommended removing altogether requirements related to written and oral translations and to alternate formats in proposed § 200.77(d)(4)(ii)-(iii), expressing concern about the financial burden placed on large urban districts with students and families who speak many different languages.
Discussion: We appreciate the strong support for proposed § 200.77(d)(4) and agree these regulations are critical to ensure that a parent receives needed information about a child's academic progress on State assessments. Section 1111(b)(2)(B)(x) of the ESEA requires a State to provide information to parents in an understandable and uniform format, and to the extent practicable, in a language that parents can understand. These requirements also apply to innovative assessment systems developed under the demonstration authority, consistent with section 1204(e)(2)(A)(i) of the ESEA and new § 200.105(b)(1) (proposed § 200.77(b)(1)). In addition, the statute includes these same requirements for accessibility of notices to parents under section 1112(e) of the ESEA, which requires LEAs to provide certain information to parents each year, including information pertaining to testing transparency. We believe the clarifications provided by new § 200.105(d)(4) (proposed § 200.77(d)(4)) will help parents take an active role in supporting their children's education, improve transparency and understanding of the innovative assessment system, and provide consistency among the statutory requirements, regulations, and applicable civil rights laws, as explained below.
We disagree with commenters that we should require written or oral translations and alternate formats only to the extent practicable. Parents with disabilities or parents who are limited English proficient have the right to request notification in accessible formats. Whenever practicable, written translations of printed information must be provided to parents with limited English proficiency in a language they understand, and the term “language” includes all languages, including Native American languages. However, if written translations are not practicable for a State or LEA to provide, it is permissible to provide information to limited English proficient parents orally in a language that they understand instead of a written translation. This requirement is consistent with Title VI of the Civil Rights Act of 1964 (Title VI), as amended, and its implementing regulations. Under Title VI, recipients of Federal financial assistance have a responsibility to ensure meaningful access to their programs and activities by persons with limited English proficiency. It is also consistent with Department policy under Title VI and Executive Order 13166 (Improving Access to Services for Persons with Limited English Proficiency).
We decline to further define the term “to the extent practicable” under these regulations, but remind States and LEAs of their Title VI obligation to take reasonable steps to communicate the information required by ESEA to parents with limited English proficiency in a meaningful way.[4] We also remind States and LEAs of their concurrent obligations under Section 504 and title II of the ADA, which require covered entities to provide persons with disabilities with effective communication and reasonable accommodations necessary to avoid discrimination unless it would result in a fundamental alteration in the nature of a program or activity or in undue financial and administrative burdens. Nothing in the ESSA or these regulations modifies those independent and separate obligations. Compliance with the ESEA, as amended by the ESSA, does not ensure compliance with Title VI, Section 504 or title II.
Changes: None.
Comments: Some commenters suggested that if an LEA begins to administer a general innovative assessment in some or all schools under the demonstration authority, the LEA should be required to notify parents of students with significant cognitive disabilities that their child will be assessed using an assessment other than the innovative assessment system and provide detail on that assessment.
Discussion: Section 1112(e) of the ESEA requires each LEA to provide annually to parents information on assessments required in their LEA, which would include, in the case of an LEA administering an innovative general assessment and the statewide AA-AAAS, details on the purpose of both assessments, the grades and subjects in which they are administered, and other information. In addition, section 1111(b)(2)(D)(i)(II) and related regulations require that parents of students assessed using an AA-AAAS receive information about that assessment. Accordingly, we believe that new § 200.105(d)(4) (proposed § 200.77(d)(4)) ensures that parents in participating schools will receive transparent information about all required assessments administered to students in the school; however, we are adding to new § 200.105(d)(4) in the final regulations to specify that this information must be sent to “all” parents of students in participating schools and include the grades and subjects in which the innovative assessment will be administered, to further clarify that an LEA must (1) include all parents in these notices, even if their student is not being assessed using an innovative assessment in the upcoming school year, and (2) provide information on any required statewide assessments that are still being given in other grades and subjects, including an AA-AAAS for students with the most significant cognitive disabilities.
Changes: We have added to new § 200.105(d)(4) to clarify that notices must be sent to parents of all students, including in a manner accessible to parents and families with limited English proficiency and those with disabilities, in participating schools and include specific information on the innovative assessment in each required grade and subject in which it is being administered.
200.106 Demonstration Authority Selection Criteria
General
Comments: One commenter supported the general depth of the selection criteria in the proposed regulations and believes the criteria, Start Printed Page 88960particularly for a timeline and budget, hold States accountable for their financial capacity and technical expertise to develop an innovative assessment system. The commenter further encouraged the Department to provide sufficient notice of application requirements and selection criteria so that States can undergo extensive planning. Another commenter expressed general support for holding States to a high bar prior to awarding demonstration authority (including a rigorous evaluation and peer review of applications) and expressed strong support for the selection criteria, especially prior experience, capacity, and stakeholder support.
Discussion: We share the commenters' views that States should be held to rigorous expectations in the development of a valid, reliable, and comparable innovative assessment system and that the requirements and selection criteria—which will be outlined in any future NIA—will both support States in planning and developing strong, thorough proposals, as well as the Department and peers in reviewing and approving applications that are likely to be successful.
Changes: None.
Comments: Due to the small scale nature of the pilot, the limited number of test items available, and the cost of developing innovative items, one commenter stated that testing irregularities and breaches of test security pose a greater risk to innovative assessment pilots, and requested additional emphasis on test security measures. The commenter suggested an additional selection criterion outlining an SEA's or consortium's plans for test security, including a description of the security measures used to protect test content and ensure test validity and reliability.
Discussion: We appreciate the commenter's concern about the increased frequency of testing irregularities and security breaches. However, we do not believe it is necessary to add additional selection criterion for SEAs or consortia of SEAs with respect to test security measures. We believe that SEAs are aware of the test security risks, and will develop their implementation plans accordingly. In addition, SEAs are required to submit evidence of test security and monitoring practices, as described in the Department's current State assessment peer review guidance, to meet the requirements for State assessments in section 1111(b)(2)(B) of the ESEA. Because SEAs are aware that their innovative assessment systems will be subject to these requirements when transitioning to statewide use as described in new § 200.107 (proposed § 200.79), we believe there is sufficient incentive in the regulations, as proposed, to develop an innovative assessment system that considers and accounts for test security and necessary protocols. We strongly encourage SEAs and consortia to consider these peer review criteria when developing their innovative assessments under the demonstration authority.
Changes: None.
Prior Experience
Comments: Several commenters expressed strong support for proposed § 200.78(b)(1)(ii)(A), which creates a selection criterion for prior experience, and specifically any experience the SEA or its LEA has in developing or using effective supports and appropriate accommodations for administering innovative assessments to all students, including English learners and children with disabilities.
Discussion: We appreciate the support of these commenters, and agree that an important criterion for evaluating the strength of an application from an SEA or consortium of SEAs, and its ability to effectively implement and scale up a high-quality innovative assessment system, will be ensuring that appropriate accommodations are provided on the assessments so that all students may participate.
Changes: None.
Comments: One commenter recommended we revise proposed § 200.78(b)(1)(ii)(C) to require independent reviewers to provide an unbiased judgment of the validity, reliability, and comparability of scoring rubrics.
Discussion: We disagree that it is necessary to revise this selection criterion to provide for evaluation by an independent reviewer under new § 200.106(b)(1)(ii)(C) (proposed § 200.78(b)(1)(ii)(C)). Because all of the information pertaining to each selection criterion is submitted as part of the SEA or consortium's application for the demonstration authority (see § 200.105(c)) and because the application is subject to external peer review as part of the approval process (see § 200.104(c)), the recommended addition of an independent review requirement in new § 200.106(b)(1)(ii) is redundant. Any prior experience with developing or using scoring rubrics would be evaluated by independent, unbiased teams of external peer reviewers who will examine the evidence submitted by States that documents validity, reliability, and comparability of student determinations using standardized and calibrated scoring rubrics.
Changes: None.
Supports for Educators
Comments: Multiple commenters supported the proposed selection criterion in proposed § 200.78(d), which provides for an SEA to describe available supports for educators to help them understand and become familiar with the innovative assessment system. Some of these commenters further requested that the selection criterion be revised to provide for SEAs to include in their applications a detailed professional development plan to support the implementation of the innovative assessment system. According to the commenters, this plan should address how the State will, among other things: Scale its system of professional development to more LEAs over time; provide sufficient time for teachers and school leaders to participate in professional development; partner with educator preparation programs to ensure pre-service and in-service training is sufficiently preparing educators to implement and use data from the innovative assessment system to inform instruction; and use Federal funding under title II, and other public sources of funds, to provide supports for educators described in its plan. These commenters also suggested the Department issue additional non-regulatory guidance that could be beneficial to support effective professional development for educators as part of the demonstration authority. Similarly, other commenters requested that the Department add a requirement that SEAs include a description of the State's efforts to increase teacher and principal assessment literacy and provide incentives to teachers participating in professional development on the innovative assessment system.
Discussion: We appreciate the feedback on ways to clarify and strengthen the supports an SEA or consortium must provide to educators who will be implementing the innovative assessment demonstration authority and agree that this will be a critical component in effectively scaling a State's innovative assessment system. As proposed, the selection criterion would allow States to provide this type of information. However, we are adding to new § 200.106(d) (proposed § 200.78(d)) to clarify that each SEA or consortium's application must include a plan for delivering supports to educators that can be consistently provided at scale, recognizing the commenter's suggestion that successful Start Printed Page 88961implementation will require a comprehensive plan for professional development and that States consider whether their plan can feasibly be delivered in all LEAs during the demonstration authority period, even if only a few LEAs are initially participating. We also are adding to new § 200.106(d)(1) to provide for applications to be evaluated on the extent to which an SEA or consortium's training for LEA and school staff will develop teacher capacity to provide instruction that is informed by the innovative assessment system and to use the results the system produces. Further, we are adding to new § 200.106(d)(4) to provide for SEAs to describe their strategies to support teachers and staff in carrying out their responsibilities under the State's chosen innovative assessment model, which may include developing, designing, implementing, and “validly and reliably” scoring the assessment results. We also note that the information in each application under the selection criteria for timeline and budget and evaluation and continuous improvement described in new § 200.106(c) and (e) (proposed § 200.78(c) and (e)), respectively, will include how the SEA or consortium plans to fund and support any evaluation of its professional development plans and activities, so it is unnecessary to add these elements to the selection criterion in § 200.106(d). Finally, we appreciate commenters' suggestions for additional non-regulatory guidance in this area and will take them into consideration as the Department moves forward with implementation of the innovative assessment demonstration authority.
Changes: We have added to the selection criterion in new § 200.106(d) to:
- Provide for each SEA or consortium's application to include a plan for delivering supports to educators that can be consistently provided at scale;
- Clarify that the SEA's or consortium's application will be evaluated on the extent to which training for LEA and school staff will develop teacher capacity to provide instruction that is informed by the innovative assessment system and to use the system's results; and
- Clarify that SEAs or consortia should describe strategies that will engage teachers and staff in carrying out their responsibilities under the State's chosen innovative assessment model, which may include “designing”, “implementing,” and “validly and reliably” scoring the assessment results—not just in developing and scoring them, in general.
Comments: One commenter objected to the reference in proposed § 200.78(d)(4) regarding teachers developing and scoring innovative assessments administered in their school. The commenter was concerned about potential conflicts of interest and the validity and reliability of the resulting scores if educators providing instruction are also developing and scoring the assessments for the students they teach. The commenter suggested revising §§ 200.105 and 200.106 to restrict teacher involvement in item development and scoring.
Discussion: We believe that teachers play a critical role in the development of assessments and should be involved throughout test development. This is true in all test development, but may be especially relevant with respect to innovative assessment systems, given changes in test design and delivery with an innovative assessment that may necessitate changes in instruction and additional or new responsibilities for educators. In addition, restricting teacher involvement in the development of the innovative assessment system or scoring such innovative assessments would place an additional restriction on the development of these assessments beyond what is required of State assessment systems in section 1111(b)(2) of the ESEA—the requirements these innovative assessment systems will need to meet in order to be used for statewide use at the end of the demonstration authority period.
We agree, however, with the commenter that States should establish reasonable safeguards within their assessment systems, including any innovative assessment system. For example, teachers, in general, should not be permitted to score the assessments taken by students for which the teacher is considered the teacher of record or the assessments taken by students in a school in which the teacher is employed, as this could affect the reliability of the scores and create incentives for improper behavior given that the results will be used in the State's accountability system. We believe that States should have flexibility to design and develop a truly innovative assessment system and do not want to restrict innovation by placing extensive restrictions on the development and scoring of these new assessments. We do want to ensure that States are considering proper safeguards (e.g., quality control procedures, inter-rater reliability checks, audit plans) to avoid any conflicts, or the appearance of conflict, of interest and note that the innovative assessment system will undergo a peer review process prior to a State receiving demonstration authority and following the statewide transition of the innovative assessment system, and are clarifying final § 200.106(d)(4) (proposed § 200.78(d)(4)) to require States to describe in their applications any “safeguards” they are using when teachers are involved in developing or scoring assessments and how they are sufficient to ensure objective and unbiased scoring of innovative assessments. Further, the Department's external peer review of State assessment systems under title I, part A of the ESEA, which is based on the APA's Standards for Psychological and Educational Testing, includes specific criteria related to sections on the State's plans for scoring assessments and for demonstrating the reliability of the assessment scores. To meet these criteria, States need to ensure adequate training, calibration, and monitoring for all scoring conducted within their assessment system. We believe these criteria will serve to mitigate the commenter's concern.
Changes: We have added language to new § 200.106(d)(4) (proposed § 200.78(d)(4)) to include both strategies and safeguards related to the development and scoring of innovative assessments by teachers and other school staff and to require States to describe in their applications how the strategies and safeguards are sufficient to ensure objective and unbiased scoring of innovative assessments.
Comments: One commenter requested the inclusion of specialized instructional support personnel among the list of school staff in proposed § 200.78(d) for which the SEA must demonstrate a plan for training and support, noting the important role that specialized instructional support personnel, such as audiologists and speech-language pathologists, play in providing curriculum and instructional supports for students.
Discussion: The selection criterion in new § 200.106(d) (proposed § 200.78(d)) is intended to ensure that States applying for demonstration authority have carefully considered how they will support LEA and school staff in participating schools during implementation of the innovative assessment system. While the proposed regulations specifically mention that these staff must include “teachers, principals, and other school leaders,” an SEA could certainly respond to this selection criterion by including other LEA and school staff, including specialized instructional support Start Printed Page 88962personnel, paraprofessionals, and district administrators, in their plans to support LEA and school personnel in effective implementation—which could likely improve the strength of the SEA's application in this area as it is evaluated by peers. However, we decline to modify the selection criterion to specifically list examples of other LEA and school staff, as enumerating “teachers, principals, and other school leaders” is more consistent with the statutory requirements for demonstration authority, which only reference teachers, principals, and other school leaders.[5]
Changes: None.
Supports for Parents
Comments: Several commenters supported the selection criterion in proposed § 200.78(d) providing for States to detail their strategies to support students in the transition to a new innovative assessment system, believing that these strategies will be critical to ensure a successful transition to a new assessment system. One commenter recommended that the final regulations also require States to describe strategies to acquaint parents with the innovative assessment system, including additional expectations for SEAs and consortia to describe plans to better communicate and explain assessment results to parents and families of students in participating LEAs and schools so that they, too, can play a critical role in using those results to improve academic outcomes for their children.
Discussion: We agree with commenters and appreciate the support for including a selection criterion related to supports for students that will familiarize them with the innovative assessment system. We further agree that States, in order to effectively implement and scale their innovative assessment systems, will need strategies to familiarize parents and families with the new assessments. We are revising the regulations in new § 200.106 to this effect in order to reinforce requirements elsewhere in the regulations for collaborating with parents in the development of the innovative assessment system, soliciting their feedback and input regularly on implementation, and providing annual information to parents about the innovative assessments and the results for their children, as required in other sections of the regulations.
Changes: We have added to the introductory paragraph of new § 200.106(d) (proposed § 200.78) to include references to supports for parents, in addition to educators and students, and § 200.106(d)(2) to provide for States to describe their strategies to familiarize parents, as well as students, with the innovative assessment system.
200.107 Transition to Statewide Use
General
Comments: One commenter stated that the requirement for a full, statewide transition at the end of the pilot makes assumptions about the finality and success of the pilot.
Discussion: The Department appreciates the concern about the requirement for transition to statewide use. However, the Department disagrees that such a requirement presumes that statewide implementation of the innovative assessment system will be successful. The requirements of new § 200.105 (proposed § 200.77) must be met in order for a State to implement the innovative assessment statewide. The Department is establishing these requirements in part to ensure a higher likelihood of successful implementation, but the Department does not believe that success is a forgone conclusion.
The regulations in new § 200.107(a) and (b) (proposed § 200.79(a) and (b)) represent another significant set of criteria that the innovative assessment must meet in order to achieve acceptance as a statewide assessment. Additionally, new § 200.108 (proposed § 200.80) provides that the Department may withdraw the innovative assessment authority from a State when it cannot produce a high-quality plan for transition or evidence that the innovative assessment systems meets specific conditions. Given these provisions, we disagree that these regulations collectively presume that an innovative assessment system which achieves statewide implementation status will automatically be deemed final or successful.
Changes: None.
Comments: One commenter suggested that the Department include additional steps in the transition to statewide use of the innovative assessment to strengthen the transparency and ensure the quality of the system to be implemented. First, the commenter suggested that an SEA be required to affirmatively notify the Secretary and the LEAs in the State of its intention to move forward with the innovative assessment, replacing the statewide assessment. Second, the commenter recommended that the State receive validation that the innovative assessment meets peer review before the State makes the transition, instead of after, as in proposed § 200.79(a)(1).
Discussion: The Department appreciates the concerns voiced by this commenter. The Department believes that the requirements in new §§ 200.105 and 200.106 (proposed §§ 200.77 and 200.78) collectively address the concerns of the commenter regarding LEA notification and transparency. The application requirements in new § 200.105(d)(3), requiring an annual update on the SEA's progress in scaling the innovative assessment system statewide, are sufficient to ensure that the Secretary will be notified when the State begins implementing the innovative assessment system statewide. Specifically, the annual report must include a timeline for and an update on progress toward full statewide implementation of the innovative assessment system. In addition, consistent with final §§ 200.105(d)(3) and 200.106(e), the annual report must include the results of the comparability determination required under final § 200.105(b)(4).
Finally, the requirements for peer review of the innovative assessment system in new § 200.107(a)(1) (proposed § 200.79(a)(1)) that is required for transitioning out of the demonstration authority are the same requirements for peer review that apply to all statewide assessments used to meet the requirements under title I, part A, that is, the peer review is conducted after the first administration of a new statewide assessment, which ensures that all necessary evidence will be available for submission to the Department.
Changes: None.
Comments: One commenter asked the Department to provide greater clarity on what steps the State will need to take if the innovative assessment system does not meet the requirements of proposed § 200.79(b). That section outlines the requirements the assessment system must meet before it can be used for purposes of both academic assessments and accountability under section 1111 of the ESEA. The commenter recommended that in such situations, a State be granted an extension under proposed § 200.80 or be required to return immediately to the previous statewide academic assessment.
Discussion: The Department agrees that States need to follow a clearly defined process in the event that the innovative assessment system does not meet the requirements of new § 200.107(b) (proposed § 200.79(b)). The Department believes, however, that the Start Printed Page 88963regulations in new § 200.108(a)-(b) (proposed § 200.80(a)-(b)) provide such a clearly defined process both in the case of granting an extension, and for a withdrawal and return to a statewide assessment, and declines to make further changes.
Changes: None.
Flexibility in Scaling Statewide
Comments: Multiple commenters requested that States be permitted to administer multiple assessments as part of the innovative assessment system. Commenters recommended that States should not be required to scale a single innovative assessment.
Discussion: The Department believes that the intent of the statute is to provide States the ability to implement an innovative assessment system as defined in final § 200.104(b)(3) (proposed § 200.76(b)(2)). States have broad flexibility to develop and design their system within the parameters of this definition, which allows for multiple assessments to be given in a single grade, including performance tasks, instructionally embedded assessments, and interim assessments.
Changes: None.
Comments: One commenter requested that States receive flexibility such that at the end of the innovative assessment demonstration authority, once the innovative assessment system has been successfully piloted, peer reviewed, and approved, the State could keep both its statewide assessment system and its innovative assessment system and allow LEAs to choose one for purposes of accountability and reporting.
Discussion: The purpose of innovative assessment demonstration authority under section 1204 of the ESEA is to provide States the flexibility to pilot an innovative assessment system with the purpose of scaling the innovative assessment system to statewide use. Once the State transitions to statewide use, the innovative assessment system must meet the requirements of section 1111(b)(2) of the ESEA. Under section 1111(b)(2)(B), a State must use the same academic assessment system to measure the achievement of all students and evaluate their achievement against the same challenging State academic achievement standards. To meet the requirement under section 1111(b)(2)(B), the State must select either its statewide assessment system or the innovative assessment system; it cannot offer a choice to LEAs. Finally, we note that section 1204(i) of the ESEA grants the Secretary authority to withdraw demonstration authority if the State cannot provide a high-quality plan for transition to full statewide use of the innovative assessment system. Thus, we believe allowing States to offer a choice to LEAs would be inconsistent with this statutory provision as well.
Changes: None.
Evaluation of Demonstration Authority
Comments: One commenter expressed concern about how the proposed regulations define a baseline year for purposes of evaluating the innovative assessment system. Since States may pilot their innovative assessment systems prior to receiving demonstration authority, the first year of innovative demonstration authority may not be the first year the test is administered, but may be the first year the test is administered for accountability purposes.
Discussion: The Department appreciates the commenter's request for clarification. We are adding to new § 200.107(c) (proposed § 200.79(c)) to clarify that the baseline year for an evaluation of the innovative assessment system is the first year the innovative assessment system is administered in an LEA under the demonstration authority.
Changes: We have added to § 200.107(c) to clarify that the baseline year is the first year the innovative assessment system is administered in an LEA under the demonstration authority.
Comments: Several commenters supported proposed § 200.79(b)(2), which would require that the SEA evaluate the statistical relationship between student performance on the innovative assessment and other measures of success. The commenters proposed a clarification to allow for the Department, peer reviewers, and States to take into account measures other than student performance. They strongly encouraged the Department to clarify that student performance should not be the only criterion used to determine that the innovative assessment system is of high quality, can replace the statewide assessments, and can be used for both accountability and reporting.
Discussion: The Department appreciates the commenters' concerns. The requirement to provide evidence of the statistical relationship between student performance on the innovative assessment and student performance on other measures of success is just one requirement in final § 200.107 (proposed § 200.79) for States to demonstrate that their innovative assessments are of “high quality” and may be used for purposes of State assessments and accountability under section 1111 of the ESEA. The relationship of student performance on the innovative assessment for each grade and subject to other measures must consider the relationship between the innovative assessment and the measures used in the remaining accountability indicators that do not rely on data from the State's academic content assessments (e.g., the Graduation Rate indicator, Progress in Achieving English Language Proficiency indicator, a School Quality or Student Success indicator), and may also examine the relationship of student performance on the innovative assessment to student performance on other assessments like NAEP, TIMMS, or college entrance exams, or measures other than test scores like college enrollment rates or success in related entry-level, college credit-bearing courses. This analysis provides validity evidence and is considered in the Department's peer review of State assessments under section 1111(a)(4) of the ESEA, as well as final § 200.107(b)(2). Additional evidence is required in peer review and will be considered in the determination that an innovative assessment system is of high quality. Since other measures would be included in peer review, as reflected in final § 200.107, to evaluate whether an innovative assessment is of high quality, we do not believe it is necessary to clarify that measures other than student performance can be taken into account.
Changes: None.
200.108 Extension, Waivers, and Withdrawal of Authority
Withdrawal of Authority
Comments: One commenter urged the Department to clearly articulate the Secretary's ability to withdraw innovative assessment authority if a State cannot demonstrate comparability or sufficient quality in order to ensure the innovative assessment system is an objective measure of student performance.
Discussion: Under section 1204 of the law, the Secretary must withdraw a State's authority to implement an innovative assessment system if, at any time during the initial demonstration period or an extension period, the State cannot meet certain requirements, including requirements pertaining to comparability to statewide assessments (section 1204(i)(5) of the ESEA) and system quality (section 1204(j)(1)(A) of the ESEA).
Changes: None.
Extension
Comments: One commenter supported proposed § 200.80(a)(1)(iii) requiring SEAs requesting an extension to address the capacity of all LEAs to full implement the innovative Start Printed Page 88964assessment system by the end of the extension period.
Discussion: The Department agrees with the commenter that SEAs must consider the readiness and capacity of all LEAs in planning for statewide implementation of the innovative assessment system. The regulations in this section help ensure that States are on track to implement the innovative assessment system statewide before receiving an extension.
Changes: None.
Waivers
Comments: Several commenters agreed with proposed § 200.80(c)(2), under which the Secretary may grant a one-year waiver to a State to delay withdrawal of the demonstration authority at the end of the extension period if a State's innovative assessment system has not yet met peer review requirements described in proposed § 200.79. One commenter supported the one-year cap on this waiver because, it asserted, States should not be given unlimited time to transition to statewide use of the innovative assessment system. Another commenter supported this requirement because it would ensure that States cannot operate two separate assessment systems for an extended period of time.
Several commenters requested that the Department remove the provision in proposed § 200.80(c)(2) because they opposed a one-year limitation on such waivers and asserted that this timeline was inconsistent with section 1204(j)(3) of the ESEA, which provides the Secretary with the authority to grant a waiver to delay withdrawal of authority in order to provide the State the time necessary to fully implement the innovative assessment system statewide. Commenters asserted that the variation in structure, design, and complexity of innovative assessment systems requires flexibility for States, and that the Department should not apply a standard expectation to all States and innovative assessment systems.
Discussion: We appreciate that innovative assessment systems will vary in complexity, and that some States may require more time than others to implement the innovative assessment system statewide. However, under the regulations, States have five years within the initial demonstration authority period to implement innovative assessments statewide. Then, States can request up to two years of extensions beyond that five year period. Given that States requesting the waiver would be in their eighth year of implementing the innovative assessments, we believe that a one-year limitation on the waiver is reasonable and appropriate to ensure that States move forward in implementing statewide assessment systems, consistent with the requirements of title I. The purpose of the innovative demonstration authority is to scale innovative assessments statewide, not to indefinitely allow States to administer two assessments. In the unlikely scenario that a State needs more than eight years to implement its innovative assessment system statewide, including having such a system peer reviewed, the Secretary maintains authority under section 8401 of the ESEA to waive requirements of the ESEA.
Changes: None.
Executive Orders 12866 and 13563
Regulatory Impact Analysis
Under Executive Order 12866, OMB must determine whether this regulatory action is significant and, therefore, subject to the requirements of the Executive order and to review by OMB. Section 3(f) of Executive Order 12866 defines a “significant regulatory action” as an action likely to result in a rule that may—
(1) Have an annual effect on the economy of $100 million or more, or adversely affect a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or tribal governments or communities in a material way (also referred to as an “economically significant” rule);
(2) Create serious inconsistency or otherwise interfere with an action taken or planned by another agency;
(3) Materially alter the budgetary impacts of entitlement grants, user fees, or loan programs or the rights and obligations of recipients thereof; or
(4) Raise novel legal or policy issues arising out of legal mandates, the President's priorities, or the principles stated in the Executive order.
This final regulatory action is significant and is subject to review by OMB under section 3(f) of Executive Order 12866.
We have also reviewed these regulations under Executive Order 13563, which supplements and explicitly reaffirms the principles, structures, and definitions governing regulatory review established in Executive Order 12866. To the extent permitted by law, Executive Order 13563 requires that an agency—
(1) Propose or adopt regulations only upon a reasoned determination that their benefits justify their costs (recognizing that some benefits and costs are difficult to quantify);
(2) Tailor its regulations to impose the least burden on society, consistent with obtaining regulatory objectives and taking into account, among other things and to the extent practicable, the costs of cumulative regulations;
(3) In choosing among alternative regulatory approaches, select those approaches that maximize net benefits (including potential economic, environmental, public health and safety, and other advantages; distributive impacts; and equity);
(4) To the extent feasible, specify performance objectives, rather than the behavior or manner of compliance a regulated entity must adopt; and
(5) Identify and assess available alternatives to direct regulation, including economic incentives such as user fees or marketable permits, to encourage the desired behavior, or provide information that enables the public to make choices.
Executive Order 13563 also requires an agency “to use the best available techniques to quantify anticipated present and future benefits and costs as accurately as possible.” The Office of Information and Regulatory Affairs of OMB has emphasized that these techniques may include “identifying changing future compliance costs that might result from technological innovation or anticipated behavioral changes.”
We are issuing these final regulations only on a reasoned determination that their benefits justify their costs. In choosing among alternative regulatory approaches, we selected those approaches that maximize net benefits. Based on the analysis that follows, the Department believes that these final regulations are consistent with the principles in Executive Order 13563.
We also have determined that this regulatory action would not unduly interfere with State, local, and tribal governments in the exercise of their governmental functions.
In accordance with both Executive orders, the Department has assessed the potential costs and benefits, both quantitative and qualitative, of this regulatory action. The potential costs associated with this regulatory action are those resulting from statutory requirements and those we have determined as necessary for administering the Department's programs and activities.
In this regulatory impact analysis we discuss the need for regulatory action and the potential costs and benefits. Elsewhere in this section under Paperwork Reduction Act of 1995, we discuss burdens associated with information collection requirements.Start Printed Page 88965
Need for Regulatory Action
The Department believes that regulatory action is needed to ensure effective implementation of section 1204 of the ESEA, which permits the Secretary to provide an SEA or consortium of SEAs that meets the application requirements with authority to establish, operate, and evaluate a system of innovative assessments. Crucially, and as discussed elsewhere in this document in response to concerns expressed by commenters that the regulations are overly prescriptive or might limit innovation, the Department believes that regulatory action is needed to ensure that these assessments ultimately can meet requirements for academic assessments and be used in statewide accountability systems under section 1111 of the ESEA, including requirements for assessment validity, reliability, technical quality, and alignment to challenging State academic standards. Absent regulatory action, SEAs implementing innovative assessment authority run a greater risk of developing assessments that are inappropriate or inadequate for these purposes, which could hinder State and local efforts to provide all children significant opportunity to receive a fair, equitable, and high-quality education and to close educational achievement gaps consistent with the purpose of title I of the ESEA.
Discussion of Potential Costs and Benefits
The primary benefit of these regulations is the administration of statewide assessments that more effectively measure student mastery of challenging State academic standards and better inform classroom instruction and student supports, ultimately leading to improved academic outcomes for all students. We believe that this benefit outweighs associated costs to an SEA, which may use funds received under the Grants for State Assessments and Related Activities program and funds reserved for State administration under part A of title I to participate in the demonstration authority. In addition, high-quality, innovative assessment models developed by participating SEAs under the demonstration authority can benefit other SEAs by providing examples of new assessment strategies for those SEAs to consider.
Participation in the innovative assessment demonstration authority is voluntary and limited during the initial demonstration period to seven SEAs. In light of the initial limits on participation, the number and rigor of the statutory application requirements, and the high degree of technical complexity involved in establishing, operating, and evaluating innovative assessment systems, we anticipate that few SEAs will seek to participate. Based on currently available information, we estimate that, initially, up to five SEAs will apply.
For those SEAs that apply and are provided demonstration authority (consistent with the final regulations), implementation costs may vary considerably based on a multitude of factors, including: The number and type(s) of assessments the SEA elects to include in its system; the differences between those assessments and the SEA's current statewide assessments, including with respect to assessment type, use of assessment items, and coverage of State academic content standards; the number of grades and subjects in which the SEA elects to administer those assessments; whether the SEA will implement its system statewide upon receiving demonstration authority and, if not, the SEA's process and timeline for scaling the system up to statewide implementation; and whether the SEA is part of a consortium (and thus may share certain costs with other consortium members). Because of the potential wide variation in innovative assessment systems along factors such as these, we did not provide estimates of the potential cost to implement innovative assessment demonstration authority for the typical SEA participant in the NPRM, stating that we believed such estimates would not be reliable or useful. We continue to believe that is the case, and note that we received no comments from SEAs providing specific anticipated costs that could inform our production of estimates.
That said, we received several comments expressing general concern about the potential cost of implementing innovative assessment demonstration authority, including concerns about additional costs to SEAs of implementing innovative assessments while also administering current State assessments in non-participating LEAs. Although we appreciate these general concerns, we remind the commenters that participation in innovative assessment demonstration authority is voluntary and that no SEA is required to develop and implement innovative assessments under this authority. Moreover, an SEA that chooses to participate has considerable flexibility in determining the number, types, and breadth of innovative assessments to include in its system. In selecting its assessments, such an SEA should accordingly be mindful of development and implementation costs, including the extent to which those costs can be supported with Federal grant funds not needed for other assessment purposes.
Regulatory Flexibility Act Certification
The Secretary certifies that these final requirements will not have a significant economic impact on a substantial number of small entities. Under the U.S. Small Business Administration's Size Standards, small entities include small governmental jurisdictions such as cities, towns, or school districts (LEAs) with a population of less than 50,000. Although the majority of LEAs that receive ESEA funds qualify as small entities under this definition, these regulations will not have a significant economic impact on these small LEAs because few SEAs are expected to participate in this voluntary innovative assessment demonstration authority and the costs of participation will be borne largely by SEAs and can be supported with Federal grant funds. We believe the benefits provided under this regulatory action outweigh any associated costs for these small LEAs. In particular, the final regulations will help ensure that the LEAs can implement assessments that measure student mastery of challenging State academic standards more effectively and better inform classroom instruction and student supports, ultimately leading to improved academic outcomes for all students.
Paperwork Reduction Act of 1995
The Paperwork Reduction Act of 1995 does not require you to respond to a collection of information unless it displays a valid OMB control number. We display the valid OMB control numbers assigned to the collections of information in these final regulations at the end of the affected sections of the regulations.
Sections 200.104(c), 200.105, and 200.106 of the final regulations contain information collection requirements. The Department will develop an Information Collection Request based upon these final regulations, and will submit a copy of these sections and the information collection instrument to OMB for its review before requiring the submission of any information based upon these regulations.
Intergovernmental Review
This program is not subject to Executive Order 12372 and the regulations in 34 CFR part 79.
Assessment of Educational Impact
In the NPRM we requested comments on whether the proposed regulations would require transmission of Start Printed Page 88966information that any other agency or authority of the United States gathers or makes available.
Based on the response to the NPRM and on our review, we have determined that these final regulations do not require transmission of information that any other agency or authority of the United States gathers or makes available.
Accessible Format: Individuals with disabilities can obtain this document in an accessible format (e.g., braille, large print, or electronic format) on request to the person listed under FOR FURTHER INFORMATION CONTACT.
Electronic Access to This Document: The official version of this document is the document published in the Federal Register. Free Internet access to the official edition of the Federal Register and the Code of Federal Regulations is available via the Federal Digital System at: www.gpo.gov/fdsys. At this site you can view this document, as well as all other documents of this Department published in the Federal Register, in text or Adobe Portable Document Format (PDF). To use PDF you must have Adobe Acrobat Reader, which is available free at the site.
You may also access documents of the Department published in the Federal Register by using the article search feature at: www.federalregister.gov. Specifically, through the advanced search feature at this site, you can limit your search to documents published by the Department. (Catalog of Federal Domestic Assistance Number does not apply.)
Start List of SubjectsList of Subjects in 34 CFR Part 200
- Elementary and secondary education
- Grant programs—education
- Indians—education
- Infants and children
- Juvenile delinquency
- Migrant labor
- Private schools
- Reporting and recordkeeping requirements
Dated: November 30, 2016.
John B. King, Jr.,
Secretary of Education.
For the reasons discussed in the preamble, the Department of Education amends part 200 of title 34 of the Code of Federal Regulations as follows:
Start PartPART 200—TITLE I—IMPROVING THE ACADEMIC ACHIEVEMENT OF THE DISADVANTAGED
End Part Start Amendment Part1. The authority citation for part 200 continues to read as follows:
End Amendment Part Start Amendment Part2. Add a new undesignated center heading following § 200.103 to read as follows:
End Amendment PartInnovative Assessment Demonstration Authority
Start Amendment Part3. Add § 200.104 to read as follows:
End Amendment PartInnovative assessment demonstration authority.(a) In general. (1) The Secretary may provide a State educational agency (SEA), or consortium of SEAs, with authority to establish and operate an innovative assessment system in its public schools (hereinafter referred to as “innovative assessment demonstration authority”).
(2) An SEA or consortium of SEAs may implement the innovative assessment demonstration authority during its demonstration authority period and, if applicable, extension or waiver period described in § 200.108(a) and (c), after which the Secretary will either approve the system for statewide use consistent with § 200.107 or withdraw the authority consistent with § 200.108(b).
(b) Definitions. For purposes of §§ 200.104 through 200.108—
(1) Affiliate member of a consortium means an SEA that is formally associated with a consortium of SEAs that is implementing the innovative assessment demonstration authority, but is not yet a full member of the consortium because it is not proposing to use the consortium's innovative assessment system under the demonstration authority, instead of, or in addition to, its statewide assessment under section 1111(b)(2) of the Elementary and Secondary Education Act of 1965, as amended by the Every Student Succeeds Act (hereinafter “the Act”) for purposes of accountability and reporting under sections 1111(c) and 1111(h) of the Act.
(2) Demonstration authority period refers to the period of time over which an SEA, or consortium of SEAs, is authorized to implement the innovative assessment demonstration authority, which may not exceed five years and does not include the extension or waiver period under § 200.108. An SEA must use its innovative assessment system in all participating schools instead of, or in addition to, the statewide assessment under section 1111(b)(2) of the Act for purposes of accountability and reporting under section 1111(c) and 1111(h) of the Act in each year of the demonstration authority period.
(3) Innovative assessment system means a system of assessments, which may include any combination of general assessments or alternate assessments aligned with alternate academic achievement standards, in reading/language arts, mathematics, or science administered in at least one required grade under § 200.5(a)(1) and section 1111(b)(2)(B)(v) of the Act that—
(i) Produces—
(A) An annual summative determination of each student's mastery of grade-level content standards aligned to the challenging State academic standards under section 1111(b)(1) of the Act; or
(B) In the case of a student with the most significant cognitive disabilities assessed with an alternate assessment aligned with alternate academic achievement standards under section 1111(b)(1)(E) of the Act and aligned with the State's academic content standards for the grade in which the student is enrolled, an annual summative determination relative to such alternate academic achievement standards for each such student; and
(ii) May, in any required grade or subject, include one or more of the following types of assessments:
(A) Cumulative year-end assessments.
(B) Competency-based assessments.
(C) Instructionally embedded assessments.
(D) Interim assessments.
(E) Performance-based assessments.
(F) Another innovative assessment design that meets the requirements under § 200.105(b).
(4) Participating LEA means a local educational agency (LEA) in the State with at least one school participating in the innovative assessment demonstration authority.
(5) Participating school means a public school in the State in which the innovative assessment system is administered under the innovative assessment demonstration authority instead of, or in addition to, the statewide assessment under section 1111(b)(2) of the Act and where the results of the school's students on the innovative assessment system are used by its State and LEA for purposes of accountability and reporting under section 1111(c) and 1111(h) of the Act.
(c) Peer review of applications. (1) An SEA or consortium of SEAs seeking innovative assessment demonstration authority under paragraph (a) of this section must submit an application to the Secretary that demonstrates how the applicant meets all application requirements under § 200.105 and that addresses all selection criteria under § 200.106.
(2) The Secretary uses a peer review process, including a review of the SEA's application to determine that it meets or will meet each of the requirements under § 200.105 and sufficiently addresses each of the selection criteria Start Printed Page 88967under § 200.106, to inform the Secretary's decision of whether to award the innovative assessment demonstration authority to an SEA or consortium of SEAs. Peer review teams consist of experts and State and local practitioners who are knowledgeable about innovative assessment systems, including—
(i) Individuals with past experience developing innovative assessment and accountability systems that support all students and subgroups of students described in section 1111(c)(2) of the Act (e.g., psychometricians, measurement experts, researchers); and
(ii) Individuals with experience implementing such innovative assessment and accountability systems (e.g., State and local assessment directors, educators).
(3)(i) If points or weights are assigned to the selection criteria under § 200.106, the Secretary will inform applicants in the application package or a notice published in the Federal Register of—
(A) The total possible score for all of the selection criteria under § 200.106; and
(B) The assigned weight or the maximum possible score for each criterion or factor under that criterion.
(ii) If no points or weights are assigned to the selection criteria and selected factors under § 200.106, the Secretary will evaluate each criterion equally and, within each criterion, each factor equally.
(d) Initial demonstration period. (1) The initial demonstration period is the first three years in which the Secretary awards at least one SEA, or consortium of SEAs, innovative assessment demonstration authority, concluding with publication of the progress report described in section 1204(c) of the Act. During the initial demonstration period, the Secretary may provide innovative assessment demonstration authority to—
(i) No more than seven SEAs in total, including those SEAs participating in consortia; and
(ii) Consortia that include no more than four SEAs.
(2) An SEA that is an affiliate member of a consortium is not included in the application under paragraph (c) of this section or counted toward the limitation in consortia size under paragraph (d)(1)(ii) of this section.
(Authority: 20 U.S.C. 1221e-3, 3474, 6364, 6571)4. Add § 200.105 to read as follows:
End Amendment PartDemonstration authority application requirements.An SEA or consortium of SEAs seeking the innovative assessment demonstration authority must submit to the Secretary, at such time and in such manner as the Secretary may reasonably require, an application that includes the following:
(a) Consultation. Evidence that the SEA or consortium has developed an innovative assessment system in collaboration with—
(1) Experts in the planning, development, implementation, and evaluation of innovative assessment systems, which may include external partners; and
(2) Affected stakeholders in the State, or in each State in the consortium, including—
(i) Those representing the interests of children with disabilities, English learners, and other subgroups of students described in section 1111(c)(2) of the Act;
(ii) Teachers, principals, and other school leaders;
(iii) LEAs;
(iv) Representatives of Indian tribes located in the State;
(v) Students and parents, including parents of children described in paragraph (a)(2)(i) of this section; and
(vi) Civil rights organizations.
(b) Innovative assessment system. A demonstration that the innovative assessment system does or will—
(1) Meet the requirements of section 1111(b)(2)(B) of the Act, except that an innovative assessment—
(i) Need not be the same assessment administered to all public elementary and secondary school students in the State during the demonstration authority period described in § 200.104(b)(2) or extension period described in § 200.108 and prior to statewide use consistent with § 200.107, if the innovative assessment system will be administered initially to all students in participating schools within a participating LEA, provided that the statewide academic assessments under § 200.2(a)(1) and section 1111(b)(2) of the Act are administered to all students in any non-participating LEA or any non-participating school within a participating LEA; and
(ii) Need not be administered annually in each of grades 3-8 and at least once in grades 9-12 in the case of reading/language arts and mathematics assessments, and at least once in grades 3-5, 6-9, and 10-12 in the case of science assessments, so long as the statewide academic assessments under § 200.2(a)(1) and section 1111(b)(2) of the Act are administered in any required grade and subject under § 200.5(a)(1) in which the SEA does not choose to implement an innovative assessment;
(2)(i) Align with the challenging State academic content standards under section 1111(b)(1) of the Act, including the depth and breadth of such standards, for the grade in which a student is enrolled; and
(ii) May measure a student's academic proficiency and growth using items above or below the student's grade level so long as, for purposes of meeting the requirements for reporting and school accountability under sections 1111(c) and 1111(h) of the Act and paragraphs (b)(3) and (b)(7)-(9) of this section, the State measures each student's academic proficiency based on the challenging State academic standards for the grade in which the student is enrolled;
(3) Express student results or competencies consistent with the challenging State academic achievement standards under section 1111(b)(1) of the Act and identify which students are not making sufficient progress toward, and attaining, grade-level proficiency on such standards;
(4)(i) Generate results, including annual summative determinations as defined in paragraph (b)(7) of this section, that are valid, reliable, and comparable for all students and for each subgroup of students described in § 200.2(b)(11)(i)(A)-(I) and sections 1111(b)(2)(B)(xi) and 1111(h)(1)(C)(ii) of the Act, to the results generated by the State academic assessments described in § 200.2(a)(1) and section 1111(b)(2) of the Act for such students. Consistent with the SEA's or consortium's evaluation plan under § 200.106(e), the SEA must plan to annually determine comparability during each year of its demonstration authority period in one of the following ways:
(A) Administering full assessments from both the innovative and statewide assessment systems to all students enrolled in participating schools, such that at least once in any grade span (i.e., 3-5, 6-8, or 9-12) and subject for which there is an innovative assessment, a statewide assessment in the same subject would also be administered to all such students. As part of this determination, the innovative assessment and statewide assessment need not be administered to an individual student in the same school year.
(B) Administering full assessments from both the innovative and statewide assessment systems to a demographically representative sample of all students and subgroups of students described in section 1111(c)(2) of the Act, from among those students enrolled in participating schools, such that at least once in any grade span (i.e., 3-5, 6-8, or 9-12) and subject for which Start Printed Page 88968there is an innovative assessment, a statewide assessment in the same subject would also be administered in the same school year to all students included in the sample.
(C) Including, as a significant portion of the innovative assessment system in each required grade and subject in which both an innovative and statewide assessment are administered, items or performance tasks from the statewide assessment system that, at a minimum, have been previously pilot tested or field tested for use in the statewide assessment system.
(D) Including, as a significant portion of the statewide assessment system in each required grade and subject in which both an innovative and statewide assessment are administered, items or performance tasks from the innovative assessment system that, at a minimum, have been previously pilot tested or field tested for use in the innovative assessment system.
(E) An alternative method for demonstrating comparability that an SEA can demonstrate will provide for an equally rigorous and statistically valid comparison between student performance on the innovative assessment and the statewide assessment, including for each subgroup of students described in § 200.2(b)(11)(i)(A)-(I) and sections 1111(b)(2)(B)(xi) and 1111(h)(1)(C)(ii) of the Act; and
(ii) Generate results, including annual summative determinations as defined in paragraph (b)(7) of this section, that are valid, reliable, and comparable, for all students and for each subgroup of students described in § 200.2(b)(11)(i)(A)-(I) and sections 1111(b)(2)(B)(xi) and 1111(h)(1)(C)(ii) of the Act, among participating schools and LEAs in the innovative assessment demonstration authority. Consistent with the SEA's or consortium's evaluation plan under § 200.106(e), the SEA must plan to annually determine comparability during each year of its demonstration authority period;
(5)(i) Provide for the participation of all students, including children with disabilities and English learners;
(ii) Be accessible to all students by incorporating the principles of universal design for learning, to the extent practicable, consistent with § 200.2(b)(2)(ii); and
(iii) Provide appropriate accommodations consistent with § 200.6(b) and (f)(1)(i) and section 1111(b)(2)(B)(vii) of the Act;
(6) For purposes of the State accountability system consistent with section 1111(c)(4)(E) of the Act, annually measure in each participating school progress on the Academic Achievement indicator under section 1111(c)(4)(B) of the Act of at least 95 percent of all students, and 95 percent of students in each subgroup of students described in section 1111(c)(2) of the Act, who are required to take such assessments consistent with paragraph (b)(1)(ii) of this section;
(7) Generate an annual summative determination of achievement, using the annual data from the innovative assessment, for each student in a participating school in the demonstration authority that describes—
(i) The student's mastery of the challenging State academic standards under section 1111(b)(1) of the Act for the grade in which the student is enrolled; or
(ii) In the case of a student with the most significant cognitive disabilities assessed with an alternate assessment aligned with alternate academic achievement standards under section 1111(b)(1)(E) of the Act, the student's mastery of those standards;
(8) Provide disaggregated results by each subgroup of students described in § 200.2(b)(11)(i)(A)-(I) and sections 1111(b)(2)(B)(xi) and 1111(h)(1)(C)(ii) of the Act, including timely data for teachers, principals and other school leaders, students, and parents consistent with § 200.8 and section 1111(b)(2)(B)(x) and (xii) and section 1111(h) of the Act, and provide results to parents in a manner consistent with paragraph (b)(4)(i) of this section and § 200.2(e); and
(9) Provide an unbiased, rational, and consistent determination of progress toward the State's long-term goals for academic achievement under section 1111(c)(4)(A) of the Act for all students and each subgroup of students described in section 1111(c)(2) of the Act and a comparable measure of student performance on the Academic Achievement indicator under section 1111(c)(4)(B) of the Act for participating schools relative to non-participating schools so that the SEA may validly and reliably aggregate data from the system for purposes of meeting requirements for—
(i) Accountability under sections 1003 and 1111(c) and (d) of the Act, including how the SEA will identify participating and non-participating schools in a consistent manner for comprehensive and targeted support and improvement under section 1111(c)(4)(D) of the Act; and
(ii) Reporting on State and LEA report cards under section 1111(h) of the Act.
(c) Selection criteria. Information that addresses each of the selection criteria under § 200.106.
(d) Assurances. Assurances that the SEA, or each SEA in a consortium, will—
(1) Continue use of the statewide academic assessments in reading/language arts, mathematics, and science required under § 200.2(a)(1) and section 1111(b)(2) of the Act—
(i) In all non-participating schools; and
(ii) In all participating schools for which such assessments will be used in addition to innovative assessments for accountability purposes under section 1111(c) of the Act consistent with paragraph (b)(1)(ii) of this section or for evaluation purposes consistent with § 200.106(e) during the demonstration authority period;
(2) Ensure that all students and each subgroup of students described in section 1111(c)(2) of the Act in participating schools are held to the same challenging State academic standards under section 1111(b)(1) of the Act as all other students, except that students with the most significant cognitive disabilities may be assessed with alternate assessments aligned with alternate academic achievement standards consistent with § 200.6 and section 1111(b)(1)(E) and (b)(2)(D) of the Act, and receive the instructional support needed to meet such standards;
(3) Report the following annually to the Secretary, at such time and in such manner as the Secretary may reasonably require:
(i) An update on implementation of the innovative assessment demonstration authority, including—
(A) The SEA's progress against its timeline under § 200.106(c) and any outcomes or results from its evaluation and continuous improvement process under § 200.106(e); and
(B) If the innovative assessment system is not yet implemented statewide consistent with § 200.104(a)(2), a description of the SEA's progress in scaling up the system to additional LEAs or schools consistent with its strategies under § 200.106(a)(3)(i), including updated assurances from participating LEAs consistent with paragraph (e)(2) of this section.
(ii) The performance of students in participating schools at the State, LEA, and school level, for all students and disaggregated for each subgroup of students described in section 1111(c)(2) of the Act, on the innovative assessment, including academic achievement and participation data required to be reported consistent with Start Printed Page 88969section 1111(h) of the Act, except that such data may not reveal any personally identifiable information.
(iii) If the innovative assessment system is not yet implemented statewide, school demographic information, including enrollment and student achievement information, for the subgroups of students described in section 1111(c)(2) of the Act, among participating schools and LEAs and for any schools or LEAs that will participate for the first time in the following year, and a description of how the participation of any additional schools or LEAs in that year contributed to progress toward achieving high-quality and consistent implementation across demographically diverse LEAs in the State consistent with the SEA's benchmarks described in § 200.106(a)(3)(iii).
(iv) Feedback from teachers, principals and other school leaders, and other stakeholders consulted under paragraph (a)(2) of this section, including parents and students, from participating schools and LEAs about their satisfaction with the innovative assessment system;
(4) Ensure that each participating LEA informs parents of all students in participating schools about the innovative assessment, including the grades and subjects in which the innovative assessment will be administered, and, consistent with section 1112(e)(2)(B) of the Act, at the beginning of each school year during which an innovative assessment will be implemented. Such information must be—
(i) In an understandable and uniform format;
(ii) To the extent practicable, written in a language that parents can understand or, if it is not practicable to provide written translations to a parent with limited English proficiency, be orally translated for such parent; and
(iii) Upon request by a parent who is an individual with a disability as defined by the Americans with Disabilities Act, provided in an alternative format accessible to that parent; and
(5) Coordinate with and provide information to, as applicable, the Institute of Education Sciences for purposes of the progress report described in section 1204(c) of the Act and ongoing dissemination of information under section 1204(m) of the Act.
(e) Initial implementation in a subset of LEAs or schools. If the innovative assessment system will initially be administered in a subset of LEAs or schools in a State—
(1) A description of each LEA, and each of its participating schools, that will initially participate, including demographic information and its most recent LEA report card under section 1111(h)(2) of the Act; and
(2) An assurance from each participating LEA, for each year that the LEA is participating, that the LEA will comply with all requirements of this section.
(f) Application from a consortium of SEAs. If an application for the innovative assessment demonstration authority is submitted by a consortium of SEAs—
(1) A description of the governance structure of the consortium, including—
(i) The roles and responsibilities of each member SEA, which may include a description of affiliate members, if applicable, and must include a description of financial responsibilities of member SEAs;
(ii) How the member SEAs will manage and, at their discretion, share intellectual property developed by the consortium as a group; and
(iii) How the member SEAs will consider requests from SEAs to join or leave the consortium and ensure that changes in membership do not affect the consortium's ability to implement the innovative assessment demonstration authority consistent with the requirements and selection criteria in this section and § 200.106.
(2) While the terms of the association with affiliate members are defined by each consortium, consistent with § 200.104(b)(1) and paragraph (f)(1)(i) of this section, for an affiliate member to become a full member of the consortium and to use the consortium's innovative assessment system under the demonstration authority, the consortium must submit a revised application to the Secretary for approval, consistent with the requirements of this section and § 200.106 and subject to the limitation under § 200.104(d).
(Authority: 20 U.S.C. 1221e-3, 3474, 6364, 6571; 29 U.S.C. 794; 42 U.S.C. 2000d-1; 42 U.S.C. 12101; 42 U.S.C. 12102)5. Add § 200.106 to read as follows:
End Amendment PartDemonstration authority selection criteria.The Secretary reviews an application by an SEA or consortium of SEAs seeking innovative assessment demonstration authority consistent with § 200.104(c) based on the following selection criteria:
(a) Project narrative. The quality of the SEA's or consortium's plan for implementing the innovative assessment demonstration authority. In determining the quality of the plan, the Secretary considers—
(1) The rationale for developing or selecting the particular innovative assessment system to be implemented under the demonstration authority, including—
(i) The distinct purpose of each assessment that is part of the innovative assessment system and how the system will advance the design and delivery of large-scale, statewide academic assessments in innovative ways; and
(ii) The extent to which the innovative assessment system as a whole will promote high-quality instruction, mastery of challenging State academic standards, and improved student outcomes, including for each subgroup of students described in section 1111(c)(2) of the Act;
(2) The plan the SEA or consortium, in consultation with any external partners, if applicable, has to—
(i) Develop and use standardized and calibrated tools, rubrics, methods, or other strategies for scoring innovative assessments throughout the demonstration authority period, consistent with relevant nationally recognized professional and technical standards, to ensure inter-rater reliability and comparability of innovative assessment results consistent with § 200.105(b)(4)(ii), which may include evidence of inter-rater reliability; and
(ii) Train evaluators to use such strategies, if applicable; and
(3) If the system will initially be administered in a subset of schools or LEAs in a State—
(i) The strategies the SEA, including each SEA in a consortium, will use to scale the innovative assessment to all schools statewide, with a rationale for selecting those strategies;
(ii) The strength of the SEA's or consortium's criteria that will be used to determine LEAs and schools that will initially participate and when to approve additional LEAs and schools, if applicable, to participate during the requested demonstration authority period; and
(iii) The SEA's plan, including each SEA in a consortium, for how it will ensure that, during the demonstration authority period, the inclusion of additional LEAs and schools continues to reflect high-quality and consistent implementation across demographically diverse LEAs and schools, or contributes to progress toward achieving such implementation across demographically diverse LEAs and schools, including diversity based on enrollment of subgroups of students described in section 1111(c)(2) of the Start Printed Page 88970Act and student achievement. The plan must also include annual benchmarks toward achieving high-quality and consistent implementation across participating schools that are, as a group, demographically similar to the State as a whole during the demonstration authority period, using the demographics of initially participating schools as a baseline.
(b) Prior experience, capacity, and stakeholder support. (1) The extent and depth of prior experience that the SEA, including each SEA in a consortium, and its LEAs have in developing and implementing the components of the innovative assessment system. An SEA may also describe the prior experience of any external partners that will be participating in or supporting its demonstration authority in implementing those components. In evaluating the extent and depth of prior experience, the Secretary considers—
(i) The success and track record of efforts to implement innovative assessments or innovative assessment items aligned to the challenging State academic standards under section 1111(b)(1) of the Act in LEAs planning to participate; and
(ii) The SEA's or LEA's development or use of—
(A) Effective supports and appropriate accommodations consistent with § 200.6(b) and (f)(1)(i) and section 1111(b)(2)(B)(vii) of the Act for administering innovative assessments to all students, including English learners and children with disabilities, which must include professional development for school staff on providing such accommodations;
(B) Effective and high-quality supports for school staff to implement innovative assessments and innovative assessment items, including professional development; and
(C) Standardized and calibrated tools, rubrics, methods, or other strategies for scoring innovative assessments, with documented evidence of the validity, reliability, and comparability of annual summative determinations of achievement, consistent with § 200.105(b)(4) and (7).
(2) The extent and depth of SEA, including each SEA in a consortium, and LEA capacity to implement the innovative assessment system considering the availability of technological infrastructure; State and local laws; dedicated and sufficient staff, expertise, and resources; and other relevant factors. An SEA or consortium may also describe how it plans to enhance its capacity by collaborating with external partners that will be participating in or supporting its demonstration authority. In evaluating the extent and depth of capacity, the Secretary considers—
(i) The SEA's analysis of how capacity influenced the success of prior efforts to develop and implement innovative assessments or innovative assessment items; and
(ii) The strategies the SEA is using, or will use, to mitigate risks, including those identified in its analysis, and support successful implementation of the innovative assessment.
(3) The extent and depth of State and local support for the application for demonstration authority in each SEA, including each SEA in a consortium, as demonstrated by signatures from the following:
(i) Superintendents (or equivalent) of LEAs, including participating LEAs in the first year of the demonstration authority period.
(ii) Presidents of local school boards (or equivalent, where applicable), including within participating LEAs in the first year of the demonstration authority.
(iii) Local teacher organizations (including labor organizations, where applicable), including within participating LEAs in the first year of the demonstration authority.
(iv) Other affected stakeholders, such as parent organizations, civil rights organizations, and business organizations.
(c) Timeline and budget. The quality of the SEA's or consortium's timeline and budget for implementing the innovative assessment demonstration authority. In determining the quality of the timeline and budget, the Secretary considers—
(1) The extent to which the timeline reasonably demonstrates that each SEA will implement the system statewide by the end of the requested demonstration authority period, including a description of—
(i) The activities to occur in each year of the requested demonstration authority period;
(ii) The parties responsible for each activity; and
(iii) If applicable, how a consortium's member SEAs will implement activities at different paces and how the consortium will implement interdependent activities, so long as each non-affiliate member SEA begins using the innovative assessment in the same school year consistent with § 200.104(b)(2); and
(2) The adequacy of the project budget for the duration of the requested demonstration authority period, including Federal, State, local, and non-public sources of funds to support and sustain, as applicable, the activities in the timeline under paragraph (c)(1) of this section, including—
(i) How the budget will be sufficient to meet the expected costs at each phase of the SEA's planned expansion of its innovative assessment system; and
(ii) The degree to which funding in the project budget is contingent upon future appropriations at the State or local level or additional commitments from non-public sources of funds.
(d) Supports for educators, students, and parents. The quality of the SEA or consortium's plan to provide supports that can be delivered consistently at scale to educators, students, and parents to enable successful implementation of the innovative assessment system and improve instruction and student outcomes. In determining the quality of supports, the Secretary considers—
(1) The extent to which the SEA or consortium has developed, provided, and will continue to provide training to LEA and school staff, including teachers, principals, and other school leaders, that will familiarize them with the innovative assessment system and develop teacher capacity to implement instruction that is informed by the innovative assessment system and its results;
(2) The strategies the SEA or consortium has developed and will use to familiarize students and parents with the innovative assessment system;
(3) The strategies the SEA will use to ensure that all students and each subgroup of students under section 1111(c)(2) of the Act in participating schools receive the support, including appropriate accommodations consistent with § 200.6(b) and (f)(1)(i) and section 1111(b)(2)(B)(vii) of the Act, needed to meet the challenging State academic standards under section 1111(b)(1) of the Act; and
(4) If the system includes assessment items that are locally developed or locally scored, the strategies and safeguards (e.g., test blueprints, item and task specifications, rubrics, scoring tools, documentation of quality control procedures, inter-rater reliability checks, audit plans) the SEA or consortium has developed, or plans to develop, to validly and reliably score such items, including how the strategies engage and support teachers and other staff in designing, developing, implementing, and validly and reliably scoring high-quality assessments; how the safeguards are sufficient to ensure unbiased, objective scoring of assessment items; and how the SEA will use effective professional development to aid in these efforts.Start Printed Page 88971
(e) Evaluation and continuous improvement. The quality of the SEA's or consortium's plan to annually evaluate its implementation of innovative assessment demonstration authority. In determining the quality of the evaluation, the Secretary considers—
(1) The strength of the proposed evaluation of the innovative assessment system included in the application, including whether the evaluation will be conducted by an independent, experienced third party, and the likelihood that the evaluation will sufficiently determine the system's validity, reliability, and comparability to the statewide assessment system consistent with the requirements of § 200.105(b)(4) and (9); and
(2) The SEA's or consortium's plan for continuous improvement of the innovative assessment system, including its process for—
(i) Using data, feedback, evaluation results, and other information from participating LEAs and schools to make changes to improve the quality of the innovative assessment; and
(ii) Evaluating and monitoring implementation of the innovative assessment system in participating LEAs and schools annually.
(Authority: 20 U.S.C. 1221e-3, 3474, 6364, 6571)6. Add § 200.107 to read as follows:
End Amendment PartTransition to statewide use.(a)(1) After an SEA has scaled its innovative assessment system to operate statewide in all schools and LEAs in the State, the SEA must submit evidence for peer review under section 1111(a)(4) of the Act and § 200.2(d) to determine whether the system may be used for purposes of both academic assessments and the State accountability system under sections 1111(b)(2), (c), and (d) and 1003 of the Act.
(2) An SEA may only use the innovative assessment system for the purposes described in paragraph (a)(1) of this section if the Secretary determines that the system is of high quality consistent with paragraph (b) of this section.
(b) Through the peer review process of State assessments and accountability systems under section 1111(a)(4) of the Act and § 200.2(d), the Secretary determines that the innovative assessment system is of high quality if—
(1) An innovative assessment developed in any grade or subject under § 200.5(a)(1) and section 1111(b)(2)(B)(v) of the Act—
(i) Meets all of the requirements under section 1111(b)(2) of the Act and § 200.105(b) and (c);
(ii) Provides coherent and timely information about student achievement based on the challenging State academic standards under section 1111(b)(1) of the Act;
(iii) Includes objective measurements of academic achievement, knowledge, and skills; and
(iv) Is valid, reliable, and consistent with relevant, nationally recognized professional and technical standards;
(2) The SEA provides satisfactory evidence that it has examined the statistical relationship between student performance on the innovative assessment in each subject area and student performance on other measures of success, including the measures used for each relevant grade-span within the remaining indicators (i.e., indicators besides Academic Achievement) in the statewide accountability system under section 1111(c)(4)(B)(ii)-(v) of the Act, and how the inclusion of the innovative assessment in its Academic Achievement indicator under section 1111(c)(4)(B)(i) of the Act affects the annual meaningful differentiation of schools under section 1111(c)(4)(C) of the Act;
(3) The SEA has solicited information, consistent with the requirements under § 200.105(d)(3)(iv), and taken into account feedback from teachers, principals, other school leaders, parents, and other stakeholders under § 200.105(a)(2) about their satisfaction with the innovative assessment system; and
(4) The SEA has demonstrated that the same innovative assessment system was used to measure—
(i) The achievement of all students and each subgroup of students described in section 1111(c)(2) of the Act, and that appropriate accommodations were provided consistent with § 200.6(b) and (f)(1)(i) under section 1111(b)(2)(B)(vii) of the Act; and
(ii) For purposes of the State accountability system consistent with section 1111(c)(4)(E) of the Act, progress on the Academic Achievement indicator under section 1111(c)(4)(B)(i) of the Act of at least 95 percent of all students, and 95 percent of students in each subgroup of students described in section 1111(c)(2) of the Act.
(c) With respect to the evidence submitted to the Secretary to make the determination described in paragraph (b)(2) of this section, the baseline year for any evaluation is the first year that a participating LEA in the State administered the innovative assessment system under the demonstration authority.
(d) In the case of a consortium of SEAs, evidence may be submitted for the consortium as a whole so long as the evidence demonstrates how each member SEA meets each requirement of paragraph (b) of this section applicable to an SEA.
(Authority: 20 U.S.C. 1221e-3, 3474, 6311(a), 6364, 6571)7. Add § 200.108 to read as follows:
End Amendment PartExtension, waivers, and withdrawal of authority.(a) Extension. (1) The Secretary may extend an SEA's demonstration authority period for no more than two years if the SEA submits to the Secretary—
(i) Evidence that its innovative assessment system continues to meet the requirements under § 200.105 and the SEA continues to implement the plan described in its application in response to the selection criteria in § 200.106 in all participating schools and LEAs;
(ii) A high-quality plan, including input from stakeholders under § 200.105(a)(2), for transitioning to statewide use of the innovative assessment system by the end of the extension period; and
(iii) A demonstration that the SEA and all LEAs that are not yet fully implementing the innovative assessment system have sufficient capacity to support use of the system statewide by the end of the extension period.
(2) In the case of a consortium of SEAs, the Secretary may extend the demonstration authority period for the consortium as a whole or for an individual member SEA.
(b) Withdrawal of demonstration authority. (1) The Secretary may withdraw the innovative assessment demonstration authority provided to an SEA, including an individual SEA member of a consortium, if at any time during the approved demonstration authority period or extension period, the Secretary requests, and the SEA does not present in a timely manner—
(i) A high-quality plan, including input from stakeholders under § 200.105(a)(2), to transition to full statewide use of the innovative assessment system by the end of its approved demonstration authority period or extension period, as applicable; or
(ii) Evidence that—
(A) The innovative assessment system meets all requirements under § 200.105, including a demonstration that the innovative assessment system has met the requirements under § 200.105(b);Start Printed Page 88972
(B) The SEA continues to implement the plan described in its application in response to the selection criteria in § 200.106;
(C) The innovative assessment system includes and is used to assess all students attending participating schools in the demonstration authority, consistent with the requirements under section 1111(b)(2) of the Act to provide for participation in State assessments, including among each subgroup of students described in section 1111(c)(2) of the Act, and for appropriate accommodations consistent with § 200.6(b) and (f)(1)(i) and section 1111(b)(2)(B)(vii) of the Act;
(D) The innovative assessment system provides an unbiased, rational, and consistent determination of progress toward the State's long-term goals and measurements of interim progress for academic achievement under section 1111(c)(4)(A) of the Act for all students and subgroups of students described in section 1111(c)(2) of the Act and a comparable measure of student performance on the Academic Achievement indicator under section 1111(c)(4)(B)(i) of the Act for participating schools relative to non-participating schools; or
(E) The innovative assessment system demonstrates comparability to the statewide assessments under section 1111(b)(2) of the Act in content coverage, difficulty, and quality.
(2)(i) In the case of a consortium of SEAs, the Secretary may withdraw innovative assessment demonstration authority for the consortium as a whole at any time during its demonstration authority period or extension period if the Secretary requests, and no member of the consortium provides, the information under paragraph (b)(1)(i) or (ii) of this section.
(ii) If innovative assessment demonstration authority for one or more SEAs in a consortium is withdrawn, the consortium may continue to implement the authority if it can demonstrate, in an amended application to the Secretary that, as a group, the remaining SEAs continue to meet all requirements and selection criteria in §§ 200.105 and 200.106.
(c) Waiver authority. (1) At the end of the extension period, an SEA that is not yet approved consistent with § 200.107 to implement its innovative assessment system statewide may request a waiver from the Secretary consistent with section 8401 of the Act to delay the withdrawal of authority under paragraph (b) of this section for the purpose of providing the SEA with the time necessary to receive approval to transition to use of the innovative assessment system statewide under § 200.107(b).
(2) The Secretary may grant an SEA a one-year waiver to continue the innovative assessment demonstration authority, if the SEA submits, in its request under paragraph (c)(1) of this section, evidence satisfactory to the Secretary that it—
(i) Has met all of the requirements under paragraph (b)(1) of this section and of §§ 200.105 and 200.106; and
(ii) Has a high-quality plan, including input from stakeholders under § 200.105(a)(2), for transition to statewide use of the innovative assessment system, including peer review consistent with § 200.107, in a reasonable period of time.
(3) In the case of a consortium of SEAs, the Secretary may grant a one-year waiver consistent with paragraph (c)(1) of this section for the consortium as a whole or for individual member SEAs, as necessary.
(d) Return to the statewide assessment system. If the Secretary withdraws innovative assessment demonstration authority consistent with paragraph (b) of this section, or if an SEA voluntarily terminates use of its innovative assessment system prior to the end of its demonstration authority, extension, or waiver period under paragraph (c) of this section, as applicable, the SEA must—
(1) Return to using, in all LEAs and schools in the State, a statewide assessment that meets the requirements of section 1111(b)(2) of the Act; and
(2) Provide timely notice to all participating LEAs and schools of the withdrawal of authority and the SEA's plan for transition back to use of a statewide assessment.
(Authority: 20 U.S.C. 1221e-3, 3474, 6364, 6571)
Footnotes
1. For more information regarding President Obama's Testing Action Plan, please see: http://www2.ed.gov/admins/lead/account/saa.html;; see also: www.ed.gov/news/press-releases/fact-sheet-testing-action-plan.
Back to Citation2. The Department has issued non-regulatory guidance on consultation under the ESEA, including suggestions and examples of best practices for meaningful stakeholder engagement. See: http://www2.ed.gov/policy/elsec/guid/secletter/160622.html.
Back to Citation3. For more information regarding stakeholder engagement, please see: http://www2.ed.gov/policy/elsec/guid/secletter/160622.html.
Back to Citation4. For more information on agencies' civil rights obligations to parents with limited English proficiency, see the Joint Dear Colleague Letter of Jan. 7, 2015, at Section J. (http://www2.ed.gov/about/offices/list/ocr/letters/colleague-el-201501.pdf).
Back to Citation5. For example, see the following sections of the ESEA: Section 1204(c)(2)(A)(i)-(ii); section 1204(e)(2)(A)(v)(II), (vii), and (viii); section 1204(e)(2)(B)(v), (ix), and (x)(III); and section 1204(j)(1)(B)(iv).
Back to Citation[FR Doc. 2016-29126 Filed 12-7-16; 8:45 am]
BILLING CODE 4000-01-P
Document Information
- Effective Date:
- 1/9/2017
- Published:
- 12/08/2016
- Department:
- Education Department
- Entry Type:
- Rule
- Action:
- Final regulations.
- Document Number:
- 2016-29126
- Dates:
- These regulations are effective January 9, 2017.
- Pages:
- 88940-88972 (33 pages)
- Docket Numbers:
- Docket ID ED-2016-OESE-0047
- RINs:
- 1810-AB31: Title I, Part B of the Elementary and Secondary Education Act of 1965 (ESEA), Innovative Assessment Demonstration Authority
- RIN Links:
- https://www.federalregister.gov/regulations/1810-AB31/title-i-part-b-of-the-elementary-and-secondary-education-act-of-1965-esea-innovative-assessment-demo
- Topics:
- Elementary and secondary education, Grant programs-education, Indians-education, Infants and children, Juvenile delinquency, Migrant labor, Private schools, Reporting and recordkeeping requirements
- PDF File:
- 2016-29126.pdf
- CFR: (5)
- 34 CFR 200.104
- 34 CFR 200.105
- 34 CFR 200.106
- 34 CFR 200.107
- 34 CFR 200.108