The purpose of these Rules is to establish the school rating system and designate school performance category levels or ratings pursuant to Ark. Code Ann. §§ 6-15-2105 and 6-15-2106 and provide financial awards to public schools that experience high student performance and those with high student academic growth in accordance with Ark. Code Ann. § 6-15-2107.
APPENDIX "A"
In accordance with Ark. Code Ann. § 6-15-2108 (Act 744 of 2017), the School Rating System shall be a multiple-measures approach that includes achievement, growth, graduation rate, English Learner progress, and at least one other indicator. The School Rating System uses the ESSA School Index which is comprised of multiple, robust indicators for each grade span responsive to stakeholders and state and federal requirements.
The ESSA School Index is the sum of weighted indicator scores. The ESSA School Index consists of the following indicators:
* Weighted Achievement
* School Mean Growth plus English Learner Growth
* Content growth (ELA and math growth scores combined for each student).
* English Learner progress to English Language Proficiency at a rate that is proportional to number of English Learners.
* Adjusted Cohort Graduation Rate
* Four-year Adjusted Cohort Graduation Rate
* Five-year Adjusted Cohort Graduation Rate
* School Quality and Student Success
Weighted Achievement Score: Measure of Academic Achievement
A Weighted Achievement measure is used to incorporate academic achievement into the School Rating System. To calculate schools' Weighted Achievement scores, point values are assigned to each of the four academic achievement levels on Arkansas's grade level assessments for math and English language arts as described in Table A-1.
Table A-1. Point Values Assigned to Academic Achievement Levels on Arkansas Grade Level Assessments
Achievement Level ACT Aspire | Achievement Level MSAA | Points Earned for Each Achievement Level |
In Need of Support | Level 1 (L1) | 0.00 |
Close | Level 2 (L2) | 0.50 |
Ready | Level 3 (L3) | 1.00 |
Exceeds: Step one: Number of L4 [LESS THAN OR EQUAL TO] Number of L1 | Level 4 (L4) | 1.00 |
Exceeds: Step two: Number of L4 [GREATER THAN] Number of L1 | Level 4 (L4) | 1.25 |
Weighted Achievement increases point value for the movement of students from lower-performance levels to higher-performance levels, relative to grade-level proficiency (criteria ii). Schools earn partial points for students close to grade-level proficiency, a single point for students at grade-level proficiency, and 1.25 points for students exceeding grade-level proficiency for the number of students exceeding that are greater than the number in the lowest achievement level. If the number of students exceeding grade level-proficiency is not greater than the number of students in the lowest achievement level then schools earn a single point for these students. Table A-2 demonstrates how positive movement of students from lower achievement levels to higher achievement levels produces higher Weighted Achievement scores.
Table A-2. How Point Values for Student Achievement Levels Total Weighted Achievement Points Earned
Row one represents year one, row two is year two, and row three is year three. These rows each show an example of the number of students at each of the four achievement levels on the state assessment (columns one-four) for the same school over three years (rows one, two, and three). When schools help students attain higher achievement levels those schools earn more points. Also, schools that help students move from lower to higher achievement levels compared to their achievement levels in prior years earn more points. The Weighted Achievement Score (column nine) is the number of points a school earned for full academic year students at each achievement level divided by the number of full academic year students with test scores.
Inclusion Rules for Weighted Achievement
Students completing a full academic year (not highly mobile) and completing the math and/or English Language Arts assessments (ACT Aspire or alternative assessment) are included in the Weighted Achievement calculation.
For schools that test 9 5 percent or more of students enrolled in the school the denominator for Weighted Achievement will include the number of math and English language arts tests of full-academic year students (not highly mobile) for the school. If a school does not test 95 percent of enrolled students the denominator for Weighted Achievement will be 95 percent of the students expected to test in math and English language arts.
School Value-Added Scores: Measure of Student Academic Growth
Student Academic Growth (Content Area Growth)
A student growth model describes the change in student achievement over two or more moments in time.
A student growth model is described as value-added when student growth is attributed to a particular entity such as a school or program.
The Department consulted with stakeholders and the Technical Advisory Committee for Assessment and Accountability over a five-year period to evaluate several measures of student growth. A value-added growth model was piloted and selected in 2015 based on policy considerations such as which question about student growth is meaningful to students, parents, teachers, and other stakeholders, as well as the technical considerations given Arkansas's test transitions. Over the five years of development and advisory meetings conducted by the Department, stakeholders concluded their preference for the use of a simple value-added model (VAM) over other options. This model provides information to answer the question, "How much did this student change or grow in achievement compared to how much we thought he/she would grow based on what we know about him/her?"
The student longitudinal growth model is a simple value-added model that assesses student growth relative to the student's individual score history and the student's expectation of growth (predicted score). It reflects the difference between observed achievement and expected (predicted) achievement for each student. The computation of the students' value-added scores (VAS) which is the difference score (residual) is carried out in two steps.
In the first step, a longitudinal individual growth model is run to produce a predicted score for each student. The individual growth model uses up to five years of prior scores for each student to maximize the precision of the prediction (best estimate) and accounts for students having different starting points (random intercepts). In this value-added model, each student's prior score history acts as the control/conditioning factor for the expectation of growth for the individual student. This has the effect of 'controlling' for factors outside schools' control such as a student's economic status, race, or educational needs status.
In the second step, the student's predicted score is subtracted from his or her actual score to generate the student's value-added score (Actual - Predicted = value-added score). The value-added score indicates the degree to which students did not meet, met, or exceeded expected growth in achievement.
Figure A-1 illustrates how a student's score history is used to determine the predicted or expected score and how the student's actual score is compared to the expected score to obtain the growth score.
Figure A-1. Simplified illustration of how a student's value-added score is obtained.
Students' value-added scores range from negative values to positive values with values of zero representing expected growth (when residual = 0 the student achieved as expected) as illustrated in Figure A-2.
Figure A-2. Interpreting student growth from value-added scores.
* Positive value-added scores: If the student has a value-added score with a positive value, the student's achievement exceeded growth expectations for the year. The student had higher than expected growth. The greater the value above zero, the more the student exceeded expectations.
* Value-added scores at/around zero: If the student has a value-added score value of zero, the student's achievement met expected achievement. The student grew at least as much as expected.
* Negative value-added scores: If the student has a value-added score with a negative value, the student did not meet expectations for growth in achievement for the year. The student did not grow as much as expected in achievement. The lower the value of the value-added score, the larger the degree to which the student did not grow as much as expected.
Student Growth in English Language Proficiency
Student value-added scores are calculated for growth in English Learner proficiency, as well as for the content areas of math and English language arts. Each student receives a value-added score based on the student's score history on the English language assessment: ELDA prior to 2016 and ELPA21 for 2016 forward. The process for calculations are the same for student growth in English language proficiency as for math and English language arts growth as illustrated in Figure 1. The score history is from the English language acquisition test rather than the content assessment. Each student receives a value-added score for growth in English language proficiency that will range from negative to positive values with zero representing the student met expected growth as indicated in Figure A-2 (repeated below).
Figure A-2 (Repeated). Interpreting student growth from value-added scores.
The English language proficiency growth scores have similar meaning to the content scores as explained below.
* Positive value-added scores: If the student has a value-added score with a positive value, the student's growth in English language proficiency exceeded growth expectations for the year based on the score history. The student had higher than expected growth in English language proficiency. The greater the value above zero, the more the student exceeded expectations.
* Value-added scores at/around zero: If the student has a value-added score value of zero, the student's English language proficiency met expected growth in English language proficiency. The student grew at least as much as expected.
* Negative value-added scores: If the student has a value-added score with a negative value, the student did not meet expectations for growth in English language proficiency for the year. The student did not grow as much as expected in English language proficiency. The lower the value of the value-added score, the larger the degree to which the student did not grow as much as expected.
Calculating School Value-added Growth Scores
School Value-added Growth Scores include student growth in the content areas (math and English language arts) as well as student growth in English language proficiency as illustrated in Figure A-3.
Figure A-3. Combining content area student growth and English language proficiency student growth in the School Value-added Growth Score.
The following steps are used to combine content area and English language proficiency growth scores into a single School Growth Score.
Transforming School Value-added Scores to Include in Rating
To include the School value-added growth score in the School Rating System, the values must be transformed to a 100-point scale that will work within the total point scale for the rating system. Value-added scores are transformed using the equation below.
School Growth Score = (School Value-added Score x 35) + 80.00
The Department determined the intercept to be 80 through a series of input sessions where stakeholders were asked to indicate what "score" a school should earn if students, on average, were meeting their expected growth. Input ranged from a scores of 75 up to scores of 85). Thus, a score of 80 was selected to represent the value-added score of 0. The transformed score of 80 is the School Growth Score that indicates students, on average, met expected growth. Figure A-4 illustrates how the School Growth Scores can be interpreted.
Figure A-4. Interpreting transformed value-added scores.
Sample transformations for mean school value-added scores are provided below.
* Positive value-added score: Mean value-added score = 0.50
School Growth Score = (0.50 x 35) + 80.00
= 17.5 + 80.00
= 97.50
* Zero value-added score: Mean value-added score = 0.00
School Growth Score = (0.00 x 35) + 80.00
= 0.00 + 80.00
= 80.000
* Negative value-added score: Mean value-added score = - 0.21
School Growth Score = (-0.21 x 35) + 80.00
= -7.35 +80.00
= 72.65
School Growth Scores typically range from 60.00 to 100.00 with scores below 70 representing the extremes of lower than expected growth and scores above 90 representing the extremes of higher than expected growth. Please note that at the student level scores have a much wider range of values that must be interpreted within that wider range of values. Averaging at the school level results in less variation among school growth scores which means that values must be interpreted within this narrower range.
Figures A-5a. through A-5b. provide examples of the calculation for content and ELP value-added scores as the scores are combined to create the school value-added score used in the ESSA School Index.
Figure A-5a. Elementary school example of how the ELP indicator is incorporated and weighted proportional to the population of English Learners at the school.
In Figure A-5a. the school has 13 English learners out of 239 students. If the universal weight of 10 percent were used for the English Learner Progress indicator then this school would not have met the minimum N and the progress of these students would have been silent in the ESSA School Index score. Instead, the 13 English Learners had greater than expected growth, on average, and this progress is included in the ESSA School Index score as 5.16% of the number of scores included in the growth and contributes 4.27 points to the School Value-added Growth Score. Since growth is 50 percent of the elementary ESSA School Index, or 38.16 points as indicated in Figure 5.a., 1.97 points are contributed to the ESSA School Index score by the 13 English Learners and 36.19 points are contributed by the 239 students' content value-added growth scores. The contribution of English Learners to the total points in the ESSA School Index score reflects the progress (value-added growth in English language proficiency) of these students at a proportionate rate to their density in the school population.
Figure A-5b. Elementary school with high proportion of English Learners.
In Figure A-5b. the school has 65 English learners out of 85 students. If the universal weight of 10 percent were used for the English Learner Progress indicator then this school would have met the minimum N and the progress of these students would have counted for only 10 percent of the ESSA School Index score. Instead, the 65 English Learners had greater than expected growth, on average, and this progress is included in the ESSA School Index score proportionate to the number of English learner value-added growth scores for progress to English language proficiency. In this case, 17.99 of the 41.52 points contributed for by the other academic indictor come from English learner progress. This is a much larger contribution than would have been accomplished with a universal weight of 10 percent for the English language progress indicator.
Figure A-5c. High school with low proportion of English learners.
In the case of high schools, the School Value-added Growth score is part of the academic achievement indicator and is weighted at half of the 70 percent of the academic achievement indicator within the ESSA School Index. This results in a nominal weight of 35% of the ESSA School Index score for value-added growth in academic achievement. In Figure A-5c. note the 28 English learners account for 7.04 percent of the students. If the universal weight of 10 percent were used for the progress to English language proficiency indicator the ESSA School Index score for this school would be inflated by the very high English learner value-added score for progress to English language proficiency. This would bias the growth score disproportionately higher given the proportion of students in the content growth value-added score. Instead, the small proportion of students accounts for 2.00 points in the ESSA School Index score, appropriately weighting the growth values.
Figure A-5d. High school with high proportion of English learners.
The final example, Figure A-5d., illustrates the weight of the English learner progress to English language proficiency indicator in a high school with a high proportion of English learners. In this case, given the high proportion of English learners (154 out of 178 students) it is appropriate that the English learner progress to English language proficiency have an impact on almost half of the points for the School Value-added Growth score.
Inclusion Rules for School Growth Scores
Students completing a full academic year (not highly mobile) and completing the math and/or English Language Arts assessments (ACT Aspire or alternative assessment) or the English language proficiency assessment (ELPA21) are included in the Weighted Achievement calculation.
Adjusted Cohort Graduation Rates
The four-year Adjusted Cohort Graduation Rate and the five-year Adjusted Cohort Graduation Rate will be used in the School Rating System. Both the four-year and five-year Adjusted Cohort Graduation Rate will be directly integrated by multiplying each rate by the weight assigned: 10 percent for four-year Adjusted Cohort Graduation Rate and five percent for five-year Adjusted Cohort Graduation Rate. The total points possible for each Adjusted Cohort Graduation Rate would reflect the weight assigned, 10 and five, respectively.
The Adjusted Cohort Graduation Rates would function as continuous values in the total School Rating System adjusted by weight for the indicator. For example, a school with a four-year Adjusted Cohort Graduation Rate of 85 would earn 85 points adjusted by the assigned weight of 10 percent which would result in the four-year Adjusted Cohort Graduation Rate contributing 8.5 points to the overall score. A five-year Adjusted Cohort Graduation Rate of 96 at an assigned rate of five percent would contribute 4.8 points to the overall score.
School Quality and Student Success Indicators
Ark. Code Ann. § 6-15-2108(b) and (c) specify that the school rating system shall consider without limitation at least one or more school quality and/or student success indicators provided those indicators allow for meaningful differentiation of schools and are valid, reliable, comparable and applicable statewide. Stakeholders communicated a desire to have multiple measures included in this indicator as soon as possible.
The Department created a student-focused aggregation of indicators that meet these requirements. The measures for this indicator focus on each student meeting important educational milestones (such as reading proficiently), important readiness criteria (minimum ACT score of 19 for Arkansas Academic Challenge Scholarship), and important postsecondary success indicators (attainment of AP, IB, concurrent credits). In essence, the School Quality and Student Success indicator combines measures of engagement, access, readiness, completion, and success criteria. To calculate this indicator a student level table is constructed to include the indicators listed in Table A-3.
For each component of this indicator a student is included in the denominator of the calculation using a comparable standard. The student engagement component can be used for an example. If a student is in grades kindergarten through 11 and the student is enrolled at a particular school, then the student is listed as enrolled at that school in a district's annual cycle 7 (June 15) data submission to the statewide information system. The cycle 7 data submission of enrolled students at each school and LEA provides the denominator for the student engagement component. A student level table is constructed that includes all students enrolled at each school and LEA as of June 15. The cycle 7 data submission includes the number of days absent and the number of days present for each student enrolled at the school.
The number of days absent and the number of days present are used to calculate the attendance rate of the student and that rate is used to determine the risk level for engagement. Chronic absence represents high risk that the student is not engaged in school.
* If a student is absent less than five percent of the days the student is enrolled, the student is considered low risk, and the student receives 1 point for the student engagement component out of 1 point possible.
* If a student is absent from five percent to less than 10 percent of the days enrolled, the student is considered moderate risk and the student receives 0.5 points for the student engagement component out of 1 point possible.
The number of points for all students enrolled (as submitted and certified in cycle 7 data) are summed for the numerator of this component. The number of students enrolled (as submitted and certified in cycle 7 data) are the denominator of this component. The use of submitted, certified cycle 7 enrollment data provides a comparable denominator for this component for schools statewide. This process-determining and summing points received for each student and points possible for each student-is replicated for each component of the School Quality and Student Success indicator. The final School Quality and Student Success indicator score is the sum of points per student across all components and the sum of points possible across all components. This summation results in a denominator for each component that is standard and comparable across schools and a numerator for each component that reflects the degree to which each student accessed or achieved a desired outcome for the component. To calculate this indicator a student level table is constructed to include the indicators listed in Table A-3.
Table A-3. School Quality and Student Success Indicators Available for Inclusion
To communicate the focus on student access, readiness, and success for this indicator, and to ensure comparability across schools and grade spans the School Quality and Student Success Indicator is calculated first at the student level.
Each student has a score that is the percentage of points earned out of points possible to earn. These student-level scores are aggregated to the school level. This student-level focus is necessary first because it aligns with the goals of the Vision and second because schools will have different grade configurations and students in different grades will have different points possible. The mean percentage of points earned per student is used to calculate a school-level statistic which represents the average earned points per student based on each student's possible points. The following steps were taken to model this student-focused School Quality and Student Success Indicator.
The mean percentage of points earned per student is used to calculate a school-level statistic which represents the average earned points per student based on each student's possible points. The following steps were taken to model this student-focused School Quality and Student Success Indicator:
* A student-level table was constructed that included two columns per indicator: points possible and points earned. If an indicator listed in Table A-3 applied to the student, the points possible were set equal to one. If the indicator did not apply, the points possible were set to a null value to exclude them from the total points possible for the student.
* When a student's data record indicated he/she earned a full or partial point the point/partial point was added to the student row for that indicator. If a student's data record showed the student did not meet the criteria to earn a point for the indicator, a zero was assigned for points earned for that particular indicator.
* Students' possible points were summed across all indicators (indicators with a null value did not apply and thus were not included in possible points).
* Students' earned points were summed across all applicable indicators.
* The percentage of points earned out of possible points was calculated for each student.
* School means were calculated for the percentage of points earned per student to produce the school-level School Quality and Student Success indicator.
Grade Spans and Grade Configurations
The Department will identify each school based on the grade span, grades PK-5, grades 6-8, or grades 9-12. Each school is assigned to a grade span based on the grades the school serves (grade range of school). Grade span categories for each grade range are indicated below. The grade spans are determined in a logical manner based on the grade levels assessed on the statewide assessments. If a school grade range includes the majority of tested grades within a span, then the school is assigned to the grade span with other schools whose majority of grades are within the same grade span for comparability purposes. When a school configuration has an equal number of assessed grades for two grade spans, then the school is included in the higher grade span for comparability purposes. This is important given the weights of weighted achievement and growth in the ESSA School Index and the different components of the School Quality Student Success indicator.
Figure A-6. Grade Spans and Configurations
Total Score and Rating
The Department will weight each indicator according to the table below:
Table A-4. Indicator Weights
Indicator | Weight of Indicator within Index Grades K-5 & 6-8 | Indicator | Weight of Indicator within Index High Schools |
Weighted Achievement Indicator | 35% | Weighted Achievement and Academic Growth | 70% total with Weighted Achievement accounting for half (35%) and School Growth Score accounting for half (35%) |
Growth Indicator Academic Growth English Language Progress | 50% | ||
Progress to English Language Proficiency* | Proportionately weighted in School Growth Score by Number of English Learners 1:1 ELP to Content Growth | Progress to English Language Proficiency* | Proportionately weighted in School Growth Score by Number of English Learners 1:1 ELP to Content Growth |
Graduation Rate Indicator 4-Year Adjusted Cohort Rate 5-Year Adjusted Cohort Rate | NA | Graduation Rate Indicator 4-Year Adjusted Cohort Rate 5-Year Adjusted Cohort Rate | 15% total 4-Yr = 10% 5-Yr = 5% |
School Quality and Student Success Indicator | 15% | School Quality and Student Success Indicator | 15% |
All ESSA School Index scores and indicator scores will be rounded and reported to the nearest hundredths by applying standard rounding rules. If the thousandths place of a decimal is five or greater, or can be rounded to five or greater, the score will be rounded up to the nearest hundredth. (Example: 78.7864 will be reported as 78.79). If the thousandths place of a decimal is four or less, or can be rounded to four or less, the score rounds down to the nearest hundredth. (Example: 98.5442 will be reported as 98.54).
The total score for each school will determine the school rating. The Department will assign the school rating according to the tables below:
Table A-5. K-5 Grade Span Ratings
Rating | Total Score Range |
A | 79.26 [LESS THAN OR EQUAL TO] Score |
B | 72.17-79.25 |
C | 64.98-72.16 |
D | 58.09-64.97 |
F | Score [LESS THAN] 58.09 |
Table A-6. Grades 6-8 Grade Span Ratings
Rating | Total Score Range |
A | 75.59 [LESS THAN OR EQUAL TO] Score |
B | 69.94-75.58 |
C | 63.73-69.93 |
D | 53.58-63.72 |
F | Score [LESS THAN] 53.58 |
Table A-7. Grades 9-12 Grade Span Ratings
Rating | Total Score Range |
A | 73.22 [LESS THAN OR EQUAL TO] Score |
B | 67.96-73.21 |
C | 61.10-67.95 |
D | 52.95-61.09 |
F | Score [LESS THAN] 52.95 |
005.27.18 Ark. Code R. 002