What can a single score tell you about a school? Not much.

A couple of months back I devoted a blog post to the way in which we “grade” students at ANCS.  In the post I wrote about why ANCS does not try to boil student performance down to a single number or letter grade because doing so does not capture the nuances of student learning and growth in the same way a narrative report alongside specific student data does (what does receiving a score of 83% tell you about your knowledge and skills as a math student?).  And I also discussed the way that using numeric grades as a motivator through rewards and punishment rarely works with students in any meaningful or long-term way.

And so, as you might imagine, I see some problems with Georgia’s new “College and Career Ready Performance Index” (CCRPI), a single numeric score—on a scale of 0-100—given to each public school in Georgia and that will be used, according to the Georgia Department of Education, to “hold schools accountable and reward them for the work they do in all subjects and with all students.”  But before I describe the problems (and a possible solution) as I see them, a little background on CCRPI for those who need it.

On its website, the DOE has posted a good deal of information about CCRPI.  In short, Georgia—like many other states—applied for and received a waiver from the U.S. DOE from provisions of the federal “No Child Left Behind” Act that called for schools to make “adequate yearly progress” on annual reading and math tests and for all students to demonstrate proficiency in reading and math by 2014.  In exchange for this waiver, states had to develop new accountability systems that would show students’ “college and career readiness”.  For Georgia, that resulted in the creation of CCRPI.  The CCRPI score for a school is derived by a mix of indicators (I’d recommend viewing the full list here) with major emphasis on student performance on the CRCT and EOCT.

There’s been more in the news about CCRPI as of late as the system rolls out.  As you hear more about CCRPI, there are, I think, some important issues to you should consider.  I admit that I do not know all that was required of Georgia (and other states) by the U.S. DOE for the waiver it received, so whether these are problems that could be easily remedied by Georgia is not for me to say.  Regardless, the issues themselves are clear:

Use of a single grade: As the AJC’s Maureen Downey pointed out last spring in her Get Schooled blog, in a conference call about CCRPI, DOE representatives told her that they “understand that most parents won’t get too far beyond the overall grade”.  And, of course, why would they?  The scores are all that will be reported in the media and the DOE’s presentation of the CCRPI dashboard for schools and districts on its website does not make it easy to understand what is behind the single score.  Other states who are using single scores or grades for schools have encountered challenges from research showing the shortcomings of such systems to charges that the formula used to tabulate the grades were changed to improve grades for certain schools.  New York City is doing away with assigning schools with letter grades entirely because a single grade or score—devoid of context—provides little meaningful information and mainly sparks ill-informed conversations about and comparisons between schools.

Augmenting the impact of the CRCT and EOCT: Unlike NCLB’s requirement that schools move students towards proficiency in reading and math, student performance on all state standardized tests are included and the CCRPI score accounts for proficiency, growth, and achievement gap closure.  Most people (myself included) applaud including both growth and proficiency measures in looking at student performance for the more complete picture they paint.  However, as I have said before, educational psychometricians will tell you that there are clear limitations on data gleaned from standardized tests and the use of growth measures for significantly different content (like 5th grade social studies and 8th grade social studies, for example) is statistically problematic.  By increasing the emphasis on standardized test scores, CCRPI increases the impact of these issues.  And the possibility that CCRPI may soon also include teacher and school leader scores from its new evaluation system (scores that are comprised 50% and 70% respectively of student standardized test scores) could further amplify these challenges.

Questionable measures included:  There may be some value in the data collected from all of the many CCRPI indicators.  But I question whether including them in a CCRPI score is useful.  For example, one indicator is the percentage of teachers at a school who access the state’s online “Student Longitudinal Data System” for student data.  Accessing a website is a means to an end—it’s not an outcome to be measured.  Some might say, “Well, just make sure all of your teachers use this website and your score will go up!”  These seems akin to the sort of meaningless “grade grubbing” that happens when the goal is simply to increase a grade—do all you can, regardless of its value, to boost your score.  The indicators also include “school climate” surveys taken by students, teachers, and parents.  Check out the student survey questions for middle school.  More than 75% of the questions ask students about things like whether they have access to or have used pot, cocaine, cigarettes or if they eat well.  A scant few questions actually pertain to the school environment.  This isn’t a measure of “school climate”; it’s a measure of “student health”.  In fact, the survey was developed by the Georgia Department of Public Health.  The results from this survey could be helpful in identifying significant health issues among students, but using it as an assessment of school climate is entirely misleading, especially when schools are incredibly limited in the impact they can have on some of the issues contained in the questions.

Logistical challenges: Last year was a “study year” for CCRPI for the state.  I’m not sure what this meant exactly, but I can tell you that, aside from various emails I received from the state about updates to what would be included in CCRPI or what the formula used to calculate the score would be, I was never invited to give my opinion, nor were any in-person meetings held that I know of to discuss CCRPI implementation.  What I could have said (and still will say) if asked about CCRPI is that having so many indicators in which school-by-school reporting is necessary and in which district office support will be needed will be bumpy.  As but one example, our school found out mid-way through last school year that we needed to have students take part in different career-oriented lessons and inventories (another indicator).  Getting information about indicators in ways that aren’t timely or entirely clear doesn’t make me confident on how useful the CCRPI scores for schools will be.

Other indicators award schools points based on student performance in arts and foreign languages courses, high school credits achieved in middle school, the 21st Century Skills Assessment, and STEM certification.  Not all schools—particularly smaller ones or ones that emphasize “depth over breadth” in electives—are able to offer all of these programs or classes and therefore see their scores impacted.

Using CCRPI on its own as a mechanism to hold schools “accountable”: To me, this is the biggest issue of all with CCRPI and the way it is or will be used.  My guess is there will be the predictable hand wringing about schools with low CCRPI scores and calls for serious consequences if those scores are not raised.  And those consequences will likely do little to change learning outcomes.

As author Dan Pink explored in his recent book Drive, that’s because decades of research shows that this type of rewards and consequences approach to motivation does not result in sustained, deep change (if you haven’t read the book or don’t have time, this TED Talk hits the highlights).  This approach works well for simple tasks.  Yet for complex tasks like teaching, learning, and schooling, this type of approach is mainly good at promoting acting in ways that devalue the ultimate goal (lots of test prep rather than emphasis on critical thinking and application skills, for example) or outright cheating in order to meet requirements.

As Pink says, “what’s frustrating, or ought to be frustrating…is that when we see these carrot-and-stick motivators demonstrably fail before our eyes – when we see them fail in organizations right before our very eyes – our response isn’t to say: ‘Man, those carrot-and-stick motivators failed again. Let’s try something new.’  It’s ‘Man, those carrot-and-stick motivators failed again. Looks like we need more carrots. Looks like we need sharper sticks.’ And it’s taking us down a fundamentally misguided path.”

The “carrot and stick” approach is reflected in the Georgia DOE’s recent announcement that all charter school renewal decisions will be based solely on CCRPI performance relative to the school’s district and to the state.  And, being a pragmatist (and the executive director of a charter school), unless and until the state changes the way they use CCRPI, our school will obviously work to do what we can to live within this flawed system and meet the requirements it sets.  But there are more rich, useful, and rigorous approaches to reporting on school quality and making decisions about schools, and perhaps Georgia could look to these approaches for its charter schools—or even all public schools.

In Massachusetts (a state often cited as an exemplar for student performance) the public charter school at which I worked several years ago was visited annually by a review team from the state DOE.  Still today, Massachusetts utilizes a multifaceted set of criteria for evaluating and reporting upon its charter schools and employs a comprehensive protocol for site visits to schools that involves observations and focus groups of students, board members, and teachers and staff.  Detailed, holistic information is available about schools for parents to review and for schools to use.  I see no reason why such an approach couldn’t be put into place here in Georgia.

Certainly it is easier and more cost-effective to assign single grades or scores to schools and to make assessments of a school’s quality on the basis of these scores without ever having to step foot into the school.  But, I would hope that, for all the rhetoric these days about “doing things in the best interests of students” and education being “the civil rights issue of this generation” we could put the time, care, and resources into accountability systems that actually reflect that spirit.


Comments

One response to “What can a single score tell you about a school? Not much.”

  1. Cleve Roland Avatar
    Cleve Roland

    This is a great article about the flawed system of CCRPI. I work with a rural school system in south Georgia and we feel the pressure of CCRPI. We have very little monies due to a low tax base. We can’t compete with a system that may have one teacher to fifteen student ratio. We also have ten calendar reduction days (furlough) for our staff. This is problematic because it limits the time for teachers to have professional training days at the beginning and end of the year. We also have a large population of students that qualify for free and reduced lunch (about 80%). We do have a strong community that is very generous in giving to the school. We have raised over $20,000 in the last two years for a new elementary playground. Our CCRPI score was published for our school and was a 53. This one score frustrates our teachers and community which is already overworked and burdened. This will may also impact giving from the community when they see the results of this one score for our school. Our teachers work hard and care for our students, but with this population of students there are many distractions, such as drug and alcohol problems in the home, shootings, and high unemployment rates. Again we just don’t have the tax revenues like some other systems in the state and can’t afford that extra exploratory teacher or counselor to assist with the needs of our low socio economic population.