Tree Rollins, the California Raisins, free throws, and how we use data

photo

Over the past few years, increasing attention has been given to “value-added” measures in schools, districts, and states around the United States.  Georgia’s new school accountability system and teacher and leader evaluation processes all use a variation of a value-added measure (for efficiency, I’ll refer to them from here on out as VAM).  And for the past couple of years, the Atlanta Public Schools has calculated VAM scores based on CRCT and EOCT data for all of its schools for internal review by school leaders.

So what is a VAM?  There are many varieties and ways of calculating them, but, in short, VAMs use standardized testing data to show how much growth a student and/or group of students made between taking a standardized test once and then taking a test on the same skills again later.  Just as there are many ways of calculating VAMs, there are many different ways VAMs get used.  As of late, VAMs are being used in attempts to attribute student growth (or lack thereof) to a teacher or school.  Unlike straight up proficiency measures (whether a student meets on the Math CRCT for example), VAMs show how much growth a student made, regardless of whether he or she met the standard (like gaining 20 points on the Math CRCT).

If you’ve read this blog in the past, you know that I share the views of many educational psychometricians in cautioning against over-interpreting or mis-using VAMs.  In general, I think VAMs can be one useful data point (mainly in reading and math) in assessing a student or school, especially if lower proficiency rates are masking real progress being made by students (and teachers) in reading comprehension or math fluency.  And it is the phrase “one data point” that I especially want to emphasize here because I think it has real and significant implications for our students and schools.  So significant, in fact, that I am willing to risk the embarrassment of this middle school picture of me in a California Raisins t-shirt with Tree Rollins to make my point and keep you reading.

You see, many years ago, in the fanfare of the grand opening of a fast food restaurant near my home, I was talked into taking part in a free throw-shooting contest.  Grand prize: Atlanta Hawks season tickets.  I nailed 13 straight free throws in my turn, which, at the end of the day-long contest, left me tied with a gentleman in his 30’s.  Time for a shoot-off.  Much to the disappointment of my competitor, with ice water in my middle school veins, I bested him.  Probably the crowning moment of my athletic career.  As I posed for my picture with Hawks legend Tree Rollins (apparently Dominique or Spud were not available for this event), he said to me, “If the rest of your game is as good as your free throws, we might have to get you to suit up.”  And in Tree’s comment, there is a lesson about our use of single points of data.

At the risk of coming across as immodest, I had pretty good proficiency when it came to free throw shooting.  With a hoop in my driveway, I often shot baskets after school and got good at making them.  But, at that point, I’d never played organized basketball before, so, the “rest of my game” probably was not very good at all.  So simple proficiency measures on their own—whether how many free throws you can make or how many questions you get right on the CRCT—cannot give us very complete pictures of complex tasks like playing basketball or learning.  Neither, though, can VAMs.  To extend the basketball reference, consider two NBA players of about the same career free throw proficiency level.  One showed more growth over the course of his career—from under 70% to several times above 85% as compared to the other player’s smaller growth from a low to a high season of 69% to 78%.  The larger growth is from the aforementioned Mr. Tree Rollins.  The other, “lesser” player in this comparison?  LeBron James.

Now, obviously, we would not use this free throw VAM to make judgments on these two players, their coaches, or their teams, and if we did, we would have been really misled by this one piece of data.  So why would we do that with any single piece of test data?  Though some people who distribute VAMs do so with a caveat that “this is just one metric and you shouldn’t read too much into it”, in the absence of other meaningful data being provided, what would you expect to happen?  And, if we continue to limit what we invest time, energy, resources, and attention in only to standardized test data—be it proficiency or VAM—we are missing a ton of other important information that is significant for our students.

As has been shown time and again, in education and in other fields, you are what you measure. And right now, by relying on a couple of pieces of data both tied to a standardized tests, we are selling our kids and families short by only comprehensively measuring and analyzing the educational equivalent of free throws—somewhat challenging and somewhat important, but not all there is to the complete game of basketball.  This approach leads schools and teachers to center much of their time and attention on getting students to get proficient at making free throws, and, once proficient, to try and keep getting better at them so as to increase their growth.  At ANCS, we work to help students reach proficiency on standardized tests in core competencies in reading, writing, and math.  But, once a student has demonstrated proficiency on these tests, our goal doesn’t become to focus even more on these types of measures so that the student can answer more questions on the same type of standardized tests.  Instead we ask students to hone their skills in other important ways, as researchers, creative writers, artists, scientists, and literary critics.  Does this impact our VAM score?  Surely.  But it’s a trade-off we feel is important for our students, especially when we hear from our alumni and their parents of its benefits.

Like an NBA general manager evaluating players and teams, I would venture to guess that most schools and parents would love to have a wider range of meaningful data to assess a school.  Maybe coming to agreement on what the best data should be and coordinating the collection of it across a whole state might be challenging.  But what’s to keep us from putting into place a system that has value to schools and families in, say, the Maynard Jackson cluster?  Along with proficiency and value-added data from standardized tests, we could also share common performance assessment data.  Imagine if across the schools in the cluster we developed rubrics for assessing common performance tasks in certain grade levels (maybe two a year–an artistic piece and a scientific investigation in 3rd and 6th grades, a mathematical simulation and a conversation in another language in 4th and 7th grades, and a research project and oral presentation in 5th and 8th grades) and teams from APS’s research office and educators from the cluster could assess student performance on these tasks.  A consortium of schools in New York  (including one at which I worked) did exactly this. And, in essence, it’s what happens already with writing via the Georgia Writing Assessment.

Outside of just student performance data, teams made up of individuals from APS, outside experts, and educators from the cluster could conduct an annual visit to each school in the cluster to observe and share findings on school culture, family engagement, safety, or other areas deemed important.  As I shared in a recent blog post, Massachusetts uses such a process for its charter schools, and our area schools could use a similar protocol (focus groups, interviews, classroom observations) to capture significant data about each school.

Certainly, plenty of details would need hammering out, but I put this forward as what I hope will be the start of a conversation.  A holistic approach like this to school data could spark productive conversations based on different types of data among school leaders in the Jackson cluster.  Armed with demographic information, a description of the school’s program, standardized test and performance task data, and a summary of a school visit report—all published in a single report made available through APS—families could have a more complete picture of a school.  And, most importantly, we’d clearly show students that we value more than just only “free throws”—we want them to be all-stars.  (Okay, cheesy ending there—but no worse than that California Raisins shirt, right?)


Comments

One response to “Tree Rollins, the California Raisins, free throws, and how we use data”

  1. Good explanation .
    I’m sorry about the t-shirt!
    Mom