Here are practical tips for creating a district report card that is more helpful and meaningful to parents and the community.
By Wayne D’Orio, School Leaders Now senior editor
Most state or district report cards are hard to understand and too vague. So found a study from the Data Quality Campaign called “Show Me the Data: State Report Cards Must Answer Questions and Inform Action.” The report focuses on states, but the same problems likely exist at local levels, says Paige Kowalski, DQC’s executive vice president.
Here’s a checklist for improving report cards:
1. Stay current
Kowalski says states have improved report cards in the past decade, but admits “some things really surprised us.” For instance, only four states meet all the requirements of NCLB, a law that was passed in 2001. Timeliness was a problem in many reports; 10 states are still showing assessments from the 2012-13 and 2013-14 school years.
2. Be multilingual
In the U.S., 45 states produce their reports only in English, with no resources to translate the information into other languages. New York uses Google Translate to offer its dashboard in more than 100 languages.
3. Offer functionality/open data sets
DQC highlighted some states that are creating helpful and informative reports. Washington, D.C.’s, clear layout gives an overview of key indicators of student performance. Minnesota’s report allows users to disaggregate data by a wide variety of factors, from gender to grade level to socio-economic status.
4. Eliminate jargon and acronyms
DQC has been studying state report cards for more than 10 years; Kowalski approached the problem as if she were a parent reading the report card at 10 p.m. trying to find information for an approaching school lottery.
That measuring stick revealed that some report cards that might look good to district officials could confuse or frustrate parents. The report found six different terms to denote a child of limited income. While listing the percentage of students who receive free and reduced-price lunch is a standard measurement, Kowalski argues that this technical term might not be as easy to understand as you think. Mentioning Safe Harbor, AYP, and value-added growth all fall in the same category.
Kowalski says DQC will examine local reports in next year’s study. While she expects to find many of the same problems, she says districts can be more nimble than states by addressing issues that are key to their area. While ESSA requires states to disaggregate information about Pacific Islander students, she says that some districts within California can have 30 different nationalities within that group. It makes sense for those districts to further break out stats, she says.
In fact, Kowalski says if districts used open data sets, they could present basic information while allowing users to create specific reports by aggregating data in specific ways. For instance, if a parent had an African-American daughter in third grade, they should be able to see how the district educates third graders, how African-Americans are achieving, and how girls fare versus boys. “I’d want to know the school can handle children like my own,” she says.
5. Provide context
One point that Kowalski emphasizes is that creating easy to understand reports will build trust with the public. “People are losing some trust in the system,” she says, mentioning Common Core misconceptions. “If you can’t be clear, you’re going to have people mistrust.”
The executive vice president says that providing context about certain statistics can also help tell your story. While every report mentions high school graduation rate, she urges districts to consider including a postsecondary enrollment rate, too. While every report includes third grade reading assessments, mentioning growth data for third graders gives a fuller picture.
With ESSA’s requirement for some type of non-academic indicator, Kowalski says experimenting with ways to quantify students’ social and emotional learning would be a good place to start. By creating some way to measure this, whether through a school survey or other data, she says schools will start taking the step of quantifying this work. In future years, studying this data could help districts determine if their social/emotional piece is effective.