Proposal eliminates some test scores from high school rating policy

Print More
CPS officials say they're tossing out results from last year's PLAN and EXPLORE tests -- and not paying the testing company more than a half-million dollars, because the assessments were freely available on the internet.

CPS officials say they're tossing out results from last year's PLAN and EXPLORE tests -- and not paying the testing company more than a half-million dollars, because the assessments were freely available on the internet.

Citing a dispute with the testing company, ACT, Inc., CPS officials are throwing out some of last year’s test score data and rewriting the formula for rating high schools.

District officials say the EXPLORE and PLAN tests they received, administered,  and expected to pay $576,000 for last year had been freely available on the internet all along. In fact, it was a teacher at Hyde Park High School who had used the online test for practice with her students who noticed what appeared to be similarities.

“ACT had in fact sold us a test that was previously exposed,” says Janice Jackson, the district’s chief education officer. “We were told this was a brand new, fresh test.”

Now CPS officials are refusing to pay for the two tests — given annually to 9th- and 10th-graders as part of the ACT’s EPAS suite – because they say those results are compromised. Schools where students took a practice version of those assessments scored slightly higher than those where students who had no access to the test, meaning the scores are artificially inflated, says John Barker, the district’s chief of accountability.

Ed Colby, a spokesman for ACT, confirmed that the company is in communications with the district over the issue but said “the nature of those discussions is confidential. We strongly value our relationship with Chicago Public Schools and hope that we can reach a mutually agreeable solution to this matter.”

The troubles come at a moment when standardized testing is increasingly under fire in Chicago and across the country. Supporters of the growing opt-out movement, including the Chicago Teachers Union, say the compromised test results validate their own concerns about the overreliance of standardized assessments in complicated formulas for rating schools and educators.

Earlier this year, for example, the district had to recalculate the 2013-14 evaluations for thousands of educators due to a coding error. And now, because of the problem with the PLAN and EXPLORE results, the district will not only have to change the school rating system – but also make a slight tweak to how high school teachers are evaluated. Barker says the district will eliminate the 5 percent of teacher evaluations that comes from “value-added” student growth from last year’s EPAS assessments.

“We put too much control in the hands of the testing companies,” says Jennifer Johnson, the CTU’s facilitator for teacher evaluations. “This is another example of CPS chaos. Maybe it’s not all their fault this time. But we don’t need to be making a ton of these high-stakes decisions based on standardized tests.”

ACT had announced in January 2014 that it was discontinuing the EXPLORE and PLAN assessments –  typically given in students’ freshmen and sophomore years to build up to the ACT — as the company turned to a new suite of tests linked to the Common Core State Standards. Colby says the company told all of its customers that it would not create any new versions of either test to be used after the 2013-2014 school year.

CPS knew ACT was discontinuing the assessments, but Jackson says the company promised to deliver tests that had not been “exposed.” (CPS officials did not provide any written documents to that effect from the testing company, saying the matter is in the hands of attorneys.) The district has been using the EXPLORE, PLAN and ACT – all part of the company’s EPAS suite – for more than a decade, and the most recent contract extension ended this summer.

But the versions of EXPLORE and PLAN that the district was given last year were previously used in other school districts and had been posted on the internet, CPS officials say. “We were perfectly willing to pay for the exam if it was a new exam that hadn’t been exposed,” Jackson says.

Similar problems have not been reported in other school districts or states that use the PLAN or EXPLORE tests. Colby says no other school districts has raised any concerns “over the Explore or Plan assessments that ACT delivered to them on or after June 30, 2014.”

CPS officials stressed that the ACT assessment itself – the college entrance exam given to all 11th-graders and paid for by the state last year – was not compromised.

Changes to rating policy

But as a result of the problems, CPS officials say they can no longer use growth and attainment metrics from the PLAN and EXPLORE tests for the purpose of rating high schools. At next week’s Board of Education meeting they will seek changes to the School Quality Rating Policy, referred to as SQRP (see page 13 of Board agenda). It’ll be the third time in just over a year that the district changes the rating policy.

“What we saw were increased scores where we knew students had access to this [test] that was available on the internet,” Barker says. “The impact would not be insignificant.”

District officials would not provide Catalyst with a school-level breakdown of the test results.

Under the district’s current rating policy, 40 percent of high schools’ ratings are based on three metrics connected to the EXPLORE, PLAN and ACT tests, which are taken in succession from students’ freshmen to junior years.  The portion from the tests is currently broken down this way:

  • 20 percent of the rating comes from a measurement of year-to-year student growth from one test to the next, as compared to a national student average;
  • 10 percent measures the growth of students from priority groups, including black and Latino students, diverse learners, and English language learners;
  • And another 10 percent looks at overall attainment.

The remaining 60 percent of high school ratings is made up of nine metrics including attendance, graduation and freshmen on track to graduate on time.

Now the district is proposing to base just 25 percent of high schools’ ratings on assessments.

  • The growth percentile will now account for just 10 percent of a school’s rating. It will be based solely on student improvement from the PLAN test they took in the 2013-2014 school year, to last year’s ACT test, which was not compromised.
  • Growth from the priority groups will account for just 5 percent.
  • And the attainment portion will be based exclusively on ACT performance – not a composite of the three assessments – and count for 10 percent.

Under the proposal, the remaining 75 percentage points will get redistributed among the other nine metrics.

“If you look at the way we’ve redistributed the points, schools will actually still be able to get a quality rating based on things that matter in a school like graduation rates and drop-out rates,” Jackson says.

The district expects to release school ratings sometime next month.

Greg Jones, principal of Kenwood Academy High School, says he doesn’t know whether the change in the formula will have a big impact on his school’s rating. His school has historically used practice PLAN and EXPLORE tests during the school year but didn’t notice any remarkable increases in scores after the tests were administered last spring.

“The performance and growth was pretty consistent with previous years,” he says.

More changes to come

Jones says the proposed changes to the SQRP come at as good a moment as any, given the fact that district is currently looking to overhaul the rating policy yet again.

That’s in part because the PLAN and EXPLORE tests won’t be administered this school year – a decision had been made before this latest snafu came up. A committee of administrators, network chiefs, administrators and experts has been meeting monthly all year to figure out what a new SQRP should look like.

“Knowing that we’re no longer offering EXPLORE and PLAN in future years, there will be some shifts anyway,” Jones says. “This is a transitional moment.”

The SQRP committee is supposed to make recommendations to the Board early next year. (Evaluations for high school teachers are also changing this year. Instead of relying on the EPAS tests for the value-added portion of ratings, high school teachers are being rated on two REACH performance tasks.)

Part of the challenge that the SQRP committee faces is figuring out what to do about the PARCC, the controversial new state assessment that’s aligned to the Common Core and will be given for the second time this school year. That test counts for only state accountability purposes – but the district is investigating its potential use for other purposes, including the SQRP and teacher evaluations.

In the meantime – that is, during this current school year – school administrators don’t know what tests will be used to determine their ratings.

Already the district has had to make at least two substantial changes to the SQRP since it was first introduced in 2013. Last fall the Board voted to allow high-performing schools to be rated entirely on test scores instead of all the comprehensive SQRP metrics.

A few months later, the Board voted to change the policy again in order to return to a previous iteration of the policy that had fewer “levels.”

The changes make it tough to keep track of what all the levels actually mean – which can be problematic for families looking at school ratings to determine the best options for their children.

“I really recognize the power of a report card to drive behavior in schools, and you hate to see mixed messages come out,” says John Easton, distinguished senior fellow at the Spencer Foundation who sits on the SQRP committee.

Easton — who was previously the U.S. Department of Education’s “research czar,” overseeing the Institute of Education Sciences, and is a former director of the University of Chicago’s Consortium on Chicago School Research — says he thinks CPS made the right decision in removing the ACT’s PLAN and EXPLORE results from the rating policy calculation. “They can’t use a test if you can’t trust the scores,” he says.