Response to Paul Vallas’s June 30 letter

Print More

As Mr. Vallas notes, the Illinois State Board of Education (ISBE) calculates dropout rates in two different ways. Both use data supplied by the Chicago Public schools and all other school districts in the state. Here is an explanation of both rates and why we relied on the one we did in reporting an increase.

The dropout rates that Mr. Vallas wanted us to use are those published in the annual Illinois school report cards, which by law provide data only on regular schools, not special or alternate schools. Those rates camouflage the phenomenon of students in regular schools being transferred to special schools, from which they then drop out. As Mr. Vallas points out, in Chicago, these alternative schools tend to have higher dropout rates than regular schools do.

Over all, the dropout rate for Chicago’s regular high schools did improve in 1997-98, but the rate that encompasses all its schools and students got worse. In 1996-97, 15.6 percent of the system’s 101,590 high school students dropped out, according to the ISBE’s annual report, “High School Dropouts by Grade, Gender, and Racial/ethnic category.” In 1997-98, 17.5 percent of 98,610 high school students dropped out, the highest annual rate of the decade.

These data are buttressed by the board’s own monthly reports of school membership, which show that the midyear loss of high school students—specifically from October to May—increased in 1997-98.

Like ISBE, Chicago’s own Office of Compliance also produces an annual dropout report that encompasses all students. For 1996-97 and the two prior school years, the CPS rates are nearly identical to the ISBE rates. However, the School Reform Board has not released Compliance’s annual dropout report for 1997-98. If the previous pattern holds, the CPS calculation likely will show an increase, too.

Mr. Vallas says that former state school Supt. Joseph Spagnolo—who now works for the Chicago Public Schools—says the all-school ISBE rates are unreliable because “they possibly double count students.” Tom Hernandez, spokesperson for the ISBE, responds that no significant double counting is likely.

Mr. Vallas contends that our main story ignores our own chart showing a downward trend in the four-year dropout rate, beginning with the Class of 1991 through the Class of 1998. However, based on historical trends, the overwhelming majority of students who dropped out in 1997-98 likely belonged to the classes of 2001 and 2002. Indeed, we won’t know for at least one year and probably two or three whether the 1997-98 increase in the annual calculation will have any affect on the four-year rate. This is a point that we should have included in our stories.

Mr. Vallas errs when he says that we were completely silent about unreliable dropout data of the past and the board’s aggressive steps to get accurate information. Click to see our story, “Audit team scours six years of records.”

Several times, Mr. Vallas incorrectly accuses us of overlooking policies and programs put in place since 1996 to help students stay in school. In our June issue alone we wrote about three of the Reform Board programs that Mr. Vallas mentions: Cradle to Classroom, credit recovery and Truancy Outreach Program. In the past year, we have written about the following programs: Dropout retrieval, alternative schools for troublemakers, transition schools, student advisories and the International Baccalaureate program.

Mr. Vallas also accuses us of ignoring the work of Northwestern University Prof. Fred Hess, who is conducting a three-year study of the board’s high school restructuring program. We quoted Mr. Hess from interviews on the subject of dropouts. We did not quote from his report summary because it wasn’t about dropouts. (To see the summary, click here.)

Mr. Vallas contends that we “insinuate” that the decrease in the number of students absent 70 or more days is due to schools dropping students for excessive absences rather than schools’ efforts to improve attendance. What we wrote was that it is unclear how much of the decrease is due to each factor. We included this statement because the school attendance audits available to Catalyst at press time showed a clear pattern: attendance percentages improved as overall enrollment declined.

While Mr. Vallas does not challenge the data in our story on schools dropping students before their 16th birthday, he does question the schools we chose for the study and, seemingly, that we wrote the story at all because the numbers were small. In selecting the schools, we looked for variety in the demographics of student bodies. We agree that the seven schools were not representative of school achievement levels.

Mr. Vallas reports that Robert Saddler says the quote attributed to him does not reflect his views and was taken out of context. Mr. Saddler’s remarks were taken verbatim from page 15 of the transcript of the April 1998 meeting of the Youth Connection Charter School Board of Directors. (For context, pages 10-15 are posted to our site.)

Mr. Vallas also cites a dropout rate for transition centers, 13 percent, but does not explain that it covers only half a school year. For all of 1997-98, the dropout rate for transition centers was 30 percent, according to calculations by Mr. Hess. This rate contrasts starkly with numbers supplied to Catalyst last year by the administration, which counted only 63 dropouts from transition centers in 1997-98, which yields a dropout rate of about 5 percent.

Switching to our June 1998 issue, Mr. Vallas incorrectly accuses us of writing that test scores went up only because of retained students. In my column in that issue, I cited a number of reasons for increased test scores, including summer school, after-school programs and the school probation program. I also speculated that the scores had been inflated by retention and criticized the board for not addressing that factor. When the Consortium on Chicago School Research subsequently reported that retention had boosted scores at some grade levels but not over all, I wrote a story reporting those findings.

Mr. Vallas also incorrectly contends that Catalyst is trying to defend the old policy of social promotion. When the School Board first announced its student promotion/retention program, we, like almost everyone else, applauded. Since then, our reporting has uncovered negative, unintended consequences that we have said should be addressed. And Mr. Vallas himself is saying it’s time for the program to evolve.

For our response to Mr. Vallas’s broader allegations about bias against his administration, see our editorial: “What we do and why we do it.”