Wednesday, February 15, 2012

School Vouchers in Louisiana...already failing!


Cover
An Independent investigation and analysis
By Heather Miller -  Photos by Robin May
[Editor's Note: Read the accompanying editorial, "We Can't Vouch for School Choice," here.]

The state Department of Education is either cooking the books or simply incompetent in establishing the success of the voucher program Gov. Bobby Jindal wants to expand.

Because the numbers just don’t add up.


More than 150 fourth graders in New Orleans took the state’s LEAP test in a private school setting last year through the state-run private school voucher program. If any or all of them failed, there is nothing on the books to stop them from advancing to the fifth grade.

The same can’t be said for fourth graders in Louisiana’s public schools, where achievement standards are clearly outlined by state law and students must meet those benchmarks to move on to the next grade.

The hypothetical scenario — hypothetical because the state would not confirm whether this has actually occurred with voucher students in New Orleans — is one of several critical oversight issues The Independent discovered through an examination of the New Orleans voucher program, also known as the pilot of Gov. Bobby Jindal’s push for a statewide launch of publicly funded private education.

The expansion of vouchers, which fund private school tuition for low-income students, is among the most controversial pillars of Jindal’s education reform agenda. The governor’s plan to potentially add 380,000 students to the private school voucher rolls has raised seemingly endless questions on how the program would work on a statewide level and, more important, whether private schools in New Orleans are showing measurable gains for the students they serve.

But few conclusions can be drawn from the data provided by the state Department of Education, and a further review of the numbers reveals limited measurable data marred by inaccuracies. The state has poured $26.2 million into the 4-year-old program that serves more than 1,800 students. So far, the state can vouch for the success of 50.

In a public records request submitted to DOE Jan. 24, The Independent asked for a grade-level breakdown of the voucher program’s enrollment, specifically the total number of students in third, fourth and fifth grade who are required to take the state’s LEAP test. The records request was, in part, a means of verifying that private schools are really testing the voucher students they accept. Private schools using voucher money must administer the LEAP and iLEAP tests to voucher students, but unlike public schools, private schools face no repercussions for failing scores because they aren’t bound to the accountability standards built into public schools.

Three days after the state received the records request, DOE spokeswoman Rene Greer conceded that there are discrepancies in the data and asked for more time.

The state agency did, however, provide an annual report on the voucher program to supplement The Independent’s research, and at first glance it appeared that the 15-page document would answer most, if not all, of the newspaper’s questions about the program. The same report was presented to the state Board of Elementary and Secondary Education in November. It was referenced by BESE President Penny Dastugue during a Jan. 17 BESE meeting.
Cover1
Gov. Bobby Jindal

The Times-Picayune presumably received the same data, too, some of which was included in a Feb. 4 Times-Pic analysis of the voucher program’s success.

Using the numbers in the report, The Independent calculated 600 voucher students who should have taken a standardized test in the 2010-2011 school year. Of those 600 students, less than half of them (294) had test scores, according to the report provided by DOE.

Fortunately, the numbers provided by the state Department of Education are dead wrong.

The state attributes the errors to a variety of factors, among them the department’s use of projected enrollment numbers as opposed to actual student counts, Greer says. DOE now says that the number of voucher students who should have tested last year is 498, of which 479 scores were reported.

DOE maintains that the percentages of students scoring at grade level or above are correct in the report, but the dismal figures clearly show that voucher schools have not proven more successful than their failing public school counterparts in New Orleans. And as The Times-Picayune points out, even if the test scores were significantly better than public schools, the state has failed to track the students and raw data in a way that can effectively measure the program.

“The best data the state has made available tracks an initial group of third-grade students who began in the voucher program when it started in 2008,” The Times-Picayune’s Andrew Vanacore writes in that Feb. 4 report. “Although it makes no comparison with students in public schools, it at least shows growth over time among a consistent group of students. But this comparison tracks only 38 students, leaving considerable uncertainty.”

The state department has since retracted that number, replacing the 38 students with an equally unimpressive cohort of 50 voucher students whose progress has been tracked over the duration of the program. Of the 104 students who started in 2008, 70 remain in the program today, leaving 20 students unaccounted for.

Greer confirms that the department has omitted from the cohort the test scores of students who weren’t promoted to the next grade (in all likelihood some of the lowest scores in the batch).


Cover2

In the program’s first year, almost 16 percent of voucher third graders in private schools did not have test scores reported to the state.

“People are interpreting things differently in accountability, and we need to do a better job of looking at what’s happening even if the laws don’t require it,” Greer says.

State Superintendent John White says the department will accept responsibility for the number of discrepancies, but still denies that the data prove anything other than the program’s intent: Choice.

“A lot of it’s yet to be worked out,” White says. “It’s going to come down to BESE and the legislative process. The program is a pilot for a larger program, and we need to have a strong, regulated program. We need to do better in that area. But I will not accept people’s criticisms that a choice-based system in New Orleans has not worked. Student outcomes have shown this is a positive reform. We can’t levy criticism on schools that are choices for kids. We are providing an option for which government did not. That’s why I say that with this parsing of the data, there can be no gaming.”

But the irregularities in the report clearly exhibit that there are too many choices in the interpretation of what should be raw and irrefutable numbers, the same numbers state officials are using to sell the voucher brand and silence the program’s critics.

Citing the DOE report at a recent BESE meeting, Dastugue took issue with public-education advocate Karran Harper Royal, who raised concerns over the state’s monitoring process of nonpublic schools.

“I have a copy of the annual report for the 2010-2011 scholarship program presented some time in the fall,” Dastugue said. “It basically says that the program retention rates were [that] 95 percent of the families submitted applications to return.”
Cover3
State Superintendent of Education John White
Dastugue, using what the department maintains is correct info on how many voucher school parents submitted applications to return to the program, inaccurately relates the 95 percent figure to actual program retention rates. The slight slip from Dastugue may have never occurred had the DOE included in the report the program’s real retention rate of 86 percent.

“The statement that parents are dissatisfied and leaving the schools is not supported by the report presented by the Department of Education to the board back in the fall,” Dastugue says. “And I will make sure you all get a copy of it.”

But no matter how you “parse” the rest of the numbers in that DOE report, only one conclusion can be drawn: The state has failed to adequately monitor the 1,800 students who attend private school on the public’s dime.

If the only accountability measure we have after three years of data isn’t worth the paper it’s printed on, are taxpayers ready to jump on board with 400,000 more kids? Then again, as DOE has repeatedly implied, it all depends on how you look at it.

Source:  The Independent Weekly.com's article: "Incomplete"

No comments: