In an interview with this paper last year, UC President Robert C. Dynes outlined his philosophy toward the university’s academic preparation programs: “I’m willing to put academic preparation money — outreach money — into anything that proves successful. No goofy ideas, but anything that actually achieves success.”
High school students at UCSD’s Preuss School gather for a study session. The institution is part of the UC-system’s outreach efforts, which are still lacking appropriate measures of accountability, experts say, despite a new UC report detailing the programs’ successes.
However, a decade after the university first introduced the programs, critics continue to argue that neither Dynes nor the UC campuses actually know how to separate “goofy ideas” from successful ones.
Having spent hundreds of millions of dollars on outreach, the university responded to its detractors last week, releasing the first comprehensive study of the efforts. Despite its lofty claims, though, the new data is unlikely to calm the controversy.
“I think what the report provides is kind of a first step in the right direction,” said Anthony Simbol, an outreach skeptic who oversees higher education for the Legislative Analyst’s Office, the state Legislature’s nonpartisan policy evaluator.
When voters passed Proposition 209, banning racial preferences in university admissions, UC faculty and administrators rushed to craft a way to legally target students — the socio-economically disadvantaged and ethnic minorities — they feared would be left behind. The answer was outreach, a term describing a variety of programs that include things like special tutoring, counseling and preparation for standardized tests.
However, the original framework for outreach lacked concrete accountability measures, requiring programs to enroll certain numbers of students, and have them meet specific academic benchmarks. In effect, the framework incentivized administrators to enroll the top performing students from the disadvantaged subgroups, who would’ve met the benchmarks anyway, instead of providing real help for those most in need, critics argued.
One such critic is Gov. Arnold Schwarzenegger. Last year, he proposed eliminating state funding for academic preparation, but later relented under the condition that the university would carry out a new, more thorough evaluation of the programs.
The new report is the UC system’s attempt to satisfy that condition.
Unlike the original framework, the report sets new goals and compares students who participate in its programs to those who don’t, to try to measure their success in much the same way doctors test drugs against placebos.
“Research and evaluation demonstrate the effectiveness of [the] programs,” the report stated.
The problem, however, is that there is reason to believe that those program participants differ from others in different, more fundamental ways.
It could be, for example, that the most academically driven and talented students would also choose to participate in UC academic preparation, driving up the scores of participants independent of the programs.
In a medical analogy, if heart-disease patients on a certain drug experience higher rates of cardiac arrest, is the drug to blame or did those taking it simply start out with weaker hearts?
“For many educational researchers, finding causal relationships between things like educational treatments/programs/curricula and student outcomes is the ‘holy grail’ of research — and just as elusive (despite the ‘DaVinci Code’!),” stated Harold Levine, dean of UC Davis’ School of Education and chair of the university’s Student Academic Preparation and Educational Partnerships Accountability Planning and Oversight Committee, in an e-mail. “So, in short, we do the best we can. … For our SAPEP programs, I think it’s fair to say that we can’t prove a causal effect. Rather, we can look at the correlations (through complex statistical models), make comparisons with similar populations not in the program(s), and draw our inferences.”
Though the university’s data show that students participating in its outreach programs do better than their peers and the statewide average, there is some reason to be agnostic. The drawbacks of such statistics can be found by looking at one program, UCSD’s Preuss School, one of the best-studied of the university’s efforts.
For example, 69 percent of 11th graders enrolled at the UCSD-run charter school scored “proficient” or better in English on state standardized tests, compared to 36 percent for San Diego County.
But when compared to a more similar group — those who applied to Preuss but lost the lottery to get in — the statistical differences disappeared, suggesting the kids were simply better in English to begin with, before they came to Preuss.
However, few outreach programs have undergone similarly rigorous tests, because they are extremely difficult to model, and are prohibitively expensive.
“It’s a little bit too early to tell; there is really no conclusive evidence,” said Simbol, who also serves on the SAPEP oversight committee, of the effectiveness of the programs.
However, the university points out that its evaluations are some of the best available.
“Taking into consideration accepted standards for measuring effects of educational programs, we believe the university’s evaluation of its programs reliably accounts for the other known factors that might have affected students’ educational outcomes,” stated UC Office of the President spokesman Brad Hayward in an e-mail.
For now, it’s unclear whether the report will satisfy critics. However, it may not have to.
One outreach skeptic, state Sen. Jackie Speier (D-San Francisco/San Mateo), who previously wondered publicly whether the programs actually worked, has written a letter to the chair of the Senate Budget Subcommittee on Education, asking that academic preparation funding be included in next year’s budget. The letter was dated before the release of the report.