Late last month a group called The 21st Century Partnership for STEM Education issued a report on Pennsylvania schools that seemed to utterly miss the point, a problem amply illustrated by the opening paragraph:
“Despite the commonly held belief that higher spending is correlated with higher test scores, the evidence does not support this in Pennsylvania as measured by 11th grade math and reading scores from 2004-2010.”
For those who haven’t memorized the arcane education acronyms wonks assume everyone knows, “STEM” is short for Science, Technology, Engineering and Mathematics. It’s been a hot topic for years as international tests and other indicators keep suggesting U.S. schools are not producing students who are up to snuff in those areas, or producing enough of them.
The 21st Century Partnership describes itself as aspiring “to be a regional and national leader in data-based analysis, program planning, innovative curricula and professional development” in STEM education. This report fails in that aspiration.
I have never seen any data or any respected research that claims, as the report intro says, that “higher spending is correlated with higher test scores.” Quite the contrary; data have consistently shown that the single biggest factor in test scores is typically income of the student’s family, which in turn can dramatically impact nutrition of the child and resources available in early learning years, factors that do weigh heavily on standardized test results..
Another argument for more education spending – one used by former Gov. Ed Rendell in his push to increase the flow of state money – is parity. District budgets hinge heavily on property taxes, which in turn are tied to demographics. A district full of wealthy people buying high-end homes can expect fuller coffers than one with people living on small incomes in small, old houses.
This argument does end up pushing the “more money makes better students” idea, but it’s secondary to the notion that the quality of a student’s education can be an accident of where you are born. Rendell’s increased spending was geared almost entirely toward assuring a student in, say, rural Northwest Area got the same opportunities as a student in a wealthy district in, say, booming Monroe County.
In fairness, the report focuses on 11th-grade students, when many of these factors should be mitigated. But mitigating them is precisely where “more money” arguments often come into play, as schools provide everything from free lunches to tutoring for those who came into class behind the curve thanks to income barriers.
And the report seems to do a rigorous job reviewing state test scores to identify 30 Pennsylvania school districts that most consistently improved math and reading results, and 30 that most consistently regressed. No Luzerne County districts made either list.
But it used that data to tackle the wrong issue. The question isn’t does more money work? That’s a question in a vacuum, and we know more dollars, without other factors, guarantee nothing.
The question is how do we spend the money to make it work? The partnership not only fails to ask the question, it concedes as much in a press release. A co-author is quoted as saying “We need in-depth studies of these (60) particular districts to understand what actually happened and why.”
Yes, we do.
So why didn’t the partnership do that study?