I decided to plug this into Claude, which had just been shared online as evidence that administrative bloat was behind declining educational outcomes.
Part of what I’ve been exploring is what critical reasoning with AI looks like and how to use it for augmented investigations of issues rather than either outputs or a search engine. In particular I want to harness the unique ability of LLMs to surface unknown unknowns — that is the connections or issues that you might not otherwise see or think to ask about. The particular nature of LLM technology makes it well suited to that — not to get too nerdy, but the linguistic representation of your issue is likely to be statistically adjacent to the linguistic representation of related issues. Call it intelligence or not, it can be a useful trick.
Here’s my method and the results.
What is this implying?
I shared this graphic with a quote tweet that said “a big part of why your children aren’t getting a good education”
My first question when presenting a piece of evidence is the fairly open-ended “What is this implying?”
I think it’s important to start with an open-ended question like that for two reasons. First, you want to make sure that the AI is correctly getting the basics right. Second, sometimes it turns out that you might be missing something. Keeping the first question loose just makes sure the user and the AI agree on what the evidence is before getting to its significance.
Here’s the reply:
The image shows a graph comparing the growth rates of different staffing categories in US public schools from 2000 to 2021. The data shows that administrative staff has grown by 95% during this period, while the number of teachers has only grown by 5% and students by 10%. The growth in principals and assistant principals is shown at 39%.
The tweet and accompanying comment are suggesting that there is an imbalance in how educational resources are being allocated, with a disproportionate increase in administrative positions compared to teaching staff. The implication is that this top-heavy growth in administration may be detrimental to educational quality, as resources are being directed toward administrative roles rather than classroom teaching positions that more directly impact student learning.
This opens up important questions about educational resource allocation, though it's worth noting that the raw numbers alone don't tell us about the necessity or impact of these administrative roles, the changing demands on school systems, or other factors that might explain these trends.
Initial Toulmin Analysis
I skimmed that and that looked good to me, and then asked for a Toulmin analysis. I like to do a Toulmin analysis because I think it keeps the analysis loose, and prevents using the AI as an “answer machine”.
Do a Toulmin analysis of this chart as evidence that administrative bloat is the cause of supposedly declining educational quality, include rebuttals and follow it with a summary of the strength of the argument vs. the rebuttals.
It gives me a lot here, but keep in mind we’re looking for the unknown unknowns, so we can skim:
I'll break this down using the Toulmin model of argumentation:
CLAIM: Administrative bloat in public schools is causing a decline in educational quality
EVIDENCE (Data):
95% growth in administrative staff from 2000-2021
Only 5% growth in teachers during same period
39% growth in principals/assistant principals
10% growth in students
WARRANT (Underlying Assumption): Resources directed to administration rather than teachers harm educational outcomes because:
Money spent on administrators could be spent on teachers
Administrative overhead reduces organizational efficiency
More administrators don't directly improve classroom instruction
BACKING:
Traditional view that teacher-student interaction is the core of education
Business principle that overhead costs should be minimized
Budget is zero-sum; money spent on one area isn't available for others
QUALIFIERS:
"Big part of the reason" (not the sole cause)
Assumes education quality has actually declined
REBUTTALS:
Correlation vs. Causation:
Graph shows correlation but doesn't prove causation
Educational outcomes influenced by many factors beyond staffing ratios
No direct evidence linking administrative ratios to student performance
Changed Administrative Needs:
Modern schools face new regulatory requirements
Special education mandates require additional coordination
Technology integration needs management
Safety and security concerns require dedicated staff
Mental health and support services need administration
Data Context Issues:
Raw numbers don't show what administrators actually do
Doesn't show if administrative roles have changed/evolved
Doesn't account for changing educational mandates
No breakdown of administrative categories
Doesn't show absolute numbers (could be small base number)
Missing Alternative Explanations:
Socioeconomic factors affecting education
Changes in student needs and demographics
Impact of technology and teaching methods
Family structure and support changes
Funding inequities between districts
STRENGTH ANALYSIS:
The argument's strengths:
Clear correlation in staffing trends
Intuitive connection between resource allocation and outcomes
Zero-sum budget argument is logical
Data spans significant timeframe
Comparison between categories is striking
The rebuttals' strengths:
Multiple alternative explanations
Complex nature of educational outcomes
Well-documented changes in educational requirements
Lack of causal evidence
Missing context about administrative roles
Overall Assessment: While the data presents a compelling correlation and raises valid concerns about resource allocation, the rebuttals significantly weaken the causal argument. The complexity of modern education systems, changing requirements, and lack of direct evidence linking administrative staffing to educational outcomes make it difficult to support the strong causal claim. The argument works better as evidence that resource allocation should be examined rather than as proof that administrative growth is definitely harming education. The data supports asking important questions about staffing priorities but doesn't definitively prove the claim about educational quality decline.
A more measured conclusion would be that administrative growth might be one factor worth examining in educational effectiveness, but it cannot be isolated as a primary cause of educational challenges without additional evidence and context.
Updating the Warrant
I didn’t like the fact here that the warrant didn’t make something explicit — that more teachers per student would lead to better outcomes. So I asked Claude to modify and add this explicitly to the warrant:
Add a warrant -- more teachers improves educational quality
Let me add this important warrant and revise the analysis:
NEW/ADDITIONAL WARRANT: More teachers (lower student-teacher ratios) improve educational quality because:
More individual attention per student
Smaller class sizes enable better classroom management
Teachers can provide more detailed feedback
More time for personalized instruction
Better ability to address diverse learning needs
It then reran the analysis and provided a strengthened argument path:
This strengthens the original argument significantly, so let me revise the strength analysis:
STRENGTHENED ARGUMENT PATH:
Resources are being directed to administration instead of teachers (evidence)
More teachers improve educational outcomes (new warrant)
Therefore, by diverting resources from hiring teachers to administration, educational quality suffers
This creates a stronger logical chain than the previous analysis, because:
There is robust research supporting the benefits of lower student-teacher ratios
The 5% teacher growth vs. 10% student growth shows ratios are getting worse
The 95% administrative growth represents resources that could have funded more teaching positions
After a lot more analysis it ended with:
The argument has moved from "suggestive but weakly supported" to "concerning and worthy of serious investigation" with the addition of the teacher-ratio warrant, but still falls short of definitively proving causation.
Correcting a Data Error
When looking at the stronger argument path, I noticed this line:
The 5% teacher growth vs. 10% student growth shows ratios are getting worse
I looked back at the graph to see that evidence:
This is obviously a badly formatted graphic. But on a second viewing, I realized the numbers here were reversed. It’s a 10% increase in teachers, and a 5% increase in students. So I corrected that:
the chart actually shows 5% student growth and 10% teacher growth, rerun the analysis
It then reran the analysis, noting that this chart actually undermines the point it is supposed to make. What the chart shows is not spending on teachers declining relative to administration and bureaucracy. It shows increases in teachers relative to students, despite that bureaucracy. So if outcomes are really going down (an assumption here) it is not due to resource competition.
STRENGTHENED POSITIVE ASPECTS:
Teacher growth (10%) has outpaced student growth (5%), suggesting improving teacher-student ratios
This undermines the argument that administrative growth has come at the expense of teaching positions
The data actually shows investment in both administrative AND teaching staff above student growth rates
Adding further context
This got me interested how much the total staffing had increased or decreased over this time. So I uploaded a graph of that.
Again, I used an open-ended question:
How does this fit into our 2000-2021 timeline and trends?
It confirmed the content:
Let me analyze how this historical staffing data aligns with and informs our discussion of the 2000-2021 trends:
Key Observations for 2000-2021 Period:
Fall 2000: 5.7 million employees
Fall 2016: 6.5 million employees (latest shown)
This represents about a 14% increase in total school employees
Then it noted something interesting. Given that the increase of the administrative staff was 95% and yet the total increase in staffing was only 14%, administrative staff must be a fairly small portion of the overall staff.
That’s a fairly advanced quantitative reasoning observation that very few people in my experience get. It reruns the analysis:
Analysis:
The total workforce growth (14%) falls between teacher growth (10%) and principal growth (39%), suggesting:
Administrative growth wasn't accompanied by equivalent support staff growth
The 95% growth in administrative staff represents a relatively small portion of total employees
Most school employees are likely teachers and support staff, not administrators
Historical Context:
Shows steady growth from 1949 (1.3M) to 2000 (5.7M)
Growth rate actually slowed during our 2000-2021 analysis period
Suggests the modern period isn't unusual in seeing staffing increases
This Weakens the Original "Bloat" Argument Because:
Total staffing growth is modest
The dramatic administrative percentage increase likely represents a smaller absolute number within the total workforce
Growth appears to be a continuation of long-term trends rather than recent mismanagement
Surprising elements
I also uploaded a number of other graphs and charts about various subcategories of professionals, and noticed that it was a bit easy to get overwhelmed by all the data. So I tried out what turned out to be a very useful question:
What are some of the more surprising things we've found in this analysis so far?
The idea of asking this question is that the things that we have learned since starting the session are probably more important to keep front of mind than the things we came in knowing. I don’t know, it seemed a good question. In this case it got a good result.
Several surprising elements have emerged from our analysis:
The Student-Teacher Ratio Improvement:
Initially many (including myself in my first analysis) might assume administrative growth came at the expense of teachers
The data actually shows teacher growth (10%) outpaced student growth (5%), meaning class sizes likely improved
This contradicts the narrative that administrative growth harmed teacher staffing
Scale of Special Needs Growth:
The 83% increase in students with disabilities since 1975 is particularly striking
Combined with 29% growth in ELL students and 45% growth in low-income students
These numbers help contextualize the 95% administrative growth - it's more proportional to need than it initially appeared
Historical Staffing Context:
The steady growth in total school employees since 1949 suggests recent growth isn't unusual
The 2000-2021 period actually shows slower total staffing growth than previous decades
This challenges the notion that recent administrative growth represents a unique problem
Total Workforce Perspective:
The 14% total workforce growth between 2000-2016 is much smaller than the 95% administrative growth
This suggests administrative growth, while large in percentage terms, represents a smaller absolute change than it might appear
The dramatic percentage increase may be somewhat misleading without this context
Summary
To be clear, the point here was not to get a quick answer, but to deepen our understanding of the issue. But we did learn some quick and useful things.
First, the graph itself undermines the argument that administrative bloat is a cause of student outcome decline (again, if we accept that assumption, something that Claude consistently warns should be checked). It’s a bit of a self-own actually.
Second, maybe schools could be leaner, and maybe when we look at staff growth relative to population the numbers don’t look as good. But we’re not in the middle of unprecedented growth by a long shot.
Finally, I have skipped over a lot of the Claude responses, but it actually does quite well with explaining how shifts in student population and regulatory requirements can drive increases in staffing. I know that, of course, but seeing it laid out clearly was helpful.
I’m still figuring out how to narrate these sessions without making it overwhelming. Maybe this is something that better translates to video, like Hank Green’s sensemaking sessions. I’ll figure it out. But what I want to get at is there is a there here, there is a way to work with these tools in ways that can surprise you and enrich your understanding as you work through problems instead of provided canned results. And while it currently feels very much more than most students can handle, there is going to be value in coming up with methods that use this tech for the augmentation of critical reasoning sessions rather than providing answers or producing products.
Again when looking at this one, I found myself thinking, but what if I showed it material with a quote and data that's never been discussed online before, would it be able to make all those connections? My thinking is that *similar* things are in its training data, rather than that Claude is reasoning. I could be totally wrong. But I am always trying to test the limits of whatever looks "good", to see how well it would perform on something less expected.
Good catch re the mistakes in the data, but you do realize you're the one who made the catches, not Claude, right? I guess senior or grad students probably would catch something like that, but perhaps not younger students?
BTW, I assume you read the original article from which the chart is taken to see which arguments it gives, were they similar or uncritical? Were there response articles to it somewhere?