Brutkey

Cory Doctorow
@pluralistic@mamot.fr
Long thread/12

Instead of blaming inadequate funding for poor ambulance response times, politicians blamed "inefficiency," driven by a poor motivation. So they established a metric: ambulances must arrive within a certain number of minutes (and they set a consequence: massive cuts to any ambulance service that didn't meet the metric).

12/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/13

Now, "an ambulance where it's needed within a set amount of time" may sound like a straightforward metric, and it was - retrospectively. As in, we could tell that the ambulance service was in trouble because ambulances were taking half an hour or more to arrive. But prospectively, after that metric became a target, it immediately ceased to be a good metric.

13/


Cory Doctorow
@pluralistic@mamot.fr
Long thread/14

That's because ambulance services, faced with the impossible task of improving response times without spending money, started to dispatch ambulance motorbikes that couldn't carry 95% of the stuff needed to respond to a medical emergency, and had no way to get patients back to hospitals. These motorbikes were able to meet the response-time targets...without improving the survival rates of people who summoned ambulances:

https://timharford.com/2014/07/underperforming-on-performance/

14/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/15

AI turns out to be a great way to explore all the perverse dimensions of Goodhart's Law. For years, machine learning specialists have struggled with the problem of "reward hacking," in which an AI figures out how to meet some target in a way that blows up the metric it was derived from:

https://research.google/blog/bringing-precision-to-the-ai-safety-discussion/

15/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/16

My favorite example of this is the AI-powered Roomba that was programmed to find an efficient path that minimized collisions with furniture, as measured by a forward-facing sensor that sent a signal whenever the Roomba bumped into anything. The Roomba started driving backwards, smashing into all kinds of furniture, but measuring zero collisions, because there was no collision-sensor on its back:

https://x.com/smingleigh/status/1060325665671692288

16/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/17

Charlie Stross has observed that corporations are a kind of "slow AI," that engage in endless reward-hacking to accomplish their goals, increasing their profits by finding nominally legal ways to poison the air, cheat their customers and maim their workers:

https://memex.craphound.com/2017/12/29/charlie-strosss-ccc-talk-the-future-of-psychotic-ais-can-be-read-in-todays-sociopathic-corporations/

17/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/18

Public services under conditions of austerity are another kind of slow AI. When policymakers demand that a metric be satisfied without delivering any of the budget or resources needed to satisfy it, the public employees downstream of that impossible demand will start reward-hacking and the metric will become a target, and then cease to be a useful metric.

Which brings me, at last, to AI in educational contexts.

18/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/19

In 2008, George W Bush stepped up the long-running war on education with the No Child Left Behind Act. The right hates public education, for many reasons. Obviously, there's the fact that uneducated people are easier to mislead, which is helpful if you want to get a bunch of turkeys to vote for Christmas ("I love the uneducated" -DJ Trump).

19/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/20

Then there's the fact that, since 1954's Brown v Board of Ed, Black and brown kids were legally guaranteed the right to be educated alongside white kids, which makes a large swathe of the right absolutely nuts. Then there was the 1962 Supreme Court decisions that banned prayer in school, leading to bans on teaching Christian doctrine, including nonsense like Young Earth Creationism.

20/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/21

Finally, there's the fact that teachers a) belong to unions; and, b) believe in their jobs and fight for the kids they teach.

No Child Left Behind was a vicious salvo in the war on teachers, positing the problem with education as a failure of teachers, driven by a combination of poor training and indifference to their students. Under No Child Left Behind, students were subjected to multiple rounds of standardized tests.

21/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/22

Teachers with low-performing students had their budgets taken away (after first being offered modest assistance in improving those scores).

Some of NCLB's standardized tests represented reasonable metrics: we really do want kids to be able to read and do math and reason and string together coherent thoughts at various points in their schooling. But when these metrics became targets,
boy did they stop being useful as metrics.

22/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/23

It's impossible to overstate how fucking perverse NCLB was. I once met an elementary school teacher from an incredibly poor school district in Kansas. Many of her students were resettled refugees who didn't speak English; they spoke a language that no one in the school system could speak, and which had no system of writing. They arrived in her classroom unable to speak English and unable to read or write in any language, and no one could speak their language.

23/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/24

Obviously, these students performed badly on standardized tests delivered in English (it didn't help that they had to take the tests just months after arriving in the classroom, because the clock started ticking on their first test when they entered the system, which could take half a year to place them in a class). Within a couple years, these schools had had most of their budgets taken away.

24/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/25

When the standardized tests rolled around, this teacher would lead her students into the only room in the school with computers - the test taking room. For many of these students, this was the first time they had ever used a computer.

25/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/26

She would tell them to do their best and leave the room for an hour, while a well-paid proctor (along with test-taking computers, the only thing NCLB guaranteed funding for) observed them as they tried to figure out how a mouse worked. They would all score zero on the test, and the school would be punished.

26/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/27

NCLB was such a failure that it was eventually rescinded (in 2015), but by that time, a new system of standardization had rushed in to fill the gap, the Common Core. Common Core is a set of rigid standardized curriciula - with standardized assessment rubrics - that was, once again, driven by contempt for teachers.

27/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/28

The argument for Common Core was that students were failing - not because of falling budgets or No Child Left Behind - but because the unions were "protecting bad teachers," who would then go on to fail students. By taking away discretion from teachers, we could impose "accountability" on them.

28/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/29

The absolutely predictable outcome followed Goodhart's Law to a T: teachers prioritized inclulcating students with the skills to pass the standardized tests, and when those test-taking skills crowded out actual learning, learning fell by the wayside.

This continues up to the most advanced part of public education, the Advanced Placement courses that students aspiring to college are strongly pressured to take. If Common Core is rigid, AP is brittle to the point of shattering.

29/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/30

Anyone who's ever parented a kid through the US secondary school system knows how much time their kids spent learning to hit their marks on standardized assessments, to the exclusion of actual learning, and how soul-suckingly awful this is.

Take that staple of the AP assessment rubric: the five-paragraph essay (5PE), bane of students, teachers and parents everywhere:

https://www.insidehighered.com/blogs/just-visiting/kill-5-paragraph-essay

30/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/31

Speaking as a sometime writing teacher and international bestselling essayist, 5PEs are objectively bad essays. Their only virtue is that they can be assessed in a standard way, so the grade any given 5PE is awarded by any grader is likely to be the same grade it receives when presented to any other grader. Grading an essay is an irreducibly subjective matter, and the only way to create an objective standard for essays is to make the essays unrecognizable as essays.

31/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/32

And yet, the 5PE is the heart of assessment for many AP classes, from History to English to Social Studies and beyond. A kid who scores high on any humanities APs will have put endless hours into perfecting this perfectly abominable literary form, mastering a skill that they will never, ever be called upon to use.

32/

Alan Langford ๐Ÿ‡จ๐Ÿ‡ฆ๐Ÿ‡จ๐Ÿ‡ฆ๐Ÿงค๐Ÿงค๐ŸงŠ๐ŸงŠๆ‘
@alan@mindly.social
Long thread/32

@pluralistic@mamot.fr Oh that is so untrue.Tthe 5PE is foundational to corporate white papers! Don't ask me how I know.

Cory Doctorow
@pluralistic@mamot.fr
Long thread/33

(The top piece of college entrance advice is "don't write your personal essay as a 5PE" and college professors spend the first half of their 101 classes teaching students not to turn in 5PEs.)

The same goes for many other aspects of AP and Common Core assessment. If you do AP Lit, you'll be required to annotate the literature you read by making a set number of marginal observations on
every page of the novels, poems and essays you read.

33/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/34

Again, as a literary reviewer, novelist, and nonfiction writer who's written more than 30 books, I have to say, this is a batshit way to learn to analyze and criticize literature. It's sole virtue is that it reduces the qualitative matter of literary analysis to a qualitative target that students can hit and teachers can count.

34/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/34

Again, as a literary reviewer, novelist, and nonfiction writer who's written more than 30 books, I have to say, this is a batshit way to learn to analyze and criticize literature. It's sole virtue is that it reduces the qualitative matter of literary analysis to a qualitative target that students can hit and teachers can count.

34/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/35

And that's where AI comes in. AI - the ultimate bullshit machine - can produce a better 5PE than any student can, because the point of the 5PE isn't to be intellectually curious or rigorous, it's to produce a standardized output that can be analyzed using a standardized rubric.

I've been writing YA novels and doing school visits for long enough to cement my understanding that kids are actually pretty darned clever.

35/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/35

And that's where AI comes in. AI - the ultimate bullshit machine - can produce a better 5PE than any student can, because the point of the 5PE isn't to be intellectually curious or rigorous, it's to produce a standardized output that can be analyzed using a standardized rubric.

I've been writing YA novels and doing school visits for long enough to cement my understanding that kids are actually pretty darned clever.

35/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/36

They don't graduate from high school thinking that their mastery of the 5PE is in any way good or useful, or that they're learning about literature by making five marginal observations per page when they read a book.

Given all this, why *wouldn't* you ask an AI to do your homework? That homework is already the revenge of Goodhart's Law, a target that has ruined its metric.

36/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/36

They don't graduate from high school thinking that their mastery of the 5PE is in any way good or useful, or that they're learning about literature by making five marginal observations per page when they read a book.

Given all this, why *wouldn't* you ask an AI to do your homework? That homework is already the revenge of Goodhart's Law, a target that has ruined its metric.

36/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/37

Your homework performance says nothing useful about your mastery of the subject, so why not let the AI write it. Hell, if you're a smart, motivated kid, then letting the AI write your bullshit 5PEs might give you time to write something good.

37/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/37

Your homework performance says nothing useful about your mastery of the subject, so why not let the AI write it. Hell, if you're a smart, motivated kid, then letting the AI write your bullshit 5PEs might give you time to write something good.

37/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/38

Teachers aren't to blame. They have to teach to the test, or they will fail their students (literally, because they will have to give them a failing grade, and figuratively, because students who get failing grades will face all kinds of punishments). Teachers' unions - who consistently fight against standardization and in favor of their members discretion to practice their educational skills based on kids' individual needs - are our best hope:

https://pluralistic.net/2025/03/29/jane-mcalevey/#trump-is-a-scab

38/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/38

Teachers aren't to blame. They have to teach to the test, or they will fail their students (literally, because they will have to give them a failing grade, and figuratively, because students who get failing grades will face all kinds of punishments). Teachers' unions - who consistently fight against standardization and in favor of their members discretion to practice their educational skills based on kids' individual needs - are our best hope:

https://pluralistic.net/2025/03/29/jane-mcalevey/#trump-is-a-scab

38/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/39

The right hates teachers and keeps on setting them up to fail. That hatred has no bottom. Take the Republican Texas State Rep Ryan Guillen, whose House Bill 462 will increase the state's school safety budget from $10/student to $100/student, with those additional funds earmarked to buy one armed drone per 200 students (these drones are supplied by a single company that has ties to Guillen):

https://dronelife.com/2024/12/08/texas-lawmaker-proposes-drones-for-school-security-a-less-lethal-solution/

39/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/39

The right hates teachers and keeps on setting them up to fail. That hatred has no bottom. Take the Republican Texas State Rep Ryan Guillen, whose House Bill 462 will increase the state's school safety budget from $10/student to $100/student, with those additional funds earmarked to buy one armed drone per 200 students (these drones are supplied by a single company that has ties to Guillen):

https://dronelife.com/2024/12/08/texas-lawmaker-proposes-drones-for-school-security-a-less-lethal-solution/

39/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/40

Imagine how much Texas schools could do with an extra $90/student/year - how much more usefully that money could be spent if it were turned over to teachers. But instead, Rep Guillen wants to put "AI in schools" in the form of drones equipped with pepper-spray, flash bangs, and "lances" that can be smashed into people at 100mph.

40/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/40

Imagine how much Texas schools could do with an extra $90/student/year - how much more usefully that money could be spent if it were turned over to teachers. But instead, Rep Guillen wants to put "AI in schools" in the form of drones equipped with pepper-spray, flash bangs, and "lances" that can be smashed into people at 100mph.

40/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/41

The problem with AI in schools isn't that students are using AI to do their homework. It's that schools have been turned into reward-hacking AIs by a system that hates the idea of an educated populace almost as much as it hates the idea of unionized teachers who are empowered to teach our kids.

41/

Cory Doctorow
@pluralistic@mamot.fr
Long thread/41

The problem with AI in schools isn't that students are using AI to do their homework. It's that schools have been turned into reward-hacking AIs by a system that hates the idea of an educated populace almost as much as it hates the idea of unionized teachers who are empowered to teach our kids.

41/