But there isn't really a winning move if you're one of the people getting hollowed out — for us, the game is survival until the grift collapses
So I'm thinking about what it looks like to prepare for the Collapse, and I've been making an analogy to depending on gasoline for your entire culture
(I mean, we do, and — after "racism", "gasoline politics" is the second good answer for "why are Americans like that")
2/
So I see two poles for how to respond to "Peak Gasoline" in SFF
[bear with me while I digress; coming back to LLMs in a sec]
and they're basically
doomer hoarding and xenophobia (everyone for himself, and f your feelings) — think MAD MAX
vs
Solarpunk invention of new ways of living that... just don't use gasoline (everyone's in this together, so we'd better have a community garden, build transit, and learn first aid) — think THE TERRAFORMERS (thanks @annaleen@wandering.shop !)
3/
So, back to LLMs, or rather, back to prepping for "Peak AI"
(It's coming, let me tell you.)
Those of us currently under attack/usurpation by the push towards AI, we can respond to this by prepping toward either pole
The doom-prepper approach is to, I dunno, start planning a consultancy on "unbefunging your company's poorly-thought-out dependency on chatgpt", or building tools to poison (or detect poisoning) for the LLMs
4/
But what is the solarpunk approach to "Peak AI"?
Solarpunk offers a vision, in the face of gasoline, for planning community gardens, mutual aid societies, bike repair workshops, etc, even as the gasoline culture around us creaks and groans its way bloodily towards a reckoning
What is our (data science, machine learning, UX, design) equivalent of solarpunk for the concepts of automation?
5/
What is our (data science, machine learning, UX, design) equivalent of solarpunk?
Can we even imagine a world of automation without the exploitation, fantasies of infinite growth, and the increasingly unsubtle justifications for "laundering toil away"?
What are compelling visions of the future of automation that _aren't_ burying the exploitation behind an app or within a billion-parameter model?
6/fin (I think)