@foolishowl@social.coop
I was looking at this philosophy paper again.
Frontiers | Naturalizing relevance realization: why agency and cognition are fundamentally not computational
https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2024.1362658/full
@foolishowl@social.coop
It reinforces my belief that the idea that LLMs are a step towards AGI is nonsense.
The ability to recognize what is relevant is a precondition of agency.
Agency precedes consciousness.
Consciousness precedes language.
Consciousness does not emerge from language. You cannot have consciousness without agency.
A paramecium has agency and awareness that an entire data center cannot achieve with all its catastrophic waste of resources.