Brutkey

Aron
@Aron@nerdculture.de

@SonstHarmlos@sueden.social @denschub@mastodon.schub.social

Who creates the output when using:
- a typewriter?
- a word processor?
- a LLM with a prompt?
It's a very bad comparison, to my mind.

I agree, that LLMs used as in ChatGPT and similar won't just disappear, but they most probably won't surpass a certain level, as that is inherently given by how they work internally. One can try to mitigate the drawbacks, but that won't solve the underlying problems. I will be happily proven wrong on this one.

To be open and curious and at the same time call it "anti-AI sentiment" is more than just a bit confusing, as it suggests there is no critical assessment of new technologies. It's good to acknowledge the advantages but turning a blind eye on the disadvantages and fingerpointing at existing and proven technologies by saying "With those XY is also not good" shows that you might be relying on a hype.

I think I've laid down all my points in the past comments, so I wouldn't answer if I would repeat myself.

SonstHarmlos
@SonstHarmlos@sueden.social

@Aron@nerdculture.de @denschub@mastodon.schub.social I'm quite rarely using LLMs except for coding experiments. For example, I made the conscious decision to have only a single LLM app on my iPhone, which is European (Mistral Le Chat), and I'm using it mainly for generating alt text for some images posted here. I also made the conscious decision not to pay for coding LLMs used privately (cost corresponds roughly with token usage/computing power used, and this corresponds roughly with energy usage/CO2 emissions)