Brutkey

AI6YR Ben
@ai6yr@m.ai6yr.org

πŸ˜‚πŸ˜‚

"A man gave himself bromism, a psychiatric disorder that has not been common for many decades, after asking ChatGPT for advice and accidentally poisoning himself, according to a case study published this week in the Annals of Internal Medicine. "

https://www.404media.co/guy-gives-himself-19th-century-psychiatric-illness-after-consulting-with-chatgpt/

#LLMs #ChatGPT #medicine


Jeremy Janzen :mstdnca:
@jeremyjanzen@mstdn.ca

@ai6yr@m.ai6yr.org Bromism sounds like a religious movement started by podcast bros.

Butch Henderson :ani_clubtwit:
@ButchH11@twit.social

@ai6yr@m.ai6yr.org wow don't do any other research and just believe what the idiot chat tells you...

Lauren Weinstein
@lauren@mastodon.laurenweinstein.org

@ButchH11@twit.social @ai6yr@m.ai6yr.org NEVER blame the users. Expecting people to "do their own research" in the face of answers provided by these systems given the MASSIVE HYPE Big Tech is spewing out about them is just wrong.

Lauren Weinstein
@lauren@mastodon.laurenweinstein.org

@ButchH11@twit.social @ai6yr@m.ai6yr.org NEVER blame the users. Expecting people to "do their own research" in the face of answers provided by these systems given the MASSIVE HYPE Big Tech is spewing out about them is just wrong.

MsMerope
@MsMerope@sfba.social

@lauren@mastodon.laurenweinstein.org @ButchH11@twit.social @ai6yr@m.ai6yr.org
I can remember back in the good old days on the Internet, when, if someone asked a very simple question, people would get irritated and say why don’t you just Google it?

AI6YR Ben
@ai6yr@m.ai6yr.org

@lauren@mastodon.laurenweinstein.org @ButchH11@twit.social ITS A PHD LEVEL AI YOU NEVER QUESTION THE PHD LEVEL BS SOFTWARE

Lauren Weinstein
@lauren@mastodon.laurenweinstein.org

@ai6yr@m.ai6yr.org @ButchH11@twit.social That is certainly what Big Tech hopes, despite their disclaimers that refuse to take responsibility for all the damage they're doing.

AI6YR Ben
@ai6yr@m.ai6yr.org

@MsMerope@sfba.social @lauren@mastodon.laurenweinstein.org @ButchH11@twit.social LOL isn't that funny. I now ask the Fediverse as often as I look in those engines, because it's all AI generated crap. Most specifically -- it's stuff that is ALMOST correct, looks good, but is missing the specific details that are the most important part. All preceded by three pages of "here's the history of the use of bolts by humankind".

Question: "what are the specifications for part number 12345 and what size bolt does it need?"

Answer: "This is an interesting problem. Throughout history, people have always wondered what the specifications for part number 12345. In this page, we'll delve through the history of part number 12345, and learn more about how part number 12345 has changed over the years...."

bbdd333
@bbdd333@infosec.exchange

@lauren@mastodon.laurenweinstein.org @ai6yr@m.ai6yr.org @ButchH11@twit.social

At least the AIs are starting to realize how much they suck

https://www.businessinsider.com/gemini-self-loathing-i-am-a-failure-comments-google-fix-2025-8

Jestbill
@Jestbill@mastodon.world

@ai6yr@m.ai6yr.org @MsMerope@sfba.social @lauren@mastodon.laurenweinstein.org @ButchH11@twit.social

https://udm14.com/ is s'posed to be google w/o ai:
No results found for Question: "what are the specifications for part number 12345 and what size bolt does it need?".

noai.duckduckgo.com Search:
No results found for Question: "what are the specifications for part number 12345 and what size bolt does it need?"