@kawazoe@mstdn.ca
@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social I can't believe it has come to this but here's a screenshot of your post with every instances of the word stupid highlighted in yellow, and of the word artificial in red... Not a whole lot of red...
To your "multiple things at once" argument, I can't recall the last time I ran an AI training and thought it'd be smart to play Cyberpunk on the same computer to kill some time. Sure, I might watch a video while I code and browse the web, but out of those things, my CPU barely sees any usage. I see way more effect from badly optimized software like Teams, Outlook, and Windows than from what I'm actually trying to do. In that world, it doesn't matter what chip you buy. They'll all perform badly. You'll get a much stronger effect from changing the software you use. Does it mean singular task benchmarks are useless? No! When I hit compile, it doesn't matter if I have Teams opened. I want it to be done quickly. If a chip does it faster, it'll do it faster with Teams opened.
@kawazoe@mstdn.ca
@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social With that being said, I'm not too sure if what you're arguing for exactly... The point here is that Intel's demise isn't a recent thing. They've been stuck in PR nightmares years after years as their bad decisions caught up with them. This has nothing to do with the competition with AMD. It's just people's expectations that a newer chip will either be faster than the previous one or cheaper, or else there's no reason to buy them. Intel's chips were none of that for 8 years. They lied to people about it to save their face while raising their prices. They literally crashed the game streaming market with their bad chips, got caught, and now no one in the server or enthousiast world trust them anymore; and we're all worse for it as AMD doesn't have any competition to regulate them.