Brutkey

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social Benchmarks are not stupid games and useless results. We're talking time to render a scene in Blender, FPS in modern games, SpecFlow Montecarlo simulations, software compile time, etc. Those are real-life results taken from real-life projects that help you decide which chip will perform the best for what you plan to do with them. They make or break a sale. Modern intel chips are not good at any of those, and that's the problem. They did have many workloads where they performed faster than AMD... for the first 6 months. That's just the outcome of running 1.4v in a chip that shouldn't continuously use more than 1.2v. It's gotten so bad that Mozzila stopped monitoring bug reports for Firefox on intel systems during the last European heatwave. It's not about efficiency. It's about making a product they know will fail because that's their only hope of getting a sale. It's about fraud.

Nazo
@nazokiyoubinbou@urusai.social

@kawazoe@mstdn.ca @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social I didn't use the word "stupid." I used the word artificial.

As in real life systems are doing multiple tasks at once and must adapt in different ways rather than producing a single pre-determined result over and over.

And most of what they're using for benchmarks aren't even those things. They're things like how fast does office software load and perform a few basic operations.


kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social I can't believe it has come to this but here's a screenshot of your post with every instances of the word stupid highlighted in yellow, and of the word artificial in red... Not a whole lot of red...

To your "multiple things at once" argument, I can't recall the last time I ran an AI training and thought it'd be smart to play Cyberpunk on the same computer to kill some time. Sure, I might watch a video while I code and browse the web, but out of those things, my CPU barely sees any usage. I see way more effect from badly optimized software like Teams, Outlook, and Windows than from what I'm actually trying to do. In that world, it doesn't matter what chip you buy. They'll all perform badly. You'll get a much stronger effect from changing the software you use. Does it mean singular task benchmarks are useless? No! When I hit compile, it doesn't matter if I have Teams opened. I want it to be done quickly. If a chip does it faster, it'll do it faster with Teams opened.

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social With that being said, I'm not too sure if what you're arguing for exactly... The point here is that Intel's demise isn't a recent thing. They've been stuck in PR nightmares years after years as their bad decisions caught up with them. This has nothing to do with the competition with AMD. It's just people's expectations that a newer chip will either be faster than the previous one or cheaper, or else there's no reason to buy them. Intel's chips were none of that for 8 years. They lied to people about it to save their face while raising their prices. They literally crashed the game streaming market with their bad chips, got caught, and now no one in the server or enthousiast world trust them anymore; and we're all worse for it as AMD doesn't have any competition to regulate them.