Brutkey

nixCraft 🐧🐧
@nixCraft@mastodon.social

Intel laid off several employees who were key maintainers of Linux kernel drivers. As a result, many drivers, including those for CPU temperature and the Slim Bootloader, are now "Orphaned" and lack anyone to maintain them https://www.phoronix.com/news/Intel-More-Orphans-Maintainers


Colin B.πŸ‡¨πŸ‡¦πŸ‡¨πŸ‡¦
@swordgeek@mstdn.ca

@nixCraft@mastodon.social I'm almost retired from IT. I've run Cyrix and AMD, but mostly Intel for 35 years.

My next PC (this fall) will be AMD (and Linux), and I just don't see much future for Intel at thos rate.

labbatt50
@labbatt50@mastodon.world

@nixCraft@mastodon.social

Typical of Big Tech
sad

Nazo
@nazokiyoubinbou@urusai.social

@nixCraft@mastodon.social What the bleep is going on? How can Intel of all companies be having these kinds of problems?

B05H
@bosh@infosec.exchange

@nixCraft@mastodon.social the age of intel is over , time for me to get a new chip/mobo

nixCraft 🐧🐧
@nixCraft@mastodon.social

@nazokiyoubinbou@urusai.social they fired 24k employee as Intel is falling behind in their core CPU biz due to compmetions from ARM, mobile cpus, Apple, AMD, and others.

Nazo
@nazokiyoubinbou@urusai.social

@nixCraft@mastodon.social These competitions have existed for ages though. It's quite strange to see this happening very suddenly now. ARM, MIPS and etc have been around since forever and in mobile devices for probably getting close to two decades. Apple only recently switched to Intel, then just simply switched away again (they used PowerPC before.) AMD has been competing with Intel since the 80s (and back then they actually had more than just AMD to compete with even on the x86 market. Especially let's not forget Cyrix who really gave them a run for their money for a while.)

None of any of that is new.

And it isn't as if they couldn't find steps to take to try to get costs down more. For example, while AMD has patents, I'm sure they could still find some way to do chiplets.

Nazo
@nazokiyoubinbou@urusai.social

@nixCraft@mastodon.social These competitions have existed for ages though. It's quite strange to see this happening very suddenly now. ARM, MIPS and etc have been around since forever and in mobile devices for probably getting close to two decades. Apple only recently switched to Intel, then just simply switched away again (they used PowerPC before.) AMD has been competing with Intel since the 80s (and back then they actually had more than just AMD to compete with even on the x86 market. Especially let's not forget Cyrix who really gave them a run for their money for a while.)

None of any of that is new.

And it isn't as if they couldn't find steps to take to try to get costs down more. For example, while AMD has patents, I'm sure they could still find some way to do chiplets.

Madagascar_Sky
@Madagascar_Sky@mastodon.social

@nazokiyoubinbou@urusai.social @nixCraft@mastodon.social

Don't quote me on this, but apparently this is the story, that I heard on the internet.

I think the problem seems to be that the proper way was being perused by Pat Gelsinger. It was going to be hard and it was going to be unpopular, but it would've brought the company out of the rut.

But then the share price dropped and then the board fired him and brought in a yes man. The yes man is just 'maximising profit' and parting out the company to be sold off.

EDIT.

Madagascar_Sky
@Madagascar_Sky@mastodon.social

@nazokiyoubinbou@urusai.social @nixCraft@mastodon.social

Don't quote me on this, but apparently this is the story, that I heard on the internet.

I think the problem seems to be that the proper way was being perused by Pat Gelsinger. It was going to be hard and it was going to be unpopular, but it would've brought the company out of the rut.

But then the share price dropped and then the board fired him and brought in a yes man. The yes man is just 'maximising profit' and parting out the company to be sold off.

EDIT.

kawazoe
@kawazoe@mstdn.ca

@Madagascar_Sky@mastodon.social @nazokiyoubinbou@urusai.social @nixCraft@mastodon.social there's more to it than that. The last 8 years of intel CPUs have been having serious trouble. Past the 8000 series, their performance stagnated during the early Ryzen era, giving AMD an enormous lead. That's when Pat got hired again, and we saw the 12th, 13th, and 14th Gen CPUs show up with massive performance gains from nowhere... except that last year, we learned how they did it. The chip are eating so much power they are frying themselves. The 13700k and 13900k specifically can see damage after only 6 months of use. They corrected the problem for the Ultra 200 series, but as a result, they showed a significant regression in performance from their predecessor. No one is buying their chips anymore.

Basically, what I'm saying is, this is not new. It's been written on the walls for at least 8 years now that they couldn't keep up with AMD. Sadly, this is also why AMD nearly doubled the price of their chips in the same period.

Zimmie
@bob_zim@infosec.exchange

@Madagascar_Sky@mastodon.social @nazokiyoubinbou@urusai.social @nixCraft@mastodon.social It’s a combination of things, but that’s certainly one of them.

Ultimately, Intel was overconfident, and took too long to realize they were in trouble. They fell behind on fab tech, their primary core design ran out of gas for a while (they focused on performance, expecting fab advances to keep power usage in check), Atom was a plan to fix that but it took longer than they could afford to start getting good.

Now they’re killing or selling off all the β€œnon-core” parts of the business like OPA, Barefoot, flash dies and SSD controllers, and so on. I hope they at least keep going with Arc.

Nazo
@nazokiyoubinbou@urusai.social

@kawazoe@mstdn.ca @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social This stupid game Intel and AMD are playing where benchmarks come first is literally killing Intel then... The chiplet design definitely handles this stuff better and Intel is just throwing more cores in there to get barely higher numbers. Not worth it. It's hard to believe they won't just let this stupid game go. It's not benchmarks that matter, it's real life results. Intel is better at some things and they should just focus on that. AMD is better at some other things (including what should be a lower cost) and they should focus on that.

Heck, I'd argue the 9000x series basically has its own equivalent to "economy cores." CCD1 in my 9900 can run at insanely low voltages and barely generates any heat but it runs slightly slower than CCD0.

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social Benchmarks are not stupid games and useless results. We're talking time to render a scene in Blender, FPS in modern games, SpecFlow Montecarlo simulations, software compile time, etc. Those are real-life results taken from real-life projects that help you decide which chip will perform the best for what you plan to do with them. They make or break a sale. Modern intel chips are not good at any of those, and that's the problem. They did have many workloads where they performed faster than AMD... for the first 6 months. That's just the outcome of running 1.4v in a chip that shouldn't continuously use more than 1.2v. It's gotten so bad that Mozzila stopped monitoring bug reports for Firefox on intel systems during the last European heatwave. It's not about efficiency. It's about making a product they know will fail because that's their only hope of getting a sale. It's about fraud.

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social Benchmarks are not stupid games and useless results. We're talking time to render a scene in Blender, FPS in modern games, SpecFlow Montecarlo simulations, software compile time, etc. Those are real-life results taken from real-life projects that help you decide which chip will perform the best for what you plan to do with them. They make or break a sale. Modern intel chips are not good at any of those, and that's the problem. They did have many workloads where they performed faster than AMD... for the first 6 months. That's just the outcome of running 1.4v in a chip that shouldn't continuously use more than 1.2v. It's gotten so bad that Mozzila stopped monitoring bug reports for Firefox on intel systems during the last European heatwave. It's not about efficiency. It's about making a product they know will fail because that's their only hope of getting a sale. It's about fraud.

Nazo
@nazokiyoubinbou@urusai.social

@kawazoe@mstdn.ca @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social I didn't use the word "stupid." I used the word artificial.

As in real life systems are doing multiple tasks at once and must adapt in different ways rather than producing a single pre-determined result over and over.

And most of what they're using for benchmarks aren't even those things. They're things like how fast does office software load and perform a few basic operations.

Nazo
@nazokiyoubinbou@urusai.social

@kawazoe@mstdn.ca @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social I didn't use the word "stupid." I used the word artificial.

As in real life systems are doing multiple tasks at once and must adapt in different ways rather than producing a single pre-determined result over and over.

And most of what they're using for benchmarks aren't even those things. They're things like how fast does office software load and perform a few basic operations.

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social I can't believe it has come to this but here's a screenshot of your post with every instances of the word stupid highlighted in yellow, and of the word artificial in red... Not a whole lot of red...

To your "multiple things at once" argument, I can't recall the last time I ran an AI training and thought it'd be smart to play Cyberpunk on the same computer to kill some time. Sure, I might watch a video while I code and browse the web, but out of those things, my CPU barely sees any usage. I see way more effect from badly optimized software like Teams, Outlook, and Windows than from what I'm actually trying to do. In that world, it doesn't matter what chip you buy. They'll all perform badly. You'll get a much stronger effect from changing the software you use. Does it mean singular task benchmarks are useless? No! When I hit compile, it doesn't matter if I have Teams opened. I want it to be done quickly. If a chip does it faster, it'll do it faster with Teams opened.

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social I can't believe it has come to this but here's a screenshot of your post with every instances of the word stupid highlighted in yellow, and of the word artificial in red... Not a whole lot of red...

To your "multiple things at once" argument, I can't recall the last time I ran an AI training and thought it'd be smart to play Cyberpunk on the same computer to kill some time. Sure, I might watch a video while I code and browse the web, but out of those things, my CPU barely sees any usage. I see way more effect from badly optimized software like Teams, Outlook, and Windows than from what I'm actually trying to do. In that world, it doesn't matter what chip you buy. They'll all perform badly. You'll get a much stronger effect from changing the software you use. Does it mean singular task benchmarks are useless? No! When I hit compile, it doesn't matter if I have Teams opened. I want it to be done quickly. If a chip does it faster, it'll do it faster with Teams opened.

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social With that being said, I'm not too sure if what you're arguing for exactly... The point here is that Intel's demise isn't a recent thing. They've been stuck in PR nightmares years after years as their bad decisions caught up with them. This has nothing to do with the competition with AMD. It's just people's expectations that a newer chip will either be faster than the previous one or cheaper, or else there's no reason to buy them. Intel's chips were none of that for 8 years. They lied to people about it to save their face while raising their prices. They literally crashed the game streaming market with their bad chips, got caught, and now no one in the server or enthousiast world trust them anymore; and we're all worse for it as AMD doesn't have any competition to regulate them.

kawazoe
@kawazoe@mstdn.ca

@nazokiyoubinbou@urusai.social @Madagascar_Sky@mastodon.social @nixCraft@mastodon.social With that being said, I'm not too sure if what you're arguing for exactly... The point here is that Intel's demise isn't a recent thing. They've been stuck in PR nightmares years after years as their bad decisions caught up with them. This has nothing to do with the competition with AMD. It's just people's expectations that a newer chip will either be faster than the previous one or cheaper, or else there's no reason to buy them. Intel's chips were none of that for 8 years. They lied to people about it to save their face while raising their prices. They literally crashed the game streaming market with their bad chips, got caught, and now no one in the server or enthousiast world trust them anymore; and we're all worse for it as AMD doesn't have any competition to regulate them.