Brutkey

✧✦Catherine✦✧
@whitequark@mastodon.social

on one hand, the upcoming financial crash will make us all even more miserable

on the other one, the third* AI winter can't come quickly enough


Joel Michael
@jpm@aus.social

@whitequark@mastodon.social and on the third hand, there’s going to be a whole heap of very powerful but also very useful hardware going very cheap (see also: FPGAs on ex-cryptocurrency systems now that they’re unprofitable)

Jamie
@jamie@social.memes.nz

@jpm@aus.social i hope this is true and the hardware isn't over-specialized on matrix multiplication of 8-bit floats at this point

Demi Marie Obenour
@alwayscurious@infosec.exchange

@jamie@social.memes.nz @jpm@aus.social I would not at all be surprised if AMD is behind on ML in part because they want to keep performance for HPC workloads high.

Joel Michael
@jpm@aus.social

@jamie@social.memes.nz it appears the spicy autocarrot folks are still in the GPU to FPGA migration that the cryptobros went through a few years ago (and fuck AMD/Xilinx in particular for leaning into that bullshit). Once they hit the FPGA to ASIC migration, the ASICs are usually completely and utterly useless for anything else.

Demi Marie Obenour
@alwayscurious@infosec.exchange

@jpm@aus.social @jamie@social.memes.nz Machine learning skipped the FPGA step and went straight to ASICs. The good news is that the ASICs still have other good uses beyond LLMs, notably in the medical field.

Demi Marie Obenour
@alwayscurious@infosec.exchange

@jpm@aus.social @jamie@social.memes.nz Machine learning skipped the FPGA step and went straight to ASICs. The good news is that the ASICs still have other good uses beyond LLMs, notably in the medical field.