@whitequark@mastodon.social
on one hand, the upcoming financial crash will make us all even more miserable
on the other one, the third* AI winter can't come quickly enough
on one hand, the upcoming financial crash will make us all even more miserable
on the other one, the third* AI winter can't come quickly enough
@whitequark@mastodon.social and on the third hand, thereβs going to be a whole heap of very powerful but also very useful hardware going very cheap (see also: FPGAs on ex-cryptocurrency systems now that theyβre unprofitable)
@jpm@aus.social i hope this is true and the hardware isn't over-specialized on matrix multiplication of 8-bit floats at this point
@jamie@social.memes.nz @jpm@aus.social I would not at all be surprised if AMD is behind on ML in part because they want to keep performance for HPC workloads high.
@jamie@social.memes.nz it appears the spicy autocarrot folks are still in the GPU to FPGA migration that the cryptobros went through a few years ago (and fuck AMD/Xilinx in particular for leaning into that bullshit). Once they hit the FPGA to ASIC migration, the ASICs are usually completely and utterly useless for anything else.
@jpm@aus.social @jamie@social.memes.nz Machine learning skipped the FPGA step and went straight to ASICs. The good news is that the ASICs still have other good uses beyond LLMs, notably in the medical field.
@jpm@aus.social @jamie@social.memes.nz Machine learning skipped the FPGA step and went straight to ASICs. The good news is that the ASICs still have other good uses beyond LLMs, notably in the medical field.