Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think we’re going to see a greatly increased use of FPGAs in AI applications. They can be very good at matrix multiplication. Think about an AI layer that tunes the FPGA based on incoming model requirements? Need an LPU like groq? Done. I would bet Apple Silicon gets some sort of FPGA in the neural engine.


But ASICs perform way faster and more efficiently. I doubt even the gain that you would get from "retuning" the FPGA would not increase enough compared to the benefit from a general purpose processor, GPU, or an ASIC


Until you need floating point performance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: