Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Microsoft recently announced that they run chatgpt 3.5 & 4 on mi300 on Azure and the price/performance is better.

https://www.amd.com/en/newsroom/press-releases/2024-5-21-amd...



I've used ChatGPT on Azure. It sucks on so many levels, everything about it was clearly enforced by some bean counters who see X dollars for Y flops with zero regard for developers. So choosing AMD here would be about par for the course. There is a reason why everyone at the top is racing to buy Nvidia cards and pay the premium.


"Everyone" at top is also developing their own chips for inference and providing APIs for customers to not worry about using CUDA.

It looks like the price to performance of inference tasks gives providers a big incentive to move away from Nvidia.


There are only like 3 AI building companies who have the tech capability and resources to afford that and 2 of them don't even offer their chips to others or have gone back to Nvidia. The rest is manufacturers desperately trying to get a piece of the pie.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: