Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> This feels comparable to the intel's battlemage offering of GPUs.

I'd very much like whatever card can run LLMs with Ollama or vLLM without bankrupting me and hopefully with somewhat low power usage.

Nvidia L4 cards seem to fit the bill when it comes to the power usage and getting things done, but the costs are way out there, not functionally different from H100s (I can afford neither).

So I'd very much welcome the Intel B60 Pro cards or honestly anything I could actually buy online. Until then, I'm stuck throwing money at OpenRouter and other API providers every month.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: