Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Are gaming GPUs like a 3080 not powerful enough?

It really depends on the model. Just cherry-picking memory as a capacity dimension first: The SAM model from Meta ships at around 2.4GB w/ 360 million parameters. That trained model fits just fine on a 12GB 3080 Ti. How fast it can compute predictions on a single 3080 Ti is a different story, in the case of SAM it does well, but this ultimately depends on how complex the given model is (not the only variable, but a big one).

> don’t understand why there aren’t more options to run locally

I think it's likely that you haven't been looking in the right places for local solutions. The deep learning space is very well represented in open source at the moment across a wide set of verticals: language models, computer vision, speech recognition, voice synthesis, etc. You don't always get the white glove UX that SaaS sometimes can offer, but thats true of much of the rest of the OSS world as well.

EDIT: Wanted to note that I use both a 3080 Ti and my M2 Max for a variety of DL tasks (both for training and inference).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: