Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's amazing how much knowledge about the world fits into 16 GiB of the distilled model.




This is early days, too. We're probably going to get better at this across more domains.

Local AI will eventually be booming. It'll be more configurable, adaptable, hackable. "Free". And private.

Crude APIs can only get you so far.

I'm in favor of intelligent models like Nano Banana over ComfyUI messes (the future is the model, not the node graph).

I still think we need the ability to inject control layers and have full access to the model, because we lose too much utility by not having it.

I think we'll eventually get Nano Banana Pro smarts slimmed down and running on a local machine.


>Local AI will eventually be booming.

With how expensive RAM currently is, I doubt it.


That's a short term effect. Long term Wright's law will kick in and ram will end up cheaper as a result of all the demand. It's not like there's a fundamental bottleneck on how much ram we could produce we're running into, just how much we're currently set up to produce.

I’m old enough to remember many memory price spikes.

I remember saving up for my first 128MB stick and the next week it was like triple in price.

Do you also remember when eveybody was waiting for cryto to cool off to buy a GPU?

It's temporary. Sam Altman booked all the supply for a year. Give it time to unwind.

[flagged]


Is this a joke?

Image and video models are some of the most useful tools of the last few decades.


Is this a joke?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: