Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have been saying this since the release of Stable Diffusion that OpenAI is going to struggle as soon as competitors release their models as open source especially when it surpasses GPT-3 and GPT-4.

This is why OpenAI is rushing to bring their costs down and to make it close to free, However, Stable Diffusion is leading the race to the bottom and is already at the finish line, since no-one else would release their model as open-source and free other than them.

As soon as someone releases a free and open-source ChatGPT equivalent, then this will be just like what happened to DALLE-2. This is just a way of them locking you in, then once the paid competitors cannot compete and shut down, then the price increases come in.



Stable Diffusion isn’t free if you include the cost of the machine. Maybe you already have the hardware for some other reason, though?

To compare total cost of ownership for a business, you need to compare using someone else’s service to running a similar service yourself. There’s no particular reason to assume OpenAI can’t do better at running a cloud service.

Maybe someday you can assume end users have the hardware to run this client side, but for now that would limit your audience.


Ever heard about Federated Learning? This is the way it goes. Also, I do run training with no matrix multiplication, just 3-bit weights, addition in log space, slight accuracy degradation, but much faster CPU only training.


Okay but I meant generating results, not training. If you're running Stable Diffusion, the weights are given, but it's not going to run on a random PC.


LLM Legend:

OpenAI = closed source not open AI

DogeLlamaInuGPT = open source AI


not open is redundant with closed source


I guess source is connected


huh, I never thought of that, thanks for pointing that out




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: