Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This raises huge red flags for privacy. Presumably OpenAI keeps full logs of every interaction with ChatGPT / GPT-3, but this isn't self-evident when you're using them. And it feels intimate--it feels like you're talking to a person--and that builds trust. To say nothing of applications like therapy, or personal coaching...

This makes me feel that open-source LLMs that can be estimated efficiently, and run on local hardware, is an urgent need. Otherwise we will live in a cyberpunk dystopia with a centralized company that knows everything about you.



> that builds trust. To say nothing of applications like therapy, or personal coaching...

I hope (naively?) that we never have a world with actual LLM therapy.

> open-source LLMs that can be estimated efficiently, and run on local hardware, is an urgent need

It’s really quite expensive otherwise, but considering Alexa et al aren’t truly on device yet, and LLM are much more complex… it’ll be a while.


I couldn't agree more, but I feel so defeated with the already egregious violations of privacy that pervade bleeding edge and big tech companies. What's there to be done?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: