This raises huge red flags for privacy. Presumably OpenAI keeps full logs of every interaction with ChatGPT / GPT-3, but this isn't self-evident when you're using them. And it feels intimate--it feels like you're talking to a person--and that builds trust. To say nothing of applications like therapy, or personal coaching...
This makes me feel that open-source LLMs that can be estimated efficiently, and run on local hardware, is an urgent need. Otherwise we will live in a cyberpunk dystopia with a centralized company that knows everything about you.
I couldn't agree more, but I feel so defeated with the already egregious violations of privacy that pervade bleeding edge and big tech companies. What's there to be done?
ChatGPT is cool and all, but this breathless hype is starting to feel like “blockchain will take over everything” hype from a few years back. Shiny new toy hype.
I try to be skeptical about new tech. But I honestly don't see GPT anything like Blockchain. Blockchain was always opposed by the establishment, it was a scrappy hacker thing trying to disrupt traditional finance. By contrast, GPT is born of the monopolies, and it has such immediately obvious applications.
This has so many uses in everything I do everyday. I use it so much already I think I might be developing a dependence. That’s the sign, at least anecdotally, that the hype is warranted.
This makes me feel that open-source LLMs that can be estimated efficiently, and run on local hardware, is an urgent need. Otherwise we will live in a cyberpunk dystopia with a centralized company that knows everything about you.