Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're way, way under thinking the problem.

People who work at your bank right now are pasting your personal details into language models and asking if you deserve a loan. People will figure out how to get this data back out.

"But the model only uses old training data" there are myriad lawsuits in flight where these companies took information they shouldn't have, in all forms. Prompt engineers have already got the engines to spew things that are legally damning.

A real hack, which might archive user inputs as well as exfiltrate training data. We're only beginning to imagine the nightmare.

Can people use these language models to get private Google info? Only if Google was dumb enough to include that in the training. Hint: Yes, it's much, much worse than anyone imagines.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: