Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
RobotToaster
on Jan 3, 2025
|
parent
|
context
|
favorite
| on:
Can LLMs write better code if you keep asking them...
IIRC there was a post on here a while ago about how LLMs give better results if you threaten them or tell them someone is threatening you (that you'll lose your job or die if it's wrong for instance)
__mharrison__
on Jan 3, 2025
[–]
The author of that post wrote this post and links to it in this article.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: