Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

by how much? I don't understand the cost model here at all.


AIUI the idea is to ratelimit each "solution". A normal human's browser only needs to "solve" once. A LLM crawler either needs to slow down (= objective achieved) or solve the puzzle n times to get n × the request rate.


lets say that that adding Anubis does the job of adding 10 seconds of extra compute for the bot when it tries to access my website. Will this be enough to deter the bot/scraper?


Empirical evidence appears to show that it is ¯\_(ツ)_/¯




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: