AIUI the idea is to ratelimit each "solution". A normal human's browser only needs to "solve" once. A LLM crawler either needs to slow down (= objective achieved) or solve the puzzle n times to get n × the request rate.
lets say that that adding Anubis does the job of adding 10 seconds of extra compute for the bot when it tries to access my website. Will this be enough to deter the bot/scraper?
by how much? I don't understand the cost model here at all.