Open source devs are fighting AI crawlers with cleverness and vengeance
Sites hosting free and open-source projects share most of their infrastructure publicly, so they tend to have fewer resources than commercial products. Many AI bots don't honor the Robots Exclusion Protocol robot.txt file, hide behind other IP addresses, and pretend to be other users, making it difficult to block them. Anubis is a reverse proxy proof-of-work check that must be passed before requests are allowed to hit a server - it blocks bots but lets through browsers operated by humans. It has been popular among the FOSS community as many projects are experiencing the same issue.
Comments
Post a Comment