Read this and then decide.
One possibly workable solution for del.icio.us would be to implement a penalty box for scrapers who pull too often. Give them a quota of x-number of pulls per hour. If the bots go above that quota, increment it down. If they keep violating the quota it eventually goes to zero and they are locked out for good. As a scraper gets close to their quota, start including a warning message directly within in the stream of data... something that the users will see and explains the problem. For scrapers who want to play fair they can pay del.icio.us for a higher quota and fewer warnings.
Is Technorati playing fair?
jasnell 1000001XHY 2 Comments 457 Visits