Please do explain how you'd engineer a site to deal with barrage of poorly written scrapers descending upon it. After you've done geo-ip routing, implemented various levels of caching, separated read/write traffic and bought an ever increasing amount of bandwidth, what is there left to do?
You could also get CloudFlare, or some other CDN, but depending on your size that might not be within your budget. I don't get why the rest of the internet should subsidize these AI companies. They're not profitable and live of venture capital and increase the operation costs of everyone else.
You could also get CloudFlare, or some other CDN, but depending on your size that might not be within your budget. I don't get why the rest of the internet should subsidize these AI companies. They're not profitable and live of venture capital and increase the operation costs of everyone else.