how they fool the AI while keeping it invisible to the human eye
My guess is that AI companies will try to scrape as much as possible without a human ever looking at the data.
When poisoned data start to become enough of a problem, that humans have to look over very sample, then this would increase training cost to to a point where it’s no longer worth to bother with it in the first place.
someone make a lemmy community for palworld already