I wonder if the influx of slop contributions can be stemmed by a legal document that makes the contributor legally liable for their submission.
Seems like lawyers have been learning the hard way to be accountable for their slop, perhaps these lessons can be leveraged in the open source community.
Real “these kids would be very upset if they could read” situation. Who bothers to pick through the whole EULA before submitting?
Like any open source mass contribution project that’s gained too much popularity, you need extra firebreaks in between the Open Submission and Final Product.
That means adding manpower to submission review, QA, etc, which public projects don’t often have.
Sort of the Achilles Heel of the open source community. AI is just making the vulnerability extra glaring.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
No game suggestions, friend requests, surveys, or begging.
No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
No off-topic posts/comments, within reason.
Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
I wonder if the influx of slop contributions can be stemmed by a legal document that makes the contributor legally liable for their submission.
Seems like lawyers have been learning the hard way to be accountable for their slop, perhaps these lessons can be leveraged in the open source community.
Legally liable for what? Just being bad code? How are you going to enforce that against some kid in another country?
It’s time to start putting maintainers’ attention behind a paywall. $50 refundable deposit to submit a PR, forfeited if it’s obvious AI slop
Real “these kids would be very upset if they could read” situation. Who bothers to pick through the whole EULA before submitting?
Like any open source mass contribution project that’s gained too much popularity, you need extra firebreaks in between the Open Submission and Final Product.
That means adding manpower to submission review, QA, etc, which public projects don’t often have.
Sort of the Achilles Heel of the open source community. AI is just making the vulnerability extra glaring.
That would be a closing a gate after the horses have escaped situation.
Letting unverifiable code in would damage to developers and users that wouldn’t be easy to disentangle and erode trust in the product, killing it.