Not the hill I'd die on, but I'm not a billionaire.
bread
link
fedilink
English
-12M

What a reprehensible, disingenuous representation of what he actually said. I’m not a fan of the guy, but PC Gamer is trash as well. Scary to see how people here are reacting just because it’s about X and AI.

popcar2
link
fedilink
English
02M

Yeah nobody in this thread went past the title, but that’s literally not what he said.

He actually said that demanding X remove AI features is gatekeeping since competitors get to keep them, which is still a dumb take but very very far from “Tim Sweeny loves child porn”…

@[email protected]
link
fedilink
English
2302M

TIL Tim Sweeny is into child porn. Not surprising tbh.

@[email protected]
link
fedilink
English
142M

The fall of rome… The fall of the perverse…

@[email protected]
link
fedilink
English
422M

Someone beat this man for attempting to defent AI csam

@[email protected]
link
fedilink
English
642M

Somebody is in a certain set of files

@[email protected]
link
fedilink
English
362M

inb4 “In a stunning 5-4 decision, the Supreme Court has ruled that AI-generated CSAM is constitutionally protected speech”

@[email protected]
link
fedilink
English
-242M

There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.

@[email protected]
link
fedilink
English
82M

The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

@[email protected]
link
fedilink
English
13
edit-2
2M

Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.

These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.

@[email protected]
link
fedilink
English
-182M

There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.

@[email protected]
link
fedilink
English
92M

CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.

@[email protected]
link
fedilink
English
-132M

CSAM is material… from the sexual abuse… of a child.

Fiction does not count.

queermunist she/her
link
fedilink
English
62M

How do you think a child would feel after having a pornographic image generated of them and then published on the internet?

Looks like sexual abuse to me.

@[email protected]
link
fedilink
English
9
edit-2
2M

You’re the only one using that definition. There is no stipulation that it’s from something that happened.

Where is your definition coming from?

@[email protected]
link
fedilink
English
-82M

My definition is from what words mean.

We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?

Sas
link
fedilink
English
112M

It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.

@[email protected]
link
fedilink
English
-13
edit-2
2M

Nothing done to your likeness is a thing that happened to you.

Do you people not understand reality is different from fiction?

edible_funk
link
fedilink
English
12M

Deepfakes are illegal. You’re defending deepfake cp now?

@[email protected]
link
fedilink
English
-22M

Threats are a crime, but they’re a different crime than the act itself.

Everyone piling on understands that it’s kinda fuckin’ important to distinguish this crime, specifically, because it’s the worst thing imaginable. They just also want to use the same word for shit that did not happen. Both things can be super fucking illegal - but they will never be the same thing.

Sas
link
fedilink
English
122M

My likeness posted for the world to see in a way i did not consent to is a thing done to me

@[email protected]
link
fedilink
English
-72M

Your likeness depicted on the moon does not mean you went to the moon.

@[email protected]
link
fedilink
English
42M

Please send me pictures of your mom so that I may draw her naked and post it on the internet.

@[email protected]
link
fedilink
English
-22M

Do you understand that’s a different thing than telling me you’ve fucked her?

LordMayor
link
fedilink
English
132M

Dude, just stop jerking off to kids whether they’re cartoons or not.

@[email protected]
link
fedilink
English
-172M

‘If you care about child abuse please stop conflating it with cartoons.’

‘Pedo.’

Fuck off.

Leraje
link
fedilink
English
72M

Someone needs to check your harddrive mate. You’re way, way too invested in splitting this particular hair.

@[email protected]
link
fedilink
English
26
edit-2
2M

Tim Sweeney vocally supports child porn and deep fake porn? He certainly looks like the type of creeper, so I guess I’m not that surprised.

I wonder how many times he’s been to Trump and Epstein’s Pedophile Island 🤔

@[email protected]
link
fedilink
English
62M

Are they removed or just this deep up their own assholes?

@[email protected]
link
fedilink
English
93
edit-2
2M

Yet another CEO who’s super into child porn huh?

@[email protected]
link
fedilink
English
72M

Maybe we are the only people that don’t f kids. Maybe this is "H’ “E” Double Hockey Sticks.

@[email protected]
link
fedilink
English
31
edit-2
2M

Imagine where Epic would be if they had just censored Tim Sweeney’s Twitter account.

It’s like he’s hell bent on driving people away from Epic. I’m not sure I could be more abrasive if I tried, without losing the plausible deniability of not trying to troll.

fonix232
link
fedilink
112M

Not just asshole. Nonce asshole.

Sanctus
link
fedilink
English
512M

His opinion is as trash as his gaming storefront that insists its a platform.

@[email protected]
link
fedilink
English
1482M

If you can be effectively censored by the banning of a site flooded with CSAM, that’s very much your problem and nobody else’s.

@[email protected]
link
fedilink
English
-522M

Nothing made-up is CSAM. That is the entire point of the term “CSAM.”

It’s like calling a horror movie murder.

@[email protected]
link
fedilink
English
41
edit-2
2M

It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.

greenskye
link
fedilink
English
72M

I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.

I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.

Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.

@[email protected]
link
fedilink
English
32M

I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime

I’m sorry to break it to you, but this has been illegal for a long time and it doesn’t need to have anything to do with CSAM.

For instance, drawing certain copyrighted material in certain contexts can be illegal.

To go even further, numbers and maths can be illegal in the right circumstances. For instance, it may be illegal where you live to break the encryption of a certain file, depending on the file and encryption in question (e.g. DRM on copyrighted material). “Breaking the encryption of a file” essentially translates to “doing maths on a number” when you boil it down. That’s how you can end up with the concept of illegal numbers.

greenskye
link
fedilink
English
12M

To further clarify it’s specifically around thought crimes in scenarios where there is no victim being harmed.

If I’m distributing copyrighted content, that’s harming the copyright holder.

I don’t actually agree with breaking DRM being illegal either, but at least in that case, doing so is supposedly harming the copyright holder because presumably you might then distribute it, or you didn’t purchase a second copy in the format you wanted or whatever. There’s a ‘victim’ that’s being harmed.

Doodling a dirty picture of a totally original character doing something obscene harms absolutely no one. No one was abused. No reputation (other than my own) was harmed. If I share that picture with other consenting adults in a safe fashion, again no one was harmed or had anything done to them that they didn’t agree to.

It’s totally ridiculous to outlaw that. It’s punishing someone for having a fantasy or thought that you don’t agree with and ruining their life. And that’s an extremely easy path to expand into other thoughts you don’t like as well. And then we’re back to stuff like sodomy laws and the like.

@[email protected]
link
fedilink
English
12M

It was already a thing in several places. In my country it’s legal to sleep with a 16 year old, but fiction about the same thing is illegal.

@[email protected]
link
fedilink
English
14
edit-2
2M

Sure, i think it’s weird to really care about loli or furry or any other niche the way a lot of people do around here, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can’t have effective safeguards against that harm it makes sense to restrict it legally.

greenskye
link
fedilink
English
22M

Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.

But generating images of adults that don’t exist? Or even clearly drawn images that aren’t even realistic? I’ve seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky.

Like let’s take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone’s fucked up fantasy. Yet lots of people want to make that into a thought crime.

I’ve always thought that if there isn’t speech out there that makes you feel icky or gross then you don’t really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.

@[email protected]
link
fedilink
English
12M

Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.

That is also drawing a certain arrangement of lines and colours.

greenskye
link
fedilink
English
1
edit-2
2M

Yes sorry. My original statement was too vague. I was talking specifically about scenarios where there is no victim and the action was just a drawing/story/etc.

I’m not a free speech absolutist. I think that lacks nuance. There are valid reasons to restrict certain forms of speech. But I do think the concept is core to a healthy democracy and society and should be fiercely protected.

@[email protected]
link
fedilink
English
62M

Drawings are one conversation I won’t get into.

GenAI is vastly different though. Those are known to sometimes regurgitate people or things from their dataset, (mostly) unaltered. Like how you can get Copilot to spit out valid secrets that people accidentally committed by typing NPM_KEY=. You can’t have any guarantee that if you ask it to generate a picture of a person, that person does not actually exist.

greenskye
link
fedilink
English
12M

Totally fair stance to take. I’m 100% on board with extra restrictions and scrutiny over anything that is photo realistic.

To me, those aren’t necessarily victimless crimes, even if the person doesn’t actually exist, because they poison the well with realistic looking fakes. That is actively harmful to others, so is not a victimless crime. Instead it becomes just another form of misinformation.

@[email protected]
link
fedilink
English
-152M

You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.

@[email protected]
link
fedilink
English
10
edit-2
2M

The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.

You are completely wrong.

https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/

“CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”

“Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.

@[email protected]
link
fedilink
English
-9
edit-2
2M

RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.

We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.

@[email protected]
link
fedilink
English
12M

Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.

@[email protected]
link
fedilink
English
12M

deleted by creator

VeganBtw
link
fedilink
English
5
edit-2
2M

Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
[…]
Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
(Emphasis mine)

https://en.wikipedia.org/wiki/Child_pornography

@[email protected]
link
fedilink
English
-72M

‘These several things are illegal, including the real thing and several made-up things.’

Please stop misusing the term that explicitly refers to the the real thing.

‘No.’

edible_funk
link
fedilink
English
7
edit-2
2M

Is it a sexualized depiction of a minor? Then it’s csam. Fuck all y’all pedo apologists.

@[email protected]
link
fedilink
English
2
edit-2
2M

AI CSAM was generated from real CSAM

AI being able to accurately undress kids is a real issue in multiple ways

@[email protected]
link
fedilink
English
-72M

AI can draw Shrek on the moon.

Do you think it needed real images of that?

rainwall
link
fedilink
English
8
edit-2
2M

It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.

The child porn it’s generating is based on literal child porn, if not itself just actual child porn.

@[email protected]
link
fedilink
English
-4
edit-2
2M

You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?

Like combining unrelated concepts isn’t the whole fucking point?

@[email protected]
link
fedilink
English
122M

No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.

@[email protected]
link
fedilink
English
02M

True enough - but fortunately, there’s approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.

@[email protected]
link
fedilink
English
32M

It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.

@[email protected]
link
fedilink
English
-32M

And you think it’s short on images of fully naked women?

CerebralHawks
link
fedilink
English
42M

Yes and they’ve been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI.

Anna’s Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they’re using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.)

So, yes, that is exactly what they’re doing. They are training their models on all the data, not just all the legal data.

@[email protected]
link
fedilink
English
22M

It’s big fucking news when those datasets contain, like, three JPEGs. Because even one such JPEG is an event where the FBI shows up and blasts the entire hard drive into shrapnel.

Y’all insisting there’s gotta be some clearly-labeled archive with a shitload of the most illegal images imaginable, in order for the robot that combines concepts to combine the concept of “child” and the concept of “naked,” are not taking yourselves seriously. You’re just shuffling cards to bolster a kneejerk feeling.

@[email protected]
link
fedilink
English
152M

The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

Were you too busy fapping to read the article?

@[email protected]
link
fedilink
English
172M

Not in the least surprised by this bag of slimy cocks

TachyonTele
link
fedilink
English
62M

Hold up

@[email protected]
link
fedilink
English
152M

Tim Sweeney is into child porn

@[email protected]
link
fedilink
English
-15
edit-2
2M

I am going to be an absolute crank about this:

CSAM means photographic evidence of child rape.

If that event did not happen, say something else.

The entire point of this term is to distinguish the go-to-jail imagery stemming from unambiguous crimes, versus any form of made-up nonsense. Bart Simpson is not eligible. Bart Simpson does not exist. Photorealistic depictions of real children can be hyper illegal, but unless they are real, they’re not CSAM. Say something else. Otherwise we’ll have to invent some even less ambiguous term for evidence of child abuse, and the fuckers downvoting this comment will also misappropriate that, to talk about shit that does not qualify.

nocturne
link
fedilink
English
72M

From the article:

The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that “includes both real and synthetic content, such as images created with artificial intelligence tools.”

@[email protected]
link
fedilink
English
-102M

They’re wrong.

As evidenced by what those words mean.

Rimu
link
fedilink
English
152M

Strange hill to die on, man.

@[email protected]
link
fedilink
English
-72M

What, taking child abuse seriously?

@[email protected]
link
fedilink
English
32M

‘I take child abuse seriously but also think it’s fine to generate nude pictures of real life children.’

Idk man. It’s a weird fuckin thing to admit to.

@[email protected]
link
fedilink
English
-12M

Show me where anyone said that. Circle it in red.

@[email protected]
link
fedilink
English
22M

That is not what CSAM means.

Create a post

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc…
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc…)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

  • 1 user online
  • 81 users / day
  • 102 users / week
  • 351 users / month
  • 1.92K users / 6 months
  • 1 subscriber
  • 14K Posts
  • 106K Comments
  • Modlog