Not the hill I'd die on, but I'm not a billionaire.
@[email protected]
link
fedilink
English
362M

inb4 “In a stunning 5-4 decision, the Supreme Court has ruled that AI-generated CSAM is constitutionally protected speech”

@[email protected]
link
fedilink
English
-242M

There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.

LordMayor
link
fedilink
English
132M

Dude, just stop jerking off to kids whether they’re cartoons or not.

@[email protected]
link
fedilink
English
-172M

‘If you care about child abuse please stop conflating it with cartoons.’

‘Pedo.’

Fuck off.

Leraje
link
fedilink
English
72M

Someone needs to check your harddrive mate. You’re way, way too invested in splitting this particular hair.

@[email protected]
link
fedilink
English
13
edit-2
2M

Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.

These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.

@[email protected]
link
fedilink
English
-182M

There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.

@[email protected]
link
fedilink
English
92M

CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.

@[email protected]
link
fedilink
English
-132M

CSAM is material… from the sexual abuse… of a child.

Fiction does not count.

queermunist she/her
link
fedilink
English
62M

How do you think a child would feel after having a pornographic image generated of them and then published on the internet?

Looks like sexual abuse to me.

@[email protected]
link
fedilink
English
9
edit-2
2M

You’re the only one using that definition. There is no stipulation that it’s from something that happened.

Where is your definition coming from?

@[email protected]
link
fedilink
English
-82M

My definition is from what words mean.

We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?

@[email protected]
link
fedilink
English
2
edit-2
2M

I already did the “what words mean” thing earlier.

-involves a child
-is sexual
-is abusive (here’s your Simpsons exclusion, btw)
-is material

That’s literally every word of CSAM, and it fits.

We need a term to specifically refer to actual photographs of actual child abuse

Why? You’ve made a whole lot of claims it should be your way but you’ve provided no sources nor any justification as to why we need to delineate between real and AI.

Sas
link
fedilink
English
112M

It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.

@[email protected]
link
fedilink
English
-13
edit-2
2M

Nothing done to your likeness is a thing that happened to you.

Do you people not understand reality is different from fiction?

edible_funk
link
fedilink
English
12M

Deepfakes are illegal. You’re defending deepfake cp now?

@[email protected]
link
fedilink
English
-22M

Threats are a crime, but they’re a different crime than the act itself.

Everyone piling on understands that it’s kinda fuckin’ important to distinguish this crime, specifically, because it’s the worst thing imaginable. They just also want to use the same word for shit that did not happen. Both things can be super fucking illegal - but they will never be the same thing.

Sas
link
fedilink
English
122M

My likeness posted for the world to see in a way i did not consent to is a thing done to me

@[email protected]
link
fedilink
English
-72M

Your likeness depicted on the moon does not mean you went to the moon.

Sas
link
fedilink
English
5
edit-2
2M

Your likebess modified naked being fucked, printed out and stapled to a tree in your neighborhood is ok then?

@[email protected]
link
fedilink
English
42M

Please send me pictures of your mom so that I may draw her naked and post it on the internet.

@[email protected]
link
fedilink
English
-22M

Do you understand that’s a different thing than telling me you’ve fucked her?

@[email protected]
link
fedilink
English
82M

The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

Create a post

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc…
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc…)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

  • 1 user online
  • 1 user / day
  • 21 users / week
  • 287 users / month
  • 1.87K users / 6 months
  • 1 subscriber
  • 14K Posts
  • 106K Comments
  • Modlog