Not the hill I'd die on, but I'm not a billionaire.
Lvxferre [he/him]
link
fedilink
English
26
edit-2
15h

IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.


EDIT: I’m going to abridge what I said above, in a way that even my dog would understand:

What Grok is doing is harmful, there are victims of that, regardless of some “ackshyually this is not CSAM lol lmao”. And yet you guys keep babbling about definitions?

Everything else I said here was contextualising and detailing the above.

Is this clear now? Or will I get yet another lying piece of shit (like @[email protected]) going out of their way to misinterpret what I said?

(I don’t even have a dog.)

@[email protected]
link
fedilink
English
315h

What exactly have I lied about?

I’ve never once tried to even insinuate that what grok is doing ok. Nor that it should be. What I’ve said. Is that it doesn’t even matter if there are an actual real person being victimized or not. It’s still illegal. No matter how you look at it. It’s illegal. Fictional or not.

Your example of Bart in the Simpsons movie is so far out of place I hardly know where to begin.

It’s NOT because he’s fictional. Because fictional depictions of naked children in sexually compromised situations IS illegal.

Though I am glad you don’t have a dog. It would be real awkward for the dog to always be the smartest being in the house.

@[email protected]
link
fedilink
English
21d

Supporting CSAM should be treated like making CSAM.

Down into the forgetting hole with them!

Lvxferre [he/him]
link
fedilink
English
11d

Nobody here is supporting CSAM. Learn to read, dammit.

@[email protected]
link
fedilink
English
1
edit-2
20h

He implicitly is.

EDIT: Wait, what is this about? Did I missphrase something?

Lvxferre [he/him]
link
fedilink
English
2
edit-2
15h

Fuck! I misread you. Yes, you’re right, Tim Sweeney is supporting CSAM.

Sorry for the misunderstanding, undeserved crankiness, and defensiveness; I thought you were claiming I was the one doing it. That was my bad. (In my own defence, someone already did it.)


Now, giving you a proper answer: yeah, Epic is better sent down the forgetting hole. And I hope Sweeney gets haunted by his own words for years and years to come.

EldritchFemininity
link
fedilink
English
317h

They mistook your comment as disagreeing with their take on how there are real victims of Grok’s porn and CSAM and saying that they themselves were supporting CSAM, rather than saying that you agree and were saying Sweeney is supporting CSAM.

@[email protected]
link
fedilink
English
213h

Gasp “Lvxferre! You damn Diddy demon! How could youuuu!”

Lvxferre [he/him]
link
fedilink
English
211h

At this rate I’m calling dibs on your nickname 🤣

@[email protected]
link
fedilink
English
-202d

That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

There ARE victims, lots of them.

@[email protected]
link
fedilink
English
17
edit-2
2d

That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

Non-consensual porn victimises the person being depicted

This is still true if the porn in question is machine-generated

@[email protected]
link
fedilink
English
-142d

The real thing to talk about is the presence or absence of a victim.

@[email protected]
link
fedilink
English
122d

Which they then talk about and point out that victims are absolutely present in this case…

If this is still too hard to understand i will simplify the sentence. They are saying:

“The important thing to talk about is, whether there is a victim or not.”

@[email protected]
link
fedilink
English
-81d

It doesn’t matter if there’s a victim or not. It’s the depiction of CSA that is illegal.

So no, talking about whatever or not there’s a victim is not the most important part.

It doesn’t matter if you draw it by hand with crayons. If it’s depicting CSA it’s illegal.

@[email protected]
link
fedilink
English
51d

Nobody was talking about the “legality”. We are talking about morals. And morally there is major difference.

Lvxferre [he/him]
link
fedilink
English
31d

I wish I was as composed as you. You’re still calmly explaining things to that dumb fuck, while they move the goalposts back and forth:

All of that while they’re still pretending to argue the same point. It reminds me a video from the Alt-Right Playbook, called “never play defence”: make dumb claim, waste someone else’s time expecting them to rebuke that dumb claim, make another dumb claim, waste their time again, so goes on.

@[email protected]
link
fedilink
English
220h

Its good training for arguing with real life people at least. Because coming up with a good comeback quickly is hard when you have never formulated your thoughts about a subject properly. I think often people misunderstand things at first and then when someone points out their mistake, they realize that they were wrong, but dont want to admit it, so they just double down. I have been that person before too tho…

@[email protected]
link
fedilink
English
-61d

Talking about morals and morality is how you end up getting things like abortion banned. Because some people felt morally superior and wanted to enforce their superior morality on everyone else.

There’s no point in bringing it up. If you need to bring up morals to argue your point. You’ve already failed.

But please do enlighten me. Because personally. I don’t think there’s a moral difference between depicting “victimless” CSAM and CSAM containing a real person.

I think they’re both, morally, equally awful.

But you said there’s a major moral difference? For you maybe.

@[email protected]
link
fedilink
English
1
edit-2
20h

If you seriously think that there is no moral difference between someone being sexually abused and them not being sexually abused then maybe you should be in prison for all our safety.

Lvxferre [he/him]
link
fedilink
English
72d

That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

There ARE victims, lots of them.

You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

Is this clear now?

@[email protected]
link
fedilink
English
-82d

Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

The real thing to talk about is the presence or absence of a victim.

Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.

@[email protected]
link
fedilink
English
221h

Is it so hard to admit that you misunderstood the comment ffs? It is painfully obvious to everyone.

Lvxferre [he/him]
link
fedilink
English
3
edit-2
1d

Yes, it certainly comes across as you arguing for the opposite

No, it does not. Stop being a liar.

Or, even better: do yourself a favour and go offline. Permanently. There’s already enough muppets like you: assumptive pieces of shit lacking basic reading comprehension, but still eager to screech at others — not because of what the others actually said, but because of what they assumed over it. You’re dead weight in any serious discussion, probably in some unserious ones too, and odds are you know it.

Also, I’m not wasting my time further with you, go be functionally illiterate elsewhere.

@[email protected]
link
fedilink
English
-115h

Ok. You’re right. You saying it’s ok to depict CSAM if there isn’t a victim is not you arguing the opposite. It’s me lying.

You’re so smart. Good job.

Create a post

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc…
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc…)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

  • 1 user online
  • 76 users / day
  • 266 users / week
  • 521 users / month
  • 2.14K users / 6 months
  • 1 subscriber
  • 13.9K Posts
  • 105K Comments
  • Modlog