• 0 Posts
  • 30 Comments
Joined 2Y ago
cake
Cake day: Jun 11, 2023

help-circle
rss

I’d never bother changing whatever default font the editor comes with and I don’t understand why anyone would care to


My only pair of earbuds have a wire. The connector is usb-c and my phone has a jack. So I don’t even use the jack now

3.5 headphones that have volume controls and a pause/hang up button can’t seem to last a year. I tried 2 and they both lasted 3 months



Step up the reading comprehension please :)

It’s pretty funny having you state, re-state and re-re-state the exact same obvious things that everyone understands while not seeing that everyone gets that, that you’re missing the point, and that you’re yelling in a hole

Why do you keep going?

Are you that painfully unconvincing in real life like cmon

Step it up



https://www.law.cornell.edu/uscode/text/18/1466A

(a) In General.—Any person who, in a circumstance described in subsection (d), knowingly produces, distributes, receives, or possesses with intent to distribute, a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that— (1) (A) depicts a minor engaging in sexually explicit conduct; and (B) is obscene; or (2) (A) depicts an image that is, or appears to be, of a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; and (B) lacks serious literary, artistic, political, or scientific value; or attempts or conspires to do so, shall be subject to the penalties provided in section 2252A(b)(1), including the penalties provided for cases involving a prior conviction.


You’re really mad that a US based study is using the US definition of CSAM while also clearly stating the definition of CSAM they’re using, aren’t you?

Sure buddy, it’s a “false equivalence”, they’re totally stating it’s the same. It’s not just your reading comprehension


There’s no definite conclusion on whether consuming and distributing lolicon content could lead some individuals to seek out or create explicit content involving real children

If they rule that out entirely through the scientific method one day, then I’ll join your side

Weebs usually respond to that “Well that’s like saying video games cause violence!” so I’ll jump ahead of you, that would be like saying we should forbid Lolicon videogames in a society that already has lolicon books, lolicon movies, lolicon cartoons and where history classes mostly cover instances of countries showing lolicon to each other. That’s not the situation we’re in, and even if it was, it’s still not necessarily comparable. Sexual urges have properties that violence doesn’t share.


They’re studying the prevalence of CSAM under the definition of the country they’re in. It’d be arbitrary to separate the two and make two different conclusions.

Also you seriously need to take a chill pill


Ah. It depends on the jurisdiction the instance is in

Mastodon has a lot of lolicon shit in japan-hosted instances for that reason

Lolicon is illegal under US protect act of 2003 and in plenty of countries


The study is transparent about their definition of CSAM. At this point, if you don’t get it, you don’t get it. Sorry dude.


That’s an arbitrary decision to make and doesn’t really need to be debated

The study is pretty transparent about what “CSAM” is under their definition and they even provide pictures, from a science communication point of view they’re in the clear


“treating them the same” => The threshold for being refused entry into mainstream instances is just already crossed at the lolicon level.

From the perspective of the fediverse, pictures of child rape and lolicon should just both get you thrown out. That doesn’t mean you’re “treating them the same”. You’re just a social network. There’s nothing you can do above defederating.


I don’t think there’s anything ridiculous about it. Lolicon should be illegal.


“treating them the same” => The threshold for being banned is just already crossed at the lolicon level.

From the perspective of the park, pissing in a pond and fighting a dude both get you thrown out. That doesn’t mean you’re “treating them the same”. You’re just the park.

Do you get it now?


Who places the bar for “exclusion from a social network” at felonies? Any kind child porn has no place on the fediverse, simulated or otherwise. That doesn’t mean they’re equal offenses, you’re just not responsible for carrying out anything other than cleaning out your porch.


I assumed it was the same thing, but if you’re placing the bar of acceptable content below child porn, I don’t know what to tell you.



He invented the stupid take he’s fighting against. Nobody equated “ink on paper” with “actual rape against children”.

The bar to cross to be filtered out of the federation isn’t rape. Lolicon is already above the threshold, it’s embarrassing that he doesn’t realize that.


Repeating over and over again that I’m equating drawings with rape isn’t going to cut it if people can just read what I wrote. Especially when nobody was even talking about rape in the first place


CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you’ll see “cartoons, paintings, sculptures, …” in the wording of the protect act

They don’t actually need a victim to be defined as such


It’s illegal in a lot of places including where I live.

In the US you have the protect act of 2003

(a) In General.—Any person who, in a circumstance described in subsection (d), knowingly produces, distributes, receives, or possesses with intent to distribute, a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that— (1) (A) depicts a minor engaging in sexually explicit conduct; and (B) is obscene; or (2) (A) depicts an image that is, or appears to be, of a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; and (B) lacks serious literary, artistic, political, or scientific value; or attempts or conspires to do so, shall be subject to the penalties provided in section 2252A(b)(1), including the penalties provided for cases involving a prior conviction.

Linked to the obscenity doctrine

https://www.law.cornell.edu/uscode/text/18/1466A



Oh it will go much different if the pork doesn’t involve depuctions of children.


Okay, the former then.

Let’s just think about it, how do you think it would turn out if you went outside and asked anyone about pornographic drawings of children? How long until you find someone who thinks like you outside your internet bubble?

“Nobody wants this stuff that whole servers…”

There are also servers dedicated to real child porn with real children too. Do you think that argument has any value with that tidbit of information tacked onto it?




Everybody understands there’s no real kid involved. I still don’t see an issue reporting it to authorities and all the definitions of CSAM make a point of including simulated and illustrated forms of child porn.

https://en.m.wikipedia.org/wiki/Child_pornography


There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content

Shadow banning those users would be nice too


Okay, thanks for the clarification

Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.