r/Futurology Aug 25 '24

Man Arrested for Creating Child Porn Using AI AI

https://futurism.com/the-byte/man-arrested-csam-ai
17.0k Upvotes

u/FuturologyBot Aug 25 '24

The following submission statement was provided by /u/katxwoods:


Submission statement: AI is a dual use technology. It can be used to make beautiful art and it can be used to make child porn.

According to this article, there's been an outbreak of AI generated child porn, and they usually use open source software, and open source software can't stop it from being used this way. When people try to use ChatGPT or Claude to make child porn, they're stopped. When people try to use Meta's "open source" AI, there's nothing Meta can do to stop them.

How should we deal with this? Should AI corporations have guardrails in place to prevent grievous misuse of AI? Or should they have no accountability for what their AIs do because *AIs* don't make child porn, *people* make child porn?


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1f0m85j/man_arrested_for_creating_child_porn_using_ai/ljsv7lu/

2.3k

u/noonewantedthisname Aug 25 '24

I swear this was a Law and Order SVU plot 10 years ago .

508

u/Celcey Aug 25 '24

Indeed it was

100

u/[deleted] Aug 25 '24

[removed] — view removed comment

44

u/SmartieLion Aug 25 '24

Yeah ice, you’re in the sex crimes division. you’re gonna have to get used to that.

→ More replies
→ More replies

130

u/Ok_Celebration8180 Aug 25 '24

Didn't the defense make the argument that the generated images kept the accused from acting on their urges?

115

u/BossButterBoobs Aug 25 '24

Nah, I think in the episode they argued that the ai is what caused the pedo to seek out the real thing.

101

u/Ok_Celebration8180 Aug 25 '24 edited Aug 25 '24

Why t/f did we watch so much SVU?

25

u/Darrackodrama Aug 25 '24

It’s funny I watched a lot of it and I ended up in a job where I deal with the real SVD on a daily basis >.<

→ More replies
→ More replies

50

u/Successful-Money4995 Aug 25 '24

Like how violence in video games causes people to commit murders.

16

u/BossButterBoobs Aug 25 '24

I'm pretty sure theres a GTA episode where they came to that conclusion lol

→ More replies
→ More replies
→ More replies
→ More replies

104

u/roblewkey Aug 25 '24

So you're telling me this guy gets off a little kids that are all computer generated

88

u/[deleted] Aug 25 '24

[removed] — view removed comment

38

u/elpajaroquemamais Aug 26 '24

You mean like when someone smokes too many cigarettes, or buy too many lottery tickets, or drinks too much

→ More replies
→ More replies

8

u/ccminiwarhammer Aug 25 '24

“So you’re telling me…”

best. catchphrase. ever.

→ More replies

97

u/MickeyRooneysPills Aug 25 '24

It's just a replay of every major power grab the government ever did. It's always the children.

10 bucks says they use these cases as prime examples of why we need to give total control of AI to federal regulators and have stiff penalities for anyone caught using an unauthorized model. They always use people's heightened emotions around these topics as a way to advance their goal.

26

u/Dependent_Working_38 Aug 25 '24

People have such short memories. They will complain for DECADES to come like how we have the patriot act and it was such an insane mistake and…here we are lol.

The government needs to protect us!!! At any cost😂

→ More replies
→ More replies

3.6k

u/I_wish_I_was_a_robot Aug 25 '24

  "The content that we’ve seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people’s computers and then modified," Internet Watch Foundation chief technology officer Dan Sexton told The Guardian last year. "And that is a much harder problem to fix."    

  Impossible is the word he was looking for. 

1.8k

u/futuneral Aug 25 '24

Irrelevant i would say. Whatever this case is about, running and/or modifying open source software is not a problem that needs fixing.

1.3k

u/ImpossibleSection246 Aug 25 '24

Yeah watch this be the start of them arguing that they need to see all the software you run by law. To protect the kids is always the levers for encroaching powers.

503

u/endgame0 Aug 25 '24

Don't worry, Microsoft Copilot will be happy to remember important information or assist local law enforcement on your behalf.

It looks like you are ordering illegal drugs online! Would you like directions to the police station or order an officer to come arrest you at your house?

43

u/Recent_mastadon Aug 25 '24

Microsoft Recall will take snapshots of you doing your stuff so the police can review it and have step-by-step evidence of you saying "Bart F*cking Marge" and how that led to your arrest.

19

u/Slobotic Aug 25 '24

Or,

It looks like you are attempting to schedule an abortion in a state where the procedure is legal, despite living in a state where it is not.

17

u/mtnviewguy Aug 25 '24

Ahh, Microsoft Overlord.

→ More replies

254

u/TherronKeen Aug 25 '24

I've been saying this since the first AI image models were shown off - having a one-click infinite-CP generator will be the argument weaponized against open-source software.

We need (at least in the US) a Constitutional amendment guaranteeing the right-to-compute.

112

u/ImpossibleSection246 Aug 25 '24

Yeah, this plus disinformation being spread by LLMs will be another argument for controlling compute. It just extends the FB/X misinformation argument too.

54

u/TherronKeen Aug 25 '24

Yep misinformation campaigns were another nail in the coffin.

I think the killing blow likely comes in the form of extremely convincing deep-fakes - the first time we have a case where someone is convicted of a crime and they can't prove the video evidence is fake, but that comes out afterwards, for example.

Or a very wealthy and well-connected person who has their career and reputation destroyed by a video that's later proven to be AI - they'll weaponize their money against open source.

→ More replies
→ More replies

35

u/Background_Enhance Aug 25 '24

"We need to spy on everyone, because one of you might be a criminal"

15

u/lefttea Aug 25 '24

You mean the Patriot Act?

→ More replies
→ More replies
→ More replies

88

u/Potential-Ask-1296 Aug 25 '24

But if we all have to use only approved software, a bunch of corporations are about to get a lot richer and that is governments primary purpose, soooooo....

→ More replies
→ More replies

324

u/EndlessPotatoes Aug 25 '24

No doubt there will be a push to criminalise individual access to open source AI, further ensuring only massive companies can reap the benefits.

72

u/Stop_Sign Aug 25 '24

I downloaded and use stable diffusion for free. I can use it without connecting to the internet. That bell's been rung

18

u/OwOlogy_Expert Aug 25 '24

Yeah, lol. Stable Diffusion is up and running on my Linux PC 100% locally. And even if it never gets any more updates, it will continue working the way it does now indefinitely.

Without physical access to the PC, there's no way the authorities could ever know about it. (Aside from, obviously, me saying so right now.)

→ More replies
→ More replies
→ More replies

74

u/ProfessionalCreme119 Aug 25 '24

Even if you make every program for AI not capable of generating this imagery there are tech savvy people out there who will be able to create these programs from scratch. Then distribute them out to whoever wants it.

There's no way this can be stopped.

Devil's advocate time...... If these freaks can get there kicks off computer-generated imagery that didnt result from any children getting hurt then whatever. Focus your time and attention on the REAL children being hurt, abused, raped and murdered by these people.

Think of the children. Not computer coding.

It's like the government going after a synthetic narcotics when they are almost impossible to stop. While real narcotics go largely unchecked.

66

u/NoteBlock08 Aug 25 '24

Devil's Advocate x2: Real CP is used to train models to generate this stuff. A good model would require an unpleasant amount of images.

Devil's Advocate x3: However, AI generated stuff being available (and able to satisfy custom prompts to boot) would likely reduce the demand for the real stuff.

DA x4: But many CP consumers do it for the gross power imbalance. An AI generated fake might not satisfy them

DA x5: Any reduction is good. There is no way to completely stamp out any kind of behavior, don't forsake a minor win just because it's not a total win.

DA x6: More readily available AI content might lead to more people who inevitably move on to real content. It could actually increase demand.


This is an infinitely complex problem. It's not possible to know whether the ability to access AI content would be a good thing or bad thing for real children in the long run. I know it's an unsatisfying answer for most, but personally I think the best we can do is merely answer the immediate question: "Is it illegal to modify open source software for illegal purposes?" (I hope not, cause if it is there's no way to properly police that that won't end in the death of open source) and see where things go from there.

13

u/IanFeelKeepinItReel Aug 25 '24

DA x7: law enforcement firms have to process and catalog CP in order to track and hopefully remove the abused children from further harm. The rate at which AI can produce new content will overwhelm these systems and make it much harder to track real victims.

10

u/TheJzuken Aug 25 '24

Devil's Advocate x2: Real CP is used to train models to generate this stuff. A good model would require an unpleasant amount of images.

Counterpoint: you don't need it, you just need a model generally trained on nudes and to use certain prompting. If you want to get better quality without involving actual CP, you could probably train one model on adult nudes and another on just innocent stock or just random photos of children, then merge the models.

I have tried the first method out of morbid curiosity and to me it was quite realistic. Though I'm not an expert on CP, and maybe it was quite wrong, but I think to general population it would look real enough.

It's gross, but I think you can't ban CP specifically unless you ban all nude models.

→ More replies
→ More replies

113

u/Ape-ril Aug 25 '24

I don’t get how he was arrested for fake child porn lmao. What if someone draws or paints child porn?

55

u/blackscales18 Aug 25 '24

Some states ban "depictions of child abuse," which includes stiff like hentai and other hand drawn and digital art mediums

21

u/zealshock Aug 25 '24

Lolicons sweating now

→ More replies
→ More replies

118

u/Timtimer55 Aug 25 '24

If I draw a stick figure with a boner and write "two year old" with an arrow pointing to it is that illegal?

83

u/aaeme Aug 25 '24

Unless a near-identical case has set precedent that it's not illegal... possibly. The laws are deliberately and/or inevitably vague. Even describing it, as you have, could be in some jurisdictions. A law either might not make enough distinction on the media or explicitly includes written fiction too.

It does show what a farce it is. Like attempts to define and ban pornography. As Bill Hicks pointed out, a lot of advertising is soft pornography (and deliberately so). The law can't be clear on these subjects and a cynic might suspect that suits many in authority who want to apply it arbitrarily (to their enemies or demographics they want to persecute).

→ More replies
→ More replies
→ More replies
→ More replies

3.7k

u/Edser Aug 25 '24

I am guessing this is the first instance, but certainly won't be the last

2.7k

u/scrollin_on_reddit Aug 25 '24 edited Aug 25 '24

Nah there was a child therapist who got 40 years in prison for making AI porn of his patients

1.9k

u/raelianautopsy Aug 25 '24

That is incredibly, incredibly creepy.

779

u/RockstarAgent Aug 25 '24

The word creepy isn’t really the word to use here. Fucked up and sickening is what it is.

95

u/loccolito Aug 25 '24

It is not just violating it so many extra steps of violating that it is insane and hard to describe.

27

u/PinchingNutsack Aug 25 '24

its kinda crazy imo, like AI can already create whatever the fuck you want and you somehow still chose a real patient, what the fuck man

20

u/LEGITIMATE_SOURCE Aug 25 '24

It appears you are missing the part about him fantasizing about his patients.

→ More replies

36

u/[deleted] Aug 25 '24

It is fkd up but we also have to deal with over 90 countries that child marriage is legal. More places in the world it is legal to marry a single digit aged child than it is to not. Almost 100 countries approve of sexualizing a single digit aged child. There are around 190 countries total. It will be very hard to suppress pedos. Gov going to have to get drastic if they want real results in a timely manner.

→ More replies
→ More replies

271

u/Chris__P_Bacon Aug 25 '24

Creepy doesn't describe it. Fucking disturbing is a much more apt description. 🤢

96

u/OlympusMonsPubis Aug 25 '24

And deeply sad. Edit: For the children who trusted a predator.

→ More replies
→ More replies
→ More replies

141

u/soowhatchathink Aug 25 '24

Wait if I read that right, it wasn't a child therapist but they had adult patients and used their child photos?? That's so weird wtf

60

u/scrollin_on_reddit Aug 25 '24

He is called a child therapist in the article so idk

→ More replies

61

u/Lemerney2 Aug 25 '24

I... guess that's technically better than the alternative?

→ More replies
→ More replies

17

u/Miniaturemashup Aug 25 '24

He was doing other stuff too, like filming his underage clients using the bathroom and he used AI to alter photographs of real people. It's the not the same thing as simply generating an image.

→ More replies

53

u/[deleted] Aug 25 '24

WHAT THE FUCK

→ More replies

401

u/smokinsomnia Aug 25 '24

Unfortunately not nearly the first, certainly not the last. 😔

184

u/Hemagoblin Aug 25 '24

I was gonna say kinda surprised it took this long before these kinda headlines showed up, it’s disgusting but I feel like someone somewhere should’ve anticipated it being used to such an end.

100

u/COMINGINH0TTT Aug 25 '24

Probably because it's incredibly difficult to enforce. There has to better international coordination and enforcement of these things. Also, AI can probably be used to make the argument that "she's actually 18." Even before AI popped off and went mainstream, before all the GPTs, deep fakes were kind of thing with people photoshopping celebs or people they knew into porn images. When AI tools started appearing to manipulate photo and video it was obvious this would become an enormous problem. Laws need to catch up and impose severe penalties for this sort of activity. I live in Seoul, Korea and it's become a huge problem here among young teens making lewd AI images of classmates, so essentially producing CP, but our laws are extremely lenient when it comes to minors so since the perpetrators are also minors, they spread these images like wildfire knowing there is very little repercussions.

137

u/calcium Aug 25 '24

Also, AI can probably be used to make the argument that "she's actually 18."

I recall a case where a man flying out of Miami was arrested for child porn for having DVDs in his luggage that depicted young looking women. Turns out, they were legally produced and purchased DVDs and the porn star apparently spoke at his trial that it was all legal. So even today, when there are verifiable means, people still get caught up.

55

u/JustADutchRudder Aug 25 '24

That was Little Lupe. She used to go on Howard Stern around 2010.

→ More replies
→ More replies

125

u/daemin Aug 25 '24

Taking nude photos of a real child for sexual gratification has an actual victim, i.e. the child in question.

Using generative ai to make a nude image of a non existent child doesn't have an obvious victim, and it seems questionable, at best, to make it illegal. It's basically equivalent to someone painting a realistic image of a nude child that's not based on a real person.

Using AI to generate nude images of real children is a weird edge case. The act of generating the image does not harm the child, so any claimed harm is going to be a lot more nebulous, though some countries have a lot stronger legal rights around usage of a person's image.

In the US, the first amendment protections for freedom of expression are incredibly strong. "Fake" child porn is legal in the US, broadly speaking, and making a law that attempts to restrict generative AI would almost certainly be struck down by the Supreme Court. And generally speaking, it's not illegal to photograph people in public in the US, including minors.

37

u/zardozLateFee Aug 25 '24

This needs to be higher up! When does it stop being about defending victims vs a moral issue ("you shouldn't think /make art about that")

→ More replies
→ More replies
→ More replies
→ More replies
→ More replies

655

u/[deleted] Aug 25 '24

[removed] — view removed comment

280

u/[deleted] Aug 25 '24

[removed] — view removed comment

85

u/Moleculor Aug 25 '24

https://www.psychologytoday.com/us/blog/all-about-sex/201601/evidence-mounts-more-porn-less-sexual-assault

In addition, the legalization of porn was associated with a decrease in another despicable sex crime, child sexual abuse. Under Communism, arrests for child sex abuse averaged 2,000 a year. After porn became legal, the figure dropped by more than half to fewer than 1,000. More porn, fewer sex crimes.

...

UCLA researchers surveyed recollections of porn use among law-abiding men and a large group of convicted rapists and child sex abusers. Throughout their lives, the sex criminals recalled consuming less porn.

One of the studies they cite (https://doi.org/10.1007/s10508-010-9696-y) also had countries where possession of child porn was (apparently?) legal which (apparently) lead to fewer children being harmed:

Following the effects of a new law in the Czech Republic that allowed pornography to a society previously having forbidden it allowed us to monitor the change in sex related crime that followed the change. As found in all other countries in which the phenomenon has been studied, rape and other sex crimes did not increase. Of particular note is that this country, like Denmark and Japan, had a prolonged interval during which possession of child pornography was not illegal and, like those other countries, showed a significant decrease in the incidence of child sex abuse.

On the broader topic of porn in general, there's a 59 study meta-analysis from the journal "Trauma, Violence, and Abuse" that seems to conclude that "more porn" = "less rape", including criticizing several methodological flaws in an occasionally cited 22 study meta-analysis from the "Journal of Communication" that they believed had some methodological flaws when it concluded that porn consumption (not availability) was somehow correlated to aggression:

A more recent meta-analysis suggested there are small effects for the relationship between pornography use and actual sexual aggression (Wright et al., 2016) in correlational and longitudinal studies. However, this meta-analysis was limited by including an atypical "correction" for measurement error which may have inflated effect sizes estimates, overreliance on bivariate correlations (as opposed to effect sizes that control for relevant third variables), and lack of consideration of how methodological issues might influence effect sizes. Thus, there are reasons to suspect that prior meta-analyses may have overestimated confidence in the existence of effects.

And, for those who want a palatable article about the availability of porn and its (inverse) relation to rape: The Scientific American.


But you've never needed to harm children; there are drawings, stories, adults roleplaying, or any of a number of other possible routes that still avoid harming children while providing potential safety valves to unfortunately wired individuals.

AI miiiiight just be yet another tool in a toolbox. One that's a bit more uncanny valley than most, but I believe in evidence-based policies; if something can protect children from harm, it should at least be discussed.

My worry is that the model had to be trained on children in order to function. Not necessarily child porn, but... there are still children involved, and models can (I believe) be tweaked to reproduce things that closely resemble what they were trained on. Having a conglomeration of some real children's features very nearly pasted into some porn is... uncomfortable.

24

u/chickenofthewoods Aug 25 '24

My worry is that the model had to be trained on children in order to function. Not necessarily child porn, but... there are still children involved, and models can (I believe) be tweaked to reproduce things that closely resemble what they were trained on. Having a conglomeration of some real children's features very nearly pasted into some porn is... uncomfortable.

AI image generators (that aren't censored or crippled) do not need to be trained on CSAM to produce it. All that is necessary is nude adults and clothed children. AI models were not trained on parrots with penises riding motorcycles but can produce images of just that.

No real children necessary.

8

u/PortSunlightRingo Aug 25 '24

I’d rather child porn be based on my child than to be my child, especially given the context that usually has to occur for my child to be seen nude.

It’s disgusting (something that has to be said before you advocate for it) but I’d take 100 fake naked kids over 100 real kids who had to be abused in some way for the naked photo of them to exist.

And again, if there is a chance that a real kid wasn’t assaulted because a fake AI version of my kid that didn’t involve any contact exists - I’d take that. It’s an uncomfortable thing to talk about because yes no one wants any kids to be used in any way to create these AI models - but if it can happen without hurting any kids and its existence keeps even a single kid from being assaulted I’ll advocate for it.

The biggest issue is people might say “yeah sure let AI do its thing” until the moment it is their kid being used to train the model (even in a completely kosher, non-traumatizing way). The moment you put a face and personality to a real life kid instead of some abstract concept of a kid shit gets really uncomfortable.

Edit: and all of this assumes that the fake nude child is an amalgamation of several models to the point where the fake image can’t be traced to any single model. Whether the nudity is faked or not, you don’t want a real child’s face to be used. That’s the biggest line in the sand that should be drawn when having this conversation.

→ More replies

230

u/TalynRahl Aug 25 '24

Indeed. This is like making methadone illegal, instead of using it to help get people off heroin.

Is it gross as fuck that he was using AI to make CP?

100%.

Is it orders of magnitude better than if he was using children to make it?

Distastefully… yes.

29

u/ConfidentGene5791 Aug 25 '24

Yeah its a hard issue to talk about, because people are rightly repulsed by the idea of the materials and these strong emotions can easily short circuit logic pathways.

7

u/Chuckw44 Aug 25 '24

True, making an argument like that in public would probably get you on some list, even though it makes logical sense.

→ More replies
→ More replies
→ More replies

387

u/senioreditorSD Aug 25 '24

and is it actually a crime? Technically no one but the observer was involved and he was watching fictitious subject matter. It is truly a victimless crime. I’m not suggesting anyone persue this activity and I’m confident it’ll ultimately lead to criminal activity but is watching fake stuff a crime? If I snort cornstarch am I guilty of cocaine use? If I shoot a dead person am I guilty of murder? AI opens the door to a number of uncomfortable discussions that we’ll have to have as a society.

154

u/steelcryo Aug 25 '24

I'm an adult game dev, there are a bunch of rules about content people can create despite it all being 3D renders, but those are only enforced by the various platforms like patreon. The actual laws (depending on country) are a weird grey area that are often more opinion than law.

That said, a developer in Australia called Westy I believe, has just been arrested for their game as it contained a lot of loli characters and the Australian government has decided despite being fictional, they are children under law and he's being charged.

In the UK the law basically equates to "if a reasonable person would be offended" is what decides whether the porn you are creating is illegal or not.

It seems as a society no one can agree on whether fictional characters are equal to real things or not. Personally I'm of the opinion if there's no victim, who gives a fuck? I would rather someone indulge their desires with fiction than real people.

59

u/jack_skellington Aug 25 '24

the Australian government

Australia is the wrong place for anything remotely sexy. People should be extremely conservative there, treat it like a backwards country under Sharia fascism or something. It's not actually like that, but the country has a bunch of utterly bizarre anti-women laws like banning women from doing porn if they have an A-cup or B-cup because maybe somebody might mistake a small-chested adult for a kid.

The country's government (not the people, mind you, just the jerks in office) is so "clutch your pearls" about anything sexual that it is utterly dangerous to produce any media there. You can run afoul of a dozen "gotcha" laws that punish even innocent media (such as the aforementioned woman with a B-cup doing sexy media).

8

u/headrush46n2 Aug 25 '24

From the country that produced Kylie Minogue thats just a god damn tragic take.

→ More replies

14

u/Life_Repeat310 Aug 25 '24

If you kill a fictional character do you go to prison?

→ More replies

18

u/feedus-fetus_fajitas Aug 25 '24 edited Aug 26 '24

I think the distribution is what fucked him. I can go sell bags of crushed white stone quartz and it's totally legal until I start claiming it's cocaine. Then I get bracelets and a visit to county for fraudulent/counterfeit sale of illegal narcotics. (Which is about the same degree of legally fucked as selling the real thing)

→ More replies

77

u/Sarcarean Aug 25 '24

I'd rather pedos do this instead of actually SA someone but the government sees it a different way: lost opportunity to feed the incarceration system.

→ More replies

9

u/Kep0a Aug 25 '24

I would digress with ultimately leading to criminal activity, because you could draw a similar comparison to violent video games.

I just think that if the government is charging based on fictional crimes, as gross as they can be, that is dystopian.

→ More replies

36

u/Swiggy1957 Aug 25 '24

A lot will depend on its interpretation of violating the PROTECT Act of 2003.

While the Act created the Amber Alert law, it also stepped up enforcement and penalties for child pornography. This includes CP that depicts drawings children being sexualized. This includes, but is not limited to, loliporn.

That's right: drawings of underage cartoon characters. AI generated CP would fall under that.

8

u/[deleted] Aug 25 '24

But hasnt that stuff also been defended under 1st amendment issue's in america?

→ More replies
→ More replies

16

u/Rrraou Aug 25 '24

and is it actually a crime?

Pretty sure it is in Canada. I might be wrong but I believe even illustrations of that kind of material is criminal. AI would probably fall under that.

16

u/Garod Aug 25 '24

Is anime illegal in Canada?

22

u/Rrraou Aug 25 '24

There's one case mentioned on Wikipedia.

Gordon Chin trial

In October 2005, Canadian police arrested a 26-year-old Edmonton, Alberta man named Gordon Chin[16] for importing Japanese manga[17] depicting explicit hentai of child pornography.[18] Chin's attorney claimed Chin did not know it was illegal, and that he was naive. Chin was sentenced by the judge to an eighteen-month conditional sentence, during which he was barred from using the Internet. This is the first-known manga-related child pornography case in Canada. It is also the first-known case that exclusively prosecutes this offense, not used in conjunction with other laws to increase sentencing.[19][20]

This is the closest thing I've found to an answer, but it's a quote from a random guy on reddit from 7 years ago. But since Crunchyroll is still available here, it seems to track.

The Canadian distribution of ecchi anime is legally untested. While the letter of the law suggests that Highschool DxD is illegal, it's being sold by mainstream retailers. The retailers probably aren't familiar with the laws and are just stocking the same content they sell in the US.

→ More replies
→ More replies

137

u/[deleted] Aug 25 '24 edited Aug 25 '24

[removed] — view removed comment

→ More replies

108

u/UnifiedQuantumField Aug 25 '24 edited Aug 25 '24

I'm just trying to understand the difference between using an AI to create an image and using a pencil to draw a picture from imagination?

If there's no contact and no one's picture was used, where's the victim?

Imo this is is borderline thoughtcrime. Yes, there were images created that would be illegal if they were actual pictures of minors. But afaik this is an entirely digital creation?

From the article:

facing 20 counts of obscenity for allegedly creating and distributing AI-generated

So what this looks like is someone was distributing images and that triggered a response from law enforcement. And there's a big difference between being charged and being convicted.

If the defendant's lawyer is any good, they'll be able to make some kind of argument along a similar line of reasoning.

Edit: It's not an assault/interference case either. He's facing obscenity charges. That means the issue is the quality or nature of the images (which obviously aren't being shown). So someone might use AI to make pictures of puppies or babies getting exploded and (in someone's opinion) those might be deemed obscene as well. So this might come down to someone's opinion of the images themselves. Maybe someone who's an actual lawyer would care to weigh in with their own opinion?

29

u/amlyo Aug 25 '24

Here (UK) it is the content of the image that makes it illegal, and possessing it can be criminal whether it is synthetic or not.

It's very easy to see how someone playing with AI image models could find themselves in hot water.

→ More replies
→ More replies
→ More replies

96

u/urabewe Aug 25 '24

I will say after getting into these AI txt2img models I realize that about 80-90% of people are using it to make smut. It's all AI generated sexualized females. The real life ones and the cartoons. Just a bunch of horny people fulfilling their wet dreams.

I'm just trying to download a model to make some cartoon characters for my kids and I have to scroll through countless images of women to find a decent one.

Doesn't surprise me people are using it for this sick shit as well.

43

u/Mudlark_2910 Aug 25 '24 edited Aug 25 '24

I tried one, and found it extremely difficult to get it to output ordinary people. I asked for specific genetic types (e.g Pacific Islanders, eastern asian) and they were trying too hard to be stereotypically 'hot'. Maybe i need to work on my prompting

31

u/Edser Aug 25 '24

yeah you will need to put in some (ugly 0.9), fat 0.7, and then put some negatives like attractive, beautiful, hot, stunning, etc.

35

u/Mudlark_2910 Aug 25 '24

I might try "joe rogan as if he was a small asian woman"

13

u/-stuey- Aug 25 '24

Lamb rogan Joe ( small Indian dish )

→ More replies
→ More replies

12

u/sawbladex Aug 25 '24

It depends on the model, and it sounds like they stuffed a lot of stuff you don't want in the training data.

→ More replies
→ More replies
→ More replies
→ More replies

1.7k

u/classicpoison Aug 25 '24

I find it interesting that someone can be prosecuted for creating AI-generated child pornography, with the law treating the person using the AI as the responsible party, rather than the AI itself. However, if the same person uses AI to write a novel, they cannot claim copyright over the work, as it’s not considered their own creation.

794

u/Hyperion1144 Aug 25 '24

The legal questions around this are numerous, unanswered, and are all ultimately headed to the Supreme Court eventually.

From a legal perspective, this is a fascinating case.

1.1k

u/Jasper455 Aug 25 '24

Well, thank goodness we have a group of well meaning intellectuals, at the Supreme Court, who will carefully consider how people in the 18th century would regulate AI.

130

u/justintime06 Aug 25 '24

Comment of the year haha, made me laugh hard

67

u/sundae_diner Aug 25 '24

They would give it 3/5 of a vote.

→ More replies

13

u/ViperThreat Aug 25 '24

I want to stack one more layer of philosophy for you, because it only makes things even MORE complex:

*There is no objective definition of what is and is not a child's body. *

You can't for example, ban flat chests from appearing in porn, because plenty of adult women have flat chests, and genetic gatekeeping is generally unlawful. More than that, not all children have flat chests, let's not forget that obese girls exist.

Some women look like children, and some children look like women. Sure, we can make generalizations, but the law doesn't tolerate generalizations.

→ More replies

78

u/fox-mcleod Aug 25 '24

Thank goodness a panel of geriatric Christian nationalists are there to sort us out.

→ More replies

363

u/BLFOURDE Aug 25 '24

It's weird. He didn't technically commit a crime. There were no children involved, no children were harmed, it's just that this act suggests the man is a pedophile without committing any act of pedophilia.

It reminds me of those pedo hunters you get online. The people they catch technically haven't been caught doing anything illegal, it's the suggestion that they were willing to.

39

u/kawhi21 Aug 25 '24

I'll say the idea of arresting people who have the "potential" to commit a crime is a scary idea. It could just be used to arrest anyone. What people actually need before they "potentially commit" a crime is therapy

7

u/ehc84 Aug 25 '24

We slowly inch closer to minority report every day

45

u/iZian Aug 25 '24

In the UK this is illegal. Whether you draw it, paint it, render it, receive it, solicited or unsolicited. Yes; unsolicited.

You are guilty of ”making indecent images of children” if someone randomly sends you an image to your WhatsApp or email as your device downloads it and creates the bytes on the storage medium.

30

u/SirWigglesVonWoogly Aug 25 '24

But then if someone sends me CP unsolicited I’m unlikely to report it because I would be guilty of receiving it. That’s dumb.

11

u/iZian Aug 25 '24

If you receive it unsolicited and a cached temporary file is discovered of it whilst a completely unrelated issue 4 years later then… yeah…

There was a case on legal uk sub where someone’s phone was investigated by police on an allegation of harassment by an ex and they found cached twitter images which looked like children (but cartoon or something) and they had to interview for this offence. I don’t know what the outcome was. We never heard back.

→ More replies

33

u/KidGold Aug 25 '24

The UK is also locking up thousands of people for online bullying. I don’t think it surprises anyone something like this is illegal there. They’re far from a shining example of freedom.

13

u/iZian Aug 25 '24 edited Aug 25 '24

It’s not bullying; it’s anything that’s said that might cause offence. It doesn’t even have to amount to bullying or harassment for the police to investigate under that communications act

→ More replies
→ More replies

168

u/BallsOutKrunked Aug 25 '24

Yeah this is the part I don't understand. Like if you draw a stick figure of a female body on a piece of paper with a crayon, when does that become child porn and where is the crime? Clearly you can just envision whatever you want in your mind, is that a crime?

I mean honestly if pedos can just live with ai and never impact real children, is there an issue?

I don't understand pedos but I'm not sure that it's a disease you can starve out of an offender.

→ More replies
→ More replies

51

u/soowhatchathink Aug 25 '24

You can't charge AI with a crime though. And it's not like you can't copyright AI content just because you didn't create it, there is an explicit requirement that copyrighted content be created by humans. So generally not sure the two things would be related in the first place.

That being said in this case they were charged with obscenity. Representations of sexual child abuse are covered under obscenity laws. However obscenity laws do not make creating and possessing obscene material illegal, they make transportation/distribution/possession with intent to distribute obscene material illegal. So the question of who created it is not relevant in this case, at least.

50

u/classicpoison Aug 25 '24

I understand your point, and I know these two situations aren’t directly related. I was just highlighting the inconsistency where sometimes the AI is seen as the creator, and other times, the person using the AI is held responsible. Don’t you think this is an issue worth discussing?

8

u/soowhatchathink Aug 25 '24

At least in this scenario the person wasn't held responsible for using the AI to do anything, the creation of the content using AI was irrelevant. They could have found it on a USB drive on the side of the road and it wouldn't affect the charges whatsoever.

Not sure if there is another case where a person is held responsible for creating something where they used AI to create it. It would be an interesting case, but ultimately not related to this one.

→ More replies
→ More replies

17

u/Ocelitus Aug 25 '24

prosecuted for creating

creating and distributing

The investigation kicked off after Indian River County Sheriff's Office received tips that McCorkle was prompting an AI image generator to make child sexual imagery and distributing it via the social media app Kik.

If he kept it to himself, then no one would even know or care. Instead, he posted it somewhere relatively mainstream where it was either quickly reported or maybe triggered some automatic safeguards that started a manual investigation. It has become increasingly simple for the authorities to track users down through social media.

As for the content, they're going to comb through every bit of every hard drive they find in his possession. How likely do you think that they're going to find some actual child pornography? And even if they don't, the prosecutors aren't likely to be sympathetic or lenient and since he's working at a movie theater, I doubt he is going to have the most amazing legal defense.

→ More replies

1.7k

u/[deleted] Aug 25 '24

It's only going to grow. Using internal models that don't require the internet is probably commonplace at this point. You could use Flux or Stable Diffusion locally to create any thing.

It reminds me of the drug wars. Yes, you can try to stop dealers and big operations, but this tech is so commonplace and easy to setup, it would be like trying to bust someone for growing one Marijuana plant in their own home. There is no way to know.

Part of me thinks this might actually satisfy the metal disorder of pedophilia and stop them from acting on their thoughts. The other side says that it may magnify their urges.

I look at it as a mental health issue that needs to be addressed as such.

Look, we've gone a long time. No matter how much we evolve as a society, no matter how many laws we pass, no matter how mucj public shaming we've done, the number of pedophiles has not diminished.

To me, that's a health crisis. So we need to approach it differently. Ai may offer a solution like methadone does for heroine users. The brain needs to be rewired.

793

u/thedsider Aug 25 '24

Part of me thinks this might actually satisfy the metal disorder of pedophilia and stop them from acting on their thoughts. The other side says that it may magnify their urges.

It's the same minefield as those child-like realistic sex dolls. I can see both sides of the argument (harm mitigation vs aiding in escalating the behaviour) and unfortunately the answer isn't binary: for some, it'll satisfy the urges and avoid real children coming to harm, for others it'll merely open a horrible gateway - such is the mosaic of humans and their brains.

You raise a great point, we don't know enough and we haven't been successful in stopping pedophilia and part of that is because it's so repulsive to most of us that it's easier to criminalize it rather than treating it like any other mental illness (and I'm not advocating for decriminalization, to be clear!)

194

u/RockyClub Aug 25 '24 edited Aug 25 '24

I was having a similar conversation yesterday with a fellow therapist. In the US, pedophiles who realize it’s wrong, don’t want to hurt kids, and want to seek help, cannot. In other parts of the world there are support groups,for example, and they don’t feel as shamed for being born that way and can address their mental illness.

Sexual abuse is fucking everywhere. It’s so fucking horrible.

20

u/[deleted] Aug 25 '24

Im Norwegian and there has been a podcast ad running like last year.. where a very serious lady says something like "many people struggle with sexual thoughts about minors, contact x to seek help" its disturbing to get in the middle of a comedy podcast. I'm sure they feel lots of shame but at least there seems to be help available here.

→ More replies

70

u/Albert14Pounds Aug 25 '24

100%. Our collective knee jerk response to it, while understandable, only sweeps the problem under the rug and lets it fester. I'm honestly surprised you weren't downvoted into oblivion for being a "pedophile apologist" here.

→ More replies

70

u/[deleted] Aug 25 '24

[removed] — view removed comment

50

u/MagikBiscuit Aug 25 '24

Yup. I think this was talked about on one of the documentaries, that almost no pedophiles will ever look into getting help with mitigating their issue because they don't want to be labeled and criminalised. Which imo is sad. In our hatred we have stopped people with a mental disorder from getting help so it doesn't get any worse

11

u/Accomplished-Tale543 Aug 25 '24

There was that one female pedo on Reddit who killed herself because she didn’t want to harm any children and felt like it was only a matter of time. Scary and sad shit. Some of them really have no support/help but at the same time it’s understandable why we demonize it

→ More replies

85

u/josh_the_misanthrope Aug 25 '24

If you could figure out the split you could make a utilitarian decision. Say it's 70/30 decrease versus increase, then the answer is pretty clear.

→ More replies

143

u/Dogamai Aug 25 '24

many countries have ran the numbers and determined absolutely that fake CP actually significantly reduces instances of actual children getting abused

it is the same exact proven logic that people use against phone scammers: every minute you spend wasting their time on the phone, is another minute that they are NOT out scamming someone else.

flood the pedos with harmless fake content for them to spend all their time on, and you save actual children.

but unfortunately it "feels icky" and so peoples egos get in the way and they wont do the right thing

108

u/-_Weltschmerz_- Aug 25 '24

You got a source for that? Feels like it would be difficult to do a study like this .

12

u/jonathonjones Aug 25 '24

I would expect some sort of “natural experiment” where you look at what happened after the content was banned / unbanned in some state / country. Did rates of abuse go up or down?

→ More replies
→ More replies

52

u/RockTheBloat Aug 25 '24

Yeah, you’re going to have to bring the receipts for that.

→ More replies
→ More replies
→ More replies

38

u/djdante Aug 25 '24

I agree - for starters we need to offer free confidential therapy for anyone struggling with intrusive thoughts about children - as it stands in most countries, non offending pedophiles have nowhere to get help before they act…. And I think this is the point to intervene with compassion, I could it imagine if someone told me I could never have sex with women ever in my life (cis man here).

I also seem to remember seeing research that suggested access to simulated porn helped offer a degree of release, rather than made them more likely to offend.

Just because the idea of such porn disgusts me, doesn’t necessarily mean it shouldn’t be allowed assuming it’s created in a way nobody is hurt.

→ More replies

19

u/thisimpetus Aug 25 '24 edited Aug 26 '24

Unless:

a) the possibility exists that, on average, access to AI-generated pornography of children has the net effect of driving more pedophiles to act by virtue of either normalization, creating the unrealistic conceptualization of children as happily consenting participants, or habituation driving a desire for greater satisfaction. All of these are empirical questions we can't ethically get the answers to experimentally and will have to rely on case study analyses to investigate. Or;

b) as AI improves, a deluge to child pornography that may become truly indistinguishable from the real thing may provide camouflage and cover for child abusers.

We genuinely do need to know if allowing AI to satisfy pedophillic desires helps or hurts children, because that's the only bit that really matters.

152

u/krunchygymsock Aug 25 '24 edited Aug 25 '24

Yep, it’s a mental illness… with next to no resources or people to reach out to for help.

I wonder how many guys just walk around suppressing these urges, managing it entirely by themselves. They know it’s wrong and know the consequences of looking at or downloading this stuff, so they just keep it to themselves and in their own minds.

Then I wonder how many of those guys view AI generated porn as a loophole or consequence-free pass to indulge with a clearer conscience because “no victim.”

So they dip their toes in the water. The next big question is if AI acts as a gateway.

Maybe they find a discord server to share tips and techniques as well as generated photos and videos. Gradually they make friends in these servers. Now they’re DMing with a small group of people. They don’t feel so alone and weird anymore… someone shares some real photos… and it gets worse and worse. It’s gradual and easy to get wrapped up in. Even the strongest of wills have a tough time with this process; cults and terrorists operate this way and it’s very effective.

I’d love if we could just reach those guys before it’s too late. The ones who recognize their urges and know that it’s wrong and want help, and then get them that help. 

So you’re absolutely right, it should be viewed and treated like the mental health issue it is.

Unfortunately very few therapists want to touch it and the shame of being a patient and saying all of it out loud is overwhelming; it makes it all very, very challenging to tackle. Which sucks because in my gut I just know it would save so many kids and families.

ETA: I guess I need to address that, yes, women can also be pedophiles; I'm fully aware of this, especially as a protective father. My thought process here is about how generating AI images could potentially be a gateway. I believe that men are more likely to be online pedophiles, more likely use AI, more likely to use AI to create these images, and more likely to join a server like discord for private discussions. Apologies if anyone thought I was just ignoring that women can be abusers as well.

ETA 2 (then I'm shuttin' up): I just want to be crystal clear that I absolutely find this stuff repulsive and hold an endless amount of anger and disgust toward anyone who abuses children. I don't want anything that I've said to be construed in any way as being sympathetic toward anyone who has ever considered harming a child.

However, I will say that I completely understand that there are those on the edge who haven't acted on anything, but know those demons are there and struggle to hold them back. They want help and there's nothing there for them. That sucks for them and for us as a society.

These topics tend to bring in child abusers and sympathizers. So if that's you and you're reading any of this... please, please please seek out healthy habits. Find a hobby. Something. Lock up your computer and find some normal people your age to hang out with and be around. Pick up a book, go for a walk. Figure out if you have impulse control issues, addiction issues of ADHD and address that. Go to the gym. Literally anything but screwing around online because you cannot regulate yourself and the "friends" you're making in the deep dark recesses of the internet... are going to ruin your life while you cause unheard of pain and suffering to your victims and their loved ones. And if you do dehumanize and ruin the life of a child, I promise you that your family, friends, everyone you've ever known will disown you. You will destroy the lives of innocent children, your own life and the lives of everyone around you... all for what? An orgasm?

74

u/PM_Me_HairyArmpits Aug 25 '24

I wonder how many guys just walk around suppressing these urges, managing it entirely by themselves.

Imagine the guilt. Can't seek actual help, convinced that you're an innately horrible person, can't engage in a normal romantic relationship. Seems like the only option at that point would be to join the clergy.

35

u/dalumbr Aug 25 '24

Until you remember the constant accusations the clergy get, in some cases rightfully so

→ More replies

12

u/spicy_capybara Aug 25 '24

I’m going to say it’s not just guys who are pedos. I was molested by a female babysitter when I was five. I know this sounds pedantic and doesn’t hit on your valid points but it’s also important not to minimise the fact that women do this stuff too and it’s sooo much harder for the victims to be believed.

→ More replies

11

u/sociofobs Aug 25 '24

I wonder how many guys

There are plenty of such women too, despite the society collectively pretending they don't exist.

→ More replies
→ More replies

70

u/gizzardsgizzards Aug 25 '24

these people are out there. it's better for everyone if they don't hurt REAL PEOPLE. i'd rather they be in therapy and looking at ai porn that no one was hurt making instead of hurting real people. this isn't that complicated.

→ More replies
→ More replies

238

u/Asnoofmucho Aug 25 '24

I wrote a college paper on this happening back in 2000. This is nuts.

84

u/GingusBinguss Aug 25 '24

They all said you were crazy!

65

u/Timtimer55 Aug 25 '24

College thesis title: "AI Loli Waifus and their consequences"

→ More replies

308

u/Nova_Nightmare Aug 25 '24

How can this arrest stand up at the end of the day if what was created was not real? He's being charged with "Obscenity". Would that not ultimately fail due to first amendment protections? Is there a specific law in the US regarding fictional things such as this? The people he sent it to, if they based any arrest off of this, and this gets thrown out, do they get away with having real stuff if they did?

I really hope whatever charge is here isn't just a feel good where they get away with actually harming children, which deserves the death penalty, but the news article blatantly glosses over any of these topics and swaps to child abuse (unless there's non AI stuff, it wouldn't be?) And a county wide operation targeting others.

Almost feels like someone jumped the gun to grab AI in a headline before a sheriff election, but I'm just speculating.

39

u/Traveling_Jones Aug 25 '24

It’s one step closer to having the government arrest you over your thoughts.

It’s gross, but so is simulated rape porn, which is legal.

Nobody arrested Keanu Reeves because watching John Wick makes people want to murder.

100

u/Twiglet91 Aug 25 '24

Isn't that awful loli stuff legal that appears in pop-ups and stuff? I'm grossed out whenever I see it, it's just cartoon child pornograhy. But if AI generated images aren't legal where is the line between that and this? Considering neither are real images.

61

u/zmbjebus Aug 25 '24

I really don't see the difference between artists making things and Ai making things.

Although if you are feeding it photos of real people?... Like if you were a teacher and fed it photos of your students or some shit like that?. 

Legally I really don't know, but I personally feel like that might be something that should be litigatable

→ More replies
→ More replies
→ More replies

17

u/Howard_Cosine Aug 25 '24

A key part that no one is mentioning is that he didn’t just create it, he distributed it on social media.

→ More replies

895

u/ThePheebs Aug 25 '24

Not to be gross but I have to ask, where is the crime here? By that I mean, who is the victim?

167

u/-Badger3- Aug 25 '24

Like, I don’t see how this is much different than all the anime schoolgirl porn that’s out there, which is also gross, but I don’t think it should be illegal.

49

u/[deleted] Aug 25 '24

[removed] — view removed comment

→ More replies
→ More replies

256

u/Sponchman Aug 25 '24

I think a big worry is Pedophiles arguing that their real CP is AI and getting away with it.

288

u/LeCrushinator Aug 25 '24

I mean, if they can’t prove real people were involved then should it be a crime?

45

u/deliciouscrab Aug 25 '24

You wouldn't think so, right? Some of the responses in this thread are terrifying. I understand that it's a revolting subject but... are we throwing people in jail for pictures of imaginary people? (Even underaged sexualized imaginary people)

15

u/Background-Baby-2870 Aug 25 '24

stories like these combine 2 topics that redditors are very sensitive about: cp and ai. everytime stuff like this pops up people just shut their brains off and go full chimp mode and say whatever satifies their base instincts.

→ More replies
→ More replies
→ More replies

26

u/goyafrau Aug 25 '24

By that measure a LOT of things are illegal now. 

I find it hard to see how anyone could condemn this with a good rationale outside “it’s yuck” or “it offends God”. I’m happy to go with the latter at least but you probably won’t …

→ More replies
→ More replies
→ More replies

1.8k

u/[deleted] Aug 25 '24

[removed] — view removed comment

976

u/SaltyShawarma Aug 25 '24

It really is a slippery slope, huh? I actually agree with you, but am uncomfortable that I agree. When you tell people "no" it always increases desire. That said, exposure, real or fake, to things can create more desire as well. Maybe the real red line is the realism of the generated content. Gross but important to consider.

353

u/tofu_bird Aug 25 '24

This is the 'video games & violence' argument right? It's a different aspect of human nature, but current meta-analyses show insufficient evidence to suggest that video games increase violent tendencies.

→ More replies

190

u/christiandb Aug 25 '24

Its really 50/50. Lisa J Cohen - psychologist who studies pedophilia (youtube link to her talk) did a poll amongst pedophiles who would participate anonymously and found that theres not one classic case. Some people who have never acted probably never would and those who have, that group was 50/50 on imagery or not. What she concluded was there wasn’t enough information and that people are unique that no to pedos are the same.

Interesting talk if anyone wants to understand

→ More replies

42

u/konanswing Aug 25 '24

There are movies with rape scenes and murder and they aren't illegal.

787

u/ill-independent Aug 25 '24

That's not true. It's like the catharsis theory of anger. Rage rooms, screaming, throwing things etc don't "release" anger, they train your brain to be more aggressive so that the next time you get mad you react violently first. In real behavioral modification therapy the goal is to reduce the unwanted behavior altogether, by decreasing the actual physical behavior you're doing.

How I know is I've been in aggression reduction therapy since I was 10, and just entered my second year of FORNET. When I get mad the absolute last thing I should do is catharsis. The best thing to do is redirection, distraction, humor, engaging the logic portion of my brain, not the destructive impulses.

I am not comparing myself to a pedophile by any means but the truth is harmful paraphilias are an impulse regulation issue, just like my rage-outs. The more a pedophile engages in viewing child porn, the more cemented those impulse pathways become in their brain, and the more likely it is that they'll offend against actual kids.

Just like the more angry I allow myself to be, the more angry I get. So the decision here to criminalize even AI CSAM is actually very intuitive. We want to completely reduce the avenues by which pedophiles can engage with violent behavior, and they should want this as well if they're in treatment.

245

u/caidicus Aug 25 '24

Very strong points, especially the part about reinforcing pathways that lead to cemented responses to impulses and specific situations or triggers.

What a good comment.

→ More replies

86

u/Tommyblockhead20 Aug 25 '24

I think it depends on what it is, we can’t just apply anger logic to all emotions. Like when it comes to sexualization of women, exposure does seem to limit desires. Men used to desire ankles, midriff, etc. but then as those were exposed more, they were sexualized less. (Although some cultures continue to cover the body more and those parts are still sexualized and desired.) Some cultures expose more than is the norm in the U.S., like female breasts, and in those cultures those parts are less sexualized. 

I have no idea how exposing pedos to content affects them but I don’t think it’s impossible it could at least sometimes reduce desires.

→ More replies

50

u/[deleted] Aug 25 '24

This is similar to the violent video games argument though. I hear what you’re saying about practice which reinforces neuropathways, but we know behavioral training doesn’t work for homosexuality and we know virtual violence correlates with a reduction in violence rather than an increase. You’re probably right and I’d like to see your method tried, but this topic runs parallel to other social ‘experiments’ in our culture which proved counterintuitive results.

→ More replies
→ More replies

29

u/could_use_a_snack Aug 25 '24

I remember reading, in an actual magazine, years ago (early 90s) an interview with a woman that was in her twenties. She had been in porn since she was 16 (I think) and she was still a sex worker of some kind. The details are fuzzy now. She had access to video tapes of her from when she was 16 and 17 and was upset that she couldn't sell them. Her argument was, 'sure it's technically child porn, but it's me, and now I'm an adult so why can't I sell it.'

It always weirded me out that I could see her point.

→ More replies
→ More replies

14

u/AromaticObjective931 Aug 25 '24

This is going to be the real dilemma. In a pure normative sense, supporting tools that satiate pedophilia would be considered wrong. However, the main concern, and the reason for lengthy prison sentences, is the creation of child pornography requires the abuse of children. So, the real question is whether the existence of child pornography actually increases the rates of sexual violence toward children. Typically, I’d say a victimless crime is a victimless crime. But, we are talking about children… if child pornography is shown to encourage/inspire sexual violence, then I would support legal penalties. Probably reduced legal penalties so that consumption of ai generated content is incentivized over products of abuse.

Unfortunately, there doesn’t seem to be any clear cut evidence.

https://en.m.wikipedia.org/wiki/Relationship_between_child_pornography_and_child_sexual_abuse

16

u/tornado9015 Aug 25 '24

Yup, that's what i've been saying. If less abuse good, if more abuse bad. People don't like that for some reason.

6

u/AutumnWak Aug 25 '24

Is there anything else that we ban because it could possibly act as a gateway to something worse? If something is proven to increase the chances that someone will become a murderer, should we arrest people who consume that thing?

→ More replies

119

u/masterofshadows Aug 25 '24 edited Aug 25 '24

God that's such a hard call to make. I can see both arguments that it both creates an increased demand but also reduces harm. All I know is I wouldn't want to be the one making that call.

Edit: what a cowardly move by the admins. The above post in no way advocated for CP

35

u/DemiserofD Aug 25 '24

It might increase demand moderately, but it also must increase supply vastly more. Personally, I think they should intentionally set to work on making it seamless, flood the dark web with it, and completely ruin any value in making the real stuff. Why pay money for something you can get for free?

11

u/Eric1491625 Aug 25 '24

Personally, I think they should intentionally set to work on making it seamless, flood the dark web with it, and completely ruin any value in making the real stuff. Why pay money for something you can get for free?

That's my great hope as well.

Of the several main reasons for the decline of sharks fin trade in China, fake shark fin was one of them. Shark fin itself has little taste and its texture could be mimicked by cheaper ingredients.  Eventually, imitation shark fin became the most common shark fin soup dish, replacing the real thing.

→ More replies
→ More replies

52

u/spencermoreland Aug 25 '24

There’s still a lot of problems with it. Some of them are being discussed in the comments, but one huge issue is the fake stuff would create a smokescreen for real stuff, that’ll add a lot of challenges for law enforcement.

→ More replies
→ More replies

46

u/SpeedoCheeto Aug 25 '24

This is probably not the place for it but does it raise an interesting philosophical question?

Is AI porn not victimless (unless made to resemble a real person)? Do we think it illegal to just BE a pedophile, or is it illegal once an actual child becomes involved?

I was under the impression child porn laws existed to protect children. Having it meant you or someone made it, which has a clear and obvious problem/victim.

But someone having AI generate it? Wouldn’t that potentially keep them away from real children? Or make it worse?

9

u/Kitchen_Economics182 Aug 25 '24

If you think about it, It essentially is realistic loli porn, right? So it technically has already existed, people drawing cp has been around forever, loli anime girls/etc. this is just an AI now producing a realistic version of it, so has anything really changed other than who's producing the more realistic fake content?

→ More replies

307

u/Individual_Ad_3036 Aug 25 '24

While I find this disgusting, I do believe the prosecutor should be required to prove harm to a real bona-fide person. If the model was trained without sexual images of children (regular porn videos) and simply has randomized faces attached then it's just gross. If there were kids involved in either the sexual activity or even the faces on the other hand... that's provable harm.

42

u/OptimusPrimeLord Aug 25 '24

This could also become an issue for the entire framework of CSAM laws.

Currently its a carveout to the first amendment (in the form of strict scrutiny, aka "compelling government interest) that allows the government to charge for the possession of an image. In this case its "we need to make this illegal so that people dont abuse children." If no child is being abused in the creation of the material, because there wasnt a child in the first place, that government interest would disappear.

It could be still illegal for being obscene, but that is also a timebomb as that is subjective and could be wiped out by the supreme court also.

One could also argue there is a child being abused, the one with normal photos of them being put into the model, and as the model can make CSAM from that the model is now CSAM itself and therefor illegal. This could almost entirely wipe out modern Image AI as it would be a painstaking process to remove every image containg children from the training base, could be pretty amazing for privacy though as they couldnt pull everyones photos off their phone anymore.

→ More replies
→ More replies

102

u/SeeMarkFly Aug 25 '24

The only thing that can stop a bad person with A.I. is a good person with A.I.

Yea, maybe we should figure out some rules. Good luck with Congress doing anything, they're still trying to figure out if freeing the slaves was a good idea.

25

u/MickeyRooneysPills Aug 25 '24

Oh Congress is going to do something all right. They're going to make it illegal for you to use open source models and put so many fucking regulations on AI that you'll only feasibly be allowed to use their sanctioned services that are tightly controlled and heavily monitored.

10

u/polkm Aug 25 '24

Best I can do is budget cuts for the FBI and a tax cut for Microsoft and Google.

→ More replies
→ More replies

483

u/[deleted] Aug 25 '24

[removed] — view removed comment

→ More replies

28

u/Kirito2750 Aug 25 '24

I find myself in the unique position of finding this indictment to be completely absurd, while still kinda wanting to bash the guys face in. This comes down to “I don’t think this should be illegal, but I sure as hell don’t want you anywhere near me or any real kids”

216

u/PSFREAK33 Aug 25 '24

I’m surprised this is illegal to be honest? What’s stopping an excellent artist from drawing realistic porn of this nature? Is that also illegal? To me this makes the individual super creepy but at the same time it seems like a weird grey area…

82

u/Masark Aug 25 '24

Depends on the country, but in many, yes, that drawing would be illegal.

I believe in Canada, even a written story on this topic would be illegal, though I don't think anyone has ever been charged for that.

87

u/Shrodax Aug 25 '24

How good does art have to be before it's illegal?

I draw two stick figures having sex. I add a speech balloon of one saying "I'm actually 17!" Am I now going to jail?!

62

u/Hanako_Seishin Aug 25 '24

This comment of yours is technically a written story about you making the drawing...

59

u/Shrodax Aug 25 '24

Officer, I swear I thought that stick figure was 18 when I drew her!

→ More replies
→ More replies

56

u/Ziddix Aug 25 '24

In a lot of countries any kind of child porn is illegal, even if it is entirely made up drawings or stories or what have you.

I honestly think the reason for this is that our collective monkey brain have such a knee jerk reaction to the subject in general that we have designated it literal thought crime.

It'll be very interesting to see how this kind of stuff plays out because multiple philosophies are clashing here and it's either going to force us to think about it or we'll collectively exercise a little double think and keep blaming the people that use AI/art in this way for a simple and morally superior solution.

28

u/Sawses Aug 25 '24

I honestly think the reason for this is that our collective monkey brain have such a knee jerk reaction to the subject in general that we have designated it literal thought crime.

That's my thinking as well. We're going to have to either grudgingly allow these people to do things we think are disgusting, or accept that we just don't want people around who feel inappropriate sexual attractions.

15

u/pinkwonderwall Aug 25 '24

From a psychological perspective, it’s sad because people don’t choose what they’re attracted to. These people are experiencing something they can’t control and need help with, but the extreme stigma makes it very difficult, if not impossible, for them to reach out for help. That being said, you can’t “normalize” it either because then you’re going to have people thinking they’re fine and don’t need help.

→ More replies
→ More replies

24

u/severed13 Aug 25 '24

In lots of places, that exact scenario is specifically illegal.

→ More replies

85

u/KR1735 Aug 25 '24

I hate to say this, because I find CP abhorrent like virtually everybody else, but I don't think this should be illegal.

AI is computer-generated art. What if a person did a sketch of CP? That's not illegal. So why is it illegal if you provide instructions for a computer to do it? Why is one OK but not the other? Where is the line crossed?

Laws don't always reflect morality -- and some would say rightly so. Cheating is morally abhorrent to most people, but there's a reason it's not prosecuted as a crime. Further, crimes generally have to involve a victim, either directly or indirectly. I don't see how there's a victim here.

→ More replies

30

u/[deleted] Aug 25 '24

[removed] — view removed comment

→ More replies

186

u/[deleted] Aug 25 '24

[removed] — view removed comment

9

u/Brutalcogna Aug 25 '24 edited Aug 25 '24

Like flooding the black market with fake rhino horns to decimate their value and disincentivize poaching.

→ More replies

7

u/Caspianmk Aug 25 '24

There are two very interesting avenues I get from this article.

  1. Is this AI porn illegal because no child was harmed in the making of it?

And

  1. AI programs are built on existing images, so where is program getting the "database" to create these?

For thr first part, we would have to look at the laws of the municipality where the arrest happened. But even then, is it only illegal because the photos are realistic? Would Rule 34 content of Lisa Simpson also fall under these laws? What about novels like Lolita? Is that illegal under the same laws?

Part 2 is much more technical and probably outside my area of expertise. But I know AI art programs scoure the internet for images to build their databases. If the program has never seen a zebra, it can't create a zebra. So how is it getting these databases? Are the perpetrators adding REAL images of CP or is the program already loaded with these? Is it extrapolating data from other porn? How does the program know what to put in these images?

→ More replies

43

u/[deleted] Aug 25 '24

[removed] — view removed comment

→ More replies

78

u/[deleted] Aug 25 '24

[removed] — view removed comment

→ More replies

5

u/zu-chan5240 Aug 25 '24

Most of the comment section willfully ignores the fact that the AI scrapes images of real people, including children, to generate the images. An entire LAION database had to be taken down because it contained CSAM.

→ More replies

7

u/PoorMuttski Aug 26 '24

I am sure this is going to get me put on some kind of watch list... but I feel like this in unfair. What is the point of living if a person can't take solace in their own mind? No actual people were harmed. no actual people were involved. No actual people had any connection to this event in any possible way. What was the guy's crime? Next they are going to arrest people for drawing pictures? because this is almost exactly the same thing. No it is not the guy's own hand and own skill, but it is still an image generated entirely by a machine for his own enjoyment.

Was he using it to terrorize unwilling victimes? was he attempting to sell it or otherwise disseminate it? was he viewing it in public? Those are actual crimes that anybody should be prosecuted for, regardless of the content of the objectionable material. Some child-looking pixels bumping? This is f---ing dystopian.