Artificial IntelligenceFeaturedLos Angelespublic schools

More AI Student Nudes Showing Up At High Schools – HotAir

We’ve seen this issue cropping up before to the consternation of parents, educators, and law enforcement. Unfortunately, it doesn’t appear that it will be going away any time soon. Parents of students at Laguna Beach High School in Los Angeles were informed last week that Artificial Intelligence-generated nude images of female students at the school were making the rounds and some of the girls had learned of them and were very distraught. (As would be understandable.) An investigation is ongoing, but little information has been released. This is taking place not all that far from a school in Beverly Hills where the same thing happened last month. Other incidents have been reported. But can anything be done that will prove effective in stopping it? (CBS News)

Laguna Beach High School leaders with the help of Laguna Beach Police are investigating reports of  nude photos of students created with artificial intelligence being distributed to the student body.

The photos might have been shared through text messages and parents were informed about the photos last week.

Principal Jason Alleman sent a letter to parents Monday about the investigation into the photos, saying such incidents can have far-reaching negative impacts on the campus culture.

“These actions not only compromise individual dignity but also undermine the positive and supportive environment we aim to foster at LBHS,” Alleman said in the letter.

Having things like this happen, particularly when it involves children, is beyond disturbing and we shouldn’t minimize the trauma this can bring to young girls. Those responsible should be held accountable, assuming they can be identified. But with all of that said, this is a very tricky situation with few easy solutions being suggested. Let’s first keep in mind the fundamental nature of post-pubescent children, particularly high school boys. Obviously, their minds turn to the topic of girls, and this sort of material acts like a powerful magnet for many of them. That’s not intended to be an excuse. It’s just the reality of the situation.

At this point, AI image-generating tools are available for free on multiple platforms and you don’t have to be particularly advanced in any type of programming language to use them. While not perfect, the images they generate are growing more and more realistic. Many are indistinguishable from reality at a casual glance. At this point, most high school students have cell phones (and therefore cameras) along with access to social media accounts, for better or worse. They are taking selfies and pictures of their friends and posting them constantly. It’s only a small step from there to having some boy (because let’s admit it… it’s almost always the boys) clipping out a female classmate’s face and feeding it into an AI program and asking it to generate a pornographic image or nude. As soon as the image is shared once, they are off to the races and the girl in question is no doubt horrified that others will see it and believe it’s real. 

Finding a way to deal with this is seriously complicated. First of all, it’s still unclear if there is even a law being broken because we are in uncharted legal waters here. The girl in the example above was never photographed nude, so claims of the possession or distribution of child pornography may not even apply. The images are not real. They’re simply realistic. If the faces and the backgrounds are real, if they were willingly published by the people taking or posing for the photos and available to the general public, there likely isn’t even a question of copyright infringement or privacy. 

This is one of those cases where most people with any sense of decency can immediately say that there is something very wrong going on and that there are young girls who are legitimately being victimized in this fashion. But our laws aren’t anywhere near catching up with this technology to the point where we can say precisely what is being done that’s illegal. There are clear laws against publishing nude images of children, but in the case described above no such actual images ever existed. There have been cartoons making the rounds for ages depicting people of various ages in different stages of undress or even sexual acts. But those were cartoons and it was rare for anyone to be prosecuted for them. These are just 21st-century cartoons. The problem is that they are nearly indistinguishable from the real thing. If anyone has a brilliant solution to this conundrum, I’m all ears. But honestly, I don’t see how we address this currently aside from banning all children from having cell phones or other technology and access to social media. That’s being attempted in places like Florida, but it requires cooperation from the parents. And even there, it only covers children up to the age of 14 and likely wouldn’t have stopped what happened here in Los Angeles. That horse may have already left the barn. 

Source link