by InTrieste
Italian authorities have opened an investigation into a website accused of publishing artificial intelligence–generated pornographic images of dozens of well-known Italian women without their consent, the latest in a string of online scandals targeting women in the country.
The inquiry, led by Italy’s postal police, began after journalist and television presenter Francesca Barra revealed on Instagram that she had discovered explicit AI-generated images of herself on an adult website. Barra condemned the incident as “an act of violence and an abuse that undermines dignity, reputation, and trust.”
“I thought of my children and felt embarrassed and afraid of what they might hear or read if those images fell into the wrong hands,” she wrote, adding that she also thought of “the girls who suffer the same digital violence and who perhaps don’t have the same tools to defend themselves or the strength to fight back.”
The website under investigation, called SocialMediaGirls, reportedly offers pornographic and erotic content using AI software. Among its paid features is “Undress AI,” a service that creates synthetic nude images based on real photographs available online.
Public Figures Targeted
The site includes an “Italian Nude VIPs” section featuring AI-generated images of prominent figures from entertainment, media, and politics. Those named include influencer Chiara Ferragni, sports presenter Diletta Leotta, actress Sophia Loren, journalists Francesca Fagnani and Selvaggia Lucarelli, television personalities Michelle Hunziker and Elisabetta Canalis, singers Elodie, Angelina Mango, and Arisa, and politician Maria Elena Boschi.
Lucarelli, who has been researching the site for weeks, wrote on Instagram that “there are more than 50 well-known Italian women on the site with nudes created with AI,” hosted on a forum with “more than seven million users.”
In an article published Monday, Lucarelli reported that the website also includes videos of women filmed without their knowledge, some allegedly captured by security cameras, hidden cameras, or unprotected household devices such as baby monitors. She suggested that some of the material may have been obtained through hacked servers.
Political Reaction
The revelations have sparked strong political backlash. Licia Ronzulli, vice president of the Senate and a member of the center-right Forza Italia party, condemned the website as “another disgusting and horrifying site that uses technology to rape women.”
“‘Stripping’ a face, a body, a life, without consent using artificial intelligence is not entertainment, it’s virtual rape,” Ronzulli said in a statement. She added that a recently approved law introducing penalties for the creation of deepfake pornography could carry sentences of up to five years in prison.
Senators Raffaella Paita and Daniela Sbrollini, both of the centrist Italia Viva party, also described the case as “unacceptable violence.”
A Wider Pattern
The case follows several recent scandals in Italy involving nonconsensual sharing of sexualized images. Earlier this year, authorities shut down a Facebook group called Mia Moglie (“My Wife”), where users exchanged private photos of women, and a website called Phica, which hosted doctored explicit images of female public figures.
Among those targeted on Phica was Prime Minister Giorgia Meloni, who said she was “disgusted” by the site and demanded that those responsible be punished “with the utmost firmness.” Both platforms were taken offline following public outrage.
According to Italian media outlet Il Post, the investigation into SocialMediaGirls presents greater challenges because it is an international platform, making it more difficult to identify those behind it. Corriere della Sera reported that the domain has been active for over a decade and is configured to ensure anonymity, a hallmark of websites that prioritize privacy for both users and administrators.
The investigation comes amid growing global concern over the misuse of artificial intelligence to create sexually explicit content without consent — a form of digital abuse that many experts warn existing laws have struggled to keep up with.




























