Judge calls AI-generated child pornography ‘chilling’
Quebec man sentenced for creating synthetic child porn
Jacob Serebrin, The Canadian Press
A Quebec man has been sentenced to more than three years in prison for using artificial intelligence to produce synthetic videos of child pornography.
Steven Larouche, 61, of Sherbrooke, Que., pleaded guilty to creating at least seven videos with so−called deepfake technology, which is used to superimpose the face of an individual onto the body of another person. He also pleaded guilty to possessing hundreds of thousands of computer files of child pornography, for which he was sentenced an additional four and a half years.
Provincial court judge Benoit Gagnon wrote in his ruling, issued earlier this month, that he believes this was the first case in the country involving deepfakes of child porn. He said he worries what will happen as criminals use the technology to put the faces of children whose images they find on social media onto videos of other children being sexually assaulted.
“The use of deepfake technology in criminal hands is chilling. The type of software allows crimes to be committed that could involve virtually every child in our communities,” Gagnon wrote in the April 14 decision. “A simple video excerpt of a child available on social media, or a video of children taken in a public place, could turn them into potential victims of child pornography.”
Gagnon wrote that the creation of the new images of sexual abuse encourages the market for child pornography, which craves novelty, and puts children at risk by “fuelling fantasies that incite sexual offences against children.”
While Larouche’s lawyers argued for a lighter sentence, because children were not assaulted when he produced the videos, the judge wrote that the children whose bodies appeared in the videos had their sexual integrity violated again.
Many images of child pornography have a digital fingerprint, allowing police to identify them, the judge wrote. By creating the synthetic images, Larouche made it more difficult for police to stop the spread of the illicit material.
Larouche also admitted to possessing more than 545,000 computer files containing images or videos of child sexual abuse, some of which he made available to others.
Among those images was a series of one girl being abused over a seven−year period, between the ages of seven and 14. On Larouche’s computer, police also found photos of her from her social media accounts and personal information about the child, including her real name, the town where she lives and the name of her school.
In total, Gagnon sentenced Larouche to eight years in prison. With credit for time served, Larouche will be required to serve another five years and 11 months.
Canadian law bans any visual representation of someone depicted as being under the age of 18 engaged in explicit sexual activity.
Stephen Sauer, the director of Cybertip.ca, said that while there have been earlier cases of people creating synthetic images of child sexual abuse in more rudimentary ways, such as with Photoshop, this may be the first time someone in Canada has been sentenced for creating material using deepfake technology.
“We’re almost in this unprecedented time where any child can be harmed as a result of this type of material. So you can grab any image online and create this type of imagery and harm them in any number of ways,” he said in an interview Wednesday.
The synthetic images not only humiliate the victims, violating their dignity and privacy, people could also seek those children out to sexually assault them. “It creates a safety risk for that child, because there are individuals out there who will seek to identify individuals in sexual abuse imagery,” he said.
Vasileia Karasavva, a masters student in clinical psychology at the University of British Columbia who co−wrote a 2021 paper on the threat of deepfake pornography, said the technological bar for creating videos with the technology keeps getting lower.
The technology is getting easier to use and the number of images required to put someone’s face into a video is also declining, she said in an interview Wednesday, adding that apps can be used to scrape images of an individual from their social media accounts.
She said deepfake pornography has two victims: the person whose body appears in the video, as well as the person whose face appears.
While there’s little research on the specific impact of deepfake pornography on victims, research on other forms of online sexual violence shows it often affects victims in similar ways to in−person sexual assaults.
“This victimization continues, as well. It’s very public. Very often, it’s very permanent, and the victim feels very helpless and powerless in that situation,” she said.
banner image: Pexels