A Toronto woman says stolen images of herself altered using deepfake technology were posted on social media without her permission, and it was an uphill battle to have them removed.
As Canadian law works to catch up with artificial intelligence (AI), her story is far from an isolated one.
She spoke to us on condition of anonymity.
“I chose not to show my face obviously because that would put a target on your back,” she said.
And she says she’s already been targeted enough. That’s what she discovered three months ago on a Thursday morning when she received a follow request on her TikTok account.
“I noticed that this person who had requested to follow me was using my photo as their profile picture, so naturally I wanted to see what it was all about. I clicked on it and discovered that the entire account was filled with AI videos of me wearing lingerie and performing sexual acts.,” she said.
“They were filming my face, putting me in a body that clearly wasn’t mine, putting me in lingerie, filming me grabbing my breasts, opening my legs, all that kind of stuff.”
She immediately contacted the user but received no response. She then contacted TikTok and they responded that they would look into the matter, but that was three months ago. That profile was still active when CityNews spoke to her last week.
The woman, who is studying law, fought back tears as she spoke about the emotional burden the law has placed on her.
“I ended up skipping class. Every time I went out, I was scared that people would recognize me and think it was me. I felt like no one would believe it wasn’t me,” she said.
The woman also contacted Toronto police, who connected her to detectives but said there was little they could do.
“They said Canadian law does not currently criminalize what happened,” she said.
According to the Toronto Police Service (TPS), the rapid growth of deepfake technology is creating challenges for law enforcement, especially in areas where laws have not yet fully caught up with how the technology is being used.
The Canadian Criminal Code, specifically section 163.1, contains provisions that address offenses related to child sexual abuse material (CSAM), including content generated by AI. This will ensure that the exploitation of children through emerging technologies is captured within existing legal frameworks.
“However, cases involving deepfake images of non-consensual adults highlight areas where current law was not designed with this technology in mind,” said TPS spokeswoman Stephanie Thayer.
Thayer said the law must evolve to ensure the legal system adequately supports law enforcement agencies as they deal with the complexities of modern technology-driven crime.
Last month, Canada’s Minister of Justice and Attorney General announced the introduction of Bill C-16, the Victim Protection Act. The aim is to criminalize sexual deepfakes that show subjects nude, exposed genitals, or sexually explicit acts.
“The government looks forward to working with Parliament to pass Bill C-16 as soon as possible,” a Ministry of Justice spokesperson told City News. Thayer also added that if passed, these changes could help law enforcement uncover potential crimes related to the creation and distribution of harmful deepfake material.
“As with other proposed legislation, the practical impact on police enforcement will depend on the final language adopted by Congress, but it reflects a broader effort to ensure the law is responsive to rapidly evolving technology,” Thayer said.
Canada has been down this road before. In 2024, the Liberal Party introduced the Online Harms Act, but it was not passed.
A woman who spoke to City News said the bill is long overdue.
“I couldn’t understand how something like this, something that completely tarnishes and ruins your reputation, could not be illegal,” she said.
When Speakers Corner contacted TikTok, a spokesperson said they could not comment on individual cases or what efforts they are taking to address women’s concerns, citing privacy concerns.
Two days after City News contacted the social media company, the woman said the profile featuring her image had been removed.
If you have an issue, story or question you’d like us to investigate, please contact us. speaker corner
