Your ex used AI to create intimate images of you, and sent them to your friends. It might not be illegal.

A giant potential loophole in Canada’s intimate image distribution laws should concern everyone. Our laws need to be clarified before someone dies.

Deepfake porn ad

Imagine waking up to find out that without your consent, an X-rated, fetish sex tape of yourself is making the rounds in your friend's DMs or on a Telegram channel. 

Now imagine that video, virtually indistinguishable from the real thing, was generated by artificial intelligence, also without your consent.

That scenario, which may have sounded like science fiction a few short years ago, is happening with increasing frequency as applications like DeepnudeAI allow virtually anyone to make highly realistic nude pictures of anyone else. Deepfake pornography is also being augmented by AI to make pornography, often for revenge. However, most concerning is that Canadian laws designed to prevent the non-consensual digital dissemination of intimate images, some of which are barely a decade old, may not provide legal protection to victims of deepnude and deepfake non-consensual pornography.

I discovered this issue last month while researching to prepare for a debate on Bill S-12, "An Act to amend the Criminal Code, the Sex Offender Information Registration Act and the International Transfer of Offenders Act." In my arguments to the House of Commons, I outlined how Canada's definition of what constitutes an intimate image might not cover AI-generated deepnudes or deepfake pornography, therefore potentially creating a legal loophole by which the digital dissemination of revenge porn could victimize Canadians.

To understand this potential loophole, acknowledging the recency of Canada's existing criminal laws regarding the non-consensual digital distribution of intimate images and knowledge of how they came to pass is essential.

96% of deepfakes posted online are sexually explicit

In 2013, Rehtaeh Parsons, a 17-year-old Nova Scotian, tragically died after an attempted suicide when classmates electronically disseminated photographs of her being allegedly gang raped. A year prior, a 15-year-old British Columbia girl, Amanda Todd, also committed suicide after experiencing assault, extreme bullying and blackmail related to the non-consensual online distribution of photos of her breasts. 

Their deaths were a wake-up call for Canadian legislators to urgently address new threats to public health and safety created by advances and widespread deployment of new digital technologies. Various laws were tabled to ensure criminal penalties for the non-consensual distribution of intimate images. At the federal level, the "Protecting Canadians from Online Crime Act" received royal assent in December 2014, which included specific provisions to address this issue.

Fast forward to today, and less than a decade after the passage of this bill, technological advances may have once again outpaced Canadian lawmakers' ability to provide criminal protection for vulnerable Canadians. 

Even before the mass deployment of large language model AIs in late 2022, Western University's Centre for Research & Education on Violence Against Women & Children was raising alarm bells about the potential inadequacies of Canadian law. An extensive brief produced by the Centre in 2021 outlined the potentially massive detrimental impact the lack of protection regarding deepnudes and deepfakes could have on Canadians, particularly women. The brief also stated that in Canada, the legal definition of intimate images does not explicitly include deepnudes and sexual deepfakes and called for the explicit criminalization of both the creation and distribution of non-consensual deepnudes and sexual deepfakes. And last month, the Canadian Bar Association posted an essay which outlined similar concerns, with a focus on how Canadian privacy torts may not adequately cover the issue.

I raised this suggestion last month in the debate on bill S-12, as I believed the subject matter of the bill may have offered an opportunity for the government to amend the definition of intimate images as presently outlined in the Criminal Code. However, the bill was fast-tracked through to royal assent with minimal amendments.

The problem isn't only confined to Canada. Yesterday, a New Jersey high school student, 14-year-old Francesca Mani, appeared on CNN to call for regulations for non-consensual AI pornographic images after such images of her were potentially distributed over the summer.

Listening to her speak, I couldn't help but be reminded of the despair and anguish expressed by Rehtaeh Parsons' family in interviews given after her death. As a young legislator at the time, I remember wondering if she still would be alive if Parliament had acted faster.

Ten years later, with the advent of widespread deepfake pornography and deepnude technology, I call upon the federal government to act quickly to make sure history doesn't repeat itself.

---------

Note: In California, it is illegal to create deepfake porn of someone without their consent, with a penalty of up to $150,000. Other jurisdictions have similar laws. In Canada, the AI provisions of Bill C-27 do not address deep fakes.

Topic tags:
Mental Health AI Computers Legislation Abuse