top of page

Navigating Digital Harms: What Canada Can Learn from the U.S. 'TAKE IT DOWN' Act

Updated: Jun 11


ree

A 15-year-old receives a message that a manipulated image of them is circulating online; one they never took, never consented to, and can't seem to remove. It's a terrifying, increasingly common scenario in the age of AI-generated content and frictionless sharing, where privacy violations happen in seconds and protection often comes too late.

In the U.S., lawmakers have responded with the TAKE IT DOWN Act, a new federal initiative aimed at combatting the non-consensual sharing of intimate images, particularly those involving minors or synthetic media. It offers a path for victims to reclaim their dignity and puts more responsibility on platforms to act quickly and transparently.

At the NOBLE Alliance, we’re closely watching these developments as we work to ensure Canada’s digital protections keep pace. Strong laws matter, and learning from global policy shifts, we can advocate for better protections at home and support ethical innovation that centers safety and consent.


Understanding & Analysis

It is important to note, the TAKE IT DOWN initiative is voluntary and civil in structure, and the act of publishing sexually explicit images of minors, whether real or AI-generated, is already a serious federal crime, covered under existing U.S. child protection laws.

Instead, it’s broader purpose is to be a removal mechanism, to flag and request removal of sexually explicit images or videos, even if they were taken consensually or are AI-generated (deepfakes). It's not a legal process, it’s a technical and procedural tool that helps get harmful content taken off participating platforms quickly and efficiently.


Key details:

· It allows minors and caregivers to anonymously submit a request for removal of explicit images (real or fake) from participating platforms.

· It includes protections for digitally altered content that depicts a minor in a sexually explicit way, even if the minor never took or posed for such a photo.

· It does not broadly apply to all likenesses or general images shared without consent (e.g., regular selfies, non-sexual videos, or memes).


Here’s what that means in practice:

· Platforms opt in voluntarily to receive and act on takedown requests submitted through the TAKE IT DOWN system.

· There is no federal mandate requiring all tech companies or platforms to comply; though major platforms like Meta, TikTok, and others have chosen to participate.

· The law relies on industry cooperation, which creates uneven protection depending on whether a platform is participating.

· It also places the burden on victims (or their guardians) to locate the image, submit a report, and hope the platform responds.


The Canadian Opportunity

Like our American counterparts, current Canadian law criminalizes the distribution of intimate images without consent under Section 162.1 of the Criminal Code. However, enforcement is reactive and often slow. Not-for-profit organizations such as Cybertip.ca and NeedHelpNow.ca exist to offer support and reporting pathways, but they do not provide the same privacy-preserving, quick-removal takedown system that the U.S. has begun to implement.

Aligning with the U.S. TAKE IT DOWN model offers Canada a practical advantage: global platforms like Meta and TikTok are already adapting to this system. By following suit, Canada can tap into those built-in removal tools in ensuring faster, more consistent protection for victims, especially youth.

As online harms cross borders instantly, adopting an inter-operable framework reduces friction, improves compliance, and helps Canada stay aligned with emerging global standards. At the same time, observing how the U.S. implements and enforces this model gives us the opportunity to refine and build a more robust, adaptive framework tailored to Canadian needs and values.

Comments


bottom of page