FTC Wants to Ban AI-Powered Impersonation

By: Vejay Lalla , Paul Famiglietti , Zach Harned , Kristen Rovai

What You Need To Know

  • To combat the rising issue of AI-generated deepfakes, the Federal Trade Commission (FTC) is proposing a ban on their use for impersonating individuals and defrauding consumers.
  • Importantly, the proposed rule would also prohibit knowingly providing goods or services that enable such impersonation.
  • On the heels of a November executive order on AI, the FTC could be moving toward requiring AI-generated content be labeled as such when it could otherwise cause confusion.
  • The FTC’s proposed rule may also have broader implications for protecting against the misuse of one’s brand identity and logos, as it imposes monetary fines against deceptive practices that could harm brand reputation or mislead consumers.

On February 15, 2024, the FTC published the Rule on Impersonation of Government and Businesses, permitting the FTC to file federal court cases directly against scammers to return the money made while impersonating government or business entities. The FTC received “surging complaints” from commentors who pointed to the increasing number of scams stemming from the impersonation of friends, family, and romantic interests, which was left unaddressed by the Rule on Impersonation of Government and Businesses. In response, the FTC issued a supplemental notice of proposed rulemaking (SNPRM) that addresses the impersonation of individuals and extends liability to parties who know or have reason to know that the goods or services they provide will be used in such impersonations. In a statement, the FTC said that “emerging technology—including AI-generated deepfakes—threatens to turbocharge [impersonation fraud], and the FTC is committed to using all of its tools to detect, deter, and halt impersonation fraud.” The breadth of the SNPRM, how it will apply to technology companies, and the risk of increased claims against innocent companies is unknown. 

The SNPRM “proposes to prohibit the deceptive impersonation of individuals and would address conduct that is prevalent and harmful.” Commentors on the Rule on Impersonation of Government and Businesses provided statistics of a steadily growing number of incidents in which scammers impersonate a victim’s friend, family member, or romantic interest to extract money or sensitive information. The FTC reports that, from 2019 through 2023, family and friend impersonation resulted in victims being defrauded of approximately $339 million and romantic impersonations resulted in victims being defrauded of approximately $4.978 billion. The FBI reported that the number of elderly impersonation victims has increased by 30% since 2019. 

The FTC’s objective for proposing the SNPRM is to “more effectively and efficiently redress consumers harmed by impersonation schemes and to more effectively address the types of unlawful impersonation affecting consumers.” Further, FTC believes that the SNPRM “would not impose new burdens on honest individuals or businesses.” The SNPRM takes a two-prong approach to stopping individual impersonation scams.

  • The first prong, provided for in Section 461.4 (Impersonation of Individuals Prohibited), is to declare that either (a) falsely posing as an individual, or (b) misrepresenting (either directly or indirectly) affiliation with an individual, in a way that affects commerce, is deemed an unfair and deceptive practice, subject to FTC’s Section 5 enforcement.
  • The second prong, provided for in Section 461.5 (the Provision of Goods or Services for Unlawful Impersonation Prohibited) makes it “unlawful to provide goods or services with knowledge or reason to know that those goods or services will be used in impersonations of the kind that are themselves unlawful under the Rule” (emphasis added).

The knowledge requirement in Section 461.5 was included as a response to public comment on the initially drafted proposal, but some commentors believe that the knowledge requirement, as written, remains too broad and could have a chilling effect on developers and users of AI technology for fear of FTC enforcement. The FTC believes that the “vast majority” of small entities do not “knowingly provide goods and services used in impersonating government, businesses, or individuals in a manner that would be unlawful under the provisions set out in this SNPRM.”

The Federal Trade Commission is currently accepting public comments on the SNPRM.