• Skip to content
  • Skip to primary sidebar

Davis+Gilbert LLP

From our base in New York, we represent a diverse range of clients across the country and around the world.

  • People
  • Services
  • Emerging Issues
Insights + Events
bookmarkprintShare>

Alert - September 29, 2025

FTC Probes AI Companion Chatbots for Risks to Minors

The Bottom Line

  • While the FTC recognizes AI’s potential for innovation and economic growth, the FTC is concerned about its impact on vulnerable populations, such as children.
  • As AI companion chatbots are designed to simulate communications like a friend, protecting children and teens who use this rapidly evolving technology is a high priority for the FTC.
  • Providers of AI companion chatbots should adequately measure, test, and monitor the potential negative impacts of this technology on children and teens, and be cognizant of the potential effects on their social relationships, mental health, and well-being.

The Federal Trade Commission (FTC) announced an inquiry into seven major providers of consumer-facing AI companion chatbots that simulate human-like communication and interpersonal relationships. As part of its inquiry, using its 6(b) authority, the FTC sent orders to these companies seeking information on how they measure, test, and monitor the potentially negative impacts of this technology on children and teens.

The FTC’s Concerns

These recent efforts by the FTC come amidst the rapid growth in AI capabilities and the rise in kids using these bots for everyday decision-making. Today’s AI chatbots can not only simulate human-like communication and relationships, but they are also generally designed to communicate like a friend and companion.

Because AI chatbots can imitate human characteristics, emotions, and intentions, the FTC is concerned about the risk they create for children and teens who may be inclined to build trust and form relationships with them. According to media reports, some companies have deployed these AI companions without adequately evaluating, monitoring and mitigating their potential negative impacts on minors. For example, AI companions may generate outputs that instruct children on how to conduct violent, illegal, or otherwise physically harmful acts, or entice them into sharing sensitive personal information.

In the FTC’s announcement of this inquiry, Chairman Andrew N. Ferguson stated, “[a]s AI technologies evolve, it is important to consider the effects chatbots can have on children, while also ensuring that the United States maintains its role as a global leader in this new and exciting industry.”

The Information Request Orders

The FTC sent orders to the following AI-chatbot providers: Alphabet, Character Technologies, Instagram, Meta Platforms, OpenAI, Snap, Inc., and x.AI. Under these orders, the FTC is requesting information on how these companies:

  1. Monetize user engagement;
  2. Process user inputs and generate outputs;
  3. Develop and approve characters;
  4. Measure, test, and monitor for negative impacts before and after deployment;
  5. Employ disclosures, advertising, and other representations to inform users about features, capabilities, the intended audience, potential negative impacts, and data collection and handling practices; and 
  6. Monitor and enforce compliance with the company’s rules and terms of service.

The FTC intends to collect this information to study what steps companies have taken to evaluate the safety of their chatbots on children and teens, as well as the potential negative effects these chatbots can have on children and teens.

What AI Chatbot Providers Can Do Now

Providers of AI companion chatbots should carefully assess, test, and monitor the potential negative impacts of this technology on children and teenagers. They must be aware of how these chatbots could affect young people’s social relationships, mental health, and overall well-being.

In addition to addressing the inquiries from the FTC, AI providers should also pay attention to state laws related to AI, as they continue to be proposed and enacted in states around the country.


Primary Sidebar

Related People

  • Attorney Allison Fitzpatrick

    Allison Fitzpatrick

    Partner

    Area Of Focus

    • Advertising + Marketing
    • Privacy, Technology + Data Security
    212 468 4866
    afitzpatrick@dglaw.com
  • Attorney Gary Kibel

    Gary Kibel

    Partner

    Area Of Focus

    • Privacy, Technology + Data Security
    • Advertising + Marketing
    212 468 4918
    gkibel@dglaw.com
  • Media item displaying Robert J. Chappell Jr.

    Robert J. Chappell Jr.

    Associate

    Area Of Focus

    • Advertising + Marketing
    • Privacy, Technology + Data Security
    212 974 6954
    rchappell@dglaw.com
  • View All

Related Services

  • Privacy, Technology + Data Security
  • Data, Digital Media and Ad Tech
  • Advertising + Marketing

Get the latest insights from Davis+Gilbert

Subscribe
  • Sitemap
  • Privacy Policy
  • Terms and Conditions
  • Accessibility Statement
  • About Us
  • Location
  • Subscribe
© 2025 Copyright Davis+Gilbert LLP. Attorney Advertising.
  • People
  • Services
  • Emerging Issues
  • Insights + Events
  • Culture + Community
  • Pro Bono + Corporate Social Responsibility
  • Careers
  • About Us
  • Subscribe
  • Location
This site uses cookies from third party providers for them to collect and store information from and on your device. These cookies either support essential functions of the site or are used to develop analytics regarding usage of our site. Click Accept to continue using the site with our recommended settings or click Decline to disable non-essential cookies. See our Privacy Policy for more information.AcceptDeclinePrivacy policy