Five things to watch in markets in the week ahead
Investing.com -- The Federal Trade Commission (FTC) has issued orders to seven major technology companies seeking information about how their AI-powered chatbots might affect children and teenagers.
The inquiry targets Alphabet, Character Technologies, Instagram, Meta Platforms, OpenAI, Snap, and X.AI Corp. The FTC wants to understand what steps these companies have taken to evaluate the safety of their chatbots when they function as companions, particularly for young users.
"Protecting kids online is a top priority for the Trump-Vance FTC, and so is fostering innovation in critical sectors of our economy," said FTC Chairman Andrew N. Ferguson. "As AI technologies evolve, it is important to consider the effects chatbots can have on children, while also ensuring that the United States maintains its role as a global leader in this new and exciting industry."
The Commission is using its 6(b) authority to conduct this wide-ranging study, which does not have a specific law enforcement purpose. The FTC voted 3-0 to issue the orders, with Commissioners Melissa Holyoak and Mark R. Meador issuing separate statements.
The investigation focuses on how AI chatbots simulate human-like communication and relationships with users. The FTC noted that these technologies can effectively mimic human characteristics, emotions, and intentions, potentially leading children and teens to trust and form relationships with them.
The Commission is requesting information about how these companies monetize user engagement, process user inputs, develop characters, measure and monitor negative impacts, mitigate risks to children, inform users about potential risks, enforce age restrictions, and handle personal information obtained through user conversations.
The inquiry also seeks to understand how companies comply with the Children’s Online Privacy Protection Act Rule regarding their AI chatbot products.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.