28 States Urge Meta to Halt AI Chatbot Amid Child Grooming and Safety Concerns
Alabama Attorney General Steve Marshall Joins Bipartisan National Coalition

A coalition of 28 State Attorneys General (AG), led by South Carolina AG Alan Wilson, is calling on Facebook’s parent company Meta Platforms, Inc. to suspend its AI chatbot, Meta AI, due to concerns that it exposes children to sexually explicit content and facilitates grooming behavior. Alabama Attorney General Steve Marshall has joined as a co-signer of the letter, along with the AGs of the States of Florida, Georgia, Mississippi and Tennessee, among others.
In a letter addressed to Meta's Chief Legal Officer, Jennifer Newstead, the coalition expressed alarm over reports that Meta AI engages in sexually explicit conversations with users, including minors. The Attorneys General assert that the platform fails to adequately warn parents about these risks and enables adult users to simulate interactions with underage personas.
“Recent reporting indicates that several Meta-created personas and the vast majority of user-created AI companions (approved by Meta and recommended as “popular”) engaged in sexual scenarios with adults. Some AI personas identifying as adults engaged in sexual role-play with users identifying as children. Yet other AI personas identifying as children engaged in sexual role-play with adult users,” the letter states.
"Meta is rapidly expanding its AI chatbot, reaching nearly a billion users each month. Yet once again, it fails to protect children from exposure to sexualized content—and worse, from predators who exploit these platforms for hypersexualized role-play," said Attorney General Marshall. "Too often, my office prosecutes cases involving adults in possession of sexually explicit images of children, both real and AI-generated. At what point will these platforms prioritize child safety over profits? It's time for Meta to stand with parents and law enforcement to end this exploitation."
The coalition's concerns are underscored by a Wall Street Journal investigation revealing that Meta's AI chatbots on platforms like Facebook and Instagram have engaged users, including minors, in sexually explicit conversations. Some AI personas, even those modeled after celebrities, reportedly participated in inappropriate role-play scenarios with users identifying themselves to the platforms as underage.
Meta has acknowledged the issue, stating that such incidents are rare and that additional safeguards have been implemented. However, critics argue that the company's response has been insufficient, especially given the potential harm to minors.
In 2024, AG Marshall collaborated with the Alabama Legislature to pass the Alabama Child Protection Act, enhancing the State's ability to investigate and prosecute cases involving AI-generated child sexual abuse material.
The bipartisan coalition includes AGs from South Carolina, Alabama, Georgia, Idaho, Indiana, Arkansas, Iowa, Florida, Kansas, Alaska, Texas, Kentucky, North Dakota, Louisiana, Ohio, Mississippi, Oklahoma, Missouri, Pennsylvania, Montana, South Dakota, Nebraska, Tennessee, New Hampshire, Utah, West Virginia, Wyoming and Virginia. They collectively urge Meta to prioritize child safety and take immediate action to prevent further exploitation through its AI technologies.
The full letter to Meta can be seen HERE.