A new study of children’s toys powered by AI found an alarming problem with one teddy bear that shared about sexual fantasies and taught how to start a fire.
Impressionable young minds exposed to Kumma, a bear made by the Chinese company FoloToy, were told about matches, knives, and a variety of sexual topics, according to a study by the Public Interest Research Group (PIRG). The artificial intelligence-enabled toy, which was being sold in the U.S. for $99, has been pulled from the market following the findings, with the Singapore-based company’s CEO, Larry Wang, telling CNN they are “conducting an internal safety audit.”
In the study titled “Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers,” the consumer watchdog group warned of the risks with toys powered by artificial intelligence, noting the data being gathered is only one of many factors parents should be wary of.
“Our testing of four toys that contain A.I. chatbots and interact with children. We found some of these toys will talk in-depth about sexually explicit topics, will offer advice on where a child can find matches or knives, act dismayed when you say you have to leave, and have limited or no parental controls,” the report stated. “We also look at privacy concerns because these toys can record a child’s voice and collect other sensitive data, by methods such as facial recognition scans.”
“We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it, simultaneously escalating in graphic detail while introducing new sexual concepts of its own,” the report said.
The bear, which uses OpenAI’s GPT-4o chatbot, also “discussed even more graphic sexual topics in detail, such as explaining different sex positions, giving step-by-step instructions on a common ‘knot for beginners’ for tying up a partner and describing roleplay dynamics involving teachers and students, and parents and children – scenarios it disturbingly brought up itself.”
“All of the toys also weighed in on other topics that parents might prefer to talk with their kids about first before the AI toy does,” the report says. “Those topics included religion, along with sex and “the glory of dying in battle in Norse Mythology.”
The FoloToy site still shows the Kumma bear for sale, but it is marked “Sold out.”

One of the researchers pointed to privacy concerns and the data being collected by toys that “listen.”
“A lot of this is the stuff you might expect,” PIRG’s Rory Erlich told The Register.
“If a child thinks the toy is their best friend, they might share a lot of data that might not be collected by other children’s products,” Erlich noted. “These things are a real wild card.”
“There’s a lot we don’t know about the impacts of these products on children’s development,” he added. “A lot of experts in childhood development have expressed concern.”
Reacting to the suspension of Kumma after the study was published, R.J. Cross, co-author of the report, told CNN, “It’s great to see these companies taking action on problems we’ve identified. But AI toys are still practically unregulated, and there are plenty you can still buy today.”
“Removing one problematic product from the market is a good step but far from a systemic fix,” she noted.
Comment
We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.
