
The American non-profit organization Common Sense Media has warned of the potential dangers of dolls and games that rely on artificial intelligence, explaining that they may pose a threat to children’s safety and privacy due to the lack of sufficient supervision over their use.
The organization, which is interested in monitoring electronic products directed to consumers, stated that some of these games emit words that are inappropriate for children, in addition to collecting large amounts of information inside homes, which raises growing concerns about protecting privacy and security.
Robbie Turney, head of digital assessments at Common Sense, explained that the results of the risk assessment revealed “fundamental problems” that make the artificial intelligence dolls unsuitable for young children, noting that more than a quarter of the products examined contained inappropriate content, including references to self-harm, drugs and dangerous behavior.
Turney added that these games rely on “intensive data collection,” including audio recordings, written texts, and information related to behavior, and also use subscription methods that exploit the emotional attachment that arises between the child and the game.
The organization drew attention to the fact that some of these games use interactive mechanisms aimed at building friendship-like relationships with children, while at the same time collecting sensitive personal data within their private domain.
Common Sense stressed the need to protect children under the age of five from exposure to artificial intelligence games, and called on parents to be extremely careful when allowing children between the ages of 6 and 12 to use these products.
For his part, James Steyer, founder and president of Common Sense, said that the world “still lacks effective safeguards to protect children from the dangers of artificial intelligence,” and likened the current situation to the absence of careful safety and quality checks that traditional toys undergo before they are put up for sale.