As parents race to find the perfect holiday gifts, child development specialists across Canada are issuing a strong warning: think twice before buying AI-powered toys. These smart devices, marketed as fun, interactive and educational, may actually pose risks to children’s creativity, emotional development and privacy.
Ottawa child psychologist Dr. Nicole Racine says early childhood is a highly sensitive stage, and the type of interaction children receive matters deeply. “The developing brain is like a sponge,” she said. “And to be honest, I don’t want an AI algorithm as the main input for my own kids.”
Her caution follows a recent advisory from Fairplay, a U.S.-based group focused on protecting children from potential tech harms. The advisory was endorsed by dozens of experts, including pediatricians, psychologists and educators, who argued that AI-enabled toys—which often act like friendly chatbots—may interfere with key developmental processes.
These toys, such as plush dolls, robots and interactive figures equipped with AI, are designed to converse with children in a human-like way. The technology is easy to spot—toy boxes often advertise AI features, and many require a WiFi connection. They promise creativity and companionship, but experts say they may actually suppress imaginative play. With traditional toys, children invent scenarios and speak for both sides of pretend conversations. AI toys, by responding automatically, can take over that role.
Another concern is social development. AI companions tend to agree with everything a child says, offering little challenge or conflict. Toronto psychiatrist Dr. Daniela Lobo says that could impact a child’s ability to manage disagreements or negotiate with real peers. “How will kids learn to handle conflict if their toy always agrees with them?” she said, adding that AI development has outpaced research and regulation.
Fairplay’s advisory points to examples such as Curio Interactive’s characters—Gabbo, Grem and Grok—along with Roybi’s educational robot. Both companies say they take privacy seriously and comply with U.S. child protection laws. Roybi said it uses anonymous IDs, stores no audio or video, and uses teacher-approved content to ensure safe interactions. Curio encourages parents to monitor their children’s conversations through its app.
But experts argue that expecting parents to supervise every interaction is unrealistic. Racine says such recommendations don’t reflect what happens in real homes.
The Canadian Paediatric Society also expressed concern. While it has no formal policy on AI toys, it notes rising cases of developmental and social delays in young children and warns that AI devices could worsen the trend by confusing children’s early understanding of relationships.
Privacy is another major issue. AI toys may collect sensitive data, especially since children often confide in toys. Fairplay warns that companies could gain access to deeply personal details about a child’s emotions, fears and thoughts. In many cases, experts say parents are left struggling to understand complicated privacy policies with little regulatory protection.
Elizabeth Cawley, chief clinical officer at PlaySpace—an online therapy platform—believes AI can support learning if proper safeguards exist, but stresses that adult oversight is essential. PlaySpace’s own AI storybook tool is reviewed by licensed clinicians to ensure safe use.
The Canadian Toy Association is urging parents to buy only from reputable brands that put safety first. Meanwhile, the federal minister for artificial intelligence, Evan Solomon, says the government is monitoring how AI is being built into consumer products, including children’s toys. His office noted that Health Canada is responsible for regulating toy safety, though the department has not yet commented.
As holiday shopping hits its peak, experts say parents shouldn’t panic—but they should be cautious. AI may offer exciting possibilities, but without clear safety standards, children could be exposed to risks hidden beneath the fun, glowing packaging.

