Police across Ontario are warning of a renewed surge in so-called “grandparent scams,” with investigators and cybersecurity experts saying artificial intelligence is making the long-running fraud far more convincing and dangerous. The Ontario Provincial Police say recent incidents show scammers increasingly using urgency, emotional pressure and, in some cases, AI voice cloning to trick seniors into handing over money.
Earlier this month, the OPP’s Upper Ottawa Valley detachment reported two incidents in the Pembroke and Petawawa areas. On Jan. 7, a senior lost $800 after receiving a call claiming a grandchild was in legal trouble. Two days later, another resident nearly lost $20,000 after being told a grandchild was in jail and needed bail money, a scam that was stopped only when an acquaintance intervened.
Police say the scheme typically begins with a frantic phone call or message from someone posing as a grandchild or close relative, often claiming an emergency such as an arrest, accident or injury. Victims are pressured to act immediately and urged not to verify the story or tell anyone, with scammers requesting money through wire transfers, cryptocurrency or even arranging couriers to collect cash in person.
While the scam itself is not new, experts say AI has dramatically increased its effectiveness. Cybersecurity specialists note that publicly available videos and audio on social media can be harvested to clone a person’s voice with only a few seconds of sample material. Thomas Curutchet, a managing director at Toronto-based tech firm Sopra Steria, said criminals can now reproduce voices with “alarming accuracy,” allowing them to convincingly impersonate loved ones during distress calls.
Similar AI-assisted scams have been reported across Canada, including cases in Alberta, Newfoundland and Saskatchewan, where seniors have lost thousands of dollars after hearing what they believed to be their grandchildren pleading for help. Experts say scammers often use short bursts of the cloned voice before handing the call to someone posing as a police officer or lawyer, a tactic designed to hide imperfections in the fake audio.
AI and cybersecurity researcher Abbas Yazdinejad says warning signs can include unnatural pauses, odd background noise or conversations that feel tightly controlled and one-sided. He advises people to slow down, hang up and independently contact family members or authorities using verified phone numbers rather than those provided during the call.
Police and experts stress that pressure to act quickly is a major red flag. They urge seniors and families to talk openly about these scams, set up verification plans and remind loved ones that legitimate police or courts will never demand immediate payment over the phone.

