‘The voice was almost identical’: AI suspected in new grandparent scams in Montreal area

The age-old grandparent phone scam has been around for years, but now it appears it’s taking on a modern twist.

Jessica Di Palma’s 73-year-old mother received a phone call last Tuesday that she will never forget; she spoke to the RCMP, who stated her grandson, Milan, had been arrested on drug charges and needed $10,000 in bail money.

The supposed officer then passed the phone to Milan, who then urged his grandmother to post bail and said he was in serious trouble.

Although the voice sounded like Milan’s, it was not actually him on the phone.

The alleged scammer managed to duplicate Milan’s voice, right down to switching from English and Italian, as Milan does when speaking to his grandmother, even calling her “nonna” — the Italian word for grandmother. It’s believed the caller used artificial intelligence (AI) to mimic her grandson’s voice.

In a panic, Di Palma’s mother hung up and immediately withdrew $10,000 to save her grandson. Thankfully, Jessica noticed the withdrawal and phoned her mother, questioning what happened.

Di Palma’s mother was instructed to tell no one about what was happening, thinking her grandson’s life was in danger and was reluctant to tell Jessica the situation.

Jessica Di Palma says her 73-year-old mother was a victim of an artificial intelligence-based phone scam that made her think she was speaking to her grandson who urgently needed money. (CTV News)

Although Di Palma’s mother spoke with both Jessica and Milan — both of whom explicitly said this was not Milan —she was still paranoid as she could not understand how the voice sounded exactly like Milan’s.

“She’s been fairly autonomous her whole life. I mean, on top of her game, but something like this just threw her off her axis completely,” said Di Palma.

Jessica Dipalma says her 73-year-old mother received a troubling phone call from someone claiming to her grandson using artificial intelligence to disguise his voice. (Submitted photo)

After the scammer did not receive the money, they then called Di Palma’s mother again, threatening her life. The police are now involved and have opened an investigation.


This case is not an isolated one.

Bruno Aiezza experienced a strikingly similar instance recently, too. Aiezza’s mother received a phone call from her so-called grandson asking to go to the bank and take out $7,000 of bail money because he and his friend had been arrested.

“It was the mannerism in which this voice spoke,” said Aiezza.

“And he says, ‘Yeah, it’s me, nonna. Giordano.’ Even the way he says Giordano, it was identical. It had to have been AI.”

Bruno Aiezza said his family was also on the receiving end of a sophisticated grandparent scam that used artificial intelligence that used his son’s mannerisms in order to fool a family member into sending money. (CTV News)

Aiezza said that the voice mentioned nicknames that his son would use, and similar to Di Palma’s story, the voice also switched from English and Italian, as his son would do.

The key to this story is Aiezza’s mother lives in a residence with limited mobility. This is what caused Aiezza’s mother to know something was off, as she knew her grandson wouldn’t ask her to go to a bank because she physically couldn’t do so.

Although in both stories, the families lost no money and were able to stop the scam from going further, not all families have had the same luck.

“Grandparent’s scams have been around for years, we know that grandparents scams have been around for years. We know how they work, we know what to watch for, we know what to warn our parents and grandparents about. But with the addition of voice cloning into the scammers’ tool kit, it’s now raised it to a terrifying new level,” said Carmi Levy, a technology analyst.

Experts say AI software makes this possible. 

“This technology is called voice cloning. And what it does is it goes out onto the open internet. And it listens for examples of the victim’s voice. And then it takes that voice and it trains itself such that the scammer can then create any kind of audio that they want,” Levy said.

The technology has become increasingly sophisticated and unfortunately has shown no signs of slowing down.

“It can replicate their tone, it can replicate their accents, their mannerisms, the ability to essentially be the individual is shocking and frightening,” Levy added.

“Unfortunately, it seems so real that victims believe what they’re hearing.”


Experts caution that families should inform themselves about the tell-tale signs of grandparent scams, such as urgency to send money, out-of-character behaviour or any threats.

They also recommend families consider using secret code words in order to authenticate a person or to warn against possible danger. If you are questioning whether or not the person you are talking to is your family member, simply hang up and call them using their phone number.

If you are a victim of this type of scam or any voice duplication scam, experts say there are steps to take.

“If you think you got caught in that voice scam, first of all, report it to the police services and check your financial records,” said Steve Waterhouse, a cybersecurity expert.

“Make sure you know all your passwords, where they are, what they are, and whatever resources they’ve asked you if it is that site or bank account, get in touch with your bank, change the password to get access to your bank portal,” he recommended.

“Hopefully, you do have an anti-virus on your system that will make sure that there is no wrongful activity that’s being done on your system.”

As scams continue to evolve, experts warn that people need to take cyber threats more seriously — something Di Palma now realizes.

“If there’s one word of advice I can also share with everyone, it’s just follow that intuition,” she said. “If there’s that gut feeling that’s telling you something is wrong, just follow it. Don’t doubt it.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button