Calgarypuck Forums - The Unofficial Calgary Flames Fan Community

Go Back   Calgarypuck Forums - The Unofficial Calgary Flames Fan Community > Main Forums > The Off Topic Forum
Register Forum Rules FAQ Community Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread
Old 03-30-2023, 03:54 PM   #1
chemgear
Franchise Player
 
Join Date: Feb 2010
Exp:
Default Police believe AI voice cloning used to scam seniors

New tool, even more crime.

chemgear is offline   Reply With Quote
The Following 2 Users Say Thank You to chemgear For This Useful Post:
Old 03-30-2023, 06:28 PM   #2
Northendzone
Franchise Player
 
Northendzone's Avatar
 
Join Date: Aug 2009
Exp:
Default

Good grief - why would you not call your own son back to verify this story.

I’d like to think that my generation will be smarter thanks to lots of ongoing training at work, but, I’d imagine the scams and technology will get more complex.

The ingenuity of criminals never ceases to amaze me
__________________
If I do not come back avenge my death
Northendzone is offline   Reply With Quote
Old 03-30-2023, 06:53 PM   #3
WhiteTiger
Franchise Player
 
WhiteTiger's Avatar
 
Join Date: Nov 2010
Exp:
Default

Quote:
Originally Posted by Northendzone View Post
Good grief - why would you not call your own son back to verify this story.
I've always wondered this, too. When asked, most would say that the person on the other end of the phone fast-talked them, not giving them a chance to process and bamboozling them with the urgency of the situation. A lot of folks are starting to wise up to it being a scam, or asking to/calling someone else for verification.

Quote:
I’d like to think that my generation will be smarter thanks to lots of ongoing training at work, but, I’d imagine the scams and technology will get more complex.

The ingenuity of criminals never ceases to amaze me
I'm not sure which generation you are, but the scams I'm seeing that are getting the 'younger' crowd (let's say, 18-35-ish) are of a 'pay your bills online' sort, in which the victim is offered a chance to set up online/direct bill payment though a 3rd party service giving them a discount. A really popular one is to also pay your bills/taxes via Bitcoin (or some other online currency). I see a LOT of debt forgiveness type scams (pay us $1000 and we'll negotiate the expunging of your student loan, etc). Oh, and falling for scam/cheap/fake/not-as-promised/too-good-to-be-true type 'deals'.

While most are tech-based, few are what I'd consider complex. They tend to rely more on the gullible/trusting nature of most folks, over complex scams.
WhiteTiger is offline   Reply With Quote
The Following User Says Thank You to WhiteTiger For This Useful Post:
Old 03-30-2023, 08:13 PM   #4
Cali Panthers Fan
Franchise Player
 
Cali Panthers Fan's Avatar
 
Join Date: Feb 2013
Location: Boca Raton, FL
Exp:
Default

this belongs in the "I don't want to live on this planet anymore" thread.
__________________
Quote:
Originally Posted by ResAlien View Post
If we can't fall in love with replaceable bottom 6 players then the terrorists have won.
Cali Panthers Fan is offline   Reply With Quote
Old 03-30-2023, 11:22 PM   #5
BlackArcher101
Such a pretty girl!
 
BlackArcher101's Avatar
 
Join Date: Jan 2004
Location: Calgary
Exp:
Default

Interesting, had an older family member get a call early in the morning on a weekday pretending to be me a few weeks back. I was apparently at the police station, and needed bail money after hitting a women with my car while texting and driving. They were going to send someone by to pick up the money and then they would drop me off. She was adamant that it sounded just like me and I thought it was maybe just because she was stressed and pressured. But now.... I'm questioning if she was right, did they somehow mimic my voice. I don't have any video clips of me speaking online or anything, so how would AI get a digital print of my voice to fake it? Is it maybe one of those numerous scam calls we get everyday, and just a few words here and there is enough?
__________________

Last edited by BlackArcher101; 03-30-2023 at 11:37 PM.
BlackArcher101 is offline   Reply With Quote
The Following User Says Thank You to BlackArcher101 For This Useful Post:
Old 03-30-2023, 11:29 PM   #6
Playfair
Scoring Winger
 
Playfair's Avatar
 
Join Date: Aug 2005
Exp:
Default

Quote:
Originally Posted by BlackArcher101 View Post
Interesting, had an older family member get a call early in the morning on a weekday pretending to me a few weeks back. I was apparently at the police station, and needed bail money after hitting a women with my car while texting and driving. They were going to send someone by to pick up the money and then they would drop me off. She was adamant that it sounded just like me and I thought it was maybe just because she was stressed and pressured. But now.... I'm questioning if she was right, did they somehow mimic my voice. I don't have any video clips of me speaking online or anything, so how would AI get a digital print of my voice to fake it? Is it maybe one of those numerous scam calls we get everyday, and just a few words here and there is enough?
I had the exact same thing happen as well
Playfair is offline   Reply With Quote
Old 03-30-2023, 11:37 PM   #7
BlackArcher101
Such a pretty girl!
 
BlackArcher101's Avatar
 
Join Date: Jan 2004
Location: Calgary
Exp:
Default

Quote:
Originally Posted by Playfair View Post
I had the exact same thing happen as well
It was a bit of a wakeup call as they obviously researched what my relationship was to them (ie, nephew) and knew some other details that essentially made them not question things as fully as they should. Time to pull family trees and anything with your voice and photo off the internet?

The other odd thing that morning, was that I woke up to a non-responsive cell phone, and couldn't even restart it. Basically just frozen. Never had that happen, and a few hours later it just started working again, which I then got a phone call asking if I was alright. But they had tried to get a hold of me during that scam call and were unable obviously. Now.... was that just a coincidence?
__________________

Last edited by BlackArcher101; 03-30-2023 at 11:40 PM.
BlackArcher101 is offline   Reply With Quote
Old 03-30-2023, 11:41 PM   #8
Mathgod
Franchise Player
 
Mathgod's Avatar
 
Join Date: Feb 2009
Exp:
Default

Just wait until deepfaking becomes so perfect that it's impossible to distinguish real audio/video evidence from fake AI generated audio/video...

When that day comes (and it's close)... will that be the end of courts and law enforcement as we know it?
__________________
Mathgod is offline   Reply With Quote
Old 03-31-2023, 12:32 AM   #9
Playfair
Scoring Winger
 
Playfair's Avatar
 
Join Date: Aug 2005
Exp:
Default

Quote:
Originally Posted by BlackArcher101 View Post
It was a bit of a wakeup call as they obviously researched what my relationship was to them (ie, nephew) and knew some other details that essentially made them not question things as fully as they should. Time to pull family trees and anything with your voice and photo off the internet?

The other odd thing that morning, was that I woke up to a non-responsive cell phone, and couldn't even restart it. Basically just frozen. Never had that happen, and a few hours later it just started working again, which I then got a phone call asking if I was alright. But they had tried to get a hold of me during that scam call and were unable obviously. Now.... was that just a coincidence?
They asked phishing questions which led to more information being shared by the elders. It really freaked me out as it could have ended up really badly had they not finally got a hold of me. The elders being questioned were highly intelligent, highly versed in the ways of the world. That is what shocked me the most. The phishing questions made them offer answers that fed more phishing. Once they realized what was going on they stopped but it could have ended so badly
Playfair is offline   Reply With Quote
Old 03-31-2023, 12:42 AM   #10
zamler
Lifetime Suspension
 
Join Date: Feb 2008
Exp:
Default

Quote:
Originally Posted by Mathgod View Post
Just wait until deepfaking becomes so perfect that it's impossible to distinguish real audio/video evidence from fake AI generated audio/video...

When that day comes (and it's close)... will that be the end of courts and law enforcement as we know it?
Sure seems like it. When fake is the same as real by any measure then all video evidence is useless.
zamler is offline   Reply With Quote
Old 03-31-2023, 03:28 AM   #11
getbak
Franchise Player
 
getbak's Avatar
 
Join Date: Feb 2006
Location: Calgary, AB
Exp:
Default

It would have been nice if the story had explained how this voice cloning was done. On the surface, it just seems like "AI" is the buzzword of the day, but this was likely just the same scam that has been used for years. Using a cloned voice to do this seems unnecessarily complicated.

In order to use a cloned voice, it would need to be extremely targeted, and these sorts of scams are more about casting a wide net and taking advantage of the few who take the bait.

You could go to the trouble of targeting someone's 80-year-old grandmother, building a cloned voice of the grandson, then the day before you're going to put the plan in motion, the grandmother drops dead, and you've wasted all the time and research it would have taken to prepare everything.


It's much easier to do what con artists, psychics, and mediums have been doing for centuries to scam people out of money... be vague and non-committal and let the other person fill in their own blanks with their information (AKA cold reading). Then, after the fact, the victim will swear that the scammer knew all sorts of information they couldn't possibly have known.




Where it would make sense to use an AI voice would be to create a voice that sounds like someone's 25-year-old grandson rather than the 40-year-old man who's actually running the scam. With a decent chat bot, you could have the AI create the whole conversation on the fly without needing any human input (other than responding to the information provided by the victim) or an actual 25-year-old.

My guess is that's the actual story here. They're not creating a clone of Mary in Toronto's 25-year-old grandson Billy who was born and raised in Vancouver. They're calling a random number in Toronto and using a voice bot of a generic mid-20s Canadian male to run the scam, and Mary is the one who convinced herself that it was Billy on the other end.
__________________
Turn up the good, turn down the suck!
getbak is online now   Reply With Quote
The Following 4 Users Say Thank You to getbak For This Useful Post:
Old 03-31-2023, 09:15 AM   #12
TheIronMaiden
Franchise Player
 
TheIronMaiden's Avatar
 
Join Date: May 2016
Location: ATCO Field, Section 201
Exp:
Default

Quote:
Originally Posted by Cali Panthers Fan View Post
this belongs in the "I don't want to live on this planet anymore" thread.
AI puts an enormous strain to the right to intellectual property and right to owning your own likeness.
TheIronMaiden is offline   Reply With Quote
Old 03-31-2023, 09:17 AM   #13
TheIronMaiden
Franchise Player
 
TheIronMaiden's Avatar
 
Join Date: May 2016
Location: ATCO Field, Section 201
Exp:
Default

Quote:
Originally Posted by getbak View Post
It would have been nice if the story had explained how this voice cloning was done. On the surface, it just seems like "AI" is the buzzword of the day, but this was likely just the same scam that has been used for years. Using a cloned voice to do this seems unnecessarily complicated.

In order to use a cloned voice, it would need to be extremely targeted, and these sorts of scams are more about casting a wide net and taking advantage of the few who take the bait.

You could go to the trouble of targeting someone's 80-year-old grandmother, building a cloned voice of the grandson, then the day before you're going to put the plan in motion, the grandmother drops dead, and you've wasted all the time and research it would have taken to prepare everything.


It's much easier to do what con artists, psychics, and mediums have been doing for centuries to scam people out of money... be vague and non-committal and let the other person fill in their own blanks with their information (AKA cold reading). Then, after the fact, the victim will swear that the scammer knew all sorts of information they couldn't possibly have known.




Where it would make sense to use an AI voice would be to create a voice that sounds like someone's 25-year-old grandson rather than the 40-year-old man who's actually running the scam. With a decent chat bot, you could have the AI create the whole conversation on the fly without needing any human input (other than responding to the information provided by the victim) or an actual 25-year-old.

My guess is that's the actual story here. They're not creating a clone of Mary in Toronto's 25-year-old grandson Billy who was born and raised in Vancouver. They're calling a random number in Toronto and using a voice bot of a generic mid-20s Canadian male to run the scam, and Mary is the one who convinced herself that it was Billy on the other end.
It is not unreasonable to believe that they could get AI to learn a persons voice by scraping their TikTok.
TheIronMaiden is offline   Reply With Quote
Old 03-31-2023, 09:18 AM   #14
Scornfire
First Line Centre
 
Scornfire's Avatar
 
Join Date: Jan 2014
Location: Kelowna
Exp:
Default



It's quickly approaching a point that I don't think we as a society are prepared for whatsoever
Scornfire is online now   Reply With Quote
Old 03-31-2023, 09:28 AM   #15
GirlySports
NOT breaking news
 
GirlySports's Avatar
 
Join Date: Jan 2007
Location: Calgary
Exp:
Default

Scammers make things so real now, anyone get a call from VISA/Bank pretending that there are fake charges on your VISA, like Amazon, E-Bay etc.. ?

I fell for it and call them back and was strung along for about 15 minutes until they told me to go to physically go to a store to buy pre-paid gift cards so they can track/mirror what really happened. That's when I hung up.

it was this one: https://www.thunderbaypolice.ca/news...rgeting-locals
__________________
Watching the Oilers defend is like watching fire engines frantically rushing to the wrong fire

GirlySports is offline   Reply With Quote
Old 03-31-2023, 10:19 AM   #16
psyang
Powerplay Quarterback
 
Join Date: Jan 2010
Exp:
Default

Quote:
Originally Posted by GirlySports View Post
Scammers make things so real now, anyone get a call from VISA/Bank pretending that there are fake charges on your VISA, like Amazon, E-Bay etc.. ?

I fell for it and call them back and was strung along for about 15 minutes until they told me to go to physically go to a store to buy pre-paid gift cards so they can track/mirror what really happened. That's when I hung up.

it was this one: https://www.thunderbaypolice.ca/news...rgeting-locals
And that whole time, I'm sure they are recording the call, and now able to clone your voice.

It takes a surprisingly small amount of audio to be able to clone one's voice. The more audio, the more accurate the clone. I've started thinking about how much freely available voice data I have out there. I have no video/audio of myself on my social media, but there is still my voicemail greeting message, any phone conversation I may have had from unsolicited calls, videos that other people may have taken and posted on their social media that has me in it.

It can be frightening if you start thinking about possibilities.
psyang is offline   Reply With Quote
Old 03-31-2023, 10:24 AM   #17
Azure
Had an idea!
 
Azure's Avatar
 
Join Date: Oct 2005
Exp:
Default

I got spam calls from a 'supposed' collection agency for months.

They kept calling from the same number, and even if I blocked the # I would just constantly get voicemails on my phone.

I found it frustrating that there wasn't an easy way to report it.

Stuff like this is going to keep happening, the government doesn't seem willing to be aggressive in dealing with it.
Azure is offline   Reply With Quote
Old 03-31-2023, 10:56 AM   #18
getbak
Franchise Player
 
getbak's Avatar
 
Join Date: Feb 2006
Location: Calgary, AB
Exp:
Default

Quote:
Originally Posted by TheIronMaiden View Post
It is not unreasonable to believe that they could get AI to learn a persons voice by scraping their TikTok.
There are many ways to clone a voice. That's not really the issue.

The issue is that a clone of a specific person's voice is only useful against a specific limited number of targets. If I have a clone of Billy's voice, that might increase my chances of scamming Billy's Grandma, but it's not going to help me scam Bobby's Grandpa.

It's much easier to just use a generic voicebot that can be used against thousands of potential targets, rather than going to the trouble of creating something so heavily targeted against a handful of people.
__________________
Turn up the good, turn down the suck!
getbak is online now   Reply With Quote
Old 03-31-2023, 11:34 AM   #19
Russic
Dances with Wolves
 
Russic's Avatar
 
Join Date: Jun 2006
Location: Section 304
Exp:
Default

I once had a scammer grilling me about GST money that I owed. The fun part is that I _did_ owe GST at the time, and conveniently had been chastising myself for forgetting to send my payment in that morning, so it was top of mind.

While I did figure out reasonably quick that it was bull, I will admit it scrambled me for a second and I wasn't mentally clear. I can imagine that for an older person who doesn't know that deepfaking voices is a thing and gets scared, it will get very confusing very fast. This will be a serious problem.
Russic is offline   Reply With Quote
Old 03-31-2023, 12:08 PM   #20
fotze2
Powerplay Quarterback
 
Join Date: Mar 2023
Exp:
Default

My dad seems to be a magnet for these. He's 93.

First one was some computer virus expert that my folks had employed years earlier. I shudder to think how much they had paid these C%^ts over the years to prevent "computer viruses". Smoke like a chimney your whole life and succumb to it but holy crap, viruses are what we really need to be scared of. He couldn't wrap his head around how mind boggling stupid it was to pay a guy to stop viruses by letting the guy into his online bank account.

I came into the house and he was on the phone with the guy with his bank open and the guy had remote control of his computer. So I actually talked to the piece of crap. I said some evil things to the guy. We had to eventually change his phone number.

We went to the bank (BMO) so we could put up some controls on his account (with his approval) and they couldn't be more useless. He kept getting suckered.

The pretending to be a relative in jail one got him too, but the cashier at Walmart where he was buying gift cards actually caught it and questioned him, not the useless tit at BMO we had talked to two weeks earlier.
fotze2 is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 10:00 PM.

Calgary Flames
2023-24




Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright Calgarypuck 2021