AI, (Artificial Intelligence), has received a lot of publicity in the last few months with the introduction of ChatGPT by Microsoft to its Edge internet browser. Observers have been quick to point out when AI has given incorrect information, seems to have encouraged racism, and the potential for harm by bad actors. All of this talk might ignore benefits of newer AI technology, but the concerns for harm are real.
One area of
concern for fraud watchers is the potential for scammers to use AI to deliver
more effective scams that are more convincing and efficient. AARP cites early
reports of an apparent enhancement of the Grand Parents scam in a warning to
its members.
The Grand
Parents scam is a classic fraud that targets older Americans. You receive a
call from someone who claims to be your grandchild, niece/nephew, or other
younger relative. They claim that they are in some sort of trouble, such as
being involved in a car accident then being taken to jail as a result. They say
that they need money for bail and legal service. They often implore the target
not to tell their parents.
A scammer can
fake the identity of the relative initially in the phone call. The target might
reveal the name in their initial reaction to the news, helping the scammer with
their credibility. A skeptical target, who might be aware of this scam, might quickly
pick up on the potential scam and not provide a name in reaction.
With new AI
technology such as ChatGPT4, scammers can create audio that sounds just like a
relative. Scammers can more easily research potential targets on social media
such as Facebook. From social media they can find out who are relatives for
anyone who lists their relatives. They can find voice samples of a relative from
a TikTok or Instagram video, then use that sample to create a convincing impersonation.
KIRO TV has
reported a potential use of AI in a scam in Pierce County. A mother and father
received a phone call from someone they thought was their 16-year-old daughter.
To mom and dad, it was unmistakably their daughter. She said that she was in an
accident at Walmart. Then a man gets on the phone and says, "If you ever
want to see your daughter safe again, you will do what I tell you." Dad
heads to Walmart, mom continues talking to the scammer. She was getting ready
to go to the bank and withdraw $10,000 in cash. She also texted her daughter to
see if that was really her. Then the scammer demanded a wire transfer. At that
point, the mother hung up. Meanwhile, the daughter returned the text to say
that she was at school.
The
difference in this incident of course was that the targets were not
grandparents but parents. And the fact that the initial part of the call was so
convincing leads to the conclusion that AI was used to simulate the daughter’s
voice.
An example of
how effective AI generated audio can be is shown by blogger Leo Notenboom in
this posting on his “Ask Leo” blog- https://askleo.com/dont-trust-your-ears/?awt_a=7qbL&awt_l=Ffceh&awt_m=JDZkG_A21ZdfbL&utm_source=newsletter&utm_campaign=20230411&utm_medium=email.
The samples he provides are not perfect, but they may be good enough to fool someone,
at least initially.
The Pierce
County incident demonstrates that scammers will use the latest technology to
give them an edge to take your money or your personal information. The
development of AI has alarmed fraud watchers. The potential for scammers to
mount a fraud campaign increases the scammer’s capability to fake their
identity. Multi-pronged strategies can include voice-cloning, AI generated
email, and deepfake videos.
How can you protect
yourself from AI enhance scams? AARP makes the following recommendations,
·
Don’t trust your caller
ID. Scammers can easily spoof any number. If you receive an
unsolicited call from a business, hang up, look up their number, and call the number
that you found.
·
Pause before you click. If you receive an email or text, never click on any links
unless you verify that they go to the right website. This is phishing, where a
scammer sends out emails or texts to download malware onto your computer or
phone to collect information about you.
·
Consider choosing a safe
word for your family. With a safe or secret word
that only members of your family know, if you receive a call from a family
member, and you become suspicious that it could be a scammer, you can ask them
for the secret word.
·
Contact your family member
separately. This is what the mother did in the
incident above. She texted her daughter to find out if she was truly in
distress.
·
Guard your personal
information. Don’t give out your
personal information such as your full name, home address, Social Security
number, credit card or banking information to strangers who have contacted you
out of the blue.
·
Report scams. If you have been contacted by a scammers or have
been victimized by a scammer, report it to your local police and to the Federal
Trade Commission (FTC) at www.reportfraud.ftc.gov.
KIRO TV:
AARP:
Federal Trade Commission:
No comments:
Post a Comment