This is what a deepfake voice clone used in a failed fraud attempt sounds like
One of the stranger applications of deepfakes — AI technology used to manipulate audiovisual content — is the audio deepfake scam. Hackers use machine learning to clone someone’s voice and then combine that voice clone with social engineering techniques to convince people to move money where it shouldn’t be. Such scams have been successful in the past, but how good are the voice clones being used in these attacks? We’ve never actually heard the audio from a deepfake scam — until now.
Security consulting firm NISOS has released a report analyzing one such attempted fraud, and shared the audio with Motherboard. The clip below is part of a voicemail sent to an employee at an unnamed tech firm, in which a voice that sounds like the...
from The Verge - All Posts https://ift.tt/30UDwG2
0 Response to "This is what a deepfake voice clone used in a failed fraud attempt sounds like"
Post a Comment