It was a day like any other when Philadelphia attorney Gary Schildhorn received a phone call that would jolt his world. An urgent voice, bearing an uncanny resemblance to his son's, pleaded for $9,000 to post bail following a car accident. The call, however, was not from his son but a ruthless scammer harnessing artificial intelligence (AI) to mimic his son's voice, a chilling reminder of the pernicious capabilities of AI voice cloning.
Unmasking the AI Voice Scam
Schildhorn was directed to reach out to an alleged public defender and transfer the money via a Bitcoin kiosk. This unusual method of payment raised an alarm, and a swift FaceTime call to his son confirmed his worst fears—he was the target of a scam. The incident forms part of a sinister pattern identified by the Federal Trade Commission (FTC), known as the 'family emergency scam', where scammers exploit voice cloning technology to generate realistic audio of a person's voice, often from a brief sample retrieved from social media.
Law Enforcement's Battle Against AI Scams
The FBI acknowledges the prevalence of such scams but admits that their hands are often tied unless money is transferred overseas. The FTC, meanwhile, has issued public warnings about the intractable nature of these scams and the challenges of recovering money once it's sent through channels like cryptocurrency or gift cards. In an effort to combat the scourge of AI voice cloning scams, the FTC has thrown down the gauntlet, offering a cash prize of $25,000 for effective, consumer-friendly solutions.
The Legal Grey Area
Despite these laudable efforts, current legislation falls short of providing adequate protection for victims. Copyright laws fail to recognize a person's voice as a form of ownership, and privacy laws are applicable only to the individual whose voice is cloned, leaving the scam victim legally defenseless. Now fully aware of the magnitude of this scam, Schildhorn has taken it upon himself to alert others about the modus operandi of such con artists, hoping to prevent further instances of fraud.