Washington, DC — February 6, 2024 — The FCC’s Enforcement Bureau today issued a cease-and-desist letter against Texas-based Lingo Telecom—the entity that is alleged to have originated robocall traffic using AI-generated voice cloning to spread misinformation to voters prior to New Hampshire’s primary election. The letter demands the company immediately stop supporting unlawful robocall traffic on its networks. In addition, the Bureau issued a “K4 Order,” which strongly encourages other providers to refrain from carrying suspicious traffic from Lingo. The FCC may proceed to require other network providers affiliated with Lingo to block its traffic should the company continue this behavior.
The FCC is taking this enforcement action in partnership with the New Hampshire State Attorney General’s office, which announced today that they are issuing a cease-and-desist order to Life Corporation, the company that Lingo Telecom facilitated robocalls for during the state primary, requiring it to cease behavior that violates voter suppression laws.
“Consumers deserve to know that the person on the other end of the line is exactly who they claim to be. That’s why we’re working closely with State Attorneys General across the country to combat the use of voice cloning technology in robocalls being used to misinform voters and target unwitting victims of fraud,” said FCC Chairwoman Jessica Rosenworcel. “I want to thank all of our partners for their cooperation in this investigation, and I look forward to our ongoing efforts to crack down on these robocallers.”
“Ensuring public confidence in the electoral process is vital. In the age of emerging Artificial Intelligence (AI) technologies, bad actors may try to use the technology to spread misinformation, manipulate public opinion, and interfere with the electoral process. AI-generated recordings used to deceive voters have the potential to have devastating effects on the democratic election process. All voters should be on the lookout for suspicious messages and misinformation and report it as soon as they see it,” said New Hampshire Attorney General John M. Formella. “The FCC’s partnership and fast action in this matter sends a clear message that law enforcement and regulatory agencies are staying vigilant and are working closely together to monitor and investigate any signs of AI being used maliciously to threaten our democratic process. We will continue to work with Attorneys General and law enforcement partners across the nation to thoroughly investigate and prosecute this case, as well as to share best practices and bolster election integrity safeguards across the country.”
“The increasing reliance on AI-generated voices to deceive the public, including as part of election disinformation campaigns, is a rapidly growing problem,” said Loyaan A. Egal, Chief of the Enforcement Bureau. “We will utilize every tool available to ensure that U.S. communications networks are not used to facilitate the harmful misuse of AI technologies. I thank our partners for their cooperation in this investigation and their ongoing efforts to stop and punish these illegal robocallers.”
Last month, the agency’s Enforcement Bureau—in coordination with the New Hampshire Attorney General, the bipartisan Anti-Robocall Multistate Litigation Task Force, and USTelecom’s Industry Traceback Group—launched an investigation into illegal robocalls made to New Hampshire voters that used AI-generated deepfake voice technology designed to appear as though President Biden was telling voters not to vote in the New Hampshire primary election.
To further the scheme, the illegal robocalls used a spoofed telephone number to obscure where the call originated from and deceive voters. This investigation found that Lingo Telecom facilitated the robocalls on behalf of a company named Life Corporation. As detailed in today’s FCC letter, each of these parties have been warned about apparent illegal robocall violations in the past. The Anti-Robocall Multistate Litigation Task Force is also expected to issue a similar letter to Life Corporation.
Last November, the FCC launched a Notice of Inquiry to build a record on how the agency can combat illegal robocalls and how AI might be involved. The agency asked questions on how AI might be used for scams that arise out of junk calls, by mimicking the voices of those we know, and whether this technology should be subject to oversight under the Telephone Consumer Protection Act (TCPA), the primary law the FCC uses to limit junk calls.
The TCPA restricts the making of telemarketing calls and the use of automatic telephone dialing systems and artificial or prerecorded voice messages. Under FCC rules, it also requires telemarketers to obtain prior express written consent from consumers before robocalling them. Last, week, the Commission announced it is considering a proposal that would ensure AI-generated voices in robocalls are also held to those same standards.
Prior cease-and-desist letters resulted in providers taking steps to mitigate robocall traffic. These companies have reported their robocall mitigation efforts to both the FCC and the Industry Traceback Group. In some cases, the providers reported to the Commission that they would shut down unlawful operations. The FCC continues to monitor these efforts and is prepared to take further action should a company backslide. The FCC Enforcement Bureau’s robocall cease-and-desist letters are archived at: https://www.fcc.gov/robocall-facilitators-must-cease-and-desist.