DHS warns of threat to elections from artificial intelligence

AI News


The analysis, compiled by the Department of Homeland Security and obtained by ABC News, shows that with less than six months to go until Election Day, the next generation of technologies that are supposed to drive progress present opportunities for abuse that threaten elections that are the foundation of our democratic system. It also outlines what is provided. .

The May 17 document states: “As the 2024 election cycle progresses, generative AI tools will continue to threaten the domestic market by exacerbating emergencies, disrupting election processes, and attacking election infrastructure. It is likely to increase opportunities for interference by external threat actors.” The bulletin said these tools could be used to “influence and sow discord” in upcoming U.S. elections by those who see them as “attractive” and “priority” targets. said.

“This is not a problem of the future. This is a problem of today,” said John Cohen, former director of intelligence at the Department of Homeland Security and current ABC News contributor. “Domestic and international threat actors have taken full advantage of the Internet and are increasingly using advanced computing capabilities, such as artificial intelligence, to carry out their illicit activities.”

According to the bulletin, those seeking to target the election are already engaged in “cyber-based hack-and-leak campaigns, voice spoofing, online disinformation campaigns, threats and planned attacks against U.S. election symbols, etc.” “We are conducting this.”

Now, an analysis warns that generative AI's innovative capabilities could be exploited in future elections. These tools can be misused to “confuse or overwhelm voters and election staff to disrupt their work” by creating or sharing “altered” or deepfaked photos. There is a possibility that Video or audio clips that “generate or promote online any information about election day details, such as claims that polling places are closed or voting times have been changed, or other tailored false information.” .

On the eve of the New Hampshire primary in January, a robocall that appeared to be impersonating President Joe Biden's voice went viral, telling the recipients not to participate in the state's primary but to prepare for the general election in November. He urged them to “save your vote.” Audio obtained by ABC News at the time.

The DHS analysis specifically flagged this “AI-generated voice message,” stating that “the timing of election-specific AI-generated media is just as important as the content itself, and the timing of responding to counter-messages is as important as the content itself.” It has also been pointed out that it may take some time. Or debunk the false content that permeates online. ”

Elizabeth Newman, who served as assistant secretary of homeland security during President Donald Trump's early days in office and is now an ABC News contributor, said, “This is a time that has changed for Americans in terms of finding the truth.'' “This may be one of the most difficult elections in the country.” “We will no longer trust not only whether politicians are telling the truth, but also the images we see in our social media feeds, emails and, in some cases, traditional media. I haven't been able to do my job.”

The 2024 campaign will be marked by an increasingly toxic rhetoric, inflammatory campaign hyperbole and courtroom theatrics as Trump faces four criminal charges of which he maintains he is innocent. ing. Hate speech, misinformation and disinformation are rampant on social media and in real life, experts said, even as rapidly evolving technology remains vulnerable. Meanwhile, overseas, wars in the Middle East and Ukraine continue, Americans are divided on their views on foreign policy, and the conflict has spilled over into protests on major American university campuses.

“Threat actors may seek to exploit deepfake video, audio content, and other generative AI media to amplify discontent,” the DHS analysis said. “Well-timed deepfakes and AI-generated media aimed at targeted audiences could prompt individuals to take actions that could lead to violence or physical disruption against elections and candidates.”

Securing the integrity of U.S. elections faces greater challenges than ever before as the threat landscape becomes “more diverse and complex” and the sophistication of artificial intelligence accelerates, the intelligence says. The agency's top officials told lawmakers last Wednesday.

Director of National Intelligence Avril Haines told a Senate committee holding a hearing focused on threats to the 2024 election that “we are leveraging every tool we have as the challenges grow.'' It is important to do so.” “An increasing number of foreign actors, including non-state actors, are seeking to engage in activities that influence elections,” he said, adding that “relevant emerging technologies, particularly generative AI and big data analytics, are increasing the number of targeted and The proliferation of influencers who can run campaigns.”

“Innovations in AI have enabled foreign influencers to generate seemingly authentic and customized messages more efficiently and at scale,” Haynes added. He also said there were lessons learned from the 2016 presidential election, saying, “The threat landscape is becoming increasingly complex, but my view is that the U.S. government has never been better prepared to meet this challenge.'' ” he said.

Experts said that at this sensitive time, authorities at all levels need to be prepared to prevent artificial intelligence from spreading fake news.

“One of the most important things we have to do now is educate and prepare the general public, because they are the people who will be targeted by this content. And the purpose is how do you influence people?'' Cohen said.

“State and local officials should have a plan in place to use reliable sources to counter and correct inaccuracies when this content is detected. , will spread rapidly throughout the online media ecosystem and must be countered immediately,” Cohen added. “Law enforcement and the security community have been slow to adapt to this rapidly evolving threat environment. To deal with today's threats, we are still using yesterday's strategies. They bring knives into battle.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *