How State’s Disinformation-Fighting Arm Uses Artificial Intelligence – Nextgov

  • Lauren
  • April 16, 2021
  • Comments Off on How State’s Disinformation-Fighting Arm Uses Artificial Intelligence – Nextgov

Emerging technologies—and partnerships promoting their use—have proven instrumental for the State Department’s Global Engagement Center, a hub that steers federal efforts to counter state-sponsored propaganda and disinformation campaigns aimed at undermining the U.S. 
“Artificial intelligence and the tools that it offers are really helping us to understand what’s happening in the environment, and to identify coordinated activity,” the GEC’s Acting Coordinator Daniel Kimmage said Thursday. “There’s obviously a much broader range of activity across the State Department, but for us it’s a powerful way to better understand what’s happening in the environment, and identify coordinated activity.”
The center was mandated by Congress several years ago to help tackle challenges around diplomacy in the digital age. At an event hosted by Foreign Policy, Kimmage offered a glimpse into how technology is impacting and enabling GEC’s work, particularly as online disinformation campaigns led by U.S. adversaries grow in sophistication.
“We’ve got what you might describe as our traditional sources of information—we have the cables, our diplomatic colleagues out in the field. We have an analysis from our colleagues in the intelligence community, and we have a huge and growing ocean of open-source information,” he explained. “We can take that and use statistical methods, artificial intelligence and data processing to arrive at a better understanding.”
Among an array of resources in the center’s “AI Toolbox,” Kimmage said natural language processing, or essentially how computers understand and analyze human language, and topic modeling, which he deemed mathematical models for determining what a text is about, are particularly useful. “We take this and we generally produce open-source, unclassified reports for our colleagues and partners to use—once again—to better understand the environment and identify coordinated activity by malicious actors,” he explained.
At the same time, AI can also underpin operational mechanisms nefarious players can deploy in orchestrated campaigns or ad-hoc web-based moves against the U.S. “So, what are those tools? Those are better ways to generate fake online personas, generate images that can buttress those personas—even synthetic text generation tools,” Kimmage noted. Pointing to some of America’s specific competitors in this realm, he added that from Russia, the GEC sees a lot of what he would describe as disruptive activities, such as troll forms and organized information leaking and hacking operations.
“On the Chinese side, it’s more of what I would say is a classic propaganda push—advancing broad narratives, often deceptive narratives about China’s role in the world, et cetera,” he said.
That doesn’t mean that the U.S. is defenseless to these efforts, though. Kimmage said the GEC is pursuing research and initiatives to advance its detection capabilities, and “inoculation,” or boosting U.S. communities’ media literacy and awareness to combat the threats.
“If I could really use one word to characterize the whole philosophical approach of the Global Engagement Center, its partnerships,” he added. “We realize that the U.S. government is, in most cases, not the most effective communicator with most audiences, and the U.S. government is not always going to be the most innovative, because there are all kinds of constraints in a big bureaucracy. So, what we try to do is identify and support partners who are at the cutting-edge of innovation, who are thinking creatively about what’s going to happen next.”
So much of the activity GEC tracks is on digital platforms via technical tools. The center established a technology engagement team that works with the officials in the tech sector to identify innovative tools that could be put to use to combat the threats. They maintain a testbed and have biweekly demonstrations.
“We’ve reviewed over 200 tools,” Kimmage said. “We’ve done more in-depth testing on 25 tools and some of those are now being put to work by some of our partners.” 
GEC also maintains an online platform and community of interest—disinfocloud.com—to connect with relevant stakeholders. 
“We’re very eager to engage,” Kimmage said.

Source: https://www.nextgov.com/emerging-tech/2021/04/how-states-disinformation-fighting-arm-uses-artificial-intelligence/173401/