Modern Warfare: Uncovering Disinformation with the Help of Location Intelligence
Category Science Monday - May 1 2023, 01:57 UTC - 1 year ago Location intelligence at ORNL is used to identify and counteract modern warfare threats, such as those seen with the COVID-19 pandemic and recent global elections. This intelligence combines open data to understand places and the factors that influence human activity in them. An automated method was developed to measure the intent of social media users, where researchers discovered unique spatial patterns of information spread and methods used by social media users to intensify this spread.
Using disinformation to create political instability and battlefield confusion dates back millennia. However, today's disinformation actors use social media to amplify disinformation that users knowingly or, more often, unknowingly perpetuate. Such disinformation spreads quickly, threatening public health and safety. Indeed, the COVID-19 pandemic and recent global elections have given the world a front-row seat to this form of modern warfare.
A group at ORNL now studies such threats thanks to the evolution at the lab of location intelligence, or research that uses open data to understand places and the factors that influence human activity in them. In the past, location intelligence has informed emergency response, urban planning, transportation planning, energy conservation and policy decisions. Now, location intelligence at ORNL also helps identify disinformation, or shared information that is intentionally misleading, and its impacts.
"Up until now, we knew disinformation campaigns existed online, but we did not know how the disinformation flowed," said Gautam Thakur, leader of ORNL's Location Intelligence Group. "By bridging a gap between the virtual world and the physical world, we can now help provide insights that agencies and organizations can use to counteract such threats." .
Today's disinformation campaigns spread fast and deep. Disinformation actors design carefully crafted messages targeting specific audiences. To spread disinformation, these actors often use bots, or computer algorithms that emulate human behavior online. Only a few narratives need to catch on to create vulnerability, build cohesion among extremist groups or erode civil trust.
Some of the group's latest work involves understanding how to measure the intent of social media users based on tweets sent during the COVID-19 pandemic.
"We discovered we needed an automated method to quantify the intent of all social media users to help keep the public safe from disinformation actors," said Chathika Gunaratne, a postdoctoral researcher in ORNL's Computing and Computational Sciences Directorate.
With help from computational data engineer Varisara Tansakul and data science researcher Debraj De, the multidisciplinary team tested a new approach on 4.7 million COVID-19-related tweets from more than 14,000 users. The team combined the results of this study with its other studies on intent and disinformation. As a result, it can now correlate breaking news notifications with disinformation actors' online responses, incorporating information such as the unique spatial patterns of information spread and methods used by social media users to intensify this spread. Some of this work is captured in the group's latest conference paper published in September 2022 in Social, Cultural, and Behavioral Modeling.
Thakur began collecting data on human activities when he came to ORNL in 2013, but he soon realized the lab needed the ability to characterize—in real time—the human behaviors that were driving the data. Then, in 2015, ORNL released PlanetSense, a digital platform used to analyze online crowd-sourced data in real time. PlanetSense enables researchers to study human activity through the lenses of economy, culture and social ties, and to illustrate how and why things happen in different places. PlanetSeism's impact can be seen in various applications, such as the detection of wildfires in the Great Smoky Mountains.
Share