Biases in AI Lead to Unequal Climate Action

Category Machine Learning

tldr #

Researchers warn of data gaps and biases in the collection of climate data for AI-driven climate modeling tools, leading to unreliable climate predictions and unequal global climate action. Particular attention should be paid to data justice to guide equitable mitigation and adaptation efforts.

content #

Developing countries risk missing out on important global climate action due to unreliable weather predictions resulting from biased data fed into artificial intelligence (AI) tools, researchers warn.Biases in the collection of climate data, which AI-driven climate modeling tools rely on, can limit the usefulness of such emerging technologies for climate scientists trying to predict future scenarios and guide global action, according to the paper published 17 August in npj Climate Action.

The authors implicate the UNFCCC's Executive body of Parties for lacking data standards for climate models.

AI computer programs used in climate science are tailored to trawl through complex datasets in search of patterns. But lack of information from certain locations, time periods or societal groups create "holes" in the data that can lead to inaccurate climate predictions and misleading conclusions, the researchers say.

Lead author Ramit Debnath, assistant professor of computational social science at the University of Cambridge, U.K., says these data gaps are more pronounced in the global South due to challenges accessing datasets for all types of modeling and analytical purposes.

Research show developing countries are more vulnerable to climate change impacts due to lack of resources for adaptation and mitigation.

"The data gap is hard to quantify accurately, but the general trend is most AI-led climate companies are based in the global North," Debnath told SciDev.Net. It means there are "high chances the models are well-calibrated and built for global North scenarios as they already have appropriate data and weather monitoring infrastructure," according to Debnath.

Biases can be broadly classified into biased programing, biased datasets and biased algorithms. The study says programing and algorithms are easier to correct as they are mathematical. However, biased datasets are the toughest because the data does not exist and it takes significant capital and human investment to build it from the ground up, researchers say.

To solve the data gap issue, new AI platforms need data governance schemes to audit data inputs.

For example, many instances of generative AI—technology that can produce content such as text and images—shows it struggles with topics around the global South, especially when asked about local contexts in Sub-Saharan Africa, India, or any other non-western nations, Debnath explained.

"Similarly, many reports suggest present-generation AI technologies often show biases in job application results based on names, skin color, etc.," he added. "Such biases, although not directly related to climate action, show that lack of representative datasets is a challenge in using AI tools for any form of decision-making purposes." .

The global South is becoming more connected and open to diverse data sources, making data acquisition easier.

The consequence of biased climate data is that it will provide inaccurate predictions of extreme weather events, say the researchers. Poorer global South nations are already more vulnerable to climate change impacts. Inaccurate predictions due to biased or lack of datasets will further slow mitigation and adaptation efforts—leading to more climate-induced damages to property, human and social capital, they argue.

AI-driven climate modeling tools use machine learning algorithms to detect patterns from complex datasets.

"The most important finding of our study is data justice can be key to climate justice," said Debnath, who is also a fellow of the university's Cambridge Zero climate change initiative. "Our recommendation is that whoever is building a climate action AI must be aware of the data gaps and embedded biases and what ef .

hashtags #
worddensity #