The Impacts of Black Box Artificial Intelligence
Category Machine Learning Monday - December 18 2023, 05:29 UTC - 11 months ago Mike Capps promotes transparency into artificial intelligence decision-making through his company Howso. Open-sourcing their AI engine, Howso is used by major companies such as Mastercard and Virginia's Department of Behavioral Health and Developmental Services to explainable AI. Capps explains why black box AI is a problem and why people should care about the concept of attributable AI.
People should demand transparency in artificial intelligence like they do in their breakfast food, says Mike Capps, whose Raleigh company Howso allows users to see how AI arrives at its conclusions."You'll want the same thing for what's deciding whether you get health care or if your kid gets into college," he said. "You want to have the same nutrition label on the side." .
The former president of Epic Games, Capps cofounded Howso (originally named Diveplane) in 2018. Since then, artificial intelligence has taken off. Today, organizations use it for major decisions surrounding medical procedures, credit approvals and even paroles.
Capps contends too many current AI program engines offer variations of "black box AI" that obscure how final judgments are made .
"We want people to stop using that crap and use ours instead," he said. "It's a big uphill battle, but we will achieve that." .
In September, Howso open sourced its AI engine, which allows users to design explainable artificial intelligence-driven platforms. And late last month, Capps traveled to Washington, D.C., to address the 7th Annual U.S. Senate AI Insight Forum.
The next day, he spoke to The News & Observer about what big tech companies get wrong about AI, what he believes Howso gets right, and why everyone should care. This conversation has been edited for clarity.
Q: Can you give an example of 'black box' artificial intelligence in use today? .
A: Spotify is putting AI in and they are growing their business with great AI that's black box. Is the DJ choosing music for you based on your history or because Spotify would like you to see more artists like this? We don't know.
Q: How is Howso's engine different? .
A: We do attributable AI, which is that you can say, If Brian gets the surgery, here are the 17 data points that were most important in that decision. I can attribute this decision back to these data points right here.
We have a client, Scanbuy, that works with all the big retailers to do customer intelligence. So they use our tool built in so they can do predictions on what customers will buy, but do it in a way that's explainable.
N.C. State and UNC are both using it for some projects. There are a few other universities that aren't public yet that are working with it. Part of going open source is (anyone can) use it. So it's pretty recent.
Other Howso clients include Mastercard, the Spanish insurance company Mutua de Madrileña, and the Virginia Department of Behavioral Health and Developmental Services.
Q: How is black box AI a problem overall? .
A: The way I explain it is: Why do we care what's in cereal? .
We put the ingredients on the side because if it's not there, do we really trust every food manufacturer to be doing it the right way? Now legally you have to do it. And we know not everybody even reads it, but the fact that it's there is part of how we know.
(Doing black box) is also bad software development.
We're talking about decisions that have serious human impact. If there is a bug in software, you can go fix it. But if you use a big black box AI system, you don't know if it has a bug or not. It's too complicated to figure it out. And once you deploy it, the only way to fix it is effectively to start over.
Share