Online Algorithms: A Complex Problem That Even AI Can't Solve
Category Science Monday - November 27 2023, 06:05 UTC - 12 months ago Online algorithms are problem-solving strategies that are used to cope with uncertainty in the real-world. One such problem is the k-server problem, which deals with dispatching agents to fulfill requests that come in one by one and try to minimize the total distance traveled. Researchers have been trying to achieve a competitive ratio of 1 in this problem, but three computer scientists have discovered that it may not be achievable in certain cases.
In life, we sometimes have to make decisions without all the information we want; that's true in computer science, too. This is the realm of online algorithms — which, despite their name, don't necessarily involve the internet. Instead, these are problem-solving strategies that respond to data as it arrives, without any knowledge of what might come next. That ability to cope with uncertainty makes these algorithms useful for real-world conundrums, like managing memory on a laptop or choosing which ads to display to people who browse the web.
Researchers study generalized versions of these problems to investigate new methods for tackling them. Among the most famous is the "k-server problem," which describes the thorny task of dispatching a fleet of agents to fulfill requests coming in one by one. They could be repair technicians or firefighters or even roving ice cream salespeople.
"It's really simple to define this problem," said Marcin Bieńkowski, an algorithms researcher at the University of Wrocław in Poland. But it "turns out to be bizarrely difficult." Since researchers began attacking the k-server problem in the late 1980s, they have wondered exactly how well online algorithms can handle the task.Over the decades, researchers began to believe there's a certain level of algorithmic performance you can always achieve for the k-server problem. So no matter what version of the problem you're dealing with, there'll be an algorithm that reaches this goal. But in a paper first published online last November, three computer scientists showed that this isn't always achievable. In some cases, every algorithm falls short.
"I am happy to say that it was a big surprise to me," said Anupam Gupta, who studies algorithms at Carnegie Mellon University and was not involved in the paper. The work offers "deeper insight into the central problem in this area." .
Computer scientists first outlined this problem in 1988. To get a sense of it, let's imagine a company that employs plumbers. As calls come in, the company needs to decide which plumber goes to which location. The company's goal — and the goal of an algorithm for the k-server problem — is to minimize the total distance traveled by all the plumbers.
The tricky part is that the company doesn't know in advance where the calls will come from. If it did, then it could rely on an "offline algorithm" that knows the future. In particular, it could use an ideal dispatching strategy that finds a solution with the least total travel for any string of calls. No online algorithm can ever beat it, or even reliably match it.
But researchers want to get as close as possible. They measure an online algorithm's performance by comparing the travel distance in each strategy, calculating what's known as the competitive ratio. Algorithm designers try to craft online strategies that approach the offline ideal, whittling this ratio down toward 1.
Let's imagine our plumbing company has just two plumbers, and it only serves a single, long street. The calls come one at a time. A reasonable first approach, known as the greedy algorithm, would be to dispatch whichever plumber is closest to the next call. This is simple, and it tends to do okay — but it's not the absolute best solution, in terms of minimizing the travel distance. As the calls continue to arrive, the algorithm keeps dispatching plumbers according to this rule.
The competitive ratio captures the distance traveled by the online algorithm, compared with the offline ideal. And researchers around the world wondered if it is possible to design an algorithm that achieves a competitive ratio of 1 for all versions of the k-server problem.
The paper released by the computer scientists suggested that achieving 1 might be impossible in certain cases. The researchers tested lots of algorithms, and they found one that almost reached the goal — but it was always 0.999..., something very close to 1, but never quite touching it. That suggests it might not be possible to do better in some cases.
That's a surprise for computer scientists, who depend on algorithms to sort out complex problems. In some cases with the k-server problem, it appears that the best available online algorithm can't quite match the offline ideal. Yet, this almost-achieved strategy can still be useful in practice, Gupta said, depending on the specific version of the problem you're trying to solve.
And there's still more work to do in this field, he said. Computer scientists can probe deeper, developing new algorithms or trying different definitions of the problem. They can seek out other tasks where, against all odds, they can achieve the unattainable.
Share