The Evolution of Sorting Algorithms from Dinosaurs to AI

Category Science

tldr #

Sorting has been an integral part of computing operations since the dawn of computing, and has recently been taken to the forefront of computing technology. Google's DeepMind team in June announced a breakthrough in sorting technology that is up to 70% faster than current methods. This was achieved by using reinforcement learning to discover enhanced computer science algorithms. These algorithms are used trillions of times every day in software applications of all kinds.


content #

Sorting, or data structuring, has been a core principle of computing operations since the first computers were developed.The ordering and processing of numbers were demonstrated by the Babylonians around 2500 BC. The Egyptians followed suit around 1550 BC and the Greek mathematician Euclid around 300 BC devised a formula to quickly find the greatest common divisor of two integers.

In the mid-1800s, ordering was a primary objective of mathematician Augusta Ada King, the daughter of poet Lord Byron. She created the first algorithm, intended for use on what was then a theoretical machine imagined by her mentor, mathematician Charles Babbage (known as the father of computers). For that achievement, she earned the title "first computer programmer." .

Charles Babbage is known as the father of computers and he was the mentor of Augusta Ada King, who created the first algorithm,

In 1951, another woman, Frances Elizabeth Holberton, designed the first generative programming system, a rudimentary sort/merge procedure for the U.S. Army. She also helped program ballistics trajectories during World War II.

Actually, according to computer design and cryptology expert Frank Rubin, sorting can be traced to life forms even before humans evolved—some 65 million years before. Dinosaurs, he said, performed simple sorting. They classified all living things into two categories: "food" and "not food." .

Frances Elizabeth Holberton designed the first generative programming system for the US Army which included sort/merge procedure

The pace of computing algorithm development quickened from midway in the 20th century to the present. We now have computers capable of a quintillion calculations a second.

The Babylonians—maybe even the dinosaurs—would be quite impressed.

Also impressive is a breakthrough announced June 7 by the team at Google's DeepMind in an online blog.

The team devised an approach to crunching numbers that is up to 70% faster than current methods. The algorithms have been in use for a year as they have been added to the C++ library. The open-source algorithms are now being used by millions of developers and companies globally, according to DeepMind.

The AI project, AlphaDev, used reinforcement learning to discover new computer science algorithms and outperform those honed by scientists and engineers

The AI project, called AlphaDev, is "an artificial intelligence system that uses reinforcement learning to discover enhanced computer science algorithms—surpassing those honed by scientists and engineers over decades," DeepMind reported on its blog. The paper is also published in the journal Nature.

AlphaDev builds upon the success of its predecessor, AlphaZero, which mastered strategies behind Go and chess.

The paper was published in the journal Nature and the research focused on short lists (3-5 characters) which are used trillions of times a day in software applications

AlphaDev training in sorting was conducted using what researchers referred to as a "single player assembly [language] game." .

Sort algorithms were built one instruction at a time, as AlphaDev continuously explored options to find one that worked better than the last one. The process involves tapping into neural networks to compare and move values, all to attain the most accurate results in the shortest amount of time.

Google's DeepMind team announced a breakthrough on June 7 which is up to 70% faster than current sorting methods

"Moore's Law is coming to an end, where chips are approaching their fundamental physical limits," said DeepMind scientist Daniel Mankowitz. "We need to find new and innovative ways of optimizing computing." .

The research focused on short lists of up to five characters. Researchers said algorithms for three to five characters are the most frequently used by programmers. Such algorithms are used "trillions of times" daily in various software applications, according to the paper.

Moore's Law is nearing its fundamental physical limits so new ways to optimize computing must be found

hashtags #
worddensity #

Share