Probabilistic Computing - A Promising Alternative to Quantum Computing
Category Science Saturday - May 13 2023, 02:42 UTC - 1 year ago Using existing fabrication methods, a group of researchers have now demonstrated the capabilities of probabilistic computing employing stochastic nanodevices to solve NP-complete problems efficiently and reliably. This potential alternative to quantum computing is easier to implement and scale, making it feasible and cost-effective.
According to computational complexity theory, mathematical problems have different levels of difficulty in the context of their solvability. While a classical computer can solve some problems (P) in polynomial time—i.e., the time required for solving P is a polynomial function of the input size—it often fails to solve NP problems that scale exponentially with the problem size and thus cannot be solved in polynomial time. Classical computers based on semiconductor devices are, therefore, inadequate for solving sufficiently large NP problems.
In this regard, quantum computers are considered promising as they can perform a large number of operations in parallel. This, in turn, speeds up the NP problem-solving process. However, many physical implementations are highly sensitive to thermal fluctuations. As a result, quantum computers often demand stringent experimental conditions such extremely low temperatures for their implementation, making their fabrication complicated and expensive.
Fortunately, there is a lesser-known and as-yet underexplored alternative to quantum computing, known as probabilistic computing. Probabilistic computing utilizes what are called "stochastic nanodevices," whose operations rely on thermal fluctuations, to solve NP problems efficiently. Unlike in the case of quantum computers, thermal fluctuations facilitate problem solving in probabilistic computing. As a result, probabilistic computing is, in fact, easier to implement in real life.
Shedding much-needed light on this potential alternative, a group of researchers have now demonstrated the capabilities of probabilistic computing by simulating stochastic nanodevice networks to solve specific NP problems. The study, led by Professor Peter Bermel from Purdue University, is published in the Journal of Photonics for Energy (JPE).
The researchers used the "Ising model," a canonical model for simulating a wide variety of physical as well as mathematical problems. Originally devised to model the interactions of magnetic dipole moments of atomic spins, its energy operator, namely the "Hamiltonian," can also represent NP problems. Essentially, solving an NP problem amounts to solving the corresponding Ising Hamiltonian. Probabilistic computing devices made of networks of optical parametric oscillators (OPOs) and stochastic circular nanomagnets with low thermal barriers have been used to solve such problems.
The researchers implemented one such nanomagnet network using existing fabrication methods. They then used it to solve the Ising Hamiltonians of four NP-complete problems (problems with no efficient solution algorithm) from number theory associated with combinatorial optimization. These included number partitioning, exact cover, binary integer linear programming, and integer linear programming.
The simulation results of the first three problems with 3, 3, and 6 probabilistic bits (p-bits) strongly agreed with the theoretical solution (Boltzmann law) of the Ising model. The researchers observed a similar agreement between modeling and theory in the simulations of five different exact cover problems with 3, 6, 9, 12, and 15 p-bits. This indicated the potential for scaling up probabilistic computing frameworks.
According to Bermel, "in probabilistic computing, efficient scaling with problem siize is more reliable than in quantum computing." .
Share