Simplifying the Data Requirements for Machine Learning on Quantum Computers

Category Machine Learning

tldr #

New quantum computing research has proven that machine learning with quantum computers requires simpler data than previously believed, taking another step in making quantum machine learning easier, more accessible and near-term. This opens the doors to exploiting the capabilities of noisy intermediate quantum computers for practical tasks much faster than with conventional processing.


content #

New theoretical research proves that machine learning on quantum computers requires far simpler data than previously believed. The finding paves a path to maximizing the usability of today's noisy, intermediate-scale quantum computers for simulating quantum systems and other tasks better than classical digital computers, while also offering promise for optimizing quantum sensors."We show that surprisingly simple data in a small amount is sufficient to train a quantum neural network," said Lukasz Cincio, a quantum theorist at Los Alamos National Laboratory. He is a co-author of the paper containing the proof published in the journal Nature Communications. "This work takes another step in the direction of making quantum machine learning easier, more accessible and more near-term." .

Quantum devices are expected to revolutionize computing by providing speed-ups in solving certain kinds of problems, compared to what is achievable today in conventional semiconductor-based computers

The new paper emerged from a collaboration between a Los Alamos team, lead author Matthias Caro, of Freie Universität Berlin, and other researchers from the United States, United Kingdom and Switzerland. The group has been developing the theoretical basis for more efficient algorithms, particularly for quantum machine learning, to exploit the capabilities of these noisy machines while the industry works on improving the quality and enlarging the size of quantum computers.

Training a quantum neural network requires only a small amount of data

The new research paper builds on previous work by Los Alamos National Laboratory and its collaborators demonstrating that training a quantum neural network requires only a small amount of data. Taken together, these recent theoretical breakthroughs prove that organizing training with very few and very simple states offers a specific approach to performing practical work on today's limited quantum computers faster than on conventional, classical-physics-based computers.

Organizing training with few and very simple states offers a specific approach to performing practical work on today's limited quantum computers faster than on conventional, classical-physics-based computers

"While prior work considered the amount of training data in quantum machine learning, here we focus on the type of training data," Caro said. "We prove that few training data points suffice even if we restrict ourselves to a simple type of data." .

"In practical terms, it means you can train a neural network on not only just a few pictures of cats, for example, but also on very simple pictures," Cincio said. "For quantum simulations, it means you can train on quantumly simple states." .

Very simple pictures can be used to train a neural network

"Those states are easy to prepare, which makes the entire learning algorithm much easier to run on near-term quantum computers," said co-author Zoe Holmes, professor of physics at École Polytechnique Fédérale de Lausanne and former Los Alamos postdoc.

A near-term application for quantum computers .

Noise in the form of interactions between quantum bits, or qubits, and the surrounding environment causes errors that limit the processing capabilities of current quantum computer technology. Despite the noise, quantum computers excel at certain tasks, such as simulating a quantum system in materials science and classification of quantum states with machine learning.

The new paper emerged from a collaboration between a Los Alamos team, lead author Matthias Caro, of Freie Universität Berlin, and other researchers from the United States, United Kingdom and Switzerland

"If you are classifying quantum data, then there's a certain amount of noise you can tolerate and still get the answer right," Cincio said. "That's why quantum machine learning may be a good near-term application." .

Quantum machine learning tolerates more noise than conventional machines, making it a viable option for near-term applications of quantum computing. Furthermore, the qubits in quantum computers can process simultaneously, or in parallel, allowing them to process a large amount of data quickly.

Quantum machine learning tolerates more noise than conventional machines

"Our work shows the potential of quantum machine learning in exploiting the best of the quantum world to be applied to real-world problems more quickly than with classical computing," Cincio said. "This points toward the interesting possibility of using the intermediate-scale quantum devices of today to train models in the near future." .


hashtags #
worddensity #

Share