Is Quantum Computing the Future of AI?


(metamorworks/Shutterstock)

Quantum computing has grabbed the imagination of computer scientists as one possible future of the discipline after we’ve reached the limits of digital binary computers. Thanks to its capability to hold many different possible outcomes in the “quantum state,” quantum computing could potentially deliver a big computational upgrade for machine learning and AI problems. However, there are still a lot of unanswered questions around quantum computing, and it’s unclear if the devices will help with the building wave of investment in enterprise AI.

We’ve done quite well with the line of binary computers that first appeared in the 1950s and have evolved into the basis of today’s multi-trillion-dollar IT sector. With just two bits and three Boolean algebraic operators, we created tremendous data-crunching machines that have automated many manual tasks and had a large impact on the world around us. From basic accounting and supply chain routing to flight control computers and understanding the genome, it’s tough to overstate the impact that computers have had on our modern lives.

But as we approach the limits of what classical binary computers can do, quantum computers have emerged with the (as yet unfulfilled) promise of a tremendous upgrade in computational power. Instead of being restricted to Boolean linear algebraic functions on 1s and 0s, quantum computing allows us to use linear algebra upon quantum bits, or qubits, that are composed of numbers, vectors, and matrices interacting in quantum states, including superposition, entanglement, and interference.

Quantum computing opens the door potentially solving very large and complex computational problems that are basically impossible to solve on traditional computers. This includes things like using brute-force methods to guess the passcode used to encrypt a piece of data using a 256-bit algorithm. Data encrypted with AES-256 is considered secure precisely because it can’t be cracked with a brute-force attack (it’s possible, but it would take many thousands of years with current technology, which makes it practically impossible). But with quantum computers’ ability to compute with multiple possible states, solving such problems will now be within practical reach.

The Google Sycamore quantum processor (Image source: Google)

Another example is the traveling salesman problem. Given a number of geographic locations, figuring out the most efficient path among them is actually an extremely compute-intensive problem. UPS, which spends billions on fuel for its delivery trucks, has gone so far as to limit the number of left turns its drivers make in an attempt to maximize delivery time and minimize fuel use, making it an interesting twist on the old traveling salesman problem.

Which brings us to machine learning and AI. The latest incarnation of machine learning, deep learning, is pushing the limits of what traditional computers can handle. Large transformer models, such as OpenAI’s GPT-3, which has 175 billion parameters, take months to train on classical computers. As future models grow into the trillions of parameters, they will take even longer to train. That is one reason why users are adopting novel microprocessor architectures that deliver better performance than what traditional CPUs and even GPUs can deliver.

But at the end of the day, CPUs and GPUs are tied to classical binary computers, and the limitations they entail. Quantum computers offer the possibility of a quantum leap in performance and capability for a range of use cases, and AI is definitely one of them.

Cem Dilmegani, who is an industry analyst at AIMultiple, defines quantum AI as the use of quantum computing for running machine learning algorithms. “Thanks to computational advantages of quantum computing, quantum AI can help achieve results that are not possible to achieve with classical computers,” Dilmegani writes.

A quantum computer from Oxford-Quantum-Circuits (Image courtesy of the company)

One of the early quantum computer manufacturers that’s making moves in this area is Google. In March 2020, Google launched TensorFlow Quantum, a which brings the TensorFlow machine learning development library to the world of quantum computers. With TensorFlow Quantum, developers will be able to develop quantum neural network models that run on quantum computers.

While running AI applications on quantum computers is still in its very earliest stages, there are many organizations working to develop it. NASA has been working with Google for some time, and there is also work going on in the national labs.

For instance, last month, researchers at Los Alamos National Laboratory published a paper called “Absence of Barren Plateaus in Quantum Convolutional Neural Networks,” which essentially shows that convolutional neural networks (the type commonly used for computer vision problems) can run on quantum computers.

“We proved the absence of barren plateaus for a special type of quantum neural network,” Marco Cerezo, a LANL researcher who co-authored the paper, said in a LANL press release. “Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters.”

LANL researchers are bullish on the potential for quantum AI algorithms to provide the next breakthrough in computational capability. Patrick Coles, a quantum physicist at LANL and a co-author of the paper, said this approach will yield new approaches for crunching large amounts of data.

“The field of quantum machine learning is still young,” Coles said in the LANL press release. “There’s a famous quote about lasers, when they were first discovered, that said they were a solution in search of a problem. Now lasers are used everywhere. Similarly, a number of us suspect that quantum data will become highly available, and then quantum machine learning will take off.”

Earlier this year, IBM Research announced that it found “mathematical proof” of a quantum advantage for quantum machine learning. The proof came in the form of a classification algorithm that, provided access to “classical data,” provided a “provable exponential speedup” over classic ML methods. While there are plenty of caveats to go along with that statement, it provides a glimpse into one potential future where quantum AI is feasible.

IBM quantum computer (Source: IBM)

To be sure, there is plenty of doubt whenever two highly hyped technologies–AI and quantum computing–come together. In its July 2021 blog, IBM stated: “Few concepts in computer science cause as much excitement—and perhaps as much potential for hype and misinformation—as quantum machine learning.”

While there appears to be potential with quantum AI, that potential is, as yet, unrealized. On the bright side, there appears to be at least cause for some optimism that a real breakthrough could be in our future.

“Sceptics are correct in that quantum computing is still a field of research and it is a long way from being applied to neural networks,” Dilmegani writes. “However, in a decade, AI could run into another plateau due to insufficient computing power and quantum computing could rise to help the advance of AI.”

It’s still too soon to tell whether the field of quantum computing will have a major impact on the development of AI. We’re still in the midst of what those in the quantum computing field call “Noisy Intermediate-Stage Quantum,” or NISQ. There definitely are many promising developments, but there are too many unanswered questions still.

Related Items:

Machine Learning Cuts Through the Noise of Quantum Computing

Google Launches TensorFlow Quantum

IBM Pairs Data Science Experience with Quantum Computer


Go to the Source Link


You might also like
Leave A Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. AcceptRead More