Monday, September 12, 2022

Poor Man's Qubit


Physics-inspired graph neural networks to solve combinatorial optimization problems
May 2022, phys.org

Uses graph neural networks (GNNs) to tackle combinatorial optimization problems.

"Given their inherent scalability, physics-inspired GNNs can be used today to approximately solve (large-scale) combinatorial optimization problems with quantum-native models, while helping our customers get quantum-ready by using the mathematical representation that quantum devices understand," Brubaker said.

Solves optimization problems without the need for training labels.

Caveat: Brought to you by Amazon - "Our work was very much inspired by customer needs"

On the topic of optimization problems and quantum computing, it's getting easier to understand (since every other article is on this topic) that quantum computers will be good at optimization, but the key word is "will". And so for now, we're figuring out how to do optimization problems using regular computers, but in a funny way they sort of weren't meant ot be used, but which becam ereally uselful with the advent of big data. And that half-way of using regular computers like quantum computers is to use the graphics processors in parallel to create neural nets. 

via Amazon Quantum Solutions Lab: Martin J. A. Schuetz et al, Combinatorial optimization with physics-inspired graph neural networks, Nature Machine Intelligence (2022). DOI: 10.1038/s42256-022-00468-6

Image credit: Gyroid for manipulating light into topological states, Nik Spencer for Nature, 2017 [link]


The potential of p-computers
Jun 2022, phys.org

Probablistic computers, P-computers, are powered by probabilistic bits (p-bits), which interact with other p-bits in the same system. Unlike the bits in classical computers, which are in a 0 or a 1 state, or qubits, which can be in more than one state at a time, p-bits fluctuate between positions and operate at room temperature.

Camsari describes the Ising machine (sIm) as a collection of probabilistic bits which can be thought of as people. "The people can make decisions quickly because they each have a small set of trusted friends and they do not have to hear from everyone in an entire network," he explained. 

The researchers showed that their sparse architecture in field-programmable gate arrays was up to six orders of magnitude faster and had increased sampling speed five to eighteen times faster than those achieved by optimized algorithms used on classical computers.

via University of California Santa Barbara Institute for Energy Efficiency: Navid Anjum Aadit et al, Massively parallel probabilistic computing with sparse Ising machines, Nature Electronics (2022). DOI: 10.1038/s41928-022-00774-2


Also this:
'Poor man's qubit' can solve quantum problems without going quantum

No comments:

Post a Comment