Our digital age is all about bits, those precise ones and zeros that are the stuff of modern computer code. But a powerful new type of computer that is about to be commercially deployed by a major United States military contractor is taking computing into the strange, subatomic realm of quantum mechanics. In that infinitesimal neighborhood, common sense logic no longer seems to apply. A one can be a one, or it can be a one and a zero and everything in between all at the same time. It sounds preposterous, particularly to those familiar with the yes/no world of conventional computing. But academic researchers and scientists at companies like Microsoft, IBM and Hewlett-Packard have been working to develop quantum computers.
Now, Lockheed Martin — which bought an early version of such a computer from the Canadian company D-Wave Systems two years ago — is confident enough in the technology to upgrade it to commercial scale, becoming the first company to use quantum computing as part of its business. Sceptics say that D-Wave has yet to prove to outside scientists that it has solved the myriad challenges involved in quantum computation. But if it performs, as Lockheed and D-Wave expect, the design could be used to supercharge even the most powerful systems, solving some science and business problems, millions of times faster than can be done today.
Ray Johnson, Lockheed’s chief technical officer, said his company would use the quantum computer to create and test complex radar, space and aircraft systems. It could be possible, for example, to tell instantly how the millions of lines of software running a network of satellites would react to a solar burst or a pulse from a nuclear explosion — something that can now take weeks, if ever, to determine.
“This is a revolution not unlike the early days of computing,” he said. “It is a transformation in the way computers are thought about.” Many others could find applications for D-Wave’s computers.
Cancer researchers see a potential to move rapidly through vast amounts of genetic data. The technology could also be used to determine the behaviour of proteins in the human genome, a bigger and tougher problem than sequencing the genome. Researchers at Google have worked with D-Wave on using quantum computers to recognise cars and landmarks, a critical step in managing self-driving vehicles.
QUANTUM READY FOR MAINSTREAM ENTERPRISE APPLICATION
While the technology has yet to break cryptography, quantum computing is ready for mainstream adoption and already is tapped to address real-world enterprise challenges. Pointing specifically to D-Wave’s proprietary annealing technology, Baratz said this allowed quantum computing to scale more easily and be less sensitive to noise and computational errors, to which gate-based systems were prone. Currently in its fifth generation, D-Wave’s quantum computers clock more than 5,000 qubits and capable of supporting commercial rollout “at commercial scale”, he said. This, he added, was a stage that no other market players had been able to achieve thus far with the gate-based model. Commonly adopted in the industry today, the gate system made quantum computers tough to build and sensitive error. Its most stable state currently generated about 30 qubits, which was sufficient to power mostly research work and unlikely to be used to solve business problems at scale for another seven to 10 years, he said.
“Error rates on [gate-based systems] are so high you can’t really do anything with them, even with small problems,” he added, noting that a competitor last year said it was able to solve a specific optimisation problem on its quantum computer. However, this was possible once out of every 100,000 attempts, he said. Quantum computing runs on principles of quantum mechanics that include probabilistic computation.
Baratz said annealing technology, designed specifically for optimisation purposes, had a higher influence on the probability of outcomes and, hence, was less sensitive to errors. It also learnt from where it ended with the previous computation to finetune future ones.
“When you lose coherence, you end up with garbage. With annealing, when you lose coherence, you settle into a [potential] solution and restart the computation to try and improve the solution,” he said. Gate-based model, in comparison, could not do that since it would lose coherence after every computation rather than pick off from the previous run. A grocery using D-Wave to enhance a portion of the customer’s logistics system was able to solve an optimization problem in two minutes per week per location, where previously it took 25 hours per week per location, he noted. There currently are more than 20,000 developers worldwide that have signed up to access Leap, with some 1,000 regularly using the service each month. Paying customers fork out an estimated $2,000 an hour to run computations on D-Wave computers.
Baratz noted, though, that its systems could not solve all quantum computing issues because annealing was designed specifically to solve optimisation problems, which were common challenges for businesses. Gate-based systems, on the other hand, would be able to solve any computation problems once the error rates were reduced — something he said likely would not actualise for at least another seven years. So while D-Wave’s annealing-powered quantum computers were limited to solving optimisation problems, they were capable of addressing real-world business challenges today, he said. Its systems also were on a path towards building a universal error correction system by leveraging the technology it had, he added. To date, more than 250 applications had been built with D-Wave systems, most of which used Leap and spanned various use cases including financial modelling, scheduling, protein folding, and manufacturing optimisation, the vendor said.
High-performance cloud computing, artificial intelligence, and a couple of quantum computers: IBM is going all-in with a freshly signed, decade-long partnership that will see Big Blue provide the technology infrastructure for a new research center dedicated to public health threats such as the COVID-19 pandemic. The Ohio-based Cleveland Clinic, a non-profit institution that combines clinical and hospital care with medical research and education, will use state-of-the-art IBM technology to support its latest project: a global center for pathogen research and human health. Supported by a $500 million investment, the new center will be dedicated to the study of viral pathogens, virus-induced diseases, genomics, immunology and immunotherapies.
Quantum computers are coming. Get ready for them to change everything
To assist researchers’ work preparing for and protecting against emerging pathogens, IBM has designed a “Discovery Accelerator” – contributing the company’s latest capabilities to better support data-based scientific work and fast-track the discovery of new treatments.
“Quantum has been advancing rapidly in the last few years, and it goes without saying how much progress AI has made as well as cloud and especially hybrid cloud,” Anthony Annunziata, director of the IBM Quantum Network, tells ZDNet. “This partnership is bringing them together – the emerging versions of the latest AI, quantum and next-generation capabilities in high-performance computing.”