For the past few years, the buzz around quantum computing has been growing steadily louder, but it has mostly been coming from university laboratories and advanced corporate R&D facilities. Now household names are getting in on the action, and a number of technical milestones have either been reached or are imminent.
But what will the effects on mainstream computing be, and will it impact on those of us who design systems based on computer technology and/or use them in our everyday lives?
It would take a thesis to properly define the difference between how a quantum computer operates versus a traditional one, but as a very simple explanation, imagine a coin being flipped. Once it lands and comes to rest, we know it will either be a head or a tail, but while it’s in the air, we can’t know which outcome will transpire. In the world of quantum computing, that coin represents a qubit, or quantum bit, and, incredibly, it exists in both states at the same time – until we observe it, at which time it becomes fixed in one particular state. Confused? Most people would be, which is why it takes some of the best minds to come to terms with, and big budgets to put into practice.
In November 2017, IBM announced it had developed a 50 qubit quantum computer, and was making a 20 qubit system available through its cloud computing platform. These qubits are highly unstable though, as their quantum state is preserved for only 90 microseconds, which was nevertheless a record at the time.
Intel managed to fabricate a 17 qubit superconducting test chip for a laboratory in the Netherlands, and Microsoft has created a quantum programming language designed to work with its Visual Studio IDE. The IDE allows users to simulate problems that require up to 30 logical qubits using a PC, and Microsoft is also building a quantum computer of its own. Google, though, is widely considered to be at the forefront of the race to achieve ‘quantum supremacy’, which is to solve a computing problem demonstrably faster than a regular computer can.
So what are all these qubits good for? At this point, not much. They can and have been used to model simple molecules and simulate their behaviour at a quantum mechanical level. Classical computing can also be used for this purpose, but not nearly as fast. Still, it is estimated to require up to 160 qubits to model something as seemingly simple as a caffeine molecule, which is still beyond reach at this time.
Other potential applications include optimisation algorithms and artificial intelligence, but the most practical application is in data encryption – particularly topical in the current era of global cyberattacks. Since qubits have indeterminate states until they are observed, they can be used to encrypt data in such a way that it is almost impossible to crack the code. Unless, of course, a hacker has a quantum computer of their own, then it becomes an arms race all over again.
It is estimated that 2020 or 2021 will mark the first time that quantum computing will begin to start demonstrating advantages for real users, corporations and scientific research. Even then, the first commercial quantum computers will need a decent sized room to house them, and will cost several millions of dollars, so don’t expect to have one sitting on your desk anytime soon.
Brett van den Bosch
Editor
Tel: | +27 11 543 5800 |
Email: | [email protected] |
www: | www.technews.co.za |
Articles: | More information and articles about Technews Publishing |
© Technews Publishing (Pty) Ltd | All Rights Reserved