WeeklyWorker

22.11.2018

Quantum computing has arrived

Yassamine Mather explains why the next generation of computers offers both huge opportunities and huge dangers

You know a scientific subject is coming of age once business papers such as the Financial Times and Wall Street Journal start following its progress. Although we are still far from seeing the general use of quantum computers, regular reports of this technology remind us that the advent of this type of powerful next generation is not as distant as we might once have thought.

The following are examples of the Financial Times reporting on quantum computing: “As attacks on infrastructure become more likely, scientists hope that quantum computing will offer the best chance of defence; in the nearer term, quantum navigation could relieve militaries of reliance on GPS satellites and space.”1 And then there is the US professor of radiology telling the paper that, after experimenting with quantum technology, he has “seen a computer come up with MRI (magnetic resonance imaging) results that have exceeded those of his own expert ‘feel’”.2

There is no doubt that quantum computers have revolutionised many scientific areas - in particular computational chemistry, where there are already dramatic improvements in our understanding of existing materials. The predictions are that future scientific advances will pave the way for the discovery of new materials, revolutionising every aspect of our lives from medicine to travel, from solving the mystery of black holes to re-imagining Schrödinger’s cat. This is a reference to the physicist, Erwin Schrödinger, who in 1935 described how a cat in a box could be in an uncertain predicament - the rules of quantum theory meant that it could be both dead and alive!3 Quantum objects, such as electrons, live in a cloud of uncertainty, mathematically encoded in a ‘wave function’ that changes shape smoothly, much like ordinary waves in the sea.

Traditional computers would not have been able to deal with such problems. Going back to computational chemistry, the study of simulated chemical bonds and reactions is probably one of the most exciting areas where quantum computers will make a difference. Here physicists use trapped ion qubits (quantum bits or units of quantum information) to demonstrate the very first simulation of a chemical bond. This has opened the way for simulating other phenomena in nature, and will lead eventually to a completely new understanding, providing humans with the ability to solve what are currently considered impossible problems in materials science, medicine and industrial chemistry, all using simulation.

In addition, while current computers can deal with problems that proceed through a logical sequence of steps, quantum machines can deal with probabilistic algorithms - calculations that do not follow such predictable logical steps. However, because qubits are unstable, they only hold their quantum state for a tiny fraction of a second and are linked together in increasingly large numbers. Because they are unpredictable, interactions between them lead to high error rates. Learning how to control results out of these complicated and ‘noisy’ systems (systems full of disturbance) is already a huge challenge, and scaling them up to computers is far beyond today’s capabilities.

None of this has diminished enthusiasm and funding for quantum computers and, of course, beyond science there are hundreds of other uses, including finance. We are told in the Wall Street Journal that future ‘quants’ on Wall Street will “map financial problems to similar problems in physics and then map them back”.4

Major banks are already experimenting with this, creating and simulating models of financial markets, improving operational efficiencies in areas such as clearing and trade reconciliation, which are considered difficult to simulate.

And, as always with science, the military has better funding, access to new research and we already know of plans for using quantum computing in ‘defence’ and ‘security’. Scientists in ‘Darpa’ (Defence Advanced Research Projects Agency) in the USA are working on quantum science as part of plans to secure the cryptography breakthroughs of the future.

In Darpa they have also built the world’s first perceptron implemented on a quantum computer and have used it for simple image processing tasks. The perceptron takes what is called a vector input - basically a set of numbers - and multiplies it by a weighting vector to produce a single-number output. If this number is above a certain threshold, the output is 1, and if it is below the threshold the output is 0.

The work was originally started by Frank Rosenblatt (1928-71), an artificial intelligence expert, who devised an electronic device built in accordance with biological principles and showed an ability to learn - a perceptron. This device was capable of picking up the image of a triangle held before it and send it along a random succession of lines to the response units, where the image was reproduced. His work was also summarised in a book published in 1961: Principles of neurodynamics: perceptrons and the theory of brain mechanisms. That work has now been repeated on a quantum computer. The technology behind the new development is IBM’s Q-5 ‘Tenerife’ superconducting quantum processor. This is capable of processing five qubits and programmable over the web by anyone who can write a quantum algorithm, using a classical vector (like an image) as an input, combining it with a quantum weighting vector, and then producing a 0 or 1 output.

The big advantage of any quantum computer is that it allows an exponential increase in the number of dimensions it can process.

In very basic terms, ‘autumn physics’ deals with very small particles, atoms and electrons, where the laws of classical mechanics (also known as Newtonian mechanics) are no longer valid. Since the early 1900s mathematical advances allowed physicists to look at relativistic mechanics in relation to very high speeds, and quantum mechanics describes nature at the smallest scales of energy levels of atoms and subatomic particles and their wave-particle duality. The dual nature of light was explained by Albert Einstein in 1905, when he described it in terms of photons, with properties of particles. This was followed by his paper on special relativity, in which light acted as a field of waves.

Quantum computing uses this strange ability of subatomic particles to exist in more than one state at any time. Due to the way in which the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers. Another quantum property these machines draw on is entanglement: two quantum bits can influence each other, even when they are not physically connected - something Einstein called “spooky action at a distance”.

Traditional computers

For more than 50 years, scientists have worked to reduce the size of transistors. Moore’s law - which states that roughly every two years the number of transistors in a dense, integrated circuit doubles, allowing a steady decline in both the size and cost of computers - has been upheld.

Today, however, transistors and other electronic components are so small (nanometer size) that soon the standard laws of physics will no longer be valid. There is no doubt that the limits of semiconductor physics mean that CPU (central processor unit) performance now only grows by 10% a year. As transistor size shrinks to such a degree, the laws of quantum physics come into play, essentially dictating that they will be of little use.

To understand this we have to look at how the most basic component of a computer, the transistor, works. This component functions as a voltage switch. It exists in one of two states - 1 or 0. State 1 allows electrons to flow through it (high voltage), while 0 stops electrons flowing, acting as a barrier or gate (low voltage = 0).

When the size of the transistor is reduced to below 5nm, quantum mechanics dictates that the transistor is no longer capable of working as a gate/valve/barrier to the electrons. This is because of a phenomenon known as quantum tunnelling, where electrons can essentially bypass a previously impenetrable barrier as if they were tunnelling or warping though them.

Traditional computers process binary information - this means that something is either on or off, or, in shorthand, 1 or 0. Mathematician Alan Turing proved that you can do any calculation if you have enough switches that you can turn on or off. If you had a dimmer switch you could achieve many more options, including ‘on’, ‘off’ and a range of states between that are experienced as degrees of brightness and darkness. In fact with a dimmer switch, a light bulb can be either on or off - or a combination of both.

Quantum computers are heralded as the super-computers of the future, because they allow us to go beyond the binary and are capable of dealing with far more complicated calculations. Quantum computers rely on components that can be both ‘on’ and ‘off’ at the same time - a so-called ‘superposition’ of states.

However, the standard CPU is facing other challenges - notably from the Nvidia Corporation’s GPU (graphical processing unit), which provides a parallel path, with a projection of a 1,000-times speed-up by 2025. Nvidia is behind the world’s fastest supercomputers, such as the US-based Summit, which is considered the world’s most powerful supercomputer. The CPU has often been called the brains of the PC, but increasingly that brain is being enhanced by another part of the PC - the GPU.

So as far as personal computing is concerned, apart from transistor count and size, there are other factors like GPUs, RAM and multi-core processors that can be developed to keep increasing the computing power in computers we use daily.

All PCs have chips that render the display images to monitors. However, all these chips are not the same. The GPU can do far more than rendering basic graphics controller functions: it can accelerate computational workloads in areas as varied as cutting-edge scientific research, as well as rather less beneficial financial modelling.

The limitations of GPUs comes into effect when you consider high-performance super-computing. This is where quantum computers will make a difference. These machines are expected to be able to overcome size limitation by changing the way transistors work at a fundamental level. This is important because of the way capitalism expects ‘deep learning’ to pave the way for the increasing dominance of artificial intelligence.

In higher education the buzz word in social sciences, humanities, economics and political science - even in classics and philosophy - is ‘machine learning’ and currently GPUs play an important role in this area. Machine learning (ML) or deep learning (DL) is primarily based on the availability of immense stores of data, the invention of deep learning algorithms and the intense performance of GPU computing.

Its current usage is diverse: in healthcare, for example, medical data in the form of millions of scanned images and corresponding data are used to create neural networks trained to find clues in MRIs that until recently were only detectable through invasive surgical biopsies. Computers are taught through repetition and correction to gain skills such as the capability to read handwriting, recognise objects, respond to human or machine-generated audio commands. In the case of self-driving cars, deep learning is used to teach the vehicle how to recognise the space where it is operating, as well as the obstacles it should avoid.

Of course, there are many limitations to machine learning. It relies on correlation and prediction, but in real life human behaviour is unpredictable in many circumstances and so predicted scenarios fail, even when it comes to simple automated devices, such as supermarket tills. However, quantitative social science research deals with causality (ie, predictive reaction) and that is where ML/DL works well. In this field we encounter attempts to reframe machine learning in terms of causal inference or to detect groups of unobserved heterogeneity, using unsupervised machine learning.

Social impact

There can be no doubt that progress in AI has already impacted on white-collar jobs and, as machine learning progresses, we will see its impact on many professions, from journalism to accountancy law and banking, leading to rising unemployment in advanced capitalist countries. Under capitalism we are not going to witness a future of leisure and creativity, while machines and robots take care of boring routine tasks. Instead we will witness fierce competition for low-paid jobs, in particular in countries where the service sector is the main employer. We are still a long way from that situation, but it is important to note the fundamental differences between the current deployment of artificial intelligence and previous industrial and technical advances.

According to Bernard Marr:

Previous large-scale changes to the way we work (past industrial revolutions) may have been disruptive in the short term. However, in the long term what happened was a transfer of labour from countryside to cities, and no lasting downfall of society.

Previous industrial revolutions involved replacing human mechanical skills with tools and machinery. This time it is our mental functions which are being replaced - particularly our ability to make predictions and decisions. This is something which has never happened before in human history, and no-one exactly knows what to expect.5

Of course, while business and capital has become aware of what our future could hold, Karl Marx predicted most of this more than 150 years ago. Scientists are paying attention to a section of the Grundrisse named ‘Fragment on machine’. Those familiar with the labour theory of value will not be surprised that in this section Marx poses one of the most fundamental questions that AI raises: how do we define value, when the human labour required to create commodities rapidly approaches zero?

In what was unbelievably prophetic for the time, Marx predicted that automation can change the relation between labour and capital and the means of labour/production:

Once adopted into the production process of capital, the means of labour passes through different metamorphoses, whose culmination is the … automatic system of machinery … set in motion by an automaton - a moving power that moves itself; this automaton consisting of numerous mechanical and intellectual organs, so that the workers themselves are cast merely as its conscious linkages.6

There are different interpretations of this. For Marx, automation of production would not automatically lead to the end of capital accumulation, but he could foresee all this as a driving force for alienation. In such circumstances, class emancipation could only be achieved when the working class gained control of production. This is in complete contrast to the autonomistic, wild predictions of the ‘inevitable’ collapse of the capitalist system, as advanced automation progressed.

So does all this foresee the end of current computers? Of course not. One reason quantum computers are not a replacement for classical computers is the physical conditions required for their operation. Quantum computers need extremely cold temperatures - near to absolute zero. For example, much of D-Wave’s input/output system must function at 15 millikelvin (-273.135˚ Celsius)! The classical computers most individuals own have built-in fans, and they may include heat sinks to dissipate heat. But supercomputers tend to be cooled with circulated water and stored in air-conditioned rooms. In other words, the ambient operating environments required by quantum computers makes them very expensive and difficult to build. And, of course, currently prices start at around $12 million - although scientists and researchers can buy time on such computers from the likes of IBM, who offer quantum computing as a cloud service.

yassamine.mather@weeklyworker.co.uk

Notes

1. www.ft.com/content/442de9aa-e7a0-11e8-8a85-04b8afea6ea3.

2. www.ft.com/content/154a1cf4-ad07-11e8-94bd-cba20d67390c.

3. See https://en.wikipedia.org/wiki/Schr%C3%B6dinger’s_cat.

4. www.wsj.com/articles/the-quants-run-wall-street-now-1495389108.

5. www.forbes.com/sites/bernardmarr/2018/07/02/how-artificial-intelligence-could-kill-capitalism/#569686154222.

6. www.marxists.org/archive/marx/works/1857/grundrisse/ch13.htm.