I have a computer science degree. I work in IT, and have done so for many years. In that period “classical” computers have advanced by leaps and bounds. I remember teletypes and paper tape, and punched cards too. I also remember when a top-notch disk drive was the size of a washing machine and the cost of a car. It provided a miserly 10 megabytes of storage. My disk drive today is the size of my wallet and cost £46.99. It provides a terabyte of storage. It’s currently somewhere in my bedroom drawer amongst my socks. We have a computer in every bedroom, plus tablets, consoles, and much much more. My phone has more processing power than NASA had when they put a man on the moon. It’s pretty much the same for everybody I know. We have phenomenal computing power at our fingertips. We have the internet. Computers have revolutionized our lives. They have changed our lives for the better.
Quantum computing hasn’t changed our lives for the better
But quantum computing hasn’t changed our lives for the better. Moreover, it looks like it’s going to stay that way. Quantum computing has been around now for nearly forty years, during which time real computing has left it in the dust. See the timeline section of the Wikipedia quantum computing article, and ask yourself where’s the parallel adder? Where’s the equivalent of Atlas, or the MU5? I went to Manchester University, see the history on the Manchester Computers article on Wikipedia. Quantum computers don’t show similar progress. Au contraire, they haven’t even got off the ground. You won’t be buying one in PC World any time soon. Yes, you hear about things like the D-wave quantum computer, which costs fifteen million dollars. It’s a black ten-foot cube, like something out of a science fiction movie:
D-wave computer and co-founder and CTO Dr Geordie Rose, image credit D-wave
But you also hear about the skepticism. People ask if it’s really a quantum computer. See for example Will Bourne’s January 2014 article D-Wave’s Dream Machine. It includes sentences like this: “He believes scientific rigor and transparency are being clouded by Rose’s commercial ambitions and “hype”, and he fears that Rose’s overreach could tarnish the entire field of quantum computing research, setting it back years”. Also see James Vincent’s 2016 article Biggest ever quantum chip announced, but scientists aren’t buying it. He says this: “A study published in Science in 2014 found that tasks performed on the company’s machines were no faster than conventional computers”.
Report cools down quantum computing hype
Michael Biercuk wrote an interesting article in August 2017 called Hype and cash are muddying public understanding of quantum computing. Another interesting article was The Argument Against Quantum Computers dating from February 2018 where Katia Moskvitch interviewed Gil Kalai. He said qubits in superposition will inevitably be corrupted by noise, and getting the noise down is a showstopper of a fundamental issue rather than just a mere matter of engineering. Yet another interesting article was Report cools down quantum computing hype written by Katherine Bourzac in December 2018. She referred to a National Academies report and said this: “A report released Dec, 4 by the National Academies of Sciences, Engineering, and Medicine throws some cold water on the hype smouldering around quantum computing”.
We’re not any closer
There’s scepticism for quantum computers even when they come from a company like IBM. See the January 2019 Wired article by Amit Katwala, who said IBM’s quantum computer is important, but it’s far from ready. The physical structure of the IBM Q System One was designed by Map Project Office, an industrial design consultancy. It’s been deliberately designed to look impressive. That’s why it’s a cross between a supercomputer and a brain in a jar, all in a nine-foot crystal cube:
IBM Q System One image from IBM, see reportage at Serve the home
Yes, it’s great presentation, and a captivating aesthetic. But note this: “A 20-qubit system is unlikely to be practically useful, says Robert Young, director of the Lancaster Quantum Technology Centre”. Also see Why Experts Are Skeptical of IBM’s New Commercial Quantum Computer by Ryan F Mandelbaum. He quotes Andrew Childs from the University of Maryland saying this: “figuring out how to make a lot of low-noise qubits is a lot more important than figuring out how to put them in a beautiful package”. Mandelbaum ends up saying that despite IBM’s flashy announcement, we’re not any closer to having a broadly useful, error-corrected quantum computer.
A 64-qubit quantum computer is not like a 64-bit computer
I empathize with the sentiment because there’s something they don’t usually tell you about in the popular press. They’ll talk about a 64 qubit quantum computer as if it was something like a 64-bit classical computer. But see the Wikipedia article on 64-bit computing. It’s “the use of processors that have datapath widths, integer size, and memory address widths of 64 bits (eight octets). Also, 64-bit computer architectures for central processing units (CPUs) and arithmetic logic units (ALUs) are those that are based on processor registers, address buses, or data buses of that size”. A 64-qubit quantum computer is nothing like that. It merely has 64 qubits at its heart. It takes 4 bits to make up one hexadecimal character, so we need 64 bits for 16 hex characters, like this: FFFF FFFF FFFF FFFF. The IBM Q System One has only 20 qubits. In hex terms that’s FFFFF. It really isn’t much.
That’s not to say all ongoing work in the field of quantum computing must be useless. People have complained that D-wave are only making analogue computers rather than real quantum computers. But as a person who takes a “realist” approach to physics, I don’t have a problem with that. Particularly since analogue computers can be very powerful, and very useful.
Analogue computer and D-wave lattice, see pinky on SDIY and Quantum annealing with more than one hundred qubits
A slide rule is an analogue computer, we used them to design atom bombs before calculators were around. They did the job. An abacus did the job in China for a couple of thousand years. Other analogue computers used water and did the job. For example, if you wanted to find a path through a maze, you could pump water into the entrance and throw in some coloured dye. As the water escapes from the exit, the dye will trace out the path through the maze. Other analogue computers used electronics to do the job, such as for fire control in a battleship. So I wouldn’t rule out all of the engineering and manufacturing work that’s been ongoing. Especially since the nitty gritty stuff can deliver a serendipitous spinoff, such as the transistor.
Fundamental physics raises fundamental issues
I’d say the real issue is the fundamental physics. Take a look at the Wikipedia qubit article. It says this: “A qubit is a two-state (or two-level) quantum-mechanical system, one of the simplest quantum systems displaying the peculiarity of quantum mechanics. Examples include: the spin of the electron in which the two levels can be taken as spin up and spin down”. The article also says quantum mechanics allows the qubit to be in a coherent superposition of both states simultaneously, a property “which is fundamental to quantum mechanics and quantum computing”. However you could make the same claim about a vector pointing North East. It points North, and it points East:
Qubit image from The Future of Computing – Quantum & Qubits by Sam Sattel autodesk
Then if you’ve read up on the history of physics and know about the electron and the wave nature of matter, you know that superposition is just a wave phenomenon. We can make electrons and positrons out of photons in gamma-gamma pair production, and we can diffract electrons. See the Wikipedia article on the superposition principle, which has a section on wave superposition. Also see the article on the Heisenberg uncertainty principle, which is “inherent in the properties of all wave-like systems”. An electron is a wave-like system. How are you going to leverage a supercomputer out of a modest number of wave-like systems with an indeterminate state? I’ve read about the history of quantum mechanics, and I do not believe in magic. So it feels to me as if the fundamental physics raises fundamental issues here. I just can’t see how quantum computing is ever going to work.
Is spookiness under threat?
My attitude is coloured by a 2007 New Scientist article written by Mark Buchanan called Quantum Entanglement: Is Spookiness Under Threat? He referred to a paper by Joy Christian entitled Disproof of Bell’s Theorem by Clifford Algebra Valued Local Variables. In essence Christian argued that John Bell got his famous theorem wrong because he assumed that hidden variables commute. That’s because experimentally observed values +1 and -1 look like they commute. However Bell didn’t consider rotations, which do not commute. For rotations, the result depends on the sequence:
Rotations image from New Scientist
The article says this: “In 1843, the Irish mathematician William Rowan Hamilton found a way to capture this non-commuting property in a set of number-like quantities called quaternions. Later, the English mathematician William Clifford generalised Hamilton’s quaternions into what modern mathematicians call Clifford algebra, widely considered the best mathematics for representing rotations. So convenient are quaternions that they are commonly used in computer graphics and aviation”. This is William Kingdon Clifford, the man who came up with space theory of matter. The man who said “I hold in fact: (1) That small portions of space are in fact of a nature analogous to little hills on a surface which is on the average flat; namely, that the ordinary laws of geometry are not valid in them”. I’m confident enough about that because of Schrödinger’s 1926 paper, where on page 26 he talked about light rays showing remarkable curvature and getting into a small closed path. He was talking about the electron, and I know that electron spin is real. See Hans Ohanian’s 1984 paper what is spin? He said “the means for filling the gap have been at hand since 1939, when Belinfante established that the spin could be regarded as due to a circulating flow of energy”.
Gif courtesy of Adrian Rossiter’s torus animations, S-orbital image from the 2010 Encyclopaedia Britannica
An electron goes around and around in a uniform magnetic field because it’s a dynamical “spinor”. The spin is a compound spin ½ rotation, but it’s hidden because the electron looks like a standing wave. So we have a hidden variable which is a rotation. Which makes it clear to me that Christian was essentially correct. What’s not to like?
Burn the heretic
Buchanan’s New Scientist article said “twenty years ago, it was heretical even to raise such an idea”. It seems it still is, because Christian received some awful opprobrium. From the likes of Scott Aaronson, a vocal advocate of quantum computing. See Aaronson’s 2012 blog post entitled I was wrong about Joy Christian. It is offensive, it is insulting, and it makes for deeply unpleasant reading. There are other similar posts. Christian’s refutation of Aaronson’s claims makes for interesting reading. See section D where he talks about Aaronson: “His campaign involved mockery, defamation, incitement, name-calling, cyber-bullying, cyber-mobbing, and various other forms of intimidation tactics and ad hominem attacks, rationalized by reiteration of some incorrect criticisms of my argument previously advanced by others”. It would seem that Aaronson thought Christian’s physics was some kind of threat to his quantum computing livelihood. Christian also says “The purpose of his shaming campaign was not just public humiliation and discrediting of my research, but, in his own words, also to starve me off by cutting off my financial and academic supports, thereby thwarting my ability to continue my work”. Apparently this is fine by the University of Austin at Texas. It would seem that academics think it’s perfectly acceptable to hurl insults and impose censorship. It would seem that that’s the way they are, and have been for years. It’s the modern equivalent of burn the heretic. Interesting reading indeed.
How Space and Time Could Be a Quantum Error-Correcting Code
Something else that’s interesting reading is How Space and Time Could Be a Quantum Error-Correcting Code. It was written by Natalie Wolchover, and appeared in Quanta magazine in January 2019. It makes interesting reading for a very different reason: it’s extremely hypothetical. It starts by saying this: “In 1994, a mathematician at AT&T Research named Peter Shor brought instant fame to “quantum computers” when he discovered that these hypothetical devices could quickly factor large numbers – and thus break much of modern cryptography”. How do you “discover” what a hypothetical device can do? Especially when “qubits are maddeningly error-prone” such that “the feeblest magnetic field or stray microwave pulse” causes them to undergo bit flips or phase flips? The article gets even more hypothetical, because it says in 1995 Shor came up with a proof that “quantum error-correcting codes” exist. It quotes Scott Aaronson saying “This was the central discovery in the ’90s that convinced people that scalable quantum computing should be possible at all”. The problem is of course, that it isn’t a real discovery. It’s not like discovering America, or discovering penicillin.
A deep connection between quantum error correction and the nature of space, time and gravity
The article then gets yet more hypothetical: “But in the dogged pursuit of these codes over the past quarter-century, a funny thing happened in 2014, when physicists found evidence of a deep connection between quantum error correction and the nature of space, time and gravity”. Surely that’s so hypothetical it’s hyperbole? Whatever next? This: “In Albert Einstein’s general theory of relativity, gravity is defined as the fabric of space and time – or “space-time” – bending around massive objects. (A ball tossed into the air travels along a straight line through space-time, which itself bends back toward Earth). That’s wrong. That isn’t how gravity works. A ball tossed into the air doesn’t travel along a straight line through spacetime. Spacetime is a mathematical abstraction that models space at all times, so there’s no motion through it. Moreover spacetime curvature relates to the tidal force, not the force of gravity. Take a look at the room you’re in. The force of gravity is 9.8 m/s² at the floor and at the ceiling. So there’s no detectable tidal force, and so no detectable spacetime curvature. But your ball still falls down.
Quantum gravity is a castle in the air
Then comes some more hype, this time about quantum gravity: “But powerful as Einstein’s theory is, physicists believe gravity must have a deeper, quantum origin from which the semblance of a space-time fabric somehow emerges”. Who believes that? I don’t. I believe quantum gravity is a castle in the air. I also believe quantum electrodynamics works after a fashion because when the electron and the proton attract one another, they “exchange field” such that the resultant hydrogen atom has very little in the way of an electromagnetic field. You can mentally chop up this exchanged field into chunks or quanta and say each is a virtual photon. But note that the electron and proton move towards each other because of the screw nature of electromagnetism. Because they’re dynamical “spinors”. It’s something like the way counter-rotating vortices move towards one another. The motion does not occur because they’re exchanging particles. See the peculiar notion of exchange forces part I and part II by Cathryn Carson. Hydrogen atoms don’t twinkle, and magnets don’t shine. In similar vein Einstein made it clear that light curves downwards because the speed of light is spatially variable. Not because gravitons are flying around.
Lies-to-children about anti-de Sitter space
It gets worse, because next comes some lies-to-children about anti-de Sitter space. Wolchover says “three young quantum gravity researchers came to an astonishing realization”. She tells us “they were working in physicists’ theoretical playground of choice: a toy universe called ‘anti-de Sitter space’ that works like a hologram”. And that “the bendy fabric of space-time in the interior of the universe is a projection that emerges from entangled quantum particles living on its outer boundary”. Oh boy, it’s the holographic universe. And guess what? The holographic emergence of space-time works just like a quantum error-correcting code, and space-time itself is a code!
Absurd claims about spacetime
It gets even worse, because then we get absurd claims like this: “John Preskill, a theoretical physicist at the California Institute of Technology, says quantum error correction explains how space-time achieves its “intrinsic robustness,” despite being woven out of fragile quantum stuff”. Again, spacetime is a mathematical abstraction that models space at all times. So there is no motion in it, so these guys don’t understand general relativity. They don’t understand black holes either. Why? Because the article says this: “The language of quantum error correction is also starting to enable researchers to probe the mysteries of black holes: spherical regions in which space-time curves so steeply inward toward the center that not even light can escape”. That’s wrong. Light doesn’t follow the curvature of spacetime. It curves wherever there’s a gradient in the speed of light, which is also a gradient in gravitational potential. Like Don Koks the physicsFAQ editor said, the ascending light beam speeds up. In a black hole the ascending light beam doesn’t curve back round. Instead it doesn’t get off first base. That’s because black holes are places where the “coordinate” speed of light is zero. Not “paradox-ridden places where gravity reaches its zenith and Einstein’s general relativity theory fails”.
HaPPY code and Tinkertoys
Things go seriously downhill after that. Because we then get Juan Maldacena “discovering” that the bendy space-time fabric in the interior of anti-de Sitter space is “holographically dual” to a quantum theory of particles living on the lower-dimensional, gravity-free boundary. Again, this isn’t a real discovery. It’s a conjecture claiming that there is some miraculous equivalence between two other conjectures. Wolchover’s article then talks about the HaPPY code and Tinkertoys tiling space, wherein a black hole is defined by “the breakdown of correctability” and is “like a sink for your ignorance”. Oh, the irony. Then we get the black hole information paradox, “which asks what happens to all the information that black holes swallow. Physicists need a quantum theory of gravity to understand how things that fall in black holes also get out”. Yes, it’s like a sink for your ignorance, because these guys don’t know how gravity works, or why black holes are black. So they don’t understand the problems with Hawking radiation, or the information paradox, or firewalls. They don’t understand that there are gamma ray bursters, and that matter falls faster and faster because of the reducing speed of light. But what does Wolchover’s article say? Quantum error corrections stops firewalls from forming, and instead leads to a two-mouthed black hole called a wormhole! Doubtless they’re holographic too.
Grabbing the limelight
It’s all way too speculative. That’s not good. Especially since quantum computing is grabbing the limelight and leaving other things in the shade. Take a look at the 2010 physicsworld article taming light at the nanoscale. That’s where Nader Engheta talks about optical computing:
Fair use excerpt from taming light at the nanoscale
Optical computing is great science because displacement current is more fundamental than conduction current. Light is displacement current. In gamma-gamma pair production we convert light into an electron and a positron. Then when you move the electron, that’s conduction current. It’s much better to cut out the middleman and use the original light, which moves at c all on its own. Unfortunately when you look on the arXiv, you can see that Nader Engheta isn’t working on optical computing any more. I’m not happy about that.
The case against quantum computing
I’m not the only one who isn’t happy. See The Case Against Quantum Computing by Mikhail Dyakonov. He says things like this: “a useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe”. Along with “I’m skeptical that these efforts will ever result in a practical quantum computer”. Also see The U.S. National Academies Reports on the Prospects for Quantum Computing by David Schneider. He says things like this: “as I read through various parts of the report, I repeatedly found justification for Dyakonov’s skepticism about the prospects for quantum computing”. Along with “This stands in contrast to the rosy picture of the field you’ll find in much of the popular press”. Also see the review of Scott Aaronson’s book by Guy Wilson. He says things like this “the book is simply a propaganda tool for the author’s fantasy”. Along with “he is dismissive, or even derisive of these arguments, for a very simple reason: His bread is deliciously buttered by the lucrative investments in the idea of quantum computers”. Last but not least, see Quantum computing as a field is obvious bullshit by Scott Locklin. He says things like this: “In 2010, I laid out an argument against quantum computing as a field based on the fact that no observable progress has taken place. That argument still stands. No observable progress has taken place”. Along with “Hundreds of quantum computing charlatans achieved tenure in that period of time”. Along with “How many millions have been flushed down the toilet by these turds?” Along with “I do feel sad for the number of young people taken in by this quackery”. I’m afraid to say I think he’s right.
There may be trouble ahead
I say that because I know about computer science, and about physics. I know enough to know that quantum computing isn’t science, it’s pseudoscience. It isn’t founded on physics, it’s founded on fantasy. It’s hype and hot air. It’s pie in the sky and jam tomorrow. It’s a tottering tower of unsupported conjecture atop a layer of specious speculation riding a raft of foundation-free hypothesis. Quantum computing isn’t just a waste of time and money. It’s like counterfeit money, the bad money which chases out the good. It’s been chasing out optical computing, and now it’s chasing out physics. Because now “quantum technology” is getting more and more funding. Even though it’s just Emperor’s New Clothes nonsense from a bunch of charlatans. Even though it’s just smoke-and-mirrors claptrap from a cabal of quantum quacks. Quantum quacks who have censored their critics with venom and bile. Quantum quacks who, with the help of their symbiotic hacks, have peddled enough snake oil and moonshine to persuade gullible politicians to give them the money. There may be trouble ahead.
Note 26/10/2019: there’s been stuff in the press this week about Google saying they’ve achieved quantum supremacy. I recommend you Google on quantum supremacy hype. I also recommend you read the Nature article and pay attention to this: “The team challenged its computer, known as Sycamore, to describe the likelihood of different outcomes from a quantum version of a random-number generator. They do this by running a circuit that passes 53 qubits through a series of random operations. This generates a 53-digit string of 1s and 0s — with a total of 253 possible combinations (only 53 qubits were used because one of Sycamore’s 54 was broken). The process is so complex that the outcome is impossible to calculate from first principles, and is therefore effectively random. But owing to interference between qubits, some strings of numbers are more likely to occur than others. This is similar to rolling a loaded die — it still produces a random number, even though some outcomes are more likely than others”. That’s no calculation. That’s rolling 53 loaded dice, then saying a real computer can’t calculate how they’ll turn up. When Google realise they’ve been conned, there will be hell to pay.