Philosophy and the quantum computer
Latest News »
The events of the past few weeks have provided much fodder to this columnist. First, there was the American grandstanding on immigration, and then the events at Cognizant Technology Solutions Corp., Infosys Ltd and Tata Consultancy Services Ltd. More recently, Cloudflare Inc., which hosts information for close to two million websites, including Uber Technologies Inc. and OKCupid, had an Internet security disaster that saw the leak of passwords, cookies, and private messages from adult dating sites. This “Cloudbleed” was discovered on 17 February, 2017, but has evidently been around for many months.
Decisions, decisions—what does one chew on first? It’s a bit like being at an all-you-can-eat buffet and not knowing where to start. Unfortunately, I am on a diet and for this week at least, I will have to skip the buffet and ask for the a-la carte menu instead. I will settle for something exotic, followed by a palate cleanser as a counterpoint to the small main meal.
Akin to the above analogy of fasting or feasting, all of today’s computing takes its root from the world of “bits”, where a transistor bit, which lies at the heart of any computing chip, can only be in one of two states: on or off. When on, the bit takes on a value of “1” and when off, it takes on a value of “0”, constraining the bit to only one of two values. This is what makes the digital world binary. All tasks performed by a computer-like device, whether a simple calculator or the sophisticated mobiles, tablets and computers we own, are constrained by this binary rule.
Eight bits make up what is called a byte. Today, our computing is based on increasing the number of bytes into kilobytes, megabytes, gigabytes and so on.
Computer scientists take this “on/off” axiom into something called Boolean Algebra, which works well with the idea of 8 bits to a byte with which they assign different values to the varying strings of 1’s and 0’s in byte-based computer chips, thereby giving us the ability to both input information into a computer and to consume its output in natural language.
As a result, all computing advances we have had thus far, including artificially intelligent programmes, driverless cars, and big data prediction models, are ultimately reduced to the binary world of the bit. This is not very difficult to understand when we realize that for centuries, western philosophy has followed the principles of Aristotelian logic. Aristotle’s logic, immediately apparent to any secondary school student, is based on the law of identity (A is A), the law of contradiction (A is not non-A), and the law of the excluded middle (A cannot be both A and non-A at the same time, just as non-A cannot be both non-A and A at the same time).
This Aristotelian axiom is so deeply imbedded in our thinking and educational systems that to us a statement that something is both A and non-A at the same time seems absurd. Paradoxically, however, the idea that something can be both A and non-A at the same time is the holy grail of a new idea called quantum computing.
This idea was first proposed in 1985 by British physicist David Deutsch, but has gained currency only recently. A quantum computer is a machine that operates according to the principles of quantum mechanics, which is the physics of very small particles like electrons and photons.
With quantum computing, information is ostensibly held in “qubits” that can exist in two states at the same time. Incredibly, and against the rules of basic electronics, but in line with the esoteric mathematics of quantum mechanics, this qubit can store a “0” and “1” simultaneously. If you build two qubits, they can hold four values at once—11, 10, 01, and 00. So adding on more qubits —kilo-qubits, mega-qubits and giga-qubits—can exponentially increase the computing capability of such a machine.
D-Wave Systems Inc. currently manufactures a 512-qubit machine, a large progression from its first 16-qubit processor which debuted in 2007.
Many of the world’s scientists argue that the D-Wave machine is something other than the computing holy grail Deutsch posited in 1985. The debate will continue, but meanwhile Google Inc. and other technology companies have nonetheless procured it, in the hope that D-Wave’s machine is a quantum processor of information in the true sense of quantum mechanics.
And now for the palate cleanser: quantum physics and eastern paradoxical logic have many similarities.
Paradoxical logic stands in contrast to Aristotelian logic, and is the system on which many eastern spiritual philosophies such as Buddhism, Taoism, and Vedantic Hinduism are based. These logical formulations are based on positions such as the positive “it is and it is not” of Vedanta or conversely, the nothingness views—“it is neither this, nor that”—of both Buddhism and Vedanta.
I shall not go into a detailed exposition of philosophy here, since I am at the risk of losing you completely while simultaneously exposing that I know nothing, but will tarry long enough to point out that early quantum physicists like James Jeans saw these similarities. It caused him to make the pronouncement: “We have already considered with disfavour the possibility of the universe having been planned by a biologist or an engineer; from the intrinsic evidence of his creation, the Great Architect of the Universe now begins to appear as a pure mathematician.”
If quantum computing is indeed here already, then Jeans’s view is in line with my suggestion in an earlier column that the new advances in computing are best left to pure mathematicians and quantum physicists. And, just maybe, eastern philosophers.
Siddharth Pai is a world-renowned technology consultant who has led over $20 billion in complex, first-of-a-kind outsourcing transactions.