This month's X-Ray is unapologetically really small and cold. We haven't dragged the magazine to the cult of the really-small-cold-people living under the bed of iamthemaxx however.
We're looking at quantum computing. What it means, how it works, and the greater implications it has on how we'll be computing in the future. Put a lab coat on, load up on energy drinks and find yourself a sweet new graphing calculator - we're hitting the lab.
Bad 90s sci-fi television
Never mind the computing bit. The quantum part of things is complex enough. A quantum is the minimum unit or measurement involved in an energy interaction. In a sense, a quantum is the idea that the physical property of anything can be quantised. This means that the value of a physical property can only take on a discrete value, rather than any continuous value at all. It's the very smallest amount of a physical property that a system or environment can posses.
To add computing to this complex concoction, a quantum computer is one that takes advantage or makes use of quantum phenomena to perform operations on data.
In understanding a quantum computer, we need to lay down the basics of a normal or binary computer first. In conventional computing, we've got 0s and 1s as representations of bits. These 0s and 1s form patterns within memory and registers to form useful calculable and process capable data. In a quantum computer, we use qubits - or quantum bits, instead.
The difference between a bit and a qubit is the states they can reside in. A bit is akin to a car ignition key. It's either 0, for off, or 1, for on. For a qubit, the value may still either be 0 or 1, but critically, it can also be 0, or 1, or a combination (a superposition) of both. This concept of something with two values also having the potential other 'mixture' state is a matter of probability and Heisenberg's uncertainty principle. The power and raw capability of a quantum computer is buried within this ability to exist in many states, or be calculating in multiple states at once. Think about it as being the difference between a single core, single threaded CPU processing an instruction, compared to a dual core dual CPU processing two instructions simultaneously, then think about its ability in a quantum representation of this at an atomic level, where every atom has the power to compute an instruction in one of these multi-state suspensions in parallel. No need for extra cores - you get all that parallel state processing for free, thanks to the power of the universe!
There's a cat involved!
There isn't much better than lounging around the house on a Saturday morning, taking snap shots of your cat doing stupid things, then posting them on the Internet. Caturday, as it has been dubbed by the masses. As it turns out, cats are pretty good at explaining quantum theory as well.
To better explain the quantum phenomena (entanglement and superposition) that make all this work, we're going to use an old quantum adage known as Schrödinger's Cat.
Let's say you've got a cat in a sealed box. You can't see inside the box. Inside said box, with the cat, is a poisonous gas that is somehow contained. Unfortunately, kitteh got curious, and released the poisonous gas. Now we're in a state where the cat is either dead (0), or alive (1). Until the point where we open the box to find out exactly what happened, the cat itself is notionally both dead (0) and alive (1), in that it can exist in both states. This is known as a superposition. The superposition effect is destroyed when we open that box, however, because the 'measurement' of it makes a statement that confirms either 0, or 1 - but not both (because that would be impossible, outside of quantum theory).
The theory extends a bit further to explain entanglement. If we've got two boxes, and two cats, and we open one of them, to find a cat alive, we can state that the other cat is by definition alive too, even if that second box is never opened.
Herein, some of the computing power is unleashed through quantum theory. Because we have the ability to influence a system or algorithm dynamically, without changing an entire environment, complex calculations can be returned or computed far quicker than with a conventional binary methodology. You might call it 'true' parallelism in a sense. It's not stuffing things into pipelines, and farming threads off to separate cores. It's really doing things (where a 'thing' is a discrete workload) at a physically simultaneous level.