Here’s a fun, easy topic to start off with.

I would like to stress that I do *intend* to try and make this as accessible as I possibly can. I am by no means a great science communicator, nor one of those absolute genius researchers that I keep running into who seem to understand topics in ways that I don’t and who wouldn’t make a dumb mistake even if they tried to.

I am liable to make mistakes here, but I’ll give it my best shot anyway. This is meant to be a very gentle, pop-science introduction to the idea rather than a full, detailed and painstakingly researched article. It will also meander a lot into niches I find interesting, especially in this first bit.

This is part 1 of possibly 3 in total and I promise I’ll get there somehow. For now I’m stuck meandering the gorgeously lush hills of Information Theory so I hope you’ll join me on this ill-conceived, hedonistic adventure. I’ll start with the basics.

## Yeah, so, what is a*n* ~~quantum~~ algorithm anyway?

Here’s a fun fact: technical scary-looking words are genuinely useful for describing concepts. Scientists don’t just throw them around to look smart (although, er, honestly sometimes maybe we do. Then again sometimes we do the opposite and make science sound absolutely brilliant). The point of a lot of the jargon we use is that the stuff we study *is* in fact very complicated and nuanced and requires very complicated and nuanced words to be conveyed without any misunderstanding or misinformation being spread throughout the community.

Now: the word **quantum** is a great mess in its own right. I direct you to the thousands of examples of media misusing it and abusing it simply to make a thing sound ‘intelligent’ and somehow otherworldly. The most recent example, as far as I’ve seen is a new cosmic horror game for the PS5 called ‘Quantum Error‘. I can only hope it involves some Quantum Error Correction (I’ll get to that in a later post) as a means of plot resolution too.

**Quantum** comes from the idea of (big surprise here) quantity – at least its etymology does. Physicists appropriated it beautifully (as they are prone to do: see quarks) in order to describe the idea of units or “quanta” of physical quantities. Something particularly interesting in Quantum Mechanics is the notion that we require something inherently continuous, like waves, to deal with things that are unexpectedly discrete, like energy levels of electrons and such (in reality it’s even more interesting than that – I like to think of wavefunctions as a special kind of probability function and there is a question as to what probabilities really are before the events they describe occur – i.e. before the wavefunction is measured). Quantum mechanics deals with things on a scale on which things stop intuitively making sense and even mathematically tend to boggle the mind sometimes. Central to this is often the idea of randomness and uncertainty.

The word **algorithm** has just as long and interesting a history. At its heart, it is a *recipe* – with a few important caveats. Like a recipe, it is a set of instructions, indeed a *finite* set (sequence, more like, meaning that it is also ordered in time). It must also be *well-defined*: there shouldn’t be room left for interpretation on what each step is or what it should end up doing. An algorithm will take an input and process it in a specific way to produce a very particular output.

So what happens when we try to mash up these two words from two different backgrounds? First off, I should say that their backgrounds aren’t so different after all! Central to the idea of both algorithms (as well as computing in general) and Quantum Mechanics (and a lot of Physics) is *information*. An algorithm is a recipe for processing information same as a recipe in a cookbook is for food. Quantum Mechanics has an intimate, turbulent and often passionate relationship with information too. In fact, it is *central* to the whole thing. It’s no surprise that the idea of Quantum Computers took so little time to appear in the minds of scientists after either of the two fields comprising it were established. There is an obvious link.

## Information?

Alright so honestly I could spend days talking about this so I’ll try and keep it concise.

The most fundamental building block of reality is information. This can be contested quite easily, as can many theorems about the fundamental nature of the universe, but it is a fairly benign and beautiful idea which I personally enjoy and maybe even subscribe to.

It all started with a guy called Shannon, who decided to not only precisely quantify what information *is*, but also how to deal with it mathematically and apply it. He reduced it to a pragmatic and tangible entity which – well, *you* try and define information in a formal setting. I *dare* you. *Without looking it up. *The point is that his view of information was amazing for uniting fields of Logic, Statistical and Quantum Physics as well as Computational Complexity (and more) and giving them something concrete to work with.

In essence, Shannon took the binary digit (bit) and turned it into a fundamental unit of this thing called information. One funny thing that jumps out about this first is that, like many physical quantities in quantum mechanics (spin, energy levels and all that schtick), information is *quantized*. Not only that, but from its simplicity we can achieve something called *universality*, a topic favoured by my personal hero Alan Turing (in essence it means that a computer of some sort can do *any* mathematical manipulation in existence). Using something as simple as 1s and 0s or ‘yes’ and ‘no’ or even ‘know’ and ‘don’t know’, we can create computers which manipulate these wonderful binary beasts into the most complicated mathematical forms. Complexity from simplicity.

So yes, information is quantized and binary at its base level and algorithms which run on computers generally use bits to represent the information they’re working with. How does this relate to the fundamental structure of the universe or Quantum Mechanics? I’ve mentioned that algorithms process information so the link there is fairly obvious. Now we’ve established that you can reduce the idea of information to fundamental ‘bits’ and that their manipulation can lead to complicated processes and mathematical manipulation. You can use a computer to add two numbers together or to play Cuphead and all of that reduces to 1s and 0s and how we choose to manipulate them via algorithms.

Here’s the first link: Turing’s work on universal machines was extended gorgeously to the **Church**–**Turing**–**Deutsch** principle:

A universal computing device can simulate every physical process.

This introduces a sort of link between the physical and this intangible thing known as information. Is information really so nebulous though?

Shannon decided to link this new, formalised idea of information to **entropy**. In statistical mechanics, it represents the measure of uncertainty about the state of a physical system. More generally, it is an expression of the *disorder*, or *randomness* of a system. Shannon reinterpreted physical entropy as a measure of the uncertainty about a message—the lack of information about it. Both information entropy and physical entropy apparently share the same mathematical description! In fact, work by Szilárd (note that the original is in German) and Landauer showed that each unit of information brings a corresponding increase in entropy of *kT ln(2) *units, where *k* is Boltzmann’s constant and *T* the environment’s temperature.

What does any of this mean? Well, you can link entropy to computation. A computation that can be reversed – like flipping a bit from 0 to 1 and then back to 0 – does not increase entropy. However, an *irreversible* computation, like the erasure of information, will always increase entropy. Information is physical: by deleting its physical manifestation as strings of bits, the universe reacts! The process of erasing a bit in one place transfers information to another place, in the form of *heat*. In other words:

Information cannot be destroyed.

This is an incredible notion and it links the idea of information directly to the physical world. I recommend reading Landauer’s seminal paper if you find this sort of thing interesting.

This notion of information being physical is important to the idea of Quantum Mechanics too (ah, see, we’re starting to get somewhere). If a universal computer can simulate any physical process as per the Church–Turing–Deutsch principle and Quantum Mechanics describes the fundamental small-scale physics of our reality, then how does the physical aspect of information manifest in such a weird world? No one even claims to understand Quantum Mechanics (oh and I’ll get to that).

Here’s a quote I absolutely adore by Seth Lloyd to get us started:

Merely by existing, all physical systems register information. And by evolving dynamically in time, they transform and process that information. The laws of physics determine the amount of information that a physical system can register (number of bits) and the number of elementary logic operations that a system can perform (number of ops).

Really what he means above is that the universe runs on information, like a giant computer. What that implies is that Quantum Mechanics should also run on information. There should be a processing limit (which he defined, by the way) and a way to access and understand the algorithms running on this hardware. Quantum systems should allow for computation in the same way that classical mechanics does, or a classical computer of some sort.

The only problem is, Quantum Information works a little differently to what we’ve been talking about up until now. Often things get a little strange and un-intuitive when it comes to Quantum Mechanics. That doesn’t mean it’s not related to any classical notion of information though!

## Quantum Information?!?!

I feel like I’ve reached the end of my Coronavirus-induced lockdown brain capacity today. I understand your frustrations, yes. This is definitely the bit where we’re getting somewhere, but that’s why I’ll dedicate myself to this properly in another long-winded post later on. One has to approach Quantum Mechanics with the trained caution of a bull-fighter – and a good night’s sleep.

I appreciate that if there is just one person who reads this, they might as questions or give feedback. If not, this is a wonderful writing exercise. I genuinely cannot wait to dig into some qubit-quandaries as soon as life allows me to.

FWIW, I’m not entirely sure I buy the notion that information cannot be destroyed. For one thing, it creates the infamous black hole information paradox, which would be easily resolved if information

canbe destroyed. For another, as far as I know (and I could be wrong about this), unlike other conservation laws, conservation of information is not based on any natural symmetry.Mostly, it seems to be a premise of QM and unitarity, but it’s not entirely clear to me that QM has to be unitary.

LikeLike