Donnerstag, 29. Dezember 2011

Untangling Entanglement



What to Feynman was interference (see the previous post), to Erwin Schrödinger (he of the cat) was the phenomenon known as entanglement: the 'essence' of quantum mechanics. Entanglement is often portrayed as one of the most outlandish features of quantum mechanics: the seemingly preposterous notion that the outcome of a measurement conducted over here can instantaneously influence the outcome of a measurement carried out way over there.
Indeed, Albert Einstein himself was so taken aback by this consequence of quantum mechanics (a theory which, after all, he helped to create), that he derided it as 'spooky' action at a distance, and never fully accepted it in his lifetime.
However, viewing quantum mechanics as a simple generalization of probability theory, which we adopt in order to deal with complementary propositions that arise when not all possible properties of a system are simultaneously decidable, quantum entanglement may be unmasked as not really that strange after all, but in fact a natural consequence of the limited information content of quantum systems. In brief, quantum entanglement does not qualitatively differ from classical correlation; however, the amount of information carried by the correlation exceeds the bounds imposed by classical probability theory.

Samstag, 17. Dezember 2011

What is Quantum Mechanics?



So far, I've told you a little about where I believe quantum theory comes from. To briefly recap, information-theoretic incompleteness, a feature of every universal system (where 'universal' is to be understood in the sense of 'computationally universal'), introduces the notion of complementarity. This can be interpreted as the impossibility for any physical system to answer more than finitely many questions about its state -- i.e. it furnishes an absolute restriction on the amount of information contained within any given system. From this, one gets to quantum theory via either a deformation of statistical mechanics (more accurately, Liouville mechanics, i.e. statistical mechanics in phase space), or, more abstractly, via introducing the possibility of complementary propositions into logic. In both cases, quantum mechanics emerges as a generalization of ordinary probability theory. Both points of view have their advantages -- the former is more intuitive, relying on little more than an understanding of the notions of position and momentum; while the abstractness of the latter, and especially its independence from the concepts of classical mechanics, highlights the fundamental nature of the theory: it is not merely an empirically adequate description of nature, but a necessary consequence of dealing with arbitrary systems of limited information content. For a third way of telling the story of quantum mechanics as a generalized probability theory see this lecture by Scott Aaronson, writer of the always-interesting Shtetl-Optimized.
But now, it's high time I tell you a little something about what, actually, this generalized theory of probability is, how it works, and what it tells us about the world we're living in. First, however, I'll tell you a little about the mathematics of waves, the concept of phase, and the phenomenon of interference.

Montag, 5. Dezember 2011

The Emergence of Law



For many scientists, the notion of a lawful, physical universe is a very attractive one -- it implies that in principle, everything is explicable through appeal to notions (more or less) directly accessible to us via scientific investigation. If the universe were not lawful, then it seems that any attempt at explanation would be futile; if it were not (just) physical, then elements necessary to its explanation may lie in a 'supernatural' realm that is not accessible to us by reliable means. Of course, the universe may be physical and lawful, but just too damn complicated for us to explain -- this is a possibility, but it's not something we can really do anything about.
(I have previously given a plausibility argument that if the universe is computable, then it is in principle also understandable, human minds being capable of universal computation at least in the limit; however, the feasibility of this understanding, of undertaking the necessary computations, is an entirely different question. There are arguments one can make that if the universe is computable, one should expect it to be relatively simple, see for instance this paper by Jürgen Schmidhuber, but a detailed discussion would take us too far afield.)
But first, I want to take a moment to address a (in my opinion, misplaced) concern some may have in proposing 'explanations' for the universe, or perhaps in the desirability thereof: isn't such a thing terribly reductionist? Is it desirable to reduce the universe, and moreover, human experience within the universe, to some cold scientific theory? Doesn't such an explanation miss everything that makes life worth living?
I have already said some words about the apparent divide between those who want to find an explanation for the world, and those who prefer, for lack of a better word, some mystery and magic to sterile facts, in this previous post. Suffice it to say that I believe both groups' wishes can be granted: the world may be fully explicable, and yet full of mystery. The reason for that is that even if some fundamental law is known, it does not fix all facts about the world, or more appropriately, not all facts can be deduced from it: for any sufficiently complex system, there exist undecidable questions about its evolution. Thus, there will always be novelty, always be mystery, and always be a need for creativity. That an underlying explanation for a system's behaviour is known does not cheapen the phenomena it gives rise to; in particular, the value of human experiences lies in the experiences themselves, not in the question of whether they are generated by some algorithmic rule, or are the result of an irreducible mystery.

Samstag, 19. November 2011

The Origin of the Quantum, Part III: Deviant Logic and Exotic Probability



Classical logic is a system concerned with certain objects that can attain either of two values (usually interpreted as propositions that may be either true or false, commonly denoted 1 or 0 for short), and ways to connect them. Though its origins can be traced back in time to antiquity, and to the Stoic philosopher Chrysippus in particular, its modern form was essentially introduced by the English mathematician and philosopher George Boole (and is thus also known under the name Boolean algebra) in his 1854 book An Investigation of the Laws of Thought, and intended by him to represent a formalization of how humans carry out mental operations. In order to do so, Boole introduced certain connectives and operations, intended to capture the ways a human mind connects and operates on propositions in the process of reasoning.
An elementary operation is that of negation. As the name implies, it turns a proposition into its negative, i.e. from 'it is raining today' to 'it is not raining today'. If we write 'it is raining today' for short as p, 'it is not raining today' gets represented as ¬p, '¬' thus being the symbol of negation.
Two propositions, p and q, can be connected to form a third, composite proposition r in various ways. The most elementary and intuitive connectives are the logical and, denoted by ˄, and the logical or, denoted ˅.
These are intended to capture the intuitive notions of 'and' and 'or': a composite proposition r, formed by the 'and' (the conjunction) of two propositions p and q, i.e. r = p ˄ q, is true if both of its constituent propositions are true -- i.e. if p is true and q is true. Similarly, a composite proposition s, formed by the 'or' (the disjunction) of two propositions p and q, i.e. s = p ˅ q, is true if at least one of its constituent propositions is true, i.e. if p is true or q is true. So 'it is raining and I am getting wet' is true if it is both true that it is raining and that you are getting wet, while 'I am wearing a brown shirt or I am wearing black pants' is true if I am wearing either a brown shirt or black pants -- but also, if I am wearing both! This is a subtle distinction to the way we usually use the word 'or': typically, we understand 'or' to be used in the so-called exclusive sense, where we distinguish between two alternatives, either of which may be true, but not both; however, the logical 'or' is used in the inclusive sense, where a composite proposition is true also if both of its constituent propositions are true.

Samstag, 12. November 2011

Maxwell's Demon, Physical Information, and Hypercomputation



The second law of thermodynamics is one of the cornerstones of physics. Indeed, even among the most well-tested fundamental scientific principles, it enjoys a somewhat special status, prompting Arthur Eddington to write in his 1929 book The Nature of the Physical World rather famously:
The Law that entropy always increases—the second law of thermodynamics—holds, I think, the supreme position among the laws of nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations—then so much the worse for Maxwell's equations. If it is found to be contradicted by observation—well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
But what, exactly, is the second law? And what about it justifies Eddington's belief that it holds 'the supreme position among the laws of nature'?
In order to answer these questions, we need to re-examine the concept of entropy. Unfortunately, one often encounters, at least in the popular literature, quite muddled accounts of this elementary (and actually, quite simple) notion. Sometimes, one sees entropy equated with disorder; other times, a more technical route is taken, and entropy is described as a measure of some thermodynamic system's ability to do useful work. It is wholly unclear, at least at first, how one is supposed to relate to the other.
I have tackled this issue in some detail in a previous post; nevertheless, it is an important enough concept to briefly go over again.

Montag, 31. Oktober 2011

The Origin of the Quantum, Part II: Incomplete Evidence



In the previous post, we have had a first look at the connections between incompleteness, or logical independence -- roughly, the fact that for any mathematical system, there exist propositions that that system can neither prove false nor true -- and quantumness. In particular, we saw how quantum mechanics emerges if we consider a quantum system as a system only able to answer finitely many questions about its own state; i.e., as a system that contains a finite amount of information. The state of such a system can be mapped to a special, random number, an Ω-number or halting probability, which has the property that any formal system can only derive finitely many bits of its binary expansion; this is a statement of incompleteness, known as Chaitin's incompleteness theorem, equivalent to the more familiar Gödelian version.
In this post, we will exhibit this analogy between incompleteness and quantumness in a more concrete way, explicitly showcasing two remarkable results connecting both notions.
The first example is taken from the paper 'Logical Independence and Quantum Randomness' by Tomasz Paterek et al. Discussing the results obtained therein will comprise the greater part of this post.
The second example can be found in the paper 'Measurement-Based Quantum Computation and Undecidable Logic' by M. Van den Nest and H. J. Briegel; the paper is very interesting and deep, but unfortunately, somewhat more abstract, so I will content myself with just presenting the result, without attempting to explain it very much in-depth.

Dienstag, 11. Oktober 2011

The Origin Of The Quantum, Part I: An Incomplete Phase Space Picture


In the last post, we have familiarized ourselves with some basic notions of algorithmic information theory. Most notably, we have seen how randomness emerges when formal systems or computers are pushed to the edges of incompleteness and uncomputability.
In this post, we'll take a look at what happens if we apply these results to the idea that, like computers or formal systems, the physical world is just another example of a universal system -- i.e. a system in which universal computation can be implemented (at least in the limit).
First, recall the idea that information enters the description of the physical world through viewing it as a question-answering process: any physical object can be uniquely identified by the properties it has (and those it doesn't have); any two physical objects that have all the same, and only the same, properties are indistinguishable, and thus identified. We can thus imagine any object as being described by the string of bits giving the answers to the set of questions 'Does the object have property x?' for all properties x; note that absent an enumeration of all possible properties an object may have, this is a rather ill-defined set, but it'll serve as a conceptual guide.
In particular, this means that we can view any 'large' object as being composed of a certain number of 'microscopic', elementary objects, which are those systems that are completely described by the presence or absence of one single property, that may be in either of two states -- having or not having that particular property. Such a system might, for instance, be a ball that may be either red or green, or, perhaps more to the point, either red or not-red. These are the systems that can be used to represent exactly one bit of information, say red = 1, not-red = 0. Call such a system a two-level system or, for short, and putting up with a little ontological inaccuracy, simply a bit.