tom thinks

A thousand miles
A thousand years
A thousand hopes
A thousand fears
Distance, time, emotion
Equations with solution
Real, imaginary, complex
No inversions, all convex
date 2000-11-05:17:12
Physics To get further into the phenomena of unknowability, I've got to talk more about states. How to do this is tricky, and one of the places I've fallen down in trying to explain this in the past. So here's another try.

Previously, I described the way we need two numbers to define the state of a system consisting of a bead moving on a wire--the initial position and the velocity. For various reasons, physicists typically use the velocity times the mass of the bead as the measure of motion, which is called the momentum and given the symbol p. The position is given the symbol x, so any state of the system can be defined by giving a pair of numbers, (x,p). This defines a point on the x-p plane--which is called "phase space" for reasons that are obscure, at least to me.

Nomenclature is a funny business, and the meanings of names are often lost in antiquity. x and p are also called "canonical co-ordinates", for reasons that Goldstein, the modern guru of classical physics, says, "remained obscure even to contemporaries" of Jacobi, who introduced the term in 1837 (Classical Mechanics, 2nd ed., footnote on p. 342)

In any case, the state of our bead on a wire is defined by a point in this two-dimensional phase space, the x-p plane. There are two ways we'd like to generalize this idea of phase space. One is to consider more degrees of freedom--we need our particles to be able to move in more than one dimension. The other is to consider systems of more than one particle. Unless you're an elementary particle physicist, most of the interesting bits of the world are made up of many-particle systems, and understanding them is the point of the enterprise.

Adding more degrees of freedom is no big deal, and it won't play any role in the rest of what I'm going to say, so feel free to skip the rest of this paragraph. For a particle that moves in three dimensions, (x,y,z), we have a six-dimensional phase space with co-ordinates (x,y,z,px,py,pz) where px is the momentum in the x-direction, pp is the momentum in the y-direction and pz is the momentum in the z-direction. Because these directions are all orthogonal to each other (that is, a particle can move in the x-direction, for example, without changing its position in y or z) the various momenta are independent too.

Adding more particles is where things get interesting. Suppose we consider two particles, labelled A and B. Let's consider the two of them on a wire, so to specify the state of the system we need to specify (x,p) for both particles: (xA,pA,xB,pB).

Now here's an interesting question: if we swap the particles, so if they start with (xA,pA,xB,pB) they wind up with (xB,pB,xA,pA), have we described a different state of the system, or not?

The answer is: it depends. It depends on whether or not the particles are distinguishable, or, to put it another way, if the difference between the particles is knowable. If it is possible to know which particle is which, then exchanging them produces a new state. If it is not possible to know which particle is which, then exchanging them does not produce a new state.

This is, in my view, the central feature of the quantum world: we can have two particles that are not the same but which are nevertheless not distinguishable. Furthermore, as we shall see, it is possible to determine the number of states available to a system, so the issue here is not whether or not we can know which particle is which, but whether or not the identity of the two particles is knowable as such.

I've introduced all this stuff about position and momentum because it will be necessary to understand how we can count the number of states available to a system, but there's a simple non-physical example of state-counting that might help illustrate things. Suppose we have a system that consists of two coins. The states of this system are labeled by specifying which face of each coin is showing. If the coins are distinguishable, we have four states:

HH
HT
TH
TT

where H stands for "head" and T stands for "tail". But if the two coins are indistinguishable then HT and TH are identical, and there are only three states available:

HH
H/T
TT

where H/T is understood to mean "one coin heads, one coin tails, but no way to tell which is which."

It should be clear from this example that if we have a way of counting the states a system could possibly be in, we will have a way of telling if the particles that make it up could be distinguished by any means whatsoever. The number of states does not depend on what is known, but one what can be known.

Over the next few days I'll introduce various cases where we can determine the number of states avaialable to a system, and show how the experimentally measured numerical values demonstrate again and again that it is possible to have particles that are not identical and that are not distinguishable.

The identity of indescernibles is sometimes called Leibniz's Law. An incidental consequence of the facts of reality is that Leibniz's Law is not generally true, and we can prove this as a matter of experiment.
Metaphysics Richard Taylor, in chapter 5 Metaphysics (p. 33) has the following to say about determinism:

The sea, at any exact time and place, has exactly a certain salinity and temperature.... The wind at any point in space has at any moment a certain direction and force, not more nor less.

Now this is, to put it plainly, false.

Consider temperature in particular: it is an emergent property of matter. A single atom does not have a temperature, it has a velocity. At an "exact time and place" there is, presumeably, at most a single atom, so far from having a certain temperature, there is no temperature at all.

If this is the kind of thinking that is used to justify determinism, then there's a problem. And there is a problem, having to do with emergent properties generally.

Adam Reed gave the following example of an emergent property. There are various basic logic circuits whose behavior if fully specified by their truth table. A NOR gate, in particular, has the following truth table:

Input A Input B Output
T T F
F T F
T F F
F F T

The output is true if and only if neither input is true, and the output (for our purposes) can be considered to change instantaneously when one of the inputs changes. The output immediately reflects any changes to the inputs, and it is this fact that allows us to write a truth table for it: the truth table just reflects how the inputs instantaneously determine the outputs.

We can connect NOR gates together in various was. In particular, we can take two of them and hook the output of each to one of the inputs of the other, and get a behavior that is not defined by a truth table at all! In this case, when both the remaining inputs are false, the output encodes which input was on last--the inputs no longer instantaneously determine the outputs, but rather the system exhibits a new, emergent property called "memory."

This is a beautiful example (I'll flesh it out with a figure when I write this up in more detail) of the sort of thing that gets determinists and eliminative reductionists of all stripes in trouble.

The problem is just this: the law of causality states "What a thing is now and only what it is now causes what it does now." It couples identity to action and action to identity--it means amongst other things that we can know what a thing is by knowing what it does. But this means that if we grant the existence of emergent properties, which on the basis of the above example would be hard not to do, then we are also granting the existence of emergent causality. That is, if we believe that the way things are causes what they do, and their emergent properties are part of the way things are, then their emergent properties are part of the cause of the actions of things.

Looked at another way, it is impossible to account for the behavior of a system with memory using truth tables of the kind given above. Any account must be couched in part in terms of memory -- the language we must use to describe the system has to include this concept. But because causal power and explanatory power are one and the same, we are again commited to the causal power of emergent properties (which, from the law of causality, we were committed to anyway--this is just a bit of reinforcement.)

The law of causality as I've presented it here is the strongest form I've given, but I think this form is necessary to capture all the work we want it to do, and to avoid the risk of miracles.
Poem Sometimes my life seems like a grand minimization problem, subject to all the difficulties of finite precision, numerical instability and false minima. But I know a solution is out there--that the surface is convex and eventually will get close enough to a quadratic form that I can jump to it in a single step. In the meantime, I surf joyfully along the real axis, riding the wave of time.
Play Spent a good part of today playing with the kids--rollerblading and playing soccer--which is pretty cool for the first week of November. I love rollerblading; the sense of freedom and speed is amazing, as is the sense that if I screw up, I'll get hurt. This definitely helps focus my attention.

When I'm out with the kids I take it easy. They're both improving by leaps and bounds, and by this time next year will probably be turning circles 'round me. It's an enormous delight to watch them learn, to see them try and fall and pick themselves up and try harder, and never stop having fun.

Find Enlightenment