Archive

Posts Tagged ‘lepton number’

The Big Bang

May 30, 2012 6 comments

One of the theoretical digital physics endeavors that has received the most attention in recent years is the program by Stephen Wolfram to find an ultimately simple universal algorithm–the rule that defines all of nature. Despite the fact that he’s a brilliant man with extraordinary resources at his disposal, and though I’d love for him to succeed, I don’t think he’s going to. In fact, I’m pretty certain of it. In this post I’m going to tell you why.

But first, let me tell you my understanding of what Wolfram is doing, as I may be missing something. In his book, A New Kind of Science, Wolfram makes the point that the only way to tell what kind of output algorithms are going to produce is by running them. Given that simple algorithms produce lots of interesting patterns that crop up in nature, he recommends exploring as many as we can, and looking for ones that produce useful effects. And for various reasons similar to those I’ve outlined on this blog, he suggests that we might be able to find the rule for the universe somewhere in that stack of programs, and that we’re fools if we don’t at least try to look. His plan, then, is to sift through the possibilities looking for those that produce a network structure that shows the properties of an expanding spacetime. In other words, a Big Bang.

My main concern with this springs from the importance of conservation laws in physics. In other words, though it has changed size, the universe contains a certain amount of stuff. So far as we can tell, there is essentially the same amount of stuff now as there was at the beginning of the universe, because properties like charge and lepton number are conserved. Certainly you can do things like create charged particle pairs spontaneously from photons, and so forth, but this doesn’t get around the fact that everything we know about cosmology suggests that the early universe was dense.

If Wolfram finds an algorithm that produces spacetime from scratch, where does all the stuff come from? The only solution, it would seem, it to have particles spontaneously generated as the network increases in size. But this isn’t what we see. If this were true, there’d be a lot more going on in the gaps between the galaxies than we witness. So, while finding an algorithm that produces spacetime geometry would certainly be an interesting result, in my opinion, it’d be highly unlikely to be physically relevant. Hence, so long as he’s looking for spacetime, my guess is that he’ll be out of luck.

So is Wolfram’s approach doomed? Far from it, I would propose, so long that we change the kind of network that we’re looking for. After all, just because we need an algorithm that eventually features conservation laws doesn’t mean we can’t have one that builds up a large amount of stuff before it builds the space to scatter it in. In other words, just because the Big Bang is where we measure the start of the universe from, there’s nothing to say that there wasn’t a prior era in which it was the stuff that was expanding instead of the space. If this is true, we should look for an algorithm that experiences a phase transition.

We already know of some algorithms that do this. Langton’s Ant experiences something of this sort. So does the Ice Nine cellular automaton as studied by the inspiring Else Nygren. Sadly, neither of these algorithms operates on networks, but they make it clear that this kind of behavior is not hard to find.

My personal guess is that if Wolfram’s explorations pay off, he will find a class of algorithms that produce a knotted tangle of nodes which, after some large number of iterations, suddenly start unfolding like a flower. We have to hope that there is an entire family of algorithms that do this. Otherwise, if we need to accumulate a full universe’s worth of stuff prior to seeing any spatial expansion, we could be waiting a very long time indeed for the algorithm to do anything recognizable.