## Causal Sets and Leaning Towers

Last year I had the incredible good fortune to spend a couple of months collaborating with Tommaso Bolognesi at CNR-ISTI, in Pisa, Italy. Tommaso runs his own research program into the interface between computation and physics and is a champion of the Digital Physics cause. He hired me to see if together we could answer a very specific question:

*Is it possible to build networks that have the same properties as spacetime using simple algorithms, and if so, how?*

I’ve had plenty to say on the subject of modeling space before this. However, what Tommaso was looking for was very specific. He wanted us to find ways to build *causal sets*. Causal set theory is probably the point of closest approach between digital physics and more mainstream quantum gravity research and it’s a fascinating subject. In a nutshell, causal set theorists believe that spacetime is most usefully thought of as a discrete structure and that the way to model it is to try to mimic the kinds of relationships between events that we see in relativity. To achieve this, they connect nodes using something called a partial order—a set of relationships that define which nodes must come before others, but which falls short of providing an exact numbering for all nodes.

Broadly speaking, the Causal Set Program uses two methods to build their sets. The first, called *sprinkling*, is to deposit nodes at random onto a surface, and hook them together based on the geometry of that surface. The other way, called *percolation dynamics*, is to add nodes one by one to a set, and randomly add links from existing members of that set to each new node.

Sprinkling is useful for exploring how causal sets behave but it has a huge problem: in order to construct the discrete structure of spacetime, you have to deposit your points onto a smooth spacetime first! Clearly, if we want to come up with a background-independent theory of physics, we need to build the sets some other way. On the other hand, percolation dynamics has all the nice statistical properties that physicists would like to see and doesn’t need a background, but sadly doesn’t actually produce graphs that look like spacetime (though people are working on that).

The right solution would seem to be to come up with a third way: a process that produces the right structures without needing a background surface. However, this comes with problems. The key features that differentiate spacetime-like causal sets from others are *dimensionality* and *Lorentz invariance*.

Dimensionality essentially says that we should expect the graph that we build to have some consistent number of dimensions, rather than just being a tangled mess. Lorentz invariance is a little trickier. What it implies is that if you build your network first and then lay the nodes onto a flat surface afterward, the positions of the nodes should appear random. There should be no way you can stretch or squish the network to make it look otherwise. This is vitally important because in order to treat every relativistic reference frame the same way, as special relativity says we must, we need about the same number of links between nodes in each frame.

Another way to say this is that, thanks to Einstein, we know that no matter how fast we’re moving, space will always feel the same to us. The way a causal set works is that each link corresponds to a step through time and space taken at a certain speed. So, if for some speed of travel, our network doesn’t have enough links, it’s definitely not going to feel the same to someone traveling through it. If this happens, our model has failed. The only way that people have ever found to make Lorentz-invariant causal sets is to have the network be random.

My collaboration with Tommaso was founded on a neat way around this problem that works like this:

- Because any causal set we can build is finite, it can only ever
*approximate*perfect randomness. - Furthermore, for a finite network of given size, we can always find some algorithm that can approximate that level of randomness through a deterministic process.
- Thus, no matter how big our network needs to be, we should still always be able to find an algorithm that could give rise to it.
- This will always be true so long as we believe that spacetime is discrete, that the universe has finite size, and that it has existed for finite time.

In essence, what this tells us is that just because the network we want to build needs to look random, that doesn’t mean that we can’t use a completely non-random method for building it. This is all great as far as it goes, but it leaves us with an enormous problem: how to find an algorithm that can build spacetime.

In the two months we had, Tommaso and I didn’t manage to crack this problem (otherwise you would have heard about it on the news by now) but we learned some fascinating things along the way. I hope to share some of them with you in my later posts.

However, in the mean time, there are plenty of really excellent introductory papers on causal sets that are very approachable for those who’re interested. While my favorite approach to discrete physics is a little different from the causal set methodology, I can recommend this field very highly to anyone interested in learning more about quantum gravity without taking on a full-time career as a string theorist.

Simple finite causal sets can be used to model the electron and its cloud formations, yielding Bohr’s formula and QM. The causal link serves as the quantum, and everything is built of quanta. The reduction of physics to causal sets is somewhat to be expected from the foundational work of Russell and Whitehead– their event ontology. The definition of energy ratios as frequency ratios is the key to the reduction.

To see the construction, search “Causal Set Theory and the Origin of Mass-ratio.”

Another approach is to just model the basic particles as simple causal sets, which works out so well that it cannot be an accident. The key to this approach is to notice that relative frequency ratios are inherent in causal sets, and these may serve physics as energy ratios, in accord with Planck’s E=hf. This implies that the causal link is the quantum of action, so that the diagrams constitute quantum schematics of the particles.

Hi Carey,

Thank you loads for drawing my attention to your ideas. I haven’t fully digested them yet but have started reading. I have two questions for you right off the bat.

The structures that you’re describing here to describe particles are regular. How are you reconciling this with the need for Lorentz invariance? Are you envisaging that such structures are superimposed onto a Lorentz-invariant background causal set of some sort? If so, do you have a model for how particle interactions would therefore happen?

Also, the contents of your paper don’t appear to include a dynamical process for evolving such structures? Do you have a picture of how that would work?

Alex

Hi Alex,

There is a premise of “causally connected universe,” which means that all the neutrino formations and electron formations (free electrons plus clouds) connect to one another, forming what there is of our 4-D manifold (since the neutrino/electron formations are 4-D structures themselves.) There are holes or gaps in the resulting “connecticum,” which delineate the locally separate “particles” (oscillating neutrinos and atomic electron clouds) and compromise the uniformity of the manifold. There is also the disparate inflation of the time metric for locally separate sequences of disparate extent, which further compromises the overall uniformity. I don’t believe that a background set of greater uniformity is needed or wanted, but I haven’t had an eye on Lorentz invariance. In spite of the degraded uniformity of the 4-D manifold of this theory, there is a strict uniformity in the conjugate relation between the energy of any bounded region and the bounding time interval in which that region elapses. This may be the only invariance that is crucial.

The only particle interaction that I’ve diagrammed is that between photons and electron clouds in the modeling of Bohr’s formula. An isolated stable particle, persisting through time, is modeled by the chained repetition of a finite cycle-structure that is characteristic of the type of particle, accounting for its mass. So non-perturbative behavior, without interaction, is the simplest to model. When particles meet, they share nodes. The time-energy rule governs the structure of the activity to follow, which may be a single product particle of combined energy values from the colliding incident particles, or dissolution into free photons with determinate frequencies, or the acquisition, or loss, of charge, or color-charge, in the transition from inputs to outputs of the collision. The first alternative– a single product particle– generates a more extensive structure from less extensive inputs, which is how structures can evolve on this theory.

I wish I could think intelligently about Lorentz invariance, but when I came upon the structural definition of energy and its quantum, I spun up the theory from the simplest structures that are logically possible, finding “low-hanging fruit” at every turn. The combinatorics of collision and interaction are messy by comparison, and require the patience and ambition of more difficult mathematics. I am hoping that others, like yourself maybe, will see the simple concept of particles and mass at the formative stage of this theory, and find the incentive to make the next developments. — Carey