Paradigm Aspect Links
What Went Wrong
Site Section Links
"Science is the belief in the ignorance of experts!" - Richard Feynman
Is it enough to say that mass and energy codify information, or is there a more fundamental relationship? Does the apparent existence of physical limits such as the speed of light and the indivisibility of quarks suggest value in an alternative description of physical phenomena? Is information really a form of mass and energy, perhaps the most fundamental form? Is the Holographic Paradigm more than just a philosophical exercise?
The Twentieth Century will likely go down in the history of science as the century of the discovery of limits. Our universe has infinite volume, but finite diameter. There appears to be a family of "smallest particles," indivisible and perhaps even unobservable, the quarks. Heisenberg tells us that we can specify position or momentum of a particle to arbitrary precision, but not both. On a very small scale, various properties are not continuous, but take on only certain "quantum" values, strictly avoiding all other values. There are many other examples, including what is perhaps the most important, the universal limit to relative velocity, the speed of light.
Even information processing has important limits. Indeed, one formulation of Relativity is based on the limits of our ability to transfer information from one place to another. This leads to a limit on computational power. There are only two ways to make a computer go faster, build the same thing out of faster parts or do more than one thing at a time. Both methods are ultimately limited by our ability to signal between two points. The fastest known way to do that is with light. When light is too slow, we must move the two points closer together, leading to the limit on computations per second, and to the interesting fact that "bigger" computers must be physically smaller. In fact, my interest in faster-than-light (FTL) travel has always centered around computation: you give me FTL and I'll give you arbitrarily large computing power!
The Nature of Computation
What do computers do? This question is easy only at a superficial level. They compute. But what is computation? At any given instant, a computer maintains what is known as a "state." Computation is a systematic progression from state to state. But this notion is highly abstract. Ask any film maker who has ever tried to show a computer in operation. Mostly we are shown irrelevancies, such as blinking lights or spinning tape drives, because the actual work that the computer is doing is very hard to observe.
Computers manipulate information. We can describe in detail the data given to a computer, the programs it executes, the algorithms and mathematics that these programs implement, the various hardware representations of programs and data, and the results or output which the computer returns to us. In some sense this does indeed describe what a computer is doing, but at the simplest level, a computer is merely a large mass that consumes power.
It is not easy to understand what computers do, at least not in the sense that we normally understand machines. A car consumes power and produces motion. A computer consumes power and produces...what? Perhaps consuming power and producing motion is an apt analogy: can we envision a computer as a machine which translates energy into movement in some abstract state space? Can we usefully think of a computer "latching onto" some information and "motoring" through a transformational highway?
A computer, then, is a machine which manipulates something called information, requiring intricate mass arrangements and consuming energy as it does so. Perhaps we should look more closely at the relationship between mass, energy and information.
Mass and Energy as Data Structures
What is an electron? Physicists tell us that it is an elementary particle with a certain mass, charge, spin, and so forth. There are currently eight properties associated with an electron. Anything which has these eight properties, and no (known) others, is an electron. These "observable" properties all require interactions with other things in order to be quantified. In other words, these properties are "measured" by relating them to properties of other objects, which are "measured" relative to the electron or still other objects, and so forth. These properties have no intrinsic values, only relative values.
In other words, an electron sounds a lot like a data structure. If one thinks of these eight properties as eight "fields" in a data record, and defines the values of each field relative to values in other records, one has captured the nature of an electron. If we write a program for this and run it in some computer, have we created an electron? Of course not, because the resulting program execution does not correctly interact with "real" electrons. So maybe there is some kind of ninth property needed which specifies a distinction between "real" electrons and "simulated" electrons.
But the idea of an electron as a data structure is seductive. Is it possible that the state of an electron is the same kind of slippery concept as the state of a computer? Is it possible that an even more fundamental relationship exists between information and mass/energy than the simple one of mass/energy "coding" informational states?
There are many examples in nature which suggest an intimate relationship between mass/energy and information. All electrons apparently have the same charge, at least to our ability to measure, which is quite good. This is easily explained, if the charge of an electron is some kind of universal constant, perhaps being accessed as needed by the computer running the simulation of the universe. For that matter, quantization is easy to understand if that simulator is using integer arithmetic. Even the limit of the speed of light can be understood as a limit on the ability to change state.
How's that last part again? Sorry, in order to understand this state change limitation, we need a small digression into the world of holograms and transform mathematics.
The Holographic Paradigm (WILB85)
Think of a holographic transform as a relationship between two representations of a dataset. We can never have access to THE data, only to representations of data. In holography there is a spatial representation, commonly called a "photograph," and a phase representation, commonly called a "hologram." It is not the commonly understood three dimensional aspects of holographic transforms that we focus on here, but the relationships between the two representations. Each pixel of the hologram is a unique function of the entire photograph. That is, in order to compute the value of a single pixel of a hologram, we must take into account each and every pixel in the photographic representation. Thus, each pixel of a hologram is a kind of "point of view," capturing something unique about the entire photograph. Surprisingly, each pixel of the photograph is a unique function of the entire hologram, and captures its own "point of view" about the hologram. Each representation therefore contains "all the data," and you can go back and forth between them.
Because of this local "point of view" aspect of holography, every subset of either representation can be used to form the other representation. Any subset smaller than the full set will necessarily form the other representation degraded in some way, but the degradation in the other representation will be global, not local! In simple terms, you can tear a hologram in half, and each half can generate the ENTIRE photograph uniformly degraded by some kind of noise. Same for the photograph. Every part of each representation contains information about the entire alternate representation.
Karl Pribram and David Bohm (PRIB76), have suggested that the brain functions as a holographic transformer between "reality" and our mental representation of the world. They argue that if our mental representation involves spatial dimensions, objects, motions and so forth, then "reality," that is, the world on the "other side" of the brain's holographic transform, must be a phase space. If this is true, then we have described nature under the hallucination that three dimensions and time exist "out there," when in fact they exist only in our minds.
Suppose the universe is more like a hologram than a photograph, and it is our internal model which is more like a photograph. Let us call the hologram the "phase domain," and the photograph the "spatial domain." Consider the speed of light in this context. The transmission of information from one point to another in the spatial domain corresponds to the simultaneous modification of all points in the phase domain. Each point in the phase domain is "running" a grossly simplified representation of all of the spatial domain, from a unique perspective. To represent any change in the spatial domain, all points in the phase domain must be modified.
It is in this context that the statement was made that the limit of the speed of light can be understood as a limit on the ability to change state. If the universe is "really" a phase domain, then the speed of light limitation is simply our perception of limitations on the ability of the universe to change. Going faster than the speed of light would amount to increasing the local computational speed of the universe's phase domain.
It is not my purpose here to argue the merits of this perspective, but to suggest value in re-examining (and possibly reformulating) physics in this light, perhaps leading to an alternative and potentially insightful understanding of some of the limits we have encountered in physics, and to make one additional provocative conjecture.
Is it possible that e=mc**2 is too limiting a conservation law? Energy has been shown to be a "tenuous" form of matter; conversely, matter is "dense" energy. Is it possible that the same type of relationship exists between information and energy? Specifically, is there some sense in which information or data structures can be shown to be "tenuous" energy, while energy is "dense" information? Is there a corresponding equation relating information to energy? (i=ec**2 comes to mind, satisfying a human need for symmetry, and leading to: i=ec**2=(mc**2)c**2=mc**4). If so, then information must be considered in any formulation of the laws of conservation of mass and energy.
Summary and Conclusions
Could it be that the relationships between information, energy and mass we find everywhere are telling us something fundamental about the universe? Are mass and energy really "programs" running in some phase domain informational computer? Do we have it backwards, and computers are simply a clumsy attempt to "latch onto" the most tenuous form of matter, information?
I think it is likely that our discoveries of physical limits in the universe this century are the harbingers of yet another convulsive reorganization of science. What form it will take remains to be seen, but it will undoubtedly require the incorporation of information at a much more fundamental level than is found today.
(PRIB76) K. Pribram and D. Bohm, Consciousness and the Brain, G. Globus, editor, 1976.
(WILB85) K. Wilber, editor, The Holographic Paradigm, New Science Library, Shanbhala, 1985.