Chapter 1: Cosmological Constant, constant?

John D. Barrow is the new professor of mathematical sciences at the Department of Applied Mathematics and Theoretical Physics and director of the Millennium Mathematics Project at the University of Cambridge, UK. He says in the scientific journal New Scientist, 24 July 1999:29-32, under the heading, "Is nothing sacred?":

"Ever since 1905, when Albert Einstein revealed his special theory of relativity to the world, the speed of light has had a special status in the mind of physicists. In a vacuum, light travels at 299 792 458 metres per second, regardless of the speed of its source. There is no faster way of transmitting information. It is the cosmic speed limit. Our trust in its constancy is reflected by the pivotal role it plays in our standards of measurement. We can measure the speed of light with such accuracy that the standard unit of length is no longer a sacred metre bar kept in Paris but the distance travelled by light in a vacuum during one 299 792 458th of a second.

"It took cosmologists half a century to the full cosmological importance of a finite speed of light. It divides the Universe into two parts: visible and invisible. At any time there is a spherical 'horizon' around us, defined by the distance light has been able to travel since the Universe began. As time passes, this horizon expands. Today, it is about fifteen billion light years away.

"This horizon creates a host of problems for cosmologists. Because no signals can travel faster than light, it is only within the horizon that light has had time to establish some degree of uniformity from place to place in terms of density and temperature. However, the Universe seems more coordinated than it has any right to be. There are other ways, too, in which the Universe seems to have adopted special characteristics for no apparent reason. Over the years, cosmologists have proposed many different explanations for these characteristics - all with their attendant difficulties. In the past year, though, a new explanation has come to light. All you have to do is break one sacred rule - the rule that says the speed of light is invariable - and everything else may well fall into place.

"The first of the problems cosmologists need to explain is a consequence of the way the cosmological horizon stretches as the Universe expands. Think about a patch of space which today reaches right to the horizon. If you run the expansion of the Universe backwards, so that the distances between objects are squeezed smaller, you find that at some early time T after the big bang that same patch of space would lie beyond the horizon that existed then. In other words, by time T there would not have been enough time for light to have travelled from one edge of the sphere bounded by our present horizon to the opposite side.

"Because of this, there would have been no time to smooth out the temperature and density irregularities between these two patches of space at opposite extremes of our present horizon. They should have remained uncoordinated and irregular. But this is not what we see. On the largest cosmic scales the temperature and density in the Universe differ by no more than a few parts in one hundred thousand. Why? This is the horizon problem.

"Another closely related cosmological problem arises because the distribution of mass and energy in our Universe appears to have been close to the critical divide that separates universes destined to expand for ever from those that will eventually collapse back to a 'big crunch'. This is problematic because in a universe that contains only the forms of matter and radiation that we know about, and any deviation from the critical divide grows larger and larger as time passes. Our Universe has apparently been expanding for nearly 15 billion years. During which time its size has increased by a factor of at least 1032. To have remained so close to the critical divide today the Universe must have been incredibly close to this distribution of mass and energy when it started expanding - an initial state for which there is no known justification. This is the flatness problem, so called because the critically expanding state requires the geometry of space to be flat rather than curved.

"The third major problem with expansion of the Universe is that Einstein's theory of gravitation - general relativity - allows the force of gravity to have two components. The better known one is just a refinement of Newton's famous inverse-square force laws. The other component behaves quite differently. If it exists, it increases in direct proportion to the distance between objects. Lambda was the Greek symbol used by Einstein to denote the strength of this force in his theory. Unfortunately, his theory of gravitation does not tell us how strong this long-range force should be or even whether it should push masses apart rather than pull them together.

"Particle physicists have for many years argued that this extra component of the gravitational force should appear naturally as a residue of quantum effects in the early Universe and its direction should be opposite to that of Newton's law of gravity: it should make all masses repel one another. Unfortunately, they also tell us that it should be about 10120 times larger than astronomical observations permit it to be. This is called the lambda problem.

Expanding fast

"Since 1981, the most popular solution to the flatness and horizon problems has been a phenomenon called inflation that is said to have occurred very soon after the big bang, accelerating the Universe's expansion dramatically for a brief interval of time. This allows the region of the Universe seen within our horizon today to have expanded from a much smaller region than if inflation had not occurred. Thus it could have been small enough for light signals to smooth it from place to place. Moreover, by the end of this bout of acceleration the expansion would be driven very close to the critical divide for flatness. This is because making a curved surface very large ensures that any local curvature becomes less noticeable, just as we have no sense of the Earth's curved surface when we move a short distance.

"Why should the Universe have suddenly inflated like this? One possibility is that strange, unfamiliar forms of matter existed in the very high temperatures of the early Universe. These could reverse the usual attractive force of gravity into repulsion and cause the Universe to inflate briefly, before decaying into ordinary radiation and particles, while the Universe adopted its familiar state of deceleration expansion.

"Compelling as inflation appears, it cannot solve the lambda problem. It has also had to confront some new observations of the rates at which distant supernovae are receding from us. They imply that the lambda force is influencing the expansion of the Universe today ('The fifth element', New Scientist, 3 April 1999). Even though the density of matter might be just 10 per cent of the critical value, the influence of the lambda force means the geometry of space might still be very close to flatness." - J. D. Barrow (1999:29-30):

"What are the cosmological consequences if the speed of light changed in the early life of the Universe? This could happen either suddenly, as Albrecht, Magueijo and Moffat first proposed, or steadily at a rate proportional to the Universe's expansion rate, as I suggested in a subsequent paper. The idea is simple to state but not so easy to formulate in a rigorous theory, because the constancy of the speed of light is woven into the warp and weft of physics in so many ways. However, when this is done in the simplest possible way, so that the standard theory of cosmology with the constant light speed is recovered if the variation in light speed is turned off, some remarkable consequences follow.

"If light initially moved much faster than it does today and then decelerated sufficiently rapidly early in the history of the Universe, then all three cosmological problems - the horizon, flatness and lambda problems - can be solved at once. Moreover, Maguijo and I then found that there are also a range of light-slowing rates, which allow the quasi-flatness and quasi-lambda problems to be solved too.

"So how can a faster speed of light in the far distant past help to solve the horizon problem? Recall that the problem arises because regions of the Universe now bounded by our horizon appear to have similar, coordinated temperatures and densities even though light had not had time to travel between them at the moment, when these attributes were fixed. However, if the speed of light were higher early on, then light could have travelled a greater distance in the same time. If it were sufficiently greater than it is today it could have allowed light signals to traverse a region larger than would expand to fill our horizon today.

"As regards the flatness problem, we need to explain why the energy density in the Universe has remained at the critical divide that yields a flat, Euclidean space, even though its expansion should have taken it farther and farther from this divide. And as for the lambda problem, we need to explain why the lambda force is so small - instead of the huge value that particle physicists calculate.

"The key point here is that the magnitude of the expansion force that drives the Universe away from the critical divide, and the magnitude of the lambda force, are both partially determined by the speed of light. The magnitude of each is proportional to the square of the speed of light, so a sufficiently rapid fall in its value compared with the rate of expansion of the Universe will render both these forces negligible in the long run. The lambda force is harder to beat than the drive away from flatness. Consequently, a slightly faster rate of fall in light speed is needed to solve the flatness, horizon, and lambda problems than is required just to solve the flatness and horizon problems.

"Remarkably, a more modest slowing of light allows the quasi-flatness problem to be solved: it leads to a Universe in which the forces that drive the Universe away from a critical state ultimately keep pace with one another, neither overwhelming the other. In the same way, a suitable rate of change of light speed can result in an approach to a critical rate of expansion in which the lambda force keeps pace with the gravitational influence of matter.

"One advantage that the varying light speed hypothesis has over inflation is that it does not require unknown gravitationally repulsive forms of matter. It works with forms of matter and radiation that are known to be present in the Universe today. Another advantage is that it offers a possible explanation for the lambda problem - something inflation has yet to solve."

"New telescopes open up the exciting possibility of measuring physical constants far more stringently than is possible in laboratory experiments. The stimulus provided by superstring-inspired theories of high-energy physics, together with the theory that a change in the speed of light in the early Universe may have propelled it into the peculiar state of near smoothness and flatness that we see today, should provoke us to take a wide-ranging look at the constancy of nature's 'constants'. Tiny variations in their values may provide us with the window we are searching for into the next level of physical reality." - J. D. Barrow (1999:32).

Comment: Even if the speed of light changed suddenly or slowly to, let's say, twice of its present speed, it still would not work. If our universe is about 15 billion light years old, it has then a radius of 15 billion light years. This means: A signal, transmitted with the speed of light from the center of the visible universe to its cosmic horizon, will now need 15 billion light years, to get there. If the information were moving with twice the present speed of light, it would still need 7.5 billion light years to reach the cosmic horizon. This is far too slow. Also, when transmitting information at twice the present speed of light, the spatial and temporal order of the universe cannot have arisen and cannot have been preserved. It is like a fire fighter, sitting in his office, who finds out, that a fire has broken out somewhere in town a month later.

The electron near our planet Earth, for example, is just as large and as heavy, as the electron, 12 billion light years away, near the cosmic horizon of our universe. It is exactly as large and as heavy. And it contains exactly the same amount of physical information and mathematics (cosmic software). This is also true of the proton, the neutron, and the many other particles. The physical constant contains information and highest mathematics. This cosmic information and mathematics exist independently of mankind. Man has only found and understood them a little. He has not made them. Information and mathematics always come only from a spiritual, non-material source. They come from an intelligent Person, the Creator.

 

Fundamental Constants of Physics: Constant?

The fundamental constants of physics: are they constant, or have they changed with time?

Physicist Stratis Karamanolis concludes in his book Rätsel der Materie (Riddle of Matter): "Often one has assumed, that also the fundamental constants, only a few or all of them, have changed with time. So, one used to assume, that the elementary load, that is, the electric load, has increased with time. Similar ideas one has voiced about the gravitational constant G. ... Hypotheses like that have been given up again, since the resulting, far-reaching consequences have not been verified. Natural phenomena, like electricity or magnetism - are found in the form of characteristics within matter. Like the natural laws, the fundamental constants and the natural forces must be viewed has having been given." (1988:84, 100).

"Matter, and thus particles, are, as we have seen, condensed electromagnetic fields... Heavier particles, like photons, neutrons, and so on, are condensations of the electromagnetic field, which are denser than those of lighter particles, like electrons. ... Matter is nothing else, but energy-knots, producing a certain non-homogeneity within the originally homogene electromagnetic field. Without any matter, the universe, filled with photons, would be such a homogeneous electromagnetic field..."

Electromagnetic Field

We have learned, when looking at the gravitational field, that each particle of mass in the universe is connected with every other particle through its gravitational field - at least a little bit. Thus, the whole spherical universe is one gravitational field. And it also has its center of gravity. Each particle in the universe also gives off and takes in all the time electromagnetic radiation. In other words: Each particle of mass in the universe has also its own electromagnetic field. And each one of these electromagnetic fields has an infinite reach. They are able to roam across the whole universe. To this electromagnetic radiation, sent off by each particle of matter in the universe, we still must add the photons of the microwave background radiation, with their 2.7-K-radiation. There are about 400 of these primordial photons in each cubic centimeter of cosmic space. Thus, the whole universe is also one huge electromagnetic field.

This means: Within each cubic centimeter of "empty" space (as far as we know), there are about 400 primordial photons of the microwave background radiation. And within this ocean of cosmic space - within a sphere, some 24 billion light years across -, the galaxies with their stars and planets are floating.

How fast is information transmitted in the universe? - The physicist will answer: Information can only move with the speed of light. Nothing in the universe is able to move any faster, than the speed of light, about 300,000 km/second. This is one of the basic doctrines of modern physics. The gravitation, carrying the gravitational force, is moving with the speed of light. The photon, carrying the electromagnetic force, is also moving in a vacuum with the speed of light. Nothing is able to move any faster. No serious physicist would even dare to doubt this. At least not all. Some have also thought about this.

Faster than Light

Physicist John Gribbin reports in the journal New Scientist of 26 February 1994 on page 16 under the heading "Atomic telepathy is faster than light": "In the past decades, physicists have carried out experiments which show that under special circumstances subatomic particles can communicate with each other instantaneously. This finding has been tough for them to swallow but now they are faced with yet another puzzle. A German scientist has shown that in theory any pair of atoms can communicate with each other faster than light.

"The possibility of detecting instantaneous communication, known as a ‘nonlocal’ interaction, was raised in the 1960s by John Bell of CERN, the European laboratory for particle physics. The effect was finally observed by Alain Aspect in Paris in the 1980s. He demonstrated that two photons ejected in opposite directions from an atom remain ‘entangled’, as if they were one particle. So when the state of one photon is measured, this instantaneously affects the state of the other, wherever it may be in space. Now it seems that even atoms which have never come into contact (from the perspective of classical Newtonian physics) are entangled in a similar way.

"Gerhard Hegerfeldt of the University of Göttingen discovered this when he corrected a mistake made by Enrico Fermi in 1932. The calculation Fermi carried out, in the early days of quantum mechanics, concerned how one atom responds to radiation emitted by another atom of the same kind some distance away. If the first atom is in an excited state, sooner or later it will emit radiation, falling back to the ground state. This radiation will have exactly the right frequency to excite the second atom (this is one of the principles of the laser). Common sense tells us that the first atom cannot be excited until there has been time for radiation travelling with the speed of light to cross the gap. This is indeed the result Fermi found when he carried out the calculation. But it now turns out that the great man made a mistake.

"But Hegerfeldt’s correct version of the calculation now makes it clear that there is a small chance that the first atom will be excited the instant the second atom decays... Now the experts have to explain what this mathematical result means. The best interpretation of the evidence so far seems to be that we should not think of any object, not even a single atom, as an ‘isolated system’.

"Because particles must also be considered as waves (one of the basic tenets of quantum mechanics), the individual particles in the atom are spread out, and there is a finite (though small) chance of finding them anywhere in the Universe. So the wave functions of the electrons in the first atom overlap with those of the electrons in the second atom. They are entangled, like the two photons produced in Aspect’s experiment, and when an electron in one atom jumps down an energy level that can instantaneously make its counterpart in the other atom jump up by the same amount." - Gribbin, J. (1994:16).