High Energy Physics in the Reciprocal System
HIGH ENERGY PHYSICS AND THE RECIPROCAL SYSTEM

“...during times of crisis new theories arise. Meanwhile, adherents of the old paradigm in crisis fight to retain it against the revolutionaries who are outrageously explaining anomalies by treating nature as if she were a rabbit or squirrel instead of what every self-respecting scientist knows she is: a duck.”

J.P.Briggs and F.D.Peat, Looking Glass Universe, p. 28

Great advances in technology in the recent decades of this century have made it possible to amass a wealth of experimental data of unprecedented scope and variety. Theory in the areas of Particle Physics and Astrophysics has been subjected to repeated revisions to cope up with the observed facts. Especially in the field of High Energy Physics (HEP) exciting things have been happening. The Orthodoxy is becoming more tolerant to wild, if not crazy ideas and inventions of thought. In this backdrop, it might be desirable to survey the vicissitudes of the physical theory, hoping that we might learn something from the history.

Little Fleas on Little Fleas on Little Fleas on...

Physicists recognize two revolutionary experiments in the 20th century that resulted in significant revision of the previous ideas about the fundamental particles. One was the Rutherford scattering experiment of 1911, which revealed that the atom was not a uniform solid object it was thought to be, but is largely hollow with a compact solid nucleus which is nearly five orders of magnitude smaller than the atom itself. Subsequent theory conjectured that the nucleus is made up of particles even more fundamental, namely, the protons and the neutrons. The second experiment was the electron-proton scattering experiment of 1968 at Stanford. With the probing energies scaled up to the MeV range the scattering pattern revealed that the proton and the neutron were not the solid compact objects they were thought to be, but are largely hollow with extremely compact, point-like objects inside. The theoreticians named these point-like particles the quarks.

Originally only three quarks (’u,’ ‘d’ and ‘s’) were invented to explain protons, neutrons and pions. But soon, a theoretical inconsistency cropped up as the unstable hadron resonance known as delta++ was experimentally discovered. According to the existing quark scheme this resonance has to be composed of three u-quarks in a configuration that is symmetric under interchange of any two quarks. This, however, was not in accordance with the well-established Pauli Exclusion Principle, which states that no two fermions can be in the same quantum state. Therefore, instead of abandoning the quark model, the inconsistency was evaded by inventing purely ad hoc, a new quantum attribute—fancifully called the ‘color’ charge—which serves to distinguish the three u-quarks.

That now we have u, d and s quarks each in three color states is, of course, not the end of the story. The discovery in 1974 of the J or psi particle required the positing of a fourth quark (the ‘c’), and in 1977 of the Upsilon particle necessitated another quark with a brand new quantum attribute (the ‘b’). At the present time, we have as the fundamental particles six types of quarks, each in three different color states, along with equal number of antiquarks. In addition, the Standard Model (SM) propounds the existence of six leptons—particles which do not experience the ‘strong’ force. These are the electron, the muon and the tau- particle and their corresponding neutrinos ve, v and vT along with, of course, the antiparticles of all of these.

Problems in the Current Theory

Though the SM is a highly successful theory of the HEP and covers the ‘weak,’ the electromagnetic and the ‘strong’ interactions, its most flagrant shortcoming is the omission of gravitation. Physicists have come up with the characteristic length at which ‘quantum gravity’ is expected to manifest as nearly 10-35 m. This is seventeen orders of magnitude smaller than the characteristic length of the ‘weak’ interaction, namely, about 10-18 m. Such a stupendous scale difference is quite baffling to them.

It is an embarrassing fact that free quarks have never been observed. Consequently it is theorized that interactions between quarks must be extraordinarily strong and perhaps irrevocably confining. The theorists do not know whether quarks are truly fundamental entities or have further structure. Nor do they know if quarks are ever-lastingly stable or decay spontaneously. Further, the SM contains many parameters, such as the masses of the quarks and leptons, the values of the fundamental charges etc. which cannot be derived from the theory but have to be taken as given. Then there is the generation problem: even though only two quarks (u and d) and two leptons (e- and ve) occur preponderantly in nature, yet nature possesses two more copies (four more quarks and four more leptons) of this basic structure, which latter are assumed to be relevant, if at all, in the first few seconds after the so-called Big-bang.

Occurrence of infinities plagues the mathematics of the theory, at the various levels of the energy ranges. Solving one problem introduces new problems at the new levels. For instance, solving the mass problem of the ‘weak’ bosons, W and Z0, by the Higgs mechanism involves the prediction of a new particle—the Higgs boson—the experimental discovery of which is an outstanding problem. The concept of supersymmetry—wherein all bosons have fermionic superpartners and vice versa—is invented to circumvent the infinities. However, in the bargain, a host of new particles are predicted, generating new ignorances at the same rate as developing new understanding.

Finally, the theorists are investing great hopes in the superstring theories, in which one-dimensional singularities, instead of point-like particles, are envisaged as the ultimate constituents of the universe. Supersymmetry is an essential ingredient of the theory. One of the problems besetting the superstring theory is the occurrence of several versions of it, without a clear hint of the actual one. The theory requires the superstrings to exist in large number of space-time dimensions (like 10). This requires figuring out ways of reducing the superabundance of the dimensions.

Vindication of these ideas comes from experimental confirmation and the future of HEP is threatened by a serious crisis. The range of energies that would be needed to test the new theories is 105 to 1019 GeV. The known acceleration technologies can take us up to the 104 GeV level in the coming decade. Beyond that, the veterans in the field fear that the HEP is near its end. The deepening crisis is making the physicists look for unconventional ideas, no matter how weird they might appear. Unfortunately, they are looking for these new ideas still within the ambit of the old paradigm only. They seem to be committing the mistake of the proverbial drunkard, who was found searching in the middle of the night, right under the street light, for something he lost in the darkness beyond! Recognition of the truth of the Reciprocal System of theory, which is based on a totally new basic paradigm, is getting procrastinated because it upsets some of our most cherished notions. But this is what a paradigm change at the most basic level is bound to do. Planck’s discovery of the quantum nature of energy is a good example. It was greeted with indifference and disbelief, if not open hostility.

The Deepening Crisis

It is now apparent that applying iteratively the program that ‘particles are built out of more fundamental particles’ has resulted in the proliferation of ‘fundamental’ particles and led us from complex theory to more complex theory. The situation is reminiscent of the accumulation of epicycles in the Ptolemaic system. Once again it might be pointing out to us, if we are able to take the hint, that the basic paradigm underlying the whole edifice of the HEP has been wrong.

Particle physicists have innovated the concept of force, which was originally defined as acceleration times mass. The idea of action-at-a-distance was repugnant to the modern scientist who thought it was spooky and belonged to the dark era of scientific ignorance. He rather believed in the localness of interaction: a force could be passed on from A to B only if A is physically touching (contiguous in space to) B, or through some other thing touching both. This belief logically led him to the idea of ‘exchange force,’ that when two entities are separated in space a force could be transmitted between them only through the intermediary of a particle—the field quantum—propagating in space. This is part of the paradigm on which the superstructure of modern physics has been erected. The physicists have even disregarded factual information from their own field and subscribed unstintingly to this paradigm. For example, there is no empirical evidence that gravitation is propagated at finite speed or that it is propagated at all. But current Orthodoxy presumes that gravitation has a field quantum, the graviton, and that it propagates at the speed of light.

Meanwhile a new factor has emerged into the situation. Carefully conducted experiments in the recent decades have established beyond doubt that quantum non-locality is a fact—particles widely separated in space are able to influence each other, without the need for any medium or intermediary and without any effects of attenuation by distance, even when they are beyond each other’s light cone. Since this is a factual finding, it must be incorporated into whichever theory of physics that might come into ascendancy if it has to be true.

Notwithstanding these developments HEP has continued on its program of building particles out of more fundamental particles, postulating at each structural level the existence of ‘carriers of interaction’—the mesons, the ‘intermediate vector bosons,’ the gluons and the like. Now the question arises whether there is a way to build physical theory basing on established facts including non-locality without having to re-introduce the unacceptable spooky action-at-a-distance? Well, this is exactly what Larson has accomplished!

The New Paradigm

Larson has laid out, in his published works1-6 the general outline of his theory, covering all the physical fields. All of the phenomena whose origin is a mystery in the current theory—like that of the high energy cosmic rays—come out as logical deductions from his fundamental Postulates about the characteristics of motion. He has carried out the development far enough to establish a prima facie case for a general theory. However, considerable amount of theoretical work still needs to be done to extend the application of the Reciprocal System to greater detail.

Following the lead given by observational facts, and not based on speculations, Larson has endeavored to review the entire physical situation and come up with a new structure of physical theory, which has come to be called The Reciprocal System of theory. Larson’s principal finding is that the physical universe is composed entirely of discrete units of motion. Space and time occur only as the two reciprocal aspects of motion and are quantized. In the new paradigm, space-time plays the role of the