Towards a Theory of
Table of Contents
While there is some evidence at this time to suggest that a theory of Hyper-Spatial Mechanics would have experimental validity, the author does not presume to describe the following as a Theory.
For the purposes of this document, this should be considered a Speculation. It is suggested that some experimental evidence for the accuracy of this Speculation has existed since the 1950s.
The problem to date, is that we are the victims of history, in that Quantum Mechanics actually predates the "Special" Theory of Relativity, by about twenty (20) years. This generational gap has caused us to develop a concept of the universe, which results in the mis-use of probability in dealing with physical phenomena. Thus we have such anomalies as "Shroedinger's Cat", the "Uncertainty" Principle, and like propositions.
While the reality of these facts is not contested, the interpretation deserves to be looked at. In fact, we will attempt to solve the paradox of the "2 Slit Experiment" and demonstrate that this phenomena opens onto other realms of research.
In the final chapter, a proposed experiment will be presented, to be conducted within the confines of our own Solar System. This experiment will, if successful, prove the viability of a Hyper-Spatial Theory.
It will be my pleasure to leave the completion of such a theory to the interested community.
The purpose of this section is to identify in very general terms, the progress of physics, in recent times. The importance of this will be recognized in identifying why it has been possible for experimental evidence to be mis-interpreted for over a century. Progress in the Physical Sciences may be accelerated by an understanding of this phenomena.
The most important event of recent history is the advent of the "Experimental Method" attributed to Galileo some 600 years ago. Shortly thereafter, Newton presented some speculative comments on a Theory of Light which he admitted were apparently premature. By the 17th Century, the Industrial Revolution had commenced, which would lead to the extreme hierarchical form of system development which marked the 20th Century. In the 19th Century, Maxwell had provided a mathematical definition for the Theory of Light and other Electro-Magnetic Phenomena, as "waves". In the late 1890's to early 1900's Max Planck provided the evidence for a "corpuscular" or "quantum" nature to energy. In the period 1915 through the early 1930's Albert Einstein developed what were the Special and then the General Theories of Relativity. During the period from the 1930's to date, theoreticians have pursued a reconciliation of these two theories.
The complex hierarchical systems engineering model of the Industrial era, created a number of "castes" within the scientific community, corresponding to the hierarchy of industrial production methods. This included the well known "compartmentalization" phenomena sometimes humorously known as the "not my job" syndrome. The problem for science, is that this created a rift between those who experimented and those who theorized. The unfortunate impact is the mess that passes for current physical theory.
As can be seen, this is a very general overview of the historical trends, for the purpose of highlighting how it is possible to be misled in the search for truth. It would be appropriate to discuss the general state of Quantum Mechanics and the Theory of Relativity, and describe the unusual consequences of same.
It is assumed that the basic tenets of Newtonian Mechanics is well understood.
Relativistic Mechanics proceeds from the basic tenet that the speed of light is a constant for all observers, regardless of their state of motion "relative" to the source. This may not be an intuitively recognizable position, however, the results are easily verified by basic experiments.
This theory introduced the Lorentz contraction and the influence of time into traditional mechanics. Note that we do not use the term dimension on describing time here... more on this later and the "normal 3 dimensions" later.
The application of this contraction is exhaustively covered by other popular sources. The introduction of "time contractions", "space contractions" and the "curvature of space" all flow from this theory. As well, the important concept of the "Frame of Reference", as well as the tenet that all frames of reference are valid in relativistic mechanics was introduced by this theory.
Further, the duality of matter and energy are enshrined in the ultimate construct of relativity ...E=mc^2.
Most important in all of this, is that experimental evidence exists to support most of this extensive theory.
Quantum Mechanics is about the corpuscular nature of matter and energy. The term "corpuscular" is deliberate here, for there are historic precedents for this model dating to near pre-history.
There is extensive experimental "proofs" for this theory. The theory itself involves an extraordinary amount of stochastic modelling, that borders on casino gambling theory. There is as well no doubt about the applicability of this theory.
Perhaps the most important aspect of this theory, is that while it deals in matter and energy in individual "quanta", it recognizes Both a Wave and Particle nature to matter and energy! This is truly a historic paradox which stands to date!
Consider the famous two slit experiment which many a student has conducted. This experiment has been conducted at the photonic level ( i.e. at the individual quanta level ). Theoretically there is no electro-magnetic phenomena smaller than this. This experiment showed that "individual quanta" act as waves! Given the distinction which we apply to waves and particles, this paradox approaches the mystical in it's nature and the cosmology that has arisen around it.
Perhaps the most controversial aspect of early 20th century physics was the application of probability theory to physical processes. Traditional mechanics perceives of the universe as a precisely predictable structure based on mathematical models. While this may yet prove to be true, observation has provided other results.
Consider atomic theory which describes the "orbit" of the electron as a "probability cloud" which identifies the "probable" location of an electron around a nucleus! Experimental evidence certainly appears to support this position, and re-enforces the uncertainty principle in the process. The famous "wave equation" is completely escounced in probability operators with every term.
Most observers still follow the precept that the universe is in some way precisely predictable in its operation. It is entirely bound up in our systems engineering paradigms...from theory to experiment to feasibility to engineering and finally to product. However, we must learn here to differentiate between "reality" and "observation". Much of what is observed in physics today is basically unobservable! One cannot, for example, observe a light wave changing speed in space ( or whatever it is that actually happens ) to correspond with the frame of reference of different observers. Therefore there is no physical evidence "observable" to describe what is being observed! When observation is not available, a basic element of the experimental method is at jeopardy. Into this gap we apply probability, since we cannot observe what happens we observe results, apply a little statistics, and produce a stochastic model of "observations". These observations mean nothing to the "reality" of the process, but we can overcome the barrier to systems engineering needed to get results. The point is to recognize probability as a tool to arrive at a definition. It is not self definitive.
Admittedly, the previous discourse is highly general and glosses over much. Assumed is a familiarity with the experimental results to date, if not the theory surrounding it.
First let us remember, that in any physical definitions to date, the preface " In Inertial Systems " must and has been applied. This in essence dictates that in all of physics to date, the fundamental foundation has been the study of matter and energy. Space itself is essentially a triviality that provides a lattice for matter and energy. The General Theory of Relativity provided perhaps the first definitive "structure" to space as a gravity supporting entity ... only because no mechanism for gravity is definitively known... what that means for the General Theory is uncertain. There has been much mathematical speculation surrounding "hiperspace" ... that is to say, a theory of space as a multi-dimensional structure which includes physical dimensions beyond the three "spatial" and one "temporal" dimension. These have included gravity and other phenomena as spatial dimensions. Electrical Engineers are constantly dealing with mathematical structures using the square root of minus 1 ( (1)^(1/2) ), which has a mathematical identity defining a 90 degree deviation from "normal space". Of course, there is no actual deviation, it is actually intrinsic to the mechanics of electro-magnetic phenomena which occur in normal space ( or at least we presume so ).
This document speculates on a "hyper-space"... which is to say something beyond the space which has been speculated on historically. To deal with this matter, some observations on light and other electro-magnetic phenomena will be discussed.
At the outset, let us presume that the speed of light in "free space" is NOT a constant. Nor is the defined universal velocity a constant. This position will allow us to obtain a greater understanding of spatial media. Ask any electrical engineer what defines the speed of electro-magnetic phenomena in any medium, and the stock response is an equation relating capacitive ( electric ) and inductive ( magnetic ) reactance of the medium to the velocity. In summary, the speed of light, or any Transverse Elector-Magnetic Wave ( TEM Wave ), is governed by the ability of the associated electric and magnetic fluxes to charge and discharge the capacitive and inductive reactance of the medium. This relationship is used throughout electro-magnetic engineering. What it tells us is that if there is a "natural" constant velocity, it is dependent on at least two properties of the medium. Therefore, free space, which is presumed to have these properties, conducts itself as a medium, and there is a definite mathematical definition for the speed of light. Since matter and energy are manifestations of the same phenomena, we must view energy-matter and space as the correct subjects for study. The characteristic of this spatial medium is what remains obscure, as well as a paraphrase of the famous question... what lies beyond space.
Consider the phenomena known as light. It has the characteristic of having an electric flux, a magnetic flux and a direction of travel that are mutually orthogonal. The velocity of travel is the universal velocity, both for the beam, and the arms of the flux lines in all three dimensions. When a "photon" is launched, at time zero, it registers specific dimensions and position relative to an observer. After several nano-seconds the beam has travelled a certain distance, and the electric and magnetic flux lines have extended outward. All of these phenomena occur at the speed of light, and the observer is effectively observing a "wave" of electro-magnetic phenomena travelling a certain distance in a certain time. From the observers point of view, this is a "wave" -like phenomena, and is subject to the usual experimentally observed wave-like phenomena.
all axis, for each observation point.
Based on relativistic mechanics, the question is what would the light wave define "itself" as.
In practical terms any phenomena moving at the universal velocity, will internally record a zero time span relative to any observation point. Internally it will also record no change in the length of its electric and magnetic flux lines, indeed it would, if propagated from an electron quantum transition ( most light waves are ), record (dx,dy,dz,dt) = 0 !!! This is essentially the definition for a dimensionless point...or the ultimate prototype for a particle! Given that this is presumed to holds true for the individual quanta or photon, we have what appears to be a particle phenomena.
Given these two separate observations in the extreme opposite frames of reference, it appears that the paradox of wave - particle duality is resolved as purely a matter of frame-of-reference in relativistic mechanics!
At this point we have the following elements of "knowledge". The structure of the electro-magnetic phenomena is generally established as an orthogonally positioned electric and magnetic flux, which has a natural velocity defined in "the medium" by the inductive and capacitive reactance of that medium. If we "compare notes" between the photonic phenomena and our own frame of reference, we find that the photon displays wavelike structure in our frame of reference, but would define itself as a point particle in its own frame of reference.
Let us consider the implications of variable inductive and capacitive reactance.
Experience has shown that the "speed of light" varies under these circumstances. At present there is a mathematically defined inductive and capacitive reactance for "free space" which is presumed to be absolute. Therefore, there is a presumed absolute constant universal velocity defined by the structure of free space itself. Let us consider the cases where this is not true. In any medium, ranging from air, through glass and metals, there is a definite change in the propagation velocity of light, Transverse Electro-Magnetic Waves ( hereafter TEM ), electric and magnetic fluxes. This fact has been used over the centuries in a number of commonly engineered products, ranging from lenses to transmission lines, to electric transformers. A common phenomena in these materials is reflections. When a TEM ( any of the entire spectrum of waves, eg light, radio or X-rays ) encounters a change in capacitive or inductive reactance caused by a change in the medium, some of the energy is reflected at the boundaries of the media. It is possible, for complete reflection to occur, as in a prism. The mechanics of this reflective process are clarified in the operation of a device known as the saturable core transformer.
The saturable core transformer, has ordinary AC primary and secondary wound coils, around a metallic alloy core material, which has the property of being readily saturated magnetically. Around this core, a DC wound coil is placed. When the DC current is increased, the core carries a higher than normal magnetic flux, and the magnetic flux from the AC primary is attenuated by core saturation, reducing the output on the AC secondary. If the DC induced magnetic flux is high enough the core is completely magnetically saturated and no magnetically induced output arrives at the secondary.
In effect, the inductive capacity of the medium is fully charged and nothing can pass through it. Other media can be "fully charged" perhaps including space itself. This would result in an essentially impenetrable shielding effect in the affected region of space. It is suggested here, that this is a very common, although not an absolute effect, commonly observed in matter itself. Matter, except under extra-ordinary circumstances, cannot share the same region of space.
Consider the following definition from quantum mechanics;
v= (mc**2)/h ...which implies that matter can be defined as a TEM with a specific frequency. This actually would be common sense, if one considers that higher strength electric and magnetic fields and total energy are implied by higher frequencies. At the "frequencies" of matter the magnetic and electric flux may in fact saturate the space medium, resulting in the inability of space to contain two objects in the same location.
Consider the effect of a TEM at elevated frequencies on the spatial medium. At higher frequencies the TEM is more energetic. The relative strength of electric and magnetic flux increases at higher frequencies. As well, all of these effects occurring at the "universal velocity" carry certain implications.
First consider the Cerenkov effect. When an inertial particle travelling at near light speed impacts a medium, where the light speed is lower, the inertial particle discharges TEM photons, to bring its total energy and speed in line with the new medium. This interesting phenomena with particles may have a corollary in free space with non-inertial, TEM phenomena.
We "speculate" here that at higher frequencies, the radial component ( or some other part of the wave phenomena ) of the TEM's sinusoidal wave may exceed the speed of light ( ie the plus to minus charge state transition attempts to exceed the speed of light ) . Of course, this does not seem practicable. However, the energy of the wave being higher than is needed to saturate the electric and magnetic reactance for free space, could have an interesting effect. Since these excess flux energies ( and implied accelerative effect ) cannot be accommodated by free space itself, then these energies may display themselves as a third phenomena, which we speculate is gravity! In effect the higher the excess energy of the particle, the higher the gravitational effect.
This entire process may have a further implication. Since the spatial reactance are fully saturated, and the associated flux proceed at the universal velocity of free space, then any attempt to move ( specifically accelerate ) these objects necessarily requires that some component of the object attempt to accellerate beyond the speed of light.
We speculate that the source of both the inertial effect and gravity, are the result of this attempt to cause the relative displacement of an object, or a component of that object, which is already moving at light speed.
As well when two objects "collide", the saturated reactance of the region of space they individually occupy reflects them away from each other. Weapons physicists, regularly deal with energies in which particles pass through each other. These energies are so high as to force the charging/ discharging of reactance resulting in the familiar additive/subtractive effects of wave mechanics.
In summary, gravity is the resulting "flux" effect of the radial velocity of a sinusoidal TEM wave that would otherwise exceed the universal velocity of free space. The radial velocity of the TEM wave is "decelerating" as gravity, somewhat like TEM is discharged by a "decelerating particle in the Cerenkov effect. Inertia is the result of attempting to displace this object and its light speed TEM components through a space which cannot accommodate the high dynamic flux states of the individual electric and magnetic fields.
A result of these speculations, is that we conclude that all of modern physics is involved in the study of the various phenomena which result from the effects of the fundamental TEM phenomena. Thus, the electric, magnetic, gravitic and inertial effects are all inter-related by their actions at opposite extremes of relativistic phenomena, either at rest state or at the universal velocity. What determines which state an object resides in is it's relative inertia. Obviously, TEMs, quanta, photons, etc... ( whatever you want to call them ) are massless, and therefore are capable of proceeding at the universal velocity of free space.
Inertial objects, are subject to the Mach Principle, and are therefore in some sense governed by their massive and inertial natures. Remember, that the Mach principle implies that all inertial objects are defined relative to the position and state of all other inertial objects. That means that our yard stick on the universe, is matter... which we have seen has a highly variable relationship with the cosmos. Perhaps non-inertial phenomena, being unsubstantial aren't governed by our perceptions of the normal.
To this point we have addressed various phenomena, and perhaps our speculations will illuminate some of the more obscure corners of current thinking. Mathematicians may deal with most of the above as multidimensional cosmology, or hiperspace, by assigning most of the phenomena as physical dimensions, which are mutually orthogonal to each other. In other words electric, magnetic and gravitic flux as dimensional manifestations! However, this is not our purpose here.
Hyper-Space, is a different proposition. Here we are looking for something that belongs purely to the non-inertial cosmos. This means that most of our current benchmarks are immaterial to the process. The Hyper-Spatial Phenomena cannot be isolated, without returning to some of current day Quantum Mechanics.
We postulate here, that the reason that Transverse Electro-Magnetic Waves display wave-like and particle like behaviour is because of a relativistic dependency on "frame of reference". However, experimental evidence implies that both these wave-like and particle-like behaviour occur in the laboratory frame of reference! A situation that seems to imply that our speculations and postulate are not supportable. To resolve this situation, we must return to Galileo's experimental method.
Recall, that the core of the experimental method is Observation. Observation of phenomena at the atomic and sub-atomic scale are difficult, and made more so by the Uncertainty Principle. Please refer to the available texts on this subject about the phenomena involved. The problem with observation at the quantum level, is to realize what is being observed . Measurement at the quantum level involves understanding the differences between how measurements occur, what is being measured, and what is being observed. We observe measurements second-hand because we do not measure. It is the instruments that measure. Measurement, at the quantum level, has a specific characteristic, which is perhaps more clearly understood in more domestic circumstances.
Consider the process of seeing. The human eye has an interesting characteristic. It is, under ideal circumstances, capable of detecting individual quanta! When an individual quanta arrives at the ocular detectors, it is "absorbed" by the eye. In other words, it ceases to exist. This process of "detection by destruction" is The Only Way That Any Observation Is Made At The Quantum Level! The significance of detection-by-destruction has been completely overlooked in quantum mechanics.
We speculate here, that Relativistic mechanics must be extended to consider the issue of Self-Reference. In the simplest terms, this means that if a unique entity is dead, then it is dead in all frames of reference. If the light of a super-nova some 1000 light years from earth is detected, it does NOT mean that the star just died at this moment. The star has been dead for a thousand years. We merely learned of it now, rather like the ancient mariners, who sent word of an event by courier.
The significance of self-reference to the quantum mechanics, is extra-ordinary. When a "photon" ( recognize that we are describing the Transverse ElectroMagnetic Wave in IT's frame of reference by calling it a photon ) is detected in our frame of reference, it's energy is completely absorbed by the detector ( usually as a quantum jump in the shell energy level of an electron ) . The impact of this, is that the "photon" completely ceases to exist in its frame of reference. By Self Reference, that photon has ceased to exist in ALL frames of reference.
In our frame of reference, the "detection" of that photon is somewhat more complex. We see a TEM which we detect, in the method commonly known to electrical engineering. An "antenna" detects the wave front, and the energy of the wave, if it is strong enough is detected by our instruments. If the TEM wave-front is not strong enough to be detected by our instruments ( i.e. its apparent energy is below the quantum threshold ) we create "mirrors" that "concentrate" the wave energy to the point that it is above the quantum threshold. TEM waves are detected, when, and only when the associated spatial impedance of the TEM wave and the detection apparatus match exactly. TEMs affect the reactance of the region they occupy, as does the detection apparatus, and exact matches are the stuff of ... probability theory ... in most cases ( building mirrors is one way we alter probability ). The unique aspect of this position, is that the TEM wave is linear, whereas the "detection process" is quantised! In effect the quantum effect is an anomaly of the detection process, and not exactly a description of reality.
Let us return to the ultimate paradox of the 2 Slit Experiment. If a photon ( a TEM of singular quanta energy ) passes through the 2 slits, it interferes with itself on the screen. If one slit is blocked off ( i.e. it reflects the incoming TEM ) there is, as would be expected in wave mechanics, no interference pattern on the back screen. And now for the true paradox... when a quantum sensitive detector is placed "in" one of the slots, the TEM wave is detected ( i.e. destroyed ) and nothing passes through the other slit... completely at odds with conventional wave mechanics, and all human understanding to date! These are the experimental results.
By now the reader may suspect the explanation. In our frame of reference, we detect, what to us is a wave front. Since TEM waves proceed at the universal velocity, we deal with an object which by self-reference describes itself as a particle in its frame of reference. If we detect this quanta ( i.e. destroy it ) in our frame of reference then this object ceases to exist, simultaneously in its frame of reference, and ours. By self reference, an object "exists" in it's frame of reference AND NO OTHER. All other frames of reference merely observe a manifestation in their frame of reference.
The most important aspect of this speculation on Hyper-Spatial Mechanics, is the outcome in our universe. Consider, the light of a distant Galaxy... the TEM wave, is well below the quantum threshold required for detection. Therefore, this TEM is passing through your region at this time completely un-detected, and undetectable. This TEM can therefore pass through any region where it is not reflected by altered reactance in the spatial medium.
This TEM wave front originating from what appears to be a point radiator... the distant galaxy ... describes a spherical wave front extending out across the cosmos. The surface of this wave spans (literally) Billions of light-years. If there are any irregularities in the surface of this wave ( and there very likely are ) this is associated with reflections to the wave front caused by irregularities in the reactance of the regions of the cosmos that this wave has passed through. Assume now that an alien species in another galaxy, has built a huge mirror, sensitive enough to focus and "detect" this wave. When they detect this wave, the photon in question will be destroyed in it's frame of reference, and if we have a similar telescope pointed in the same direction, we will never know about that galaxy, because at the same instant that that photon was destroyed in its frame of reference, it was destroyed in all other frames of reference, including the aliens and ours! Depending on the quality of the alien instrument, and how long they point it at this imaginary galaxy, we may be able to detect other "photons" from this galaxy... or perhaps not. On earth, the atmosphere acts like the imaginary aliens, and stars twinkle.
It is unlikely, that Hyper-Spatial Mechanics, in it's very primitive current state, can provide some means of transporting objects instantaneously across the cosmos. However, some simple experiments can be conducted to determine the accuracy of the theory.
First, the famous 2 Slit Experiment should be conducted with an eye to verifying that self reference is a correct postulate, by determining if the detection process results in the instantaneous disappearance of the photon at another slit. This can be done by sending a very spars stream of "photons" at the widely separated slits and "selectively" detecting them at one slit. If the selective detection is selectively periodic, then the light from the other slit should "twinkle" periodically and perhaps instantaneously.
Second, of interest to the cosmonautic community, is the construction of a point-to-point hyper relay. Place a reference beam satellite in a very high ... beyond pluto distance... solar polar orbit. This satellite would be a precision instrument transmitting a precisely defined radio signal. In orbit around Mars and Earth place High precision detector satellites. If the detection process of the orbiting satellite closest to the reference beam is modulated on and off in an organized fashion, then the other satellite should be able to detect this modulation as a "twinkle" in the beam from the reference beam satellite. Thus information which can take on the order of 20 minutes to pass from Mars orbit to Earth orbit, can be communicated nearly instantaneously between worlds!
Third, of interest to those who seek to intercept the communications of other worlds, is the following observation. The SETI type projects to date are doomed to failure, for the simple reason that they seek out the wrong types of signals. In the mid 1970's the use of spread spectrum technology demonstrated methods of spectrum re-use that allowed more communications than would otherwise be possible in the available spectrum, with lower power signals. These signals look like pure noise to any interceptor, not familiar with the encoding and modulation methods used. As well, current technology prefers "piped" communications on a planetary scale, ( ie. fibre-optic cables ). Interplanetary and InterStellar communications is likely conducted against reference beams. Therefore, one must detect a reference beam, extrapolate its original characteristics, determine what part of it's noise spectrum is possibly modulation via a point-to-point hyper relay, and then determine the spread-spectrum encoding technique and modulation mode of the relay. Looking directly at stars for modulated radiators is useless, since they aren't likely to be transmitting!