lördag 26 mars 2011

What's the Matrix? Relative locality.

Physics governed by a novel principle, which we call the Principle of Relative Locality? This states that - Physics takes place in phase space and there is no invariant global projection that gives a description of processes in spacetime. From their measurements local observers can construct descriptions of particles moving and interacting in a spacetime, but different observers construct different spacetimes, which are observer-dependent slices of phase space.

Is this something new? No, only to mainstream, maybe. Alternative ideas have been around long, talking of subjective Causal Diamonds or light-cones. The "new" (?) thing is the observer effect done by the Planck scale, where the consciousness also is born. But this has Matti Pitkänen also long talked of. A 'living' Universe.
Abstract: We propose a deepening of the relativity principle according to which the invariant arena for non-quantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them.
This framework, in which absolute locality is replaced by relative locality, results from deforming momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of momentum space geometry, such as its curvature, torsion and non-metricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of momentum space with a metric compatible connection and constant curvature. The principle of relative locality. Lee Smolin's show.

Non-locality in theories with deformed Lorentz invariance. Deformed symmetry breaking = asymmetry.
"Look around - do you see space?"
- no, just things
- no, spacetime, or in fact momentum space, photons arriving (momenta) and energies (angels), it is inferred, derived from momentum space measurements.

"Do we all infer the same spacetime, - also at different energies?"
- transformations (conservation laws) and translations (total momentum)
- between observers
- translation is independent of momenta and energy, we all construct the same spacetime, local for one is local for everybody
- description of events are different at different energies
- for every interaction observers local to it will infer it as local, and a distant observer will not infer it as local

We call this principle of relative locality!, say Smolin. And there is a math for it based on geometry of momentum space (4-D; 0,1,2,3); the classical Planck mass (mp) regime, where the constant variables can be varied. And in that way the gravity and quantum effects (hbar) can be 'fixed' to a constant (if c=1), and neglected. New phenomena on scaling of momenta and energy might show up. lp=Planck length.
See Hossenfelder & Smolin, Phenomenological Quantum Gravity. See also the problem of Planck mass, as instance Nima Arkani-Hamed The hierarchy problem and new dimensions at a millimeter from 1998, at arxive. "There are two seemingly fundamental scales, the electroweak scale (GeV 10^3) and the Planck scale (GeV 10^18) The Planck scale is not a fundamental scale; its enormity is simply a consequence of the large size of the new dimensions. The Planck scale is where gravity becomes as strong as the gauge interactions. The ratio mEW/mPl (GeV 10^−17) has been one of the greatest driving forces behind the construction of theories beyond the Standard Model (SM). The physics responsible for making a sensible quantum theory of gravity is revealed only at the Planck scale. A desert between these scales "The weak scale is extremely interesting, but will never give a direct experimental handle on strong gravitational physics.

lp = sqrt hbar x GNewton ->0 (quantum spacetime eliminated?)
mp= sqrt hbar:GNewton -> constant

So we have only an energy scale left. The ratio mp/lp=mp that is constant. The ratio tells us the effect from energy on structure, frequency, amplitude. These are then phenomena governed by two parameters, c and Planck energy (EPl). Velocity of light is assumed to be constant.

Assumption 1: Momentum space is more fundamental than spacetime.
How can it get deformed so it can be measured by a scale mp? This is the first measurement that is possible. The dynamics of spacetime will be formed in mp and momentum space.

Assumption 2: The observer is local, and can only do measurements locally. In this way mp is the first observer, measuring energy and momenta. Also time. With these three we can construct the geometry of momentum space. A modified Minkowski is deformed and curved. mp goes to eternity to get a flat geometry.

Assumption 3: There is a preferred coordinate on momentum space, the ground state, and measurements are only made that is above this state. Ground state = 0 (homeostasis mechanism). mp can measure only metric (rest energy and kinetic energy) and connections/relations. Connections can be defined by an algebra that determines how momenta combine when particles interact (p, q variables). The product rule, (iterations, commutativity, linearity, associativity...) + output -> input, a feedback mechanism. In this way we get an oscillation of p (and p x q), torsion and curvature away from the original momentum space.

This is primitive consciousness, as in artificial intelligence? The most basic measurement+ feedback+ ground state. This is also homeostasis, self-regulation?

Translations/deformations because of the observer are generated by the laws of conservation of energy and momentum. If you look at things in very small scale (Planck), you will create a Black hole (a collapse) with your energy (measurement).

Since momentum space is curved, and Kb is non-linear, it follows that the “spacetime coordinates” of a 2-D particle translate in a way that is dependent on the energy and momenta of the particles it interacts with. This is a manifestation of the relativity of locality, ie local spacetime coordinates for one observer mix up with energy and momenta on translation to the coordinates of a distant observer.
We will see that the meaning of the curvature of momentum space is that it implies a limitation of the usefulness of the notion that processes happen in an invariant spacetime, rather than in phase space. - Non-linear deformation of the conservation laws gives the relative locality. Contraction gives a phase space. Four worldlines emerge from the interaction.
Particles are removed from the origin/center by the observer mp. "The cotangent space based at pI and the cotangent space based at 0 are different spaces in the general curved case. This expresses mathematically the relativity of locality." - This is nothing else than ZEO.

The geometry of the spacetime.
A Zero Energy Ontology
(part II) and geometry arise from vacuum extremals giving discretization.
"In fact, practically all solutions of Einstein's equations have this property very naturally. The explicit formulation emerged with the progress in the formulation of quantum TGD. In zero energy ontology physical states are creatable from vacuum and have vanishing net quantum numbers, in particular energy. Zero energy states can be decomposed to positive and negative energy parts with de nfiite geometro-temporal separation, call it T, and having interpretation in terms of initial and final states of particle reactions. Zero energy ontology is consistent with ordinary positive energy ontology at the limit when the time scale of the perception of observer is much shorter than T. One of the implications is a new view about fermions and bosons allowing to understand Higgs mechanism among other things. S-matrix as a characterizer of time-like entanglement associated with the zero energy state and a generalization of S-matrix to what might be called M-matrix emerges. M-matrix is complex square root of density matrix expressible as a product of real valued "modulus" and unitary matrix representing phase and can be seen as a matrix valued generalization of Schrödinger amplitude. Also thermodynamics becomes an inherent element of quantum theory in this approach."
"The fusion of p-adic physics and real physics to single coherent whole requires generalization of the number concept obtained by gluing reals and various p-adic number fields along common algebraic numbers. This leads to a completely new vision about how cognition and intentionality make themselves visible in real physics via long range correlations realized via the e ffective p-adicity of real physics. The success of p-adic length scale hypothesis and p-adic mass calculations suggest that cognition and intentionality are present already at elementary particle level."
Indeed, very much the same. Which one came first, the hen or the egg? So this cannot be anything new. Of some reason Smolin et co doesn't refer to Pitkänen. Why?
p-Adic length scale hypothesis follows if one assumes that the temporal distance between the tips of CD comes as an octave of fundamental time scale defi ned by the size of CP2. The "world of classical worlds" (WCW) is union of sub-WCWs associated with spaces CD x CP2 with di fferent locations in M4 x CP2. ZEO is replaced with a book like structure obtained by gluing together infinite number of singular coverings and factor spaces of CD resp. CP2 together. The copies are glued together along a common "back" M2 x M2 of the book in the case of CD. Color rotations in CP2 produce di fferent choices of this pair..
The same picture as Smolin et co. Superpositions. In TGD:
(hbar/hbar-0)¨2 appears as a quantum scaling factor of M4 covariant metric. Anyonic and charge fractionization e ffects allow to "measure" hbar(CD) and hbar(CP2) rather than only their ratio. hbar(CD) = hbar(CP2) = hbar-0 corresponds to what might be called standard physics without any anyonic eff ects and visible matter is identi fied as this phase. Quantum TGD is reduced to parton level (flat) objects in light-like 3-surfaces (Wilson loops?) of arbitrary size (infinite?), give rise to an in finite-dimensional symplectic/symplectic algebra. The light-likeness of partonic 3-surfaces is respected by conformal transformations of H made local with respect to the partonic 3-surface.

Units of action (E , t , c), (a)symmetry in ground state.

Planck's constant has units of energy multiplied by time, which are the units of action.
These units may also be written as momentum times distance (N·m·s), which are the units of angular momentum. In classical physics, the groundstate is Lorentz-invariant and the principles of special relativity (SR) are satisfied. So, SR breaks down at Planck scale? Those who suggest this point to the existence of a preferred cosmological rest frame. Experiments (Fermi, Kepler) are currently probing whether the Lorentz symmetry is preserved when effects of the order of the ratio of energies in the experiment to EPl are taken into account.

Keas blog:
Ulla, no, they don't seem to understand the ZEO, although they are getting closer. At present they are 'fixing scales' and looking at infrared QG effects by letting hbar go to zero. So they might eventually appreciate Louise's law, but they will probably make up a big confusing story about how it was all their idea. As I see it, without carefully reading the paper, but assuming they are making vague sense, they are essentially fixing c (separately) in the local physics at the two locations of the GRB thought experiment. I think this is more or less OK, for the situation they are discussing, but of course the missing explanation needs to be provided eventually. Morally, they should perhaps be taking c→∞ in their scheme, since hbar→0. But Riofrio's law can still be enforced by assuming that t→0 instead, for fixed M and c, which is correct when hbar→0.

TGD: (Only one cone is broken?) - also photon and gluons become massive and eat their Higgs partners to get longitudinal polarization they need.
WCW is union over all possible choices of CD and pairs of geodesic spheres so that at the level no symmetry breaking takes place . The points of M2 and S2 have a physical interpretation in terms of quantum criticality with respect to the phase transition changing Planck constant (leakage to another page of the book through the back of the book). This gives invisibility, non-commutativity and dark matter for the other cone/diamond, p7.

Then we are on the arena for another famous physicist, Nima Arkani-Hamed. For the previous discussion, see 'Duality for the S-matrix'.
The integrand has also been beautifully interpreted as a supersymmetric generalization of the null-polygonal Wilson-Loop, making dual superconformal invariance manifest and providing a general proof of the Wilson-Loop/Amplitude duality. Again no ref. to the alternative group.

Own words again for the same thing. Do they not understand each other?

Spacetime is doomed, says Nima Arkani-Hamed. Space-Time, Quantum Mechanics and Scattering Amplitudes.
"The most interesting thing ihave seen in my lifetime" Non-deterministic? Feynmans diagrams are about manifest locality and uncertainty. A field- locality are opposites. Gauge redundancy is all our troubles. For 60 years-where is the emergent spacetime? String theory, twistors, algebraic geometry, integrals have not given the answer. Trees, loops, groups, ? N-4 super Yang Mills simplest gauge theory of all. Unitary theory through spacetime. Emergent spacetime-emergent QM. Tree amplitudes central loops and 6 treads anchoring every second is positive, every second negative. Gravity is cyclicity parity, no spinning poles. Infinitely many hidden symmetries. Massless particles enjoy symmetry (conformal) invariance in spacetime theory. Twistor space has a point only, no line but a momentum space, (one invisible) and generate Yangian algebra. This solved the problem of determining anom dimensions in N-4 SYM, without Feynmans, and gives extensions to amplitudes, that are more physical.
Momentum conservation, parity invariant, k-plane - kn-plane is similar.
This is the ZEO, but symmetric 1-D, identical structures. Look how excited he is to discover the TGD world.
Entangled removal or loop corrections. the superposition of loops is QM. Supersymmetric Wilson Loop with perfect symmetry...
Wilson loops are problematic in TGD; they must be generalized. Knots and TGD.
Time like braidings induces space-like braidings and one can speak of time-like or dynamical braiding and even duality of time-like and space-like braiding. What happens can be understood in terms of dance metaphor.
The interpretation of string world sheets in terms of Wilson loops in 4-dimensional space-time is very natural. This raises the question whether Witten's a original identi cation of the Jones polynomial as vacuum expectation for a Wilson loop in 2+1-D space might be replaced with a vacuum expectation for a collection of Wilson loops in 3+1-D space-time and would characterize in the general case (multi-)braid cobordism rather than braid.

History: By late 1900, Planck had built his new radiation law from the ground up, having made the extraordinary assumption that energy comes in tiny, indivisible lumps. In the paper he wrote, presented to the German Physical Society on December 14, he talked about energy "as made up of a completely determinate number of finite parts" and introduced a new constant of nature, h, with the fantastically small value of about 6.6 × 10-27 erg second. This constant, now known as Planck's constant, connects the size of a particular energy element to the frequency of the oscillators associated with that element. Something new and extraordinary had happened in physics, even if nobody immediately caught on to the fact. For the first time, someone had hinted that energy isn't continuous.


Dark matter and invisibility.
TGD is maybe most chritizised for its dark matter physics. But dark matter is nothing peculiar at all in this model. The same tells us neutrino research, that has shown asymmetry. The majority of leptons are invisible and noncommutative.

In 'Manyfold Universe' Nima Arkani-Hamed et al. suggests 1999: a new DM particle and a new framework for the evolution of structure in our universe. (LHC bounds on large extra dimensions by A. Strumia and collaborators poses very strong constraints on large extra dimensions and mass and effective coupling constant parameter of massive graviton.)
We propose that our world is a brane folded many times inside the sub-millimeter extra dimensions [with massive graviton]. The folding produces many connected parallel branes or folds with identical microphysics - a Manyfold. Nearby matter on other folds can be detected gravitationally as dark matter since the light it emits takes a long time to reach us traveling around the fold. Hence dark matter is microphysically identical to ordinary matter; it can dissipate and clump possibly forming dark replicas of ordinary stars which are good MACHO candidates. Its dissipation may lead to far more frequent occurrence of gravitational collapse and consequently to a significant enhancement in gravitational wave signals detectable by LIGO and LISA. Sterile neutrinos find a natural home on the other folds. Since the folded brane is not a BPS state, it gives a new geometric means for supersymmetry breaking in our world. It may also offer novel approach for the resolution of the cosmological horizon problem, although it still requires additional dynamics to solve the flatness problem.
This is in essence the Big Book of TGD. Interactions happen through the back of the Book. U-matrix; regions where the brane bends and its curvature is very large, with Nimas words. They can probe objects on adjacent folds gravitationally (wormholes in TGD), since they are nearby in the bulk, at sub-millimeter distances. Hence matter on other folds will appear as dark matter to us. A low energy observer on the braneworld can therefore experience two different minimal distances: gravitational, defined by minimizing the distances traveled by bulk particles, and electromagnetic, which corresponds to minimizing the distances along the brane, and around the tips of folds.

This U-matrix is what Matti now has solved in TGD. "I must say that for me the idea of large dimensions is so ugly that the results are not astonishing. Aether hypothesis was a beauty compared with this beast. Should we accept anthropic principle? No, electroweak symmetry breaking is left."
Dark matter is an anyonic phase of matter in TGD. And the redshift or blueshift is the interaction change, showing the light-cone direction. This interaction changes de facto the speed of light a little??? Blue or red is not illusion only? In GRT framework speed of light is by definition a constant in local Minkowski coordinates. It seems very difficult to make sense about varying speed of light since c is purely locally defined notion. I quote from Mattis paper:
"Particle states belonging to di fferent pages of the book can interact via classical fields and by exchanging particles, such as photons, which leak between the pages of the book. This leakage means a scaling of frequency and wavelength in such a manner that energy and momentum of photon are conserved. Direct interactions in which particles from di fferent pages appear in the same vertex of generalized Feynman diagram are impossible. This seems to be enough to explain what is known about dark matter. This picture diff ers in many respects from more conventional models of dark matter making much stronger assumptions and has far reaching implications for quantum biology, which also provides support for this view about dark matter."

The basic implication of dark matter hierarchy is hierarchy of macroscopic quantum coherent systems covering all length scales. The presence of this hierarchy is visible as exact discrete symmetries of field bodies reflecting at the level of visible matter as broken symmetries. In case of gravitational interaction these symmetries are highest and also the scale of quantum coherence is astrophysical. Together with ruler-and-compass hypothesis and p-adic length scale hypothesis this leads to very powerful predictions and p-adic length scale hypothesis might reduce to the ruler-and-compass hypothesis. High Tc superconductivity, nuclear string model, the 5- and 6-fold symmetries of the sugar backbone of DNA suggest that corresponding cyclic groups or cyclic groups having these groups as factors are symmetries of dark matter part of DNA, presumably consisting of what is called as free electron pairs (quasiparticles) assignable to 5- and 6-cycles, giving the vision about living matter as quantum critical system. The notion of (magnetic) field body which plays key role in TGD inspired model of living matter serving as intentional agent controlling the behavior of field body. For instance, the model of EEG relies and of bio-control relies on this notion. The large value of the Planck constant is absolutely essential since for a given low frequency it allows to have gauge boson energy above thermal threshold. Large value of Planck constant is essential for time mirror mechanism which is behind the models of metabolism, long term memory, and intentional action. The huge values of gravitational Planck constant supports the vision of Penrose about the special role of quantum gravitation in living matter.
On a blogpost aug. 09 he writes: TGD view leads to the explanation of standard model symmetries, elementary particle quantum numbers and geometrization of classical fields, the dream of Einstein. The presence of imbedding space M4×CP2 brings in light-like geodesics of M4 for which c is maximal and by a suitable choice of units could be taken c=1. The time taken for light to propagate from point A to B along space-time sheet can in TGD framework occur via several routes (each sheet gives its own value for c.), which gives a reduced speed of light for sub CDs. The light-like geodedesics of M4 serves as universal comparison standards when one measures speed of light - something which GRT does not provide. c measured in this manner increases in cosmological scales, just the opposite for what Louise Riofrio claims. The reason is that strong gravitation makes space-surface strongly curved and it takes more time to travel from A to B during early cosmology. More here.

Smolin used c=1 for above. But if gravity is 'turned on' the curvature change. "Momentum space is represented by covector fields and the metric induced on each fiber is dependent on the spacetime point. It can be said that the relative locality is dual gravity". And "In zero momentum local measurements are important, coming from uncertainty principle and QM, where a particle of energy p0 can only be diffusely-precisely localized with hbar/p0."

If time isn't absolute, then the velocity of light isn't absolute either, says Smolin. We need an absolute scale (invariant velocity c) to measure the velocity, and the scale of non-linearities; then there also is an absolute time (objective time in TGD). The additivity of momenta and energy implies the existence of an absolute spacetime too.
This then allows us to interchange distances and times, which makes possible the existence of an absolute spacetime, which replaces the notion of absolute space. Space itself remains, but as an observer dependent concept, because of the relativity of the simultaneity of distant events. When we contemplate weakening that to a non-linear combination rule for momenta in physical interactions, we need an invariant momentum scale. We have taken this scale to be mp but of course from a phenomenological point of view it should be taken as having a free value to be constrained by experiment. This, together with hbar makes it possible to interchange distances and momenta, which makes possible the mixing of spacetime coordinates with energy and momenta, so that the only invariant structure is the phase space. We saw above explicitly how non-linearity in conservation of energy and momentum directly forces translations of spacetime coordinates to depend on momenta and energies. Local spacetime remains, but as an observer dependent concept.
Smolin described a 2+1 D world. In 3+1 D the 'no gravity' field is governed by a topological field theory, he guess. And if time bends, then light too? Both in an U-matrix = timetravel IS possible? But that is highly controversial. Can these fields be constructed?

Nor the energy isn't absolute. The first law of thermodynamics break down at Planck scale? And the second law concerning systems out of balance too, complicated with gravity, and quantum gravity is unknown? Smolin & Magueijo 2002:
We propose a modification of special relativity in which a physical energy, which may be the Planck energy, joins the speed of light as an invariant, in spite of a complete relativity of inertial frames and agreement with Einstein’s theory at low energies. This is accomplished by a nonlinear modification of the action of the Lorentz group on momentum space, generated by adding a dilatation to each boost in such a way that the Planck energy remains invariant. The associated algebra has unmodified structure constants. We also discuss the resulting modifications of field theory and suggest a modification of the equivalence principle which determines how the new theory is embedded in general relativity.

Fermi:
We detect for the first time a GRB prompt spectrum with a significant deviation from the Band function. This can be interpreted as two distinct spectral components, which challenge the prevailing gamma-ray emission mechanism: synchrotron - synchrotron self-Compton. The detection of a 31 GeV photon during the first second sets the highest lower limit on a GRB outflow Lorentz factor, of >1200, suggesting that the outflows powering short GRBs are at least as highly relativistic as those powering long GRBs. Even more importantly, this photon sets limits on a possible linear energy dependence of the propagation speed of photons (Lorentz-invariance violation) requiring for the first time a quantum-gravity mass scale significantly above the Planck mass.
A pregeometric phase of matter giving rise to space and time, and a geometric phase giving rise to gravity, say Hossenfelder. "High-Temperature Superconductor Spills Secret: A New Phase of Matter?" (also this). For more details see the article in Science. This phase would be present also in the super-conducting phase. In TGD this phase would consist of Cooper pairs of electrons with a large value of Planck constant but associated with magnetic flux tubes with short length so that no macroscopic supra currents would be possible.

Sanejouand : The empirical evidences in favor of the hypothesis that the speed of light decreases by a few centimeters per second each year are examined. Lunar laser ranging data are found to be consistent with this hypothesis, which also provides a straightforward explanation for the so-called Pioneer anomaly, that is, a time-dependent blue-shift observed...


Variable speed of light? Absolute speed? Absolute time?

Kea: GRB 090510 eliminates Lorentz violating theories that conclude that the speed of light depends on its energy. Observable naive Lorentz violation via frequency dependent photon speeds has been ruled out - we find that the bulk of the photons above 30 MeV arrive 258±34 ms later than those below 1 MeV, indicating some energy dependent time delay.

Riofrio: Supernova redshifts are the only evidence of cosmic acceleration. Low redshifts increase linearly with distance, showing that the Space/Time expands. High redshifts increase non-linearly, leading to speculation about repulsive energies. Riofrio is the one talking for variable speed. " If constants like α really vary, the cosmos loses its homogeneity, and dark matter and dark energy could be different in various places." This is what is seen too.

Close coincidence in time between the onset of the "dimming of the universe" and the onset of cosmic acceleration. - Like a growing Planck constant?

One can look for a variation in the speed of light proportional to E/EPl, where E is the energy of a photon. That is one looks for an energy dependent speed of light (in far away gamma ray bursts, as instance) of the form v = c(1 ± aE/EPl), where a is a dimensionless parameter to be determined. This has not been seen, so we know that the parameter a must be less than around
one (at least for the minus sign). In fact a slowing down of the photon speed has been shown by Fermi, only the reason for the slowing-down is unknown.
Several bursts have now been documented in which the higher energetic photons (>GeV) arrive with a delay of more than 10 seconds after the onset of the burst has been observed in the low energy range (keV-MeV). While it is still unclear whether this delay of the high energetic photons is caused at emission or during propagation, more statistics and a better analysis - in particular about the delay’s dependence on the distance to the source – will eventually allow to narrow down the possible causes and constrain models that give rise to such features.

Lubos Motl, Fermi kills all Lorentz-violating theories, doesn't want to believe this - he calls Hossenfelder and 'light-varying believers' less pleasant things. He doesn't like Loop quantum gravity either, nor TGD.
Objections to loop quantum gravity
Lorentz violation and deformed special relativity
MAGIC: dispersion of gamma rays?
MAGIC: rational arguments vs propaganda
Aether compactification
Relativistic phobia
Testing E=mc2 for centuries
Lorentz violation makes perpetuum mobile possible
Well, for one thing, we expect the quantized spacetime to manifest itself as minute differences in the speed of light for different colors... Leslie Winkle. A serious thing :) And Hossenfelders answer.
On Keas blog:
The May 2009 burst was shorter and the photon(s) had higher energies, which allowed to measure the coefficient of the Lorentz violation with much better accuracy, and it's zero, 100 times more accurate zero than needed to prove that the Lorentz invariance holds at the Planck scale. The hypothesis that the delay arises on the journey would predict about 100 times bigger delay than the upper bound of the delay recently seen by Fermi.
Every theory that violates Lorentz symmetry implies that the speed of light is not universal. Special relativity, including Lorentz symmetry, was derived just from two postulates - the equivalence of all inertial frames and the constancy of the speed of light. A constant, energy-independent speed of light in a Lorentz-violating theory would require an infinite amount of fine-tuning (made with the assumption of constant speed of light).

Kea: All the photons we look at here are travelling at speed c. But we didn't observe all those GRB photons on their way here. And like in any simple 2 slit experiment, maybe they took many different paths. Maybe the path weighting for the higher energy ones gives a greater probability that these photons are slowed down more (by gravity?).
You see, Lubos talks for the need of an infinite hierarchy :)

In the frame of an absolute speed of light versus subjective speed of light what she says maybe makes sense? But the transparency of Universe indicate there must be some transformations of the photons then (to invisibility?). And biophotons, photoassimilation, electron excitation, all these small events must invoke on the speed a little. Also the massless photon itself is problematic. It must be a quasiparticle for the equations to work. Also in TGD it must have a minute mass for the massivation from the Higgs mechanism.

In what sense c could be changing in solar system

In General Relativity are the idea about Lorenz invariance and the absolute time in conflict, points Matti out. He says:
For Lorentz invariance to hold in this scenario, it has to be exactly the same for all observers. Thus, this idea either breaks Lorentz invariance by 1) assuming absolute time for all observers or 2) resulting in the speed of light being different for observers in different frames. Many-sheeted space-time allows variation of light-velocity in this sense without breaking of Lorentz invariance. In cosmic scales the light-velocity measured in this manner increases slowly (the time taken to travel along curved space-time surface decreases since it becomes less curved as it flattens so that c approaches its maximal value) . Solar system space-time sheet is predicted to not participate in expansion except possibly by rapid phase transitions and this is known to be true. This implies apparent reduction of c since the standard to which one compares increases slowly.

The Planck ratio.
The factor that the Planck mass is bigger than the particle masses is about the same factor that atoms are bigger than the Planck length (wouldn't it be weird if the Bohr radius were about as small as the Planck length?), and also that if the Planck mass wasn't a hell of a lot bigger, then interparticle gravitation would be an issue in the physics of the atom (wouldn't that be weird?). This is where both quantum and classic physics are born? TGD:
Non-commutative physics would be interpreted in terms of a finite measurement resolution rather than something emerging below Planck length scale. An important implication is that a finite measurement sequence can never completely reduce quantum entanglement so that entire universe would necessarily be an organic whole. Topologically condensed space-time sheets could be seen as correlates for sub-factors which correspond to degrees of freedom below measurement resolution.

Since the gravitational Planck constant is proportional to the product of the gravitational masses of interacting systems, it must be assigned to the field body of the two systems and characterizes the interaction between systems rather than systems themselves. This observation applies quite generally and each field body of the system (em, weak, color, gravitational) is characterized by its own Planck constant. A further fascinating possibility is that the observed indications for Bohr orbit quantization of planetary orbits could have interpretation in terms of gigantic Planck constant for underlying dark matter so that macroscopic and -temporal quantum coherence would be possible in astrophysical length scales manifesting itself in many manners.
The magnetic body too is a field body.

Smolin:
Last but not least, momentum spaces of constant curvature find its natural application as a model of Doubly (or deformed) Special Relativity (DSR), whose second observer independent scale is naturally associated with the curvature of the momentum space. These formulations were closely related to non-commutative geometry. A second formulation of DSR expressed the same idea as an energy dependence of spacetime. For a long time it has been suspected that these were different ways of formulating the same theories, the developments of this paper show how an energy dependent metric and non-commutative spacetime coordinates are different ways of expressing a deeper idea, which is relative locality.

And the quintessense, wormholes, black holes...
Planck mass divided by the Planck length (mp/lp) is varying in reality and would be the greatest possible meaningful density of mass/energy, which might be the same as the density of the universe one Planck time after the Big Bang. Planck mass is the largest possible mass that can fit in the smallest meaningful volume of space, which would be about equal to the Planck length cubed (superposition?), the Planck scale is not just about a limit on smallness, but largeness as well.

A Planck-mass black hole is not a tree-level classical particle such as an electron or a quark, but a quantum entity resulting from the Many-Worlds quantum sum over histories at a single point in spacetime. The Planck mass is the mass required to form a black hole with an event horizon of a Planck length - which basically means nothing smaller than this can collapse into a black hole. This is a good thing if you consider the incredible density of an atomic nucleus. Why is there a mass limit in some natural way? If you are imagining gravity as the curvature of spacetime, then the Planck mass is where spacetime gets so warped it gets closed off. But then QM uncertainty/compton wavelength has to be invoked as to why there is not then a complete collapse of spacetime to a singularity. Linked to gravitons?

For any given length, there is only one possible mass that will give a black hole whose radius is exactly that length (see the formula for Schwarzschild radius as a function of mass here). The Planck mass represents the minimum for the size of a black hole, since anything smaller would be a black hole smaller than a Planck length, which probably wouldn't even make sense according to quantum gravity. But the Planck mass is probably also the maximum mass that can be packed into a physically meaningful unit of space - if you try to add mass, you'll just get a black hole with a radius larger than the Planck length, whose density will be lower.
Talk of the Planck mass is misleading here. It is the Planck mass density that is large and maximal. The Planck mass density would equate to 10^96 grams per cubic centimetre.

The value for the mp given in the Particle Data Group's 1998 review is 1.221 x 10^19 GeV. mp is much higher than lp.

And the Planck temperature [highest possible temperature] is radiation with a wavelength of a Planck length. Not coincidentally, this happens to be the same as the temperature of the universe one Planck time after the big bang. Time-temp ratio?

To put this in terms of an ancient dichotomy, the Planck scale is the smallest scale for "form" - it is the smallest coherent unit of spacetime. And also the largest scale for "substance" - the density of energy/mass/temperature that can fit into a unit of spacetime.
Gravity = spacetime = form. Mass/energy/temperature as such do not feature in this explanatory chain. The alternative formulation yields "substance" - mass = wavelength = substance.

Physics Forum: Think of the Planck scale as a complex package that gives you both upper and lower limits. Then as the Universe expands, both the form and the substance "fall" towards their other extreme. So expansion increases spacetime - the form is heading towards maximum largeness. Yet at the same time the substance, the energy density of the Universe, is falling towards its lowest possible level, its lowest possible temperature. The "unpacking" of the Planck scale is thus a move from a hot point to a cold void. What was small at the beginning grows large, and what was large grows small - so a conservation of scale going on.

You get the Planck energy out of lightspeed considerations. The shorter the wavelength, the higher the energy. So if you have a world as small as the Planck scale, there is only room for a single crisp oscillation of about the Planck length. This would be the reason for a crisp upper bound on energy/mass/temperature. If anything "substantial" is happening at the Planck-scale, it would have to start at the shortest possible wavelength, and thus the highest possible energy level. Stress-tension, zero point or vacuum energy? Consciousness?

TGD:
The wormhole contact connects two space-time sheets with induced metric having Minkowski signature. Wormhole contact itself has an Euclidian metric signature so that there are two wormhole throats which are light-like 3-surfaces and would carry fermion and anti-fermion number. In this case a delicate question is whether the space-time sheets connected by wormhole contacts have opposite time orientations or not. If this the case the two fermions would correspond to positive and negative energy particles. First only Higgs was considered as a wormhole contact but there is no reason why this identification should not apply also to gauge bosons (certainly not to gravitons). p 16
Graviton-graviton pairs might emerge in higher orders. Gravitons are stable if the throats of wormhole contacts carry non-vanishing gauge fluxes so that the throats of wormhole contacts are connected by flux tubes carrying the gauge flux. The mechanism producing gravitons would the splitting of partonic 2-surfaces via the basic vertex. A connection with string picture emerges with the counterpart of string identified as the flux tube connecting the wormhole throats. Gravitational constant would relate directly to the value of the string tension. p.17

Winterberg writes in Planck Mass Rotons as Cold Dark Matter and Quintessence, 2002, Label, Analog Models of General Relativity:
According to the Planck aether hypothesis , the vacuum of space is a superfluid made up of Planck mass particles, with the particles of the standard model explained as quasiparticle – excitations of this superfluid. Astrophysical data suggests that ≈70% of the vacuum energy, called quintessence, is a negative pressure medium, with ≈26% cold dark matter and the remaining ≈4% baryonic matter and radiation. This division in parts is about the same as for rotons in superfluid helium, in terms of the Debye energy with a ≈70% energy gap and ≈25% kinetic energy. Having the structure of small vortices, the rotons act like a caviton fluid with a negative pressure. Replacing the Debye energy with the Planck energy, it is conjectured that cold dark matter and quintessence are Planck mass rotons with an energy below the Planck energy.

And an article 'Superconductivity from nowhere' in Physicsworld:
Unlike previously known superconductivity, it would survive at very high temperatures, perhaps billions of degrees. It would also exist alongside strong magnetic fields and, perhaps strangest of all, it wouldn't need a material to exist – just a vacuum. An up quark and a down antiquark can bind to form a positively charged rho meson, but the meson is normally so unstable that it decays. Now they think that in a strong magnetic field the quarks would be forced to move only along the field lines – and this would make the rho mesons far more stable. In addition, the rho meson's own spin would interact with the external magnetic field, lowering the particle's effective mass to zero so that it can move freely, as in a superconductor. The external magnetic field required for this superconductivity must be at least 10^16 T. This is done only in Colliders.
Vacuum superconductivity might not always need particle accelerators, however. The early universe might have had sufficiently strong magnetic fields, and the subsequent super-currents might have seeded the mysterious large-scale magnetic fields seen across the universe today. "It sounds like a crazy idea, but what if it is true?"
Zero point energy is adiabatic and expanding, according to the SM. Maybe this is an explanation? It would fit nicely into this theory. Frank Wilczek also talk of a 'grid' (Wilczek's grid), a vibrant energy field, in "The Lightness of Being".
"The Grid fills space, and is full of spontaneous activity. In some ways it resembles the old idea of “ether”. But the Grid is highly evolved ether, ether on steroids if you like, with many new features. We live inside a medium and that we're all connected. A low-energy supersymmetry, the color superconducting phases of quark matter. In modern physics, energy and space are much more fundamental than mass. That's part of the message. The traditional idea that there are stable, static bodies that are massive and hard to push around has been replaced by a much more fluid concept. Fields are more basic than particles."
There's much more to the world than what our sensory apparatus has evolved to react to. We have more than five senses?
Interactions in virtually empty space give rise to the substance of subatomic particles and complex molecules, and mass. The Grid is permeated with a not-yet-understood property that "slows down" some of the interactions in the field, just as electrons are slowed down in a superconducting medium. In the medium known as the Grid, we perceive that slowed-down quality as mass. It's usually called the Higgs field. We don't know what it's made of. We know it's not made of any of the known forms of matter; they don't have the right properties. So the simplest possibility, logically, is that it's made out of one new thing, and those would be Higgs particles. But I think you get a nicer theory by embedding it in a larger framework, where it's made out of several things. The whole barrier between light and matter has fallen. The underlying reality is much closer to the traditional concept of light than the traditional concept of matter.
This revelation about matter is not only satisfying, but it also opens new doors, say Wilczek.
Wilczek has helped to reveal and develop axions, anyons, asymptotic freedom, the color superconducting phases of quark matter, and other aspects of quantum field theory. He has worked on an unusually wide range of topics, ranging across condensed matter physics, astrophysics, and particle physics.
Axions are very light, very weakly interacting particles, and should contribute much of the dark matter. Axions have been lurking unrecognized on surfaces of bismuth-tin alloys and other materials. To be more precise: the equations that arise in axion physics are the same as those that describe the electromagnetic behaviour of a recently discovered class of materials known, collectively, as topological insulators. The axion field inside topological insulators is an emergent — and subtle — property of collections of electrons that is connected to their spin–orbit coupling.
And “topologically ordered phases” do not exhibit any symmetry breaking. Ordered phases of matter such as a superfluid or a ferromagnet are usually associated with the breaking of a symmetry; and are sensitive to order parameters. The classic experimental probe of topological quantum numbers is magneto-transport, where measurements of the quantization of quantum Hall conductivity σxy = ne2/hbar (where e is the electric charge) reveals the value of the topological number n that characterizes the quantum Hall effect state.

At last:
To Lubos, remember that I am no physicist. I just cite and recombine. With over 100 theories this is a mess. I suppose not only to me.
Are these peoples on the right track? At least they cannot be nonchalated.


References.
Nima Arkani-Hamed, 2010: Space-Time, Quantum Mechanics and Scattering Amplitudes.
- 'Duality for the S-matrix'.
Nima Arkani-Hamed et al. 1999: 'Manyfold Universe' JHEP 0012 (2000) 010 DOI 10.1088/1126-6708/2000/12/010
Nima Arkani-Hamed & co. 1998: The hierarchy problem and new dimensions at a millimeter, at arxive.
Fermi GBM/LAT Collaborations, 2009: Testing Einstein's special relativity with Fermi's short hard gamma-ray burst GRB090510.
Shou-Cheng Zhang and colleagues (X.-L. Qi et al. Phys. Rev. B78, 195424; 2008)
Lee Smolin 2011: The principle of relative locality.
Frank Wilczek 2005: In search of symmetry lost. NATURE VOL 433 , 20 JANUARY 2005. Nature PublishingGroup, 239-247.
- Wilczek on anyons and superconductivity, 1991.
Winterberg 2002: Planck Mass Rotons as Cold Dark Matter and Quintessence.
Superconductivity from nowhere.Physicsworld, Mar 29, 2011.
M.Pitkänen, 2010: General View About Physics in Many-Sheeted Space-Time: Part II. 73 p.
- Macroscopic quantum phenomena and CP2 geometry
- TGD and Astrophysics
- Knots and TGD. 2011
S. Hossenfelder, Lee Smolin, 2009: Phenomenological quantum gravity.
D. Hsieh et al.2009: Observation of topologically protected Dirac spin-textures and pi Berry’s phase in pure Antimony (Sb) and topological insulator BiSb. SCIENCE 323, 919 (2009). http://dx.doi.org/10.1126/science.1167733
LHC bounds on large extra dimensions by A. Strumia and collaborators 2011

and:
Einstein 1905: “Does the Inertia of a Body Depend on Its Energy Content?” m=E/c2
S. Hossenfelder, Bounds on an energy-dependent and observerindependent speed of light from violations of locality, Phys. Rev. Lett. 104 (2010) 140402. [arXiv:1004.0418 [hep-ph]].
S. Majid, Meaning of noncommutative geometry and the Planck-scale quantum group, Lect. Notes Phys. 541 (2000) 227 [arXiv:hep-th/0006166].

11 kommentarer:

  1. http://theoreticalatlas.wordpress.com/2011/03/31/relativity-of-localization/
    Various additional physical assumptions – like the momentum-space “duals” of the equivalence principle (that the combination of momenta works the same way for all kinds of matter regardless of charge), or the strong equivalence principle (that inertial mass and rest mass energy per the relation E = mc^2 are the same) and so forth can narrow down the geometry of this metric and connection. Typically we’ll find that it needs to be Lorentzian. With strong enough symmetry assumptions, it must be flat, so that momentum space is a vector space after all – but even with fairly strong assumptions, as with general relativity, there’s still room for this “empty space” to have some intrinsic curvature, in the form of a momentum-space “dual cosmological constant”, which can be positive (so momentum space is closed like a sphere), zero (the vector space case we usually assume) or negative (so momentum space is hyperbolic).

    This geometrization of what had been algebraic is somewhat analogous to what happened with velocities (i.e. vectors in spacetime)) when the theory of special relativity came along. Insisting that the “invariant” scale c be the same in every reference system meant that the addition of velocities ceased to be linear. At least, it did if you assume that adding velocities has an interpretation along the lines of: “first, from rest, add velocity v to your motion; then, from that reference frame, add velocity w”. While adding spacetime vectors still worked the same way, one had to rephrase this rule if we think of adding velocities as observed within a given reference frame – this became v \oplus w = (v + w) (1 + uv) (scaling so c =1 and assuming the velocities are in the same direction). When velocities are small relative to c, this looks roughly like linear addition. Geometrizing the algebra of momentum space is thought of a little differently, but similar things can be said: we think operationally in terms of combining momenta by some process. First transfer (group-valued) momentum p to a particle, then momentum q – the connection on momentum space tells us how to translate these momenta into the “reference frame” of a new observer with momentum shifted relative to the starting point. Here again, the special momentum scale m_p (which is also a mass scale since a momentum has a corresponding kinetic energy) is a “deformation” parameter – for momenta that are small compared to this scale, things seem to work linearly as usual.

    There’s some discussion in the paper which relates this to DSR (either “doubly” or “deformed” special relativity), which is another postulated limit of quantum gravity, a variation of SR with both a special velocity and a special mass/momentum scale, to consider “what SR looks like near the Planck scale”, which treats spacetime as a noncommutative space, and generalizes the Lorentz group to a Hopf algebra which is a deformation of it. In DSR, the noncommutativity of “position space” is directly related to curvature of momentum space. In the “relative locality” view, we accept a classical phase space, but not a classical spacetime within it.

    We should understand this scale as telling us where “quantum gravity effects” should start to become visible in particle interactions. This is a fairly large scale for subatomic particles. The Planck mass as usually given is about 21 micrograms: small for normal purposes, about the size of a small sand grain, but very large for subatomic particles. Converting to momentum units with c, this is about 6 kg m/s: on the order of the momentum of a kicked soccer ball or so. For a subatomic particle this is a lot.

    SvaraRadera
  2. This scale does raise a question for many people who first hear this argument, though – that quantum gravity effects should become apparent around the Planck mass/momentum scale, since macro-objects like the aforementioned soccer ball still seem to have linearly-additive momenta. Laurent explained the problem with this intuition. For interactions of big, extended, but composite objects like soccer balls, one has to calculate not just one interaction, but all the various interactions of their parts, so the “effective” mass scale where the deformation would be seen becomes N m_p where N is the number of particles in the soccer ball. Roughly, the point is that a soccer ball is not a large “thing” for these purposes, but a large conglomeration of small “things”, whose interactions are “fundamental”. The “effective” mass scale tells us how we would have to alter the physical constants to be able to treat it as a “thing”.

    In “spacetime”, a spaceship travelling a large loop at high velocity will arrive where it started having experienced less time than an observer who remained there (because of the Lorentzian metric) – and a dual phenomenon in momentum space says that particles travelling through loops (also in momentum space) should arrive displaced in space because of the relativity of localization. This could be observed in particle accelerators where particles make several transits of a loop, since the effect is cumulative. Another effect could be seen in astronomical observations: if an observer is observing some distant object via photons of different wavelengths (hence momenta), she might “localize” the object differently – that is, the two photons travel at “the same speed” the whole way, but arrive at different times because the observer will interpret the object as being at two different distances for the two photons.

    This last one is rather weird, and I had to ask how one would distinguish this effect from a variable speed of light (predicted by certain other ideas about quantum gravity).

    Again, this is no more bizarre (mathematically) than the fact that distant, relatively moving, observers in special relativity might disagree about simultaneity, whether two events happened at the same time. They have their own coordinates on spacetime, and transferring between them mixes space coordinates and time coordinates, so they’ll disagree whether the time-coordinate values of two events are the same. Similarly, in this phase-space picture, two different observers each have a coordinate system for splitting phase space into “spacetime” and “energy-momentum” coordinates, but switching between them may mix these two pieces. Thus, the two observers will disagree about whether the spacetime-coordinate values for the different interacting particles are the same. And so, one observer says the interaction is “local in spacetime”, and the other says it’s not. The point is that it’s local for the particles themselves (thinking of them as observers). All that’s going on here is the not-very-astonishing fact that in the conventional picture, we have no problem with interactions being nonlocal in momentum space (particles with very different momenta can interact as long as they collide with each other)… combined with the inability to globally and invariantly distinguish position and momentum coordinates.

    SvaraRadera
  3. That was only an excerpt, but it clearly did not dismiss this idea.

    SvaraRadera
  4. http://www.sciencedaily.com/releases/2011/04/110405161345.htm
    Time-Delayed Jets Around Young Star

    This would confirm the varying c, AND ZEO.

    "Now we know that in at least one case, there appears to be a delay, which tells us that some sort of communication may be going on between the jets that takes time to occur."remained hidden behind a dark cloud. Spitzer's sensitive infrared vision was able to pierce this cloud, revealing the obscured jet in greater detail.

    the newfound jet is perfectly symmetrical to its twin, with identical knots of ejected material.

    This symmetry turned out to be key to the discovery of the jets' time delay. By measuring the exact distances from the knots to the star, the astronomy team was able to figure out that, for every knot of material punched out by one jet, a similar knot is shot out in the opposite direction 4.5 years later. For more information about Spitzer, visit http://spitzer.caltech.edu/ and http://www.nasa.gov/spitzer . materials provided by NASA/Jet Propulsion Laboratory. http://www.jpl.nasa.gov/

    Jets are an active phase in a young star's life. A star begins as a collapsing, roundish cloud of gas and dust. By ejecting supersonic jets of gas, the cloud slows down its spinning. As material falls onto the growing star, it develops a surrounding disk of swirling material and twin jets that shoot off from above and below the disk, like a spinning top.

    Once the star ignites and shines with starlight, the jets will die off and the disk will thin out.

    SvaraRadera
  5. http://www.sciencedaily.com/releases/2011/04/110405084252.htm
    Jiří Tomkovič, Michael Schreiber, Joachim Welte, Martin Kiffner, Jörg Schmiedmayer, Markus K. Oberthaler. Single spontaneous photon as a coherent beamsplitter for an atomic matter-wave. Nature Physics, 2011; DOI: 10.1038/nphys1961

    Standing in front of a mirror, we can easily tell apart ourselves from our mirror image. The mirror does not affect our motion in any way. For quantum particles, this is much more complicated. In a spectacular experiment in the labs of the University of Heidelberg, a group of physicists at the University Heidelberg, together with colleagues at TU Munich and TU Vienna extended a 'thought experiment' by Einstein and managed to blur the distinction between a particle and its mirror image.

    When an atom emits light (i.e. a photon) into a particular direction, it recoils in the opposite direction. If the photon is measured, the motion of the atom is known too. The scientists placed atoms very closely to a mirror. In this case, there are two possible paths for any photon travelling to the observer: it could have been emitted directly into the direction of the observer, or it could have travelled into the opposite direction and then been reflected in the mirror. If there is no way of distinguishing between these two scenarios, the motion of the atom is not determined, the atom moves in a superposition of both paths.

    SvaraRadera
  6. http://riofriospacetime.blogspot.com/2011/04/end-of-beyond-einstein.html

    The local conditions of Special Relativity, which do not allow for gravity, can be linked with the curved Space/Time of General Relativity. The solution can explain much about the Universe--its size, expansion rate and whether that expansion will stop or reverse. A bit of math can explain the non-linear increase of supernova redshifts, the cosmic horizon problem, the "flatness" problem, even the 4.507034% proportion of baryons. Long before the expensive Space missions would have launched, we may already be Beyond Einstein.

    SvaraRadera
  7. http://online.kitp.ucsb.edu/online/qcdscat11/arkanihamed/

    Gravity + QM = spacetime is doomed!
    Locality+Unitarity!

    Look at this!

    SvaraRadera
  8. Part II of Nimas talk.
    http://online.kitp.ucsb.edu/online/qcdscat11/arkanihamed2/

    SvaraRadera
  9. Matti has written two posts relating to this today.

    http://matpitka.blogspot.com/2011/04/how-arrow-of-geometric-time-is-selected.html

    http://matpitka.blogspot.com/2011/04/objection-against-zero-energy-ontology.html

    See also http://en.wikipedia.org/wiki/Flipped_SU%285%29

    SvaraRadera
  10. http://www.kurzweilai.net/big-bang-or-big-chill-the-quantum-graphity-theory

    SvaraRadera
  11. http://www.newscientist.com/article/mg21128241.700-beyond-spacetime-welcome-to-phase-space.html

    Smolin's relevant manuscripts are:

    G. Amelino-Camelia, L. Freidel, J. Kowalski-Glikman, and L Smolin, The principle of relative locality. http://arxiv.org/abs/1101.0931

    L. Freidel and L. Smolin, Gamma ray burst delay times probe the geometry of momentum space. http://arxiv.org/abs/1103.5626

    ...........So what is phase space? It is a curious eight-dimensional world that merges our familiar four dimensions of space and time and a four-dimensional world called momentum space.

    SvaraRadera