onsdag 24 november 2010

Grandma


I must send some wonderful pictures of me and my grandchildren, Cina and Mathias, 2 year resp. 5 months. Also the little boy understand important things, like what he would wish for christmas from Father Christmas :) The knee full of warm children, wonderful. I remember my own motherhood with four children. I also remembered it when I was babywatch on saturday evening and the little one said I did not cook his soup as his mother did, nor the milk as mom. He was really angry at me :) Puh, at last he slept, and so mom and dad came home, and Cina and I had just sat down on the floor with LEGO constructioning of fantastic castles. NO, no, she said, go away....

It is not so easy to be a mom or dad. Or big sister. They came home so fast... They thought that little Mathias was not so pleased...

He is a very intelligent boy for 5 months. He smiles as he wants to say some comment when we talk, and his eyes look so intelligent.


And he laughs at his sisters troublesome behaviour, then he tries the same, but his body doesn't do like he wants it to do. He is so frustrated. And the speech don't succeed either, aich.

måndag 22 november 2010

Evolution is a process.

On Physics arXive Blog they talked about biology and evolution, the possibility that life as it has evolved on Earth is but a local minima in a vast landscape of evolutionary possibilities. If that's the case, biologists are studying a pitifully small fraction of something bigger. Much bigger. Then the charachter of Life could also look very much differently from what we see on Earth? Already here we have such a diversity of life forms. Take nanobacterias for instance. And yet many are undiscovered, also in the human body.

Usually biological evolution stops at the emergence of Life and the cosmical evolution starts, but there is no essential difference between the two concepts. There was a wast evolution also before Life was created on Earth. The RNA World is much about that pre-biological evolution.

As I see it Life is a phase of intermediate matter, where the quantum world is very much ruling. The Schrödinger cat is both dead and alive. It is not about chemistry, because also chemistry is ruled by the same quantum world. Merely you can see ordinary matter and living matter as two different paths of realisation. For living matter the governing rule is the capacity to react on surroundings and change the surroundings in own favor. Ordinary matter is bad at that. Essentially Life is about consciousness and qualias, which requires self-awareness and memory.
Today, we get an important insight into this state of affairs thanks to a fascinating paper by Nigel Goldenfeld and Carl Woese at the University of Illinois. Life Is Physics: Evolution As A Collective Phenomenon Far From Equilibrium. Goldenfeld is a physicist by training while Woese, also a physicist, is one of the great revolutionary figures in biology. In the 1970s, he defined a new kingdom of life, the Archae, and developed a theory of the origin of life called the RNA world hypothesis, which has gained much fame or notoriety depending on your viewpoint.
They suggest that biologists need to think about their field in a radical new way: as a branch of condensed matter physics. Their basic conjecture is that life is an emergent phenomena that occurs in systems that are far out of equilibrium. Goldenfeld and Woese say that biologists' closed way of thinking on this topic is embodied by the phrase: all life is chemistry. Nothing could be further from the truth.

And they take the example of superconductivity. No wonder.

The real explanation is much more interesting and profound. It turns out that many of the problems of superconductivity are explained by a theory which describes the relationship between electromagnetic fields and long range order. When the symmetry in this relationship breaks down, the result is superconductivity. And it doesn't just happen in materials on Earth. This kind of symmetry breaking emerges in other exotic places such as the cores of quark stars. Superconductivity is an emergent phenomenon and has little to do with the behaviour of atoms.

Life is like superconductivity. It is an emergent phenomenon and we need to understand the fundamental laws of physics that govern its behaviour. Consequently, only a discipline akin to physics can reveal such laws and biology as it is practised today does not fall into this category.

In the paper they point to some crucial facts:
- we have too little genes to encode for all that is needed, brain only would need them all. The collapse of the doctrine of one gene for one protein, with one direction of causal flow from basic codes. The key to complexity is not more genes, but more combinations and interactions generated by fewer units of code and many of these interactions (as emergent properties) must be explained at the level of their appearance, for they cannot be predicted from the separate underlying parts alone. Genes need to be treated collectively.
- majority of signaling is microbial also in the body, with coordinated division of labor, cellular differentiation and cooperative defense
- evolution was equal to 'natural selection', and relegated to a peripheral role during the development of molecular biology and high-tech biophysic. Collective phenomena are important also in microbiology etc.
- biology may extend the frontier of non-equilibrium physics, revealing principles of self-organization that seem absent in purely physical processes such as pattern formation
- sexual reproduction, fitness, growth rates

I could add to that list a lot.

What about death then? What brings the flow to cease?

The main problem with Life is its coherence and synchrony and that is not "far from equilibrium" - on the contrary, it is very much about equilibrium, but as a "bubble in bubbles", a phase transition. Energetically life is an open system, though, and requires an energy flow going through the ecosystem. Maybe, if you take entropy as the leading star but Life is about negentropy. A maximation of negentropy, (or a minimal decoherence) that is collected from environment by the superconductive nerves, perception organs, and at the far end the carbon networks. It is no random chance Life is based on carbon. This year the Nobel prize in physics went to graphene, a very interesting material in the science of what Life is, although arbitrary.
To the biologist interested in practical issues, we ask that you do not dismiss the seemingly useless and na¨ıve issues that we necessarily raise. On the one hand, a fundamental understanding of evolution may not seem to offer immediate benefits in terms of finding the next wonder drug; on the other hand, the lack of appreciation for the rapidity and pervasiveness of evolution has, within a lifetime, destroyed the effectiveness of numerous antibiotics, and probably is responsible for the limited success of the treatment of cancer. The biomedical-industrial complex cannot afford to ignore the need to create a fundamental science of biology.
That is true. Physical texts about biology often seems so naive, but they are looking for the similarities, the first principles.
The lack of structure in the way that biology is traditionally presented reflects the field’s unavoidable focus on a single sample path; however, the underlying evolutionary process itself is surely one with deep mathematical structure, capable of expression and elucidation in unifying terms based on emergent physical laws. This is a true frontier of physics, but one that will require a great deal of what has been termed (in another context of non-equilibrium physics) “open-minded spadework” to unearth.
Superconductivity is best understood as arising from the breaking of the global U(1) gauge symmetry in the effective field theory that describes the interaction between offdiagonal long-range order and the electromagnetic field. It is not about chemistry. There is nothing fundamental about the atoms or molecules. Examples are topological insulators, other examples of major significance include the Aharonov-Bohm effect, the quantum spin Hall effect, localization, and the renaissance in atomic, molecular and optical physics provided by the experimental realization of atomic Bose-Einstein condensates.

It certainly is not a microscopic theory, because it lacks a biochemical level of description, and it is certainly an effective theory, valid only when there is a separation of scales between ecosystem dynamics and gene mutation dynamics.
First, the very existence of the phenomenon of life needs to be understood.
Second, the realization or instantiation of it, on Earth, for example, needs to be understood. For the most part, it is fair to say that the discipline of biology has neglected the first condition, and in pursuit of the second, has confused understanding of the realization with understanding of the phenomenon. This has had a number of unfortunate consequences, which arguably have hindered both the conceptual development of biology and the proper application of foundational understanding to societal applications.
Perhaps the primary shortcoming of the biological enterprise is the manifest failure to account for the phenomenon of the existence of life. Without doubt, this failure reflects not only on biologists, but also on physicists. Biology is a big anomaly, echoes in my brain. Said by Matti Pitkänen.

At last something begins to happen.

fredag 19 november 2010

Not from dark matter?

The Galactic centre is one of the most dynamic places in our Galaxy. It is thought to be home to a gigantic black hole, called Sagittarius A.
January 12th 2008: Antimatter Cloud Traced To Binary Stars. Four years of observations from the European Space Agency’s Integral (INTErnational Gamma-Ray Astrophysics Laboratory) satellite may have cleared up one of the most vexing mysteries in our Milky Way: the origin of a giant cloud of antimatter surrounding the galactic center. What? I look at NASA 01.09.08 .


Integral mapped the glow of 511 keV gamma rays from electron-positron annihilation. The map shows the whole sky, with the galactic center in the middle. The emission extends to the right.
Look at the picture of M87 in the earlier post. They show the same kind of nucleus. The cloud shows up because of the gamma rays it emits when individual particles of antimatter, in this case positrons, encounter electrons, their normal matter counterpart, and annihilate one another.

Integral found that the cloud extends farther on the western side of the galactic center than it does on the eastern side. This imbalance matches the distribution of a population of binary star systems that contain black holes or neutron stars, strongly suggesting that these binaries are churning out at least half of the antimatter, and perhaps all of it. The detection of an asymmetry. It is lopsided with twice as much on one side of the galactic centre as the other. Such a distribution is highly unusual because gas in the inner region of the galaxy is relatively evenly distributed.

The cloud itself is roughly 10,000 light-years across, and generates the energy of about 10,000 Suns. The cloud shines brightly in gamma rays due to a reaction governed by Einstein’s famous equation E=mc^2. Negatively charged subatomic particles known as electrons collide with their antimatter counterparts, positively charged positrons. When electrons and positrons meet, they can annihilate one another and convert all of their mass into gamma rays with energies of 511,000 electron-volts (511 keV). Scientists don’t understand how low-mass X-ray binaries could produce enough positrons to explain the cloud, and they also don’t know how they escape from these systems. "We expected something unexpected, but we did not expect this," says Skinner. The antimatter is probably produced in a region near the neutron stars and black holes, where powerful magnetic fields launch jets of particles that rip through space at near-light speed.

The antimatter cloud was discovered in the 1970s. What? Why do they then not talk about it, but say there is no dark matter? Incredible.

Scientists have proposed a wide range of explanations for the origin of the antimatter, which is exceedingly rare in the cosmos. For years, many theories centered around radioactive elements produced in supernovae, prodigious stellar explosions. Others suggested that the positrons come from neutron stars, novae, or colliding stellar winds. Particles of dark matter were annihilating one another, or with atomic matter, producing electrons and positrons that annihilate into 511-keV gamma rays. But other scientists remained skeptical, noting that the dark matter particles had to be significantly lighter than most theories predicted.

Well, we have just seen that the 'Higg is probably much lighter than expected. If it even exist at all?

Integral found certain types of binary systems near the galactic center are also skewed to the west. These systems are known as hard low-mass X-ray binaries, since they light up in high-energy (hard) X-rays as gas from a low-mass star spirals into a companion black hole or neutron star. Because the two "pictures" of antimatter and hard low-mass X-ray binaries line up strongly suggests the binaries are producing significant amounts of positrons. NASA’s Gamma-ray Large Area Space Telescope (GLAST), scheduled to launch in 2008, may help clarify how objects such as black holes launch particle jets. That paper we had just seen.

The unexpectedly lopsided shape is a new clue to the origin of the antimatter. The observations have significantly decreased the chances that the antimatter is coming from the annihilation or decay of astronomical dark matter. Some astronomers have suggested that exploding stars could produce the positrons. This is because radioactive nuclear elements are formed in the giant outbursts of energy, and some of these decay by releasing positrons. However, it is unclear whether these positrons can escape from the stellar debris in sufficient quantity to explain the size of the observed cloud.

Not from dark matter?
Dark matter is thought to exist throughout the Universe – undetectable matter that differs from the normal material that makes up stars, planets and us. It is also believed to be present within and around the Milky Way, in the form of a halo.

The recent study has found that the ‘positrons’ fuelling the radiation are not produced from dark matter but from an entirely different, and much less mysterious, source: massive stars explode and leave behind radioactive elements that decay into lighter particles, including positrons.

The reasoning behind the original hypothesis was that positrons, being electrically charged, would be affected by magnetic fields and thus would not be able to travel far. As the radiation was observed in places that did not match the known distribution of stars, dark matter was invoked as an alternative for the origin of the positrons.

But the recent finding by a team of astronomers led by Richard Lingenfelter at the University of California at San Diego, proves otherwise. The astronomers show that the positrons formed by radioactive decay of elements left behind after explosions of massive stars are, in fact, able to travel great distances, with many leaving the thin Galactic disc.


We can convert pure energy into equal amounts of matter and antimatter. Matter and antimatter can also convert back to energy. In a universe with equal amounts of matter and antimatter, we’d expect most of it to annihilate. We know that particles called neutral B mesons can fluctuate from matter to antimatter and back again, like a seesaw going up and down.

New physics?
SkiDaily aug 2010: The Tevatron experiments DZero found that colliding protons in their experiment produced short-lived B meson particles that almost immediately broke down into debris that included slightly more matter than antimatter. The two types of matter annihilate each other, so most of the material coming from these sorts of decays would disappear, leaving an excess of regular matter behind. But the truly exciting implication is that the experiment implies that there is new physics, beyond the widely accepted Standard Model, that must be at work. If that's the case, major scientific developments lie ahead.

The explanation for the dominance of matter in the present day universe is that the CP violation treated matter and antimatter differently. New result provides evidence of deviations from the present theory in the decays of B mesons, in agreement with earlier hints,so the measurement is consistent with any known effect is below 0.1 percent (3.2 standard deviations). When matter and anti-matter particles collide in high-energy collisions, they turn into energy and produce new particles and antiparticles. At the Fermilab proton-antiproton collider, scientists observe hundreds of millions every day.
Both CDF (top quark) and DZero (bottom quark) therefore continue to collect data.

DZero physicist wins prestigious UK prize.

Borissov, from Lancaster University in England, was recognized by the UK society for his work on B physics. He was also part of a team whose work received media attention, including a front page article in the New York Times, in May of 2010 when DZero found evidence of a bias toward matter over antimatter. This observation, called dimuon charge asymmetry, could help to explain why the universe as we know it is composed of matter. “The Standard Model is not able to explain why we have the world around us.”

But there is also a parity asymmetry?

Dark matter and antimatter relations.
Here I will explain the difference between matter, anti matter, dark matter, and negative matter in a concise and understandable way, say Hontas Farmer. Dark matter only interacts by way of gravity and the weak atomic force. Dark matter does not interact via the strong atomic force hence dark matter cannot be seen. It only interacts via the weak force which is what keeps neutrons and protons inside the nucleus of atoms together. Such is why experiments to detect dark matter directly rely on a particle of dark matter bumping into a particle of matter.

But that is difficult without some strong force in the surroundings, or that all the forces vanish?

Antimatter as dark matter?
It is conceivable that the dark matter (or at least part of it) could be antimatter, but there are very strong experimental reasons to doubt this. For example, if the dark matter out there were antimatter, we would expect it to annihilate with matter whenever it meets up with it, releasing bursts of energy primarily in the form of light. We see no evidence in careful observations for that, which leads most scientists to believe that whatever the dark matter is, it is not antimatter.
"It would take longer than the age of the universe to make one gram of antimatter."

Dark matter is not necessarily anti-matter. Dark matter is matter that for some reason doesn't emit or reflect light. It could be purely transparent, never interacting with light. Invisibility is a change of Planck constant? Antimatter goes to the past is also said. What would that mean? We cannot see the past? The largest part of dark matter which does not interact with electromagnetic radiation is not only "dark" but also by definition utterly transparent; in recognition of this, it has been referred to as transparent matter by some astronomers".
Water is also transparent.

Antimatter could be anyons? That is seen in superconduction and condensed matter physics. "Some exotic particle-like structures known as anyons appear to entwine in ways that could lead to robust quantum computing schemes", according to Kirill Shtengel. The anyons the researchers believe they have created are not true particles that can exist on their own, like electrons or protons. Instead anyons are quasiparticles that exist only inside a material, but move in ways that resemble free particles. Although braids in three dimensions unravel easily, braids trapped in two dimensions can't pull apart, which means they're able to withstand disturbances that would scramble the data and calculations in other quantum computers.

Wilczek. Quantum Computers and Anyons, in Edge Bio page.
I am very fond of anyons because I worked at the beginning on the fundamental physics involved. It was thought, until the late 70s and early 80s, that all fundamental particles, or all quantum mechanical objects that you could regard as discrete entities fell into two classes: so-called bosons after the Indian physicist Bose, and fermions, after Enrico Fermi. Bosons are particles such that if you take one around another, the quantum mechanical wave function doesn't change. Fermions are particles such that if you take one around another the quantum mechanical wave function is multiplied by a minus sign. It was thought for a long time that those were the only consistent possibilities for behavior of quantum mechanical entities. In the late 70s and early 80s, we realized that in two plus one dimensions, not in our everyday three dimensional space (plus one dimension for time), but in planar systems, there are other possibilities. In such systems, if you take one particle around another, you might get not a factor of one or minus one, but multiplication by a complex number—there are more general possibilities.

More recently, the idea that when you move one particle around another, it's possible not only that the wave function gets multiplied by a number, but that it actually gets distorted and moves around in bigger space, has generated a lot of excitement. Then you have this fantastic mapping from motion in real space as you wind things around each other, to motion of the wave function in Hibert space—in quantum mechanical space. It's that ability navigate your way through Hilbert space—that connects to quantum computing and gives you access to a gigantic space with potentially huge bandwidth that you can play around with in highly parallel ways, if you're clever about the things you do in real space.

But in anyons we're really at the primitive stage. There's very little doubt that the theory is correct, but the experiments are at a fairly primitive stage—they're just breaking now.

Back to the blackboard?

Reference: ‘An asymmetric distribution of positrons in the galactic disk revealed by gamma rays’ by Georg Weidenspointner et al. 10 January, in Nature.

Lingenfelter, R.E., Higdon, J.C. Rothschild, R.E. Is There a Dark Matter Signal in the Galactic Positron Annihilation Radiation? Physical Review Letters, 2009; 103 (3): 031301 DOI: 10.1103/PhysRevLett.103.031301

The D0 Collaboration: V.M. Abazov, et al. Evidence for an anomalous like-sign dimuon charge asymmetry. Physical Review D, 2010. http://arxiv.org/abs/1005.2757
R. L. Willett, L. N. Pfeiffer, K. W. West. Alternation and interchange of e/4 and e/2 period interference oscillations consistent with filling factor 5/2 non-Abelian quasiparticles. Physical Review B, 2010; 82: 205301 DOI: 10.1103/PhysRevB.82.205301

Kirill Shtengel. Non-Abelian anyons: New particles for less than a billion? APS Physics, 2010; 3 (93) DOI: 10.1103/Physics.3.93

The Fermilab Tevatron experiments: the APS journals Physical Review Letters and Physical Review D. Aug. 17, 2010 , and 14 may 2010.

The DZero result is based on data collected over the last eight years by the DZero experiment at Fermilab. Besides Ellison, the UC Riverside co-authors of the paper, submitted for publication in Physical Review D, are Ann Heinson, Liang Li, Mark Padilla, and Stephen Wimpenny.

http://www-d0.fnal.gov/Run2Physics/WWW/results/final/B/B10A/

Cern, antimatter, Mirror of the universe, links.

onsdag 17 november 2010

Antimatter - antihydrogen trapped at Cern.

Seconds after the Big Bang, the Universe was a soup of matter and antimatter. But today, everything around us is matter. So what and where is the antimatter? Could antimatter be the missing dark matter?

Physicists at CERN in Geneva are the first to capture and store atoms of antimatter for long enough to study its properties in detail. Working at the lab's ALPHA experiment, the team managed to trap 38 anti-hydrogen atoms for about 170 ms. Setting the antihydrogen free to annihilate with surrounding matter created several charged particles including pions. Tremendous? A door is opened to explore the symmetry question and models of neutron stars (->supernovas) and black hole collapse. The antiprotons could be used to study how antimatter interacts with gravity and to make the best ever measurement of the magnetic moment of the antiproton.
During those early moments, matter was an ultrahot, superdense brew of particles called quarks and gluons rushing hither and thither and crashing willy-nilly into one another. A sprinkling of electrons, photons and other light elementary particles seasoned the soup. This mixture had a temperature in the trillions of degrees, more than 100,000 times hotter than the sun's core. Video here.
Antimatter was first predicted in 1931, by Dirac, detected 1995. But an anti-atom is not easy to create. There are many degrees of freedom. Look at the proton only.
The new antinucleus . . . is a negatively charged state of antimatter containing an antiproton, an antineutron, and an anti-Lambda particle. It is also the first antinucleus containing an anti-strange quark. "This experimental discovery may have unprecedented consequences for our view of the world," commented theoretical physicist Horst Stoecker. "This antimatter pushes open the door to new dimensions in the nuclear chart - an idea that just a few years ago, would have been viewed as impossible."
Anti-electrons are used regularly in the medical technique of positron emission tomography scanning. Antihydrogen has been produced at Cern since 2002, and is of interest for use in a precision test of nature’s fundamental symmetries. The charge conjugation/parity/time reversal (CPT) theorem, a crucial part of the foundation of the standard model of elementary particles and interactions, demands that hydrogen and antihydrogen have the same spectrum. Any deviations from this could help physicists identify new physics – and explain why there is much more matter than antimatter in the universe. Hints at the asymmetry has come from neutrino-antineutrino scattering. Can it be seen also here?

At Nature 17.11. 2010:
Given the current experimental precision of measurements on the hydrogen atom (about two parts in 1014 for the frequency of the 1s-to-2s transition), subjecting antihydrogen to rigorous spectroscopic examination would constitute a compelling, model-independent test of CPT. Antihydrogen could also be used to study the gravitational behaviour of antimatter. However, so far experiments have produced antihydrogen that is not confined, precluding detailed study of its structure. Here we demonstrate trapping of antihydrogen atoms. From the interaction of about 107 antiprotons and 7×108 positrons, we observed 38 annihilation events consistent with the controlled release of trapped antihydrogen from our magnetic trap; the measured background is 1.4±1.4 events. This result opens the door to precision measurements on anti-atoms, which can soon be subjected to the same techniques as developed for hydrogen.


Fig. a, Measured tz distribution for annihilations obtained with no bias (green circles), left bias (blue triangles), right bias (red triangles) and heated positrons (violet star). The grey dots are from a numerical simulation of antihydrogen atoms released from the trap during the quench. The simulated atoms were initially in the ground state, with a maximum kinetic energy of 0.1meV. The typical kinetic energy is larger than the depth of the neutral trap, ensuring that all trappable atoms are considered. The 30-ms observation window includes 99% of the 20,000 simulated points. b, Experimental tz distribution, as above, shown along with results of a numerical simulation of mirror-trapped antiprotons being released from the trap. The colour codes are as above and there are 3,000 points in each of the three simulation plots. In both a and b, the simulated z distributions were convolved with the detector spatial resolution, of ~5mm.

Antimatter unstable.
Jun 23,2005: Particles interacting with each other through electrical or Coulomb forces. Starting with a proton, an antiproton, an electron and a positron, for instance, it is possible to form two stable atoms: hydrogen, which contains a proton and an electron, and antihydrogen, which contains an antiproton and a positron. However, Gridnev and Greiner show that these two atoms cannot form a molecule because there is no molecular state with an energy that is less than the combined energy of the individual atoms. Phys. Rev. Lett. 94 223402
Molecules can only form in such systems if a certain function of the four masses is greater than a particular value. Antihydrogen molecule is not stable, and that replacing the hydrogen atom with heavier isotopes (deuterium and tritium) does not make it stable either. Moreover, other exotic systems, such as muonium-antimuonium, are also unstable. "The hydrogen-antihydrogen molecule is unstable because the proton and antiproton get too close together and are therefore seen as a neutral combination by the other particles," Gridnev told. When hydrogen and antihydrogen meet the result is protonium (a bound state of a proton and an antiproton) and positronium (an electron-positron bound state).

March 4, 2010: Today the most heavy detected antimatter is He. Scientists at RHIC analyzed about a hundred million collisions to spot the new antinuclei, identified via their characteristic decay into a light isotope of antihelium and a positive pi-meson. Altogether, 70 examples of the new antinucleus were found. It should be possible to discover even heavier antinuclei.

The elements that make up basic matter contain protons and neutrons (up and down quarks). Now if we were to organize these elements by the amount of protons they contain, we would end up with the standard two-dimensional Periodic Table of Elements. Physicists take it a step further and organize the table by not only the number of protons, but the number of neutrons while adding an entirely new third dimension that measures a nuclei's "strangeness." (S on the table below, Z represents number of protons and N, neutrons). Nuclei containing one or more strange quarks are called hypernuclei.
For all ordinary matter, with no strange quarks, the strangeness value is zero and the chart is flat. Hypernuclei appear above the plane of the chart. The new discovery of strange antimatter with an antistrange quark (an antihypernucleus) marks the first entry below the plane.



The diagram above is known as the 3-D chart of the nuclides. The familiar Periodic Table arranges the elements according to their atomic number, Z, which determines the chemical properties of each element. Physicists are also concerned with the N axis, which gives the number of neutrons in the nucleus. The third axis represents strangeness, S, which is zero for all naturally occurring matter, but could be non-zero in the core of collapsed stars. Antinuclei lie at negative Z and N in the above chart, and the newly discovered antinucleus (magenta) now extends the 3-D chart into the new region of strange antimatter.

“The strangeness value could be non-zero in the core of collapsed stars,” said Jinhui Chen. In both nucleus-nucleus collisions at RHIC and in the Big Bang, quarks and antiquarks emerge with equal abundance. At RHIC, among the collision fragments that survive to the final state, matter and antimatter are still close to equally abundant, even in the case of the relatively complex antinucleus and its normal-matter partner featured in the present study. In contrast, antimatter appears to be largely absent from the present-day universe.

Bubbles of broken symmetry.
February 15, 2010: Data suggest symmetry may ‘melt’ along with protons and neutrons.
hints of profound symmetry transformations in the hot soup of quarks, antiquarks, and gluons produced in RHIC’s most energetic collisions. In particular, the new results, reported in the journal Physical Review Letters, suggest that “bubbles” formed within this hot soup may internally disobey the so-called “mirror symmetry” that normally characterizes the interactions of quarks and gluons.
Some crucial features of symmetry-altering bubbles speculated to have played important roles in the evolution of the infant universe. Bubbles, or local regions of “broken” symmetry may occur at extreme temperatures near transitions from one phase of matter to another, that is transformations of space, time, and particle types may be different (oppositely oriented charge separation in bubbles at different locations?). Analogous symmetry-altering bubbles created at an even earlier time in the universe helped maybe to establish the preference for matter over antimatter in our world.
RHIC exp. is 250,000* times hotter than the center of the Sun, and a transition to a new phase of nuclear matter known as quark-gluon plasma. Furthermore, as the colliding nuclei pass near each other, they produce an ultra-strong magnetic field that facilitates detecting effects of the altered symmetry.
It hint at a violation in what is known as mirror symmetry, or parity. This rule of symmetry suggests that events should occur in exactly the same way whether seen directly or in a mirror, with no directional dependence. But STAR has observed an asymmetric charge separation in particles emerging from all but the most head-on collisions at RHIC: The observations suggest that positively charged quarks may prefer to emerge parallel to the magnetic field in a given collision event, while negatively charged quarks prefer to emerge in the opposite direction. Because this preference would appear reversed if the situation were reflected through a mirror, it appears to violate mirror symmetry.
The theory suggests that particles with the same sign of electric charge should tend to be emitted from such local parity-violating regions in the same direction, either both parallel, or both anti-parallel, to the magnetic field arising in the collision, whereas unlike-sign particles should be emitted in opposite directions.

Data also suggest the local breaking of another form of symmetry, known as charge-parity, or CP invariance. According to this fundamental physics principle, when energy is converted to mass or vice-versa according to Einstein’s famous E=mc2 equation, equal numbers of particles and oppositely charged antiparticles must be created or annihilated.

Matter-antimatter oscillations.
From odd behavior in a particle called the BS meson, which flips back and forth between its matter and antimatter forms three trillions times per second (mixing). As the exotic, short-lived particles produced in the collisions progressively decayed to more stable particles such as electrons, a collision product known as a neutral B meson appeared to decay more often into muons—unstable particles that exist for roughly two millionths of a second before decaying further—than into antimuons. "We observe an asymmetry that is close to 1 percent."

"We find that the phase of the Bs mixing amplitude deviates more than 3 sigma from the Standard Model prediction. While no single measurement has a 3 sigma significance yet, all the constraints show a remarkable agreement with the combined result."

They are known as neutral mesons because they carry no net electric charge. In their brief lifetimes, they can oscillate between two forms, each the antiparticle of the other, Denisov explains. The difference is that Bs mesons oscillate much faster, giving them more flexibility to change from a matter progenitor to an antimatter progenitor, or vice versa. "Neutral B mesons are really interesting because they can basically go back and forth between matter and antimatter, to simplify things a bit, and we would have thought that they would spend an equal time as each," Denisov says. "What we're measuring now, it looks like they prefer matter."

Researchers believe that such a breakdown, known as CP violation, is required to explain why matter is so abundant. But the generally observed CP breaking is far too small to explain the difference, when about 2 - 4 % is matter. DZero and its detector, CDF, focus on the BS, which consists of a bottom quark and a strange antiquark. Data make it 99.7 percent likely that the discrepancy is real, but that is not enough yet for result to be real. Newer result is well compati-
ble with these results.

Researchers strongly suspect that the key to this riddle lies in the weak nuclear force, which governs radioactive decay, along with more exotic reactions. CP symmetry states that a particle ought to behave identically to the mirror image of its antiparticle, but not when acted on by the weak nuclear force.

See also blogs Physics Buzz, symmetry breaking, and uncertain principles for discussion.

söndag 14 november 2010

More about galactic tornados.

Picture: NASA/JPL-Caltech/UCLA
Ucla, newsroom Astronomers report an unprecedented elongated double helix nebula near the center of our Milky Way galaxy. The part of the nebula the astronomers observed stretches 80 light years in length. "We see two intertwining strands wrapped around each other as in a DNA molecule," said Mark Morris. "Nobody has ever seen anything like that before in the cosmic realm. Most nebulae are either spiral galaxies full of stars or formless amorphous conglomerations of dust and gas — space weather. What we see indicates a high degree of order." The double helix nebula is approximately 300 light years from the enormous black hole at the center of the Milky Way. (The Earth is more than 25,000 light years from the black hole at the galactic center.)

This new gamma ray went 25000 lightyears to both sides.

"We know the galactic center has a strong magnetic field that is highly ordered and that the magnetic field lines are oriented perpendicular to the plane of the galaxy," Morris said. "If you take these magnetic field lines and twist them at their base, that sends what is called a torsional wave up the magnetic field lines.
"You can regard these magnetic field lines as akin to a taut rubber band," Morris added. "If you twist one end, the twist will travel up the rubber band."
"We see this twisting torsional wave propagating out. We don't see it move because it takes 100,000 years to move from where we think it was launched to where we now see it, but it's moving fast — about 1,000 kilometers per second — because the magnetic field is so strong at the galactic center — about 1,000 times stronger than where we are in the galaxy's suburbs." A strong, large-scale magnetic field can affect the galactic orbits of molecular clouds by exerting a drag on them. It can inhibit star formation, and can guide a wind of cosmic rays away from the central region; understanding this strong magnetic field is important for understanding quasars and violent phenomena in a galactic nucleus. This magnetic field is strong enough to cause activity that does not occur elsewhere in the galaxy; the magnetic energy near the galactic center is capable of altering the activity of our galactic nucleus. All galaxies that have a well-concentrated galactic center may also have a strong magnetic field at their center.
The magnetic field at the galactic center, though 1,000 times weaker than the magnetic field on the sun, occupies such a large volume that it has vastly more energy than the magnetic field on the sun. It has the energy equivalent of 1,000 supernovae.
What launches the wave, twisting the magnetic field lines near the center of the Milky Way? Morris thinks the answer is not the monstrous black hole at the galactic center, at least not directly.
Orbiting the black hole like the rings of Saturn, several light years away, is a massive disk of gas called the circumnuclear disk; Morris hypothesizes that the magnetic field lines are anchored in this disk. The disk orbits the black hole approximately once every 10,000 years.
"Once every 10,000 years is exactly what we need to explain the twisting of the magnetic field lines that we see in the double helix nebula," Morris said.
Published in Nature.
UCLA astronomers present the first evidence that tens of thousands of black holes are orbiting the monstrous black hole at the center of the Milky Way. The supermassive black hole, with a mass more than 3 million times that of our sun, is in the constellation of Sagittarius.

How were millions of young stars able to form at the center of our Milky Way galaxy in the presence of an enormous black hole with a mass 4 million times that of the sun? Millions of young stars packed closely together in this region are obscured by enormous quantities of dust but are easier to observe in the infrared because infrared light can penetrate the dust. More star formation is occurring in this region than anywhere else in the galaxy.

The Milky Way has "a wimpy galactic center," according to Becklin, who noted that while black holes in the center of other galaxies can be up to billions of times the mass of our sun, ours is only some 4 million times as massive. "Our previous assumption was that the black hole would make that star formation next to impossible; the tidal forces would not allow the collapse of a cloud of gas and dust to form a star. But it's happening, within just a light year of the black hole," said Mark Morris. "We are trying to understand, through observations using both short and long infrared wavelengths, what happens to the dust and gas that allows stars to form. We have some ideas.""We can study the dust and see what it is made of, and by knowing what it is made of and how big the dust grains are, we can model the evolutionary history of the dust and determine its fate. Most of the energy coming from the galactic center comes from the dust. The dust absorbs starlight and reemits it as infrared; that's why we are observing it in the infrared. We study the energy pouring out of the galactic center by analyzing the dust."
The amount of dust in the galactic center, which is approximately 500 light years across, is approximately 1 million times the mass of our sun, Morris said. Star formation is strongly affected by the presence of a magnetic field. A strong enough magnetic field can prevent a cloud from collapsing to form a star, Morris said.

UCLA astronomers report that three stars have accelerated by more than 250 thousand miles per hour per year as they orbit the monstrous black hole at the center of our Milky Way galaxy. "We are actually seeing stars begin to curve in their orbits," Ghez said. "One of these stars may complete its orbit around the supermassive black hole in as little as 15 years. The two closest stars are only 10 light days from the black hole, but Ghez predicts they will orbit the enormous black hole and not be swallowed by it. In 1995 the three stars were moving at two million miles per hour, and by 1999 they had changed their velocities by more than one million miles per hour.

UCLA astronomers report they have detected remarkably stormy conditions in the hot plasma being pulled into the monstrous black hole residing at the center of our Milky Way galaxy. The black hole is dining on a calm stream of plasma that experiences glitches only 2 percent of the time," said Andrea Ghez, professor of physics and astronomy at UCLA, who headed the research team. "Our infrared detection shows for the first time that the black hole's meal is more like the Grand Rapids, in which energetic glitches from shocked gas are occurring almost continually."


Animation of the stellar orbits in the central parsec. Images taken from the years 1995 through 2008 are used to track specific stars orbiting the proposed black hole at the center of the Galaxy.

UCLA and Japan have discovered evidence of "natural nuclear accelerators" at work in our Milky Way galaxy. Cosmic rays of the highest energies were believed by physicists to come from remote galaxies containing enormous black holes capable of consuming stars and accelerating protons at energies comparable to that of a bullet shot from a rifle. These protons — referred to individually as "cosmic rays" — travel through space and eventually enter our galaxy. But earlier this year came a surprising discovery: Many of the energetic cosmic rays found in the Milky Way are not actually protons but nuclei — and the higher the energy, the greater the nuclei-to-proton ratio. "This finding was totally unexpected because the nuclei, more fragile than protons, tend to disintegrate into protons on their long journey through space," said Alexander Kusenko. Stellar explosions in our own galaxy can accelerate both protons and nuclei. But while the protons promptly leave the galaxy, the heavier and less mobile nuclei become trapped in the turbulent magnetic field and linger longer, the local density of nuclei is increased, and they bombard Earth in greater numbers. These ultra–high-energy nuclei have been trapped in the web of galactic magnetic fields for millions of years, and their arrival directions as they enter the Earth's atmosphere have been "completely randomized by numerous twists and turns in the tangled field," he said.
published Aug. 20 in the journal Physical Review Letters.


The fascinating radio image of Messier 87 (above) was taken with the Very Large Array (VLA) at a frequency of 327 MHz (millions of cycles per second), corresponding to a wavelength of about 92 cm. The brightest radio emission is shown in red, and shows a central peak at the same position as M87. There are also jets of emission extending to the west and to the east, producing large radio lobes (yellow). Note that the western lobe (to the right) takes a sudden turn to the south (bottom), suggesting that it is ramming into a denser and unseen intracluster medium.

A large halo of radio emission (green) surrounds the entire galaxy and jets. The halo stretches across a distance of about 80 kiloparsecs, more than twice the diameter of our own Milky Way Galaxy. The central galaxy and its radio halo, meanwhile, are immersed in an even larger cocoon of hot and thin gas which can be seen at x-ray wavelengths.

The VLA radio telescope team in New Mexico plotted the electrical system at the centre of the Milky Way. One scientist likened the giant plasma loops found there to a dynamo. With the more recent discovery of plasma jets from the same vicinity there is good reason to connect the two, since electricity is known to be able to bring about nuclear fusion, if not the creation of matter. (BBC News website 17 April 2000)

3 Aug 2010 : We find two red clump (RC) populations co-existing in the same fields toward the Galactic bulge. We can only understand the data if these RC peaks simply reflect two stellar populations separated. Most of our fields show the two RCs at roughly constant distance with longitude, which is also inconsistent with a tilted bar, although an underlying bar may be present. The stellar densities in the two RCs changes dramatically with longitude: on the positive longitude side the foreground RC is dominant, while the background RC dominates negative longitudes. A line connecting the maxima of the foreground and background populations is tilted to the line of sight by ~20 +/-4 deg., similar to claims for the tilt of a Galactic bar. The distance between the two RCs decreases towards the Galactic plane; seen edge-on the bulge is X-shaped, resembling some extra-galactic bulges and the results of N-body simulations. The center of this X is consistent with the distance to the Galactic center. Our observations may be understood if the two RC populations emanate, nearly tangentially, from the ends of a Galactic bar, each side shaped like a funnel or horn. Alternatively, the X, or double funnel shape, may continue to the Galactic center. This would appear peanut/box shaped from the Solar direction, but X-shaped when viewed tangentially.

Blueshifted stars are concentrated around the galactic center. The coordinate system has its origin at the Galactic center, composed of all stars (left-hand and right-hand panels) simultaneously. The upper right-hand panel shows the line-of-sight velocity distribution as a function of distance from the Sun d, while the lower right-hand panel is the cumulative distribution of BHB stars with distance from the Galactic center. With image.

Results from the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) experiment and from the Fermi space telescope suggest a local excess positron fraction e+/(e+ + e−) at energies above 10 GeV as well as an excess of e+ + e− peaking around
500 GeV. An attractive explanation is that a DM WIMP (weakly interacting massive particle) is present in our galaxy at large enough concentrations to self-annihilate into standard model leptons.

Antimatter is blueshifted and with R-parity? Antimatter is also dark matter? Has an collission between matter and antimatter happened? Is that what we see as the enormous gamma radiation burst?




An unusual distribution of Gamma Ray energy, shown in this figure as the red glow above the Milky Way plane. It has been interpreted as a region in which antimatter (electrons are positively charged [positrons] and protons have a negative charge) has interacted with conventional matter, releasing a huge amount of energy.

"Nothing we tried besides dark matter came anywhere close to being able to accommodate the features of the observation," Dan Hooper, of the Fermi National Accelerator Laboratory in Batavia, Ill., and the University of Chicago, told SPACE.com. "It's always hard to be sure there isn't something you just haven't thought of. But I've talked to a lot of experts and so far I haven't heard anything that was a plausible alternative."
Hooper conducted the analysis with Lisa Goodenough. They think the Milky Way's gamma-ray glow is caused by dark matter explosions.
By studying the data on this radiation, Hooper and Goodenough calculated that dark matter must be made of particles called WIMPs (weakly interacting massive particles) with masses between 7.3 and 9.2 GeV (giga electron volts) — almost nine times the mass of a proton. They also calculated a property known as the cross-section, which describes how likely the particle is to interact with others.
Knowing these two properties would represent a huge leap forward in our understanding of dark matter.
"It's the biggest thing that's happened in dark matter since we learned it existed," Hooper said. "So long as no unexpected alternative explanations come forward, I think yes, we've finally found it."

lördag 13 november 2010

Detailed Dark Matter map and galactic tornados..

Is our Universe fractal? A new picture has been released from NASA that show a space with much bubbles, like champagne :) The same pattern comes again and again, repeat themself in a topological way. Take a look at the beauty. Here is more fantastic pictures.

One of the main, still open, problems of today’s cosmology is that of the formation and evolution of gravitational structures. To this picture belongs the dark matter formations and the antimatter disappearance. If matter-antimatter formation has a base on four, or two pairs, then only one of those 'ends' = elemental particles would be visible, as discussed by Kea and Graham at galaxyzoo. Antimatter would then also be part of dark matter.



Quote:
News Release Number: STScI-2010-37

Detailed Dark Matter Map Yields Clues to Galaxy Cluster Growth

November 11, 2010: Astronomers using NASA's Hubble Space Telescope received a boost from a cosmic magnifying glass to construct one of the sharpest maps of dark matter in the universe. They used Hubble's Advanced Camera for Surveys to chart the invisible matter in the massive galaxy cluster Abell 1689, located 2.2 billion light-years away. The cluster contains about 1,000 galaxies and trillions of stars. Dark matter is an invisible form of matter that accounts for most of the universe's mass. Hubble cannot see the dark matter directly. Astronomers inferred its location by analyzing the effect of gravitational lensing, where light from galaxies behind Abell 1689 is distorted by intervening matter within the cluster.

Researchers used the observed positions of 135 lensed images of 42 background galaxies to calculate the location and amount of dark matter in the cluster. They superimposed a map of these inferred dark matter concentrations, tinted blue, on a Hubble image of the cluster. The new dark matter observations may yield new insights into the role of dark energy in the universe's early formative years.

The cluster's gravity, the majority of which comes from dark matter, acts like a cosmic magnifying glass, bending and amplifying the light from distant galaxies behind it. This effect, called gravitational lensing, produces multiple, warped, and greatly magnified images of those galaxies, like the view in a funhouse mirror. By studying the distorted images, astronomers estimated the amount of dark matter within the cluster. If the cluster's gravity only came from the visible galaxies, the lensing distortions would be much weaker.

Based on their higher-resolution mass map, Coe and his collaborators confirm previous results showing that the core of Abell 1689 is much denser in dark matter than expected for a cluster of its size, based on computer simulations of structure growth. Abell 1689 joins a handful of other well-studied clusters found to have similarly dense cores. The finding is surprising, because the push of dark energy early in the universe's history would have stunted the growth of all galaxy clusters.

"Galaxy clusters, therefore, would had to have started forming billions of years earlier in order to build up to the numbers we see today," Coe explains. "At earlier times, the universe was smaller and more densely packed with dark matter.

Abell 1689 is among the most powerful gravitational lensing clusters ever observed. "The lensed images are like a big puzzle," Coe says. "Here we have figured out, for the first time, a way to arrange the mass of Abell 1689 such that it lenses all of these background galaxies to their observed positions."

We have finally `cracked the code' of gravitational lensing.

End of quote. Discussed by Sean Carroll here.


A fractal Universe?
Gravitational structure formation in scale relativity
In the framework of the theory of scale relativity, we suggest a solution to the cosmological problem of the formation and evolution of gravitational structures on many scales. This approach is based on the giving up of the hypothesis of differentiability of space-time coordinates. As a consequence of this generalization, space-time is not only curved, but also fractal. In analogy with Einstein's general relativistic methods, we describe the effects of space fractality on motion by the construction of a covariant derivative. The principle of equivalence allows us to write the equation of dynamics as a geodesics equation that takes the form of the equation of free Galilean motion. Then, after a change of variables, this equation can be integrated in terms of a gravitational Schrodinger equation that involves a new fundamental gravitational coupling constant, alpha_g = w0/c. Its solutions give probability densities that quantitatively describe precise morphologies in the position space and in the velocity space. Finally the theoretical predictions are successfully checked by a comparison with observational data: we find that matter is self-organized in accordance with the solutions of the gravitational Schrodinger equation on the basis of the universal constant w0 = 144.7±0.7 km/s (and its multiples and sub-multiples), from the scale of our Earth and the Solar System to large scale structures of the Universe .

A fractal universe, or a universe with many sub-manifolds, and light-cones. What would make it fractal? It must be the scaling mechanism.

As we shall see, this approach provides a solution for both the formation problem (this paper) and the anomalous effects (joint paper) without needing any additional unseen matter. Moreover, it allows one to understand the morphogenesis of several structures at all scales and to theoretically predict the existence of new relations and constraints, that are now successfully checked from an analysis of the astrophysical data.

And they write - this problem becomes formally equivalent to a scattering process during elastic collisions. Indeed, recall that the collision of particles is described in quantum mechanics in terms of an incoming free particle plane wave and of outcoming free plane and spherical waves.

So Universe scattering can be studied in a lab? In fact also the double spilt experiment can be seen as a fractalization of Universe in an holistic manner. Every scattering pattern is like the other in a topological way.

Nottale had a new preprint, Scale relativity and fractal space-time: theory and applications, in 2008 and in Foundations of science 2010, where he took also up implications for evolution and origin of Life. He writes as conclusion:
The theory of scale relativity relies on the postulate that the fundamental laws that govern the various physical, biological and other phenomenons find their origin in first principles. In continuity with previous theories of relativity, it considers that the most fundamental of these principles is the principle of relativity itself. The extraordinary success due to the application of this principle, since now four centuries, to position, orientation, motion (and therefore to gravitation) is well known. But, during the last decades, the various sciences have been faced to an ever increasing number of new unsolved problems, of which many are linked to questions of scales. It therefore seemed natural, in order to deal with these problems at a fundamental and first principle level, to extend theories of relativity by including the scale in the very definition of the coordinate system, then to account for these scale transformations in a relativistic way.

We have attempted to give in this article a summarized discussion of the various developments of the theory and of its applications. The aim of this theory is to describe space-time as a continuous manifold without making the hypothesis of differentiability, and to physically constrain its possible geometry by the principle of relativity, both of motion and of scale. This is effectively made by using the physical principles that directly derive from it, namely, the covariance, equivalence and geodesic principles. These principles lead in their turn to the construction of covariant derivatives, and finally to the writing, in terms of these covariant derivatives, of the motion equations under the form of free-like geodesic equations. Such an attempt is therefore a natural extension of general relativity, since the two-times differentiable continuous manifolds of Einstein’s theory, that are constrained by the principle of relativity of motion, are particular sub-cases of the new geometry in construction.
Now, giving up the differentiability hypothesis involves an extremely large number of new possible structures to be investigated and described.

Such an approach is rendered possible by the result according to which the small scale structure which manifest the nondifferentiability are smoothed out beyond some relative transitions toward the large scales. One therefore recovers the standard classical differentiable theory as a large scale approximation of this generalized approach. But one also obtains a new geometric theory which allows one to understand quantum mechanics as a manifestation of an underlying nondifferentiable and fractal geometry, and finally to suggest generalizations of it and new domains of application for these generalizations. Now the difficulty with theories of relativity is that they are meta-theories rather than theories of some particular systems.
A search on google on ' star cluster fractal' shows mainly two names, Nottale and Pitkänen. A search on arXive gives 5 more articles. Generalized quantum potentials in scale relativity, Journal-ref: J. Phys. A: Math. Theor. 42 (2009), Quantum-like gravity waves and vortices in a classical fluid, Motion equations for relativistic particles in an external electromagnetic field in scale relativity proceedings of the Rencontres TRANS-ERI-COD 2009, The Evolution and Development of the Universe , Special Issue of the First International Conference on the Evolution and Development of the Universe (EDU 2008), and Electromagnetic Klein-Gordon and Dirac equations in scale relativity, International Journal of Modern Physics A 25, (2010) 4239-4253 .
We present a new step in the foundation of quantum field theory with the tools of scale relativity. However, if one first applies the quantum covariance, then implements the scale covariance through the scale-covariant derivative, one obtains the electromagnetic Dirac equation in its usual form. This method can also be applied successfully to the derivation of the electromagnetic Klein-Gordon equation. This suggests it rests on more profound roots of the theory, since it encompasses naturally the spin-charge coupling.
Even this mysterious observation may get its explanation. GIANT GAMMA-RAY BUBBLES: AGN ACTIVITY OR BIPOLAR GALACTIC WIND? We saw the picture in Nottales works. Magnetic perturbation. Or a giant hurricane in the center of Milky Way, the HAARP of our galaxy? NASA's Fermi Telescope Finds Giant Structure in our Galaxy.


NASA's Fermi Gamma-ray Space Telescope has unveiled a previously unseen structure centered in the Milky Way. The feature spans 50,000 light-years and may be the remnant of an eruption from a supersized black hole at the center of our galaxy.
"What we see are two gamma-ray-emitting bubbles that extend 25,000 light-years north and south of the galactic center," said Doug Finkbeiner, an astronomer at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass., who first recognized the feature. "We don't fully understand their nature or origin." They are not dark matter, anyway.



Also at LHC there are today for the first time seen the 4 muon particle pairs, or dimuon pairs, but they show no complete symmetry. The CMS collaboration and its discussion.

torsdag 4 november 2010

The 'magic number' .

Usually constants are constants, never changing natural units. But it becomes more and more evident this is only for a certain measurement interval. Constants can sometimes be variables too? A varying fine structure constant, the 'magic number' alpha (α) has been proposed as a way of solving problems (the cosmological constant problem) in cosmology and astrophysics. More recently, theoretical interest in varying constants (not just α) has been motivated by different proposals for going beyond the Standard Model of particle physics. The first detection of a variation came 1999 with a slight increase in α over time.

Physical constants can take many dimensional forms: dimension refers to the constituent structure of all space (cf. volume) and its position in time; a dimensionless quantity is a quantity without an associated physical dimension. No association to time, length and always has a dimension of 1.

A. Dimensionless ratios:
The fine structure constant α, is the coupling constant characterizing the strength of the electromagnetic interaction (the parameter that describes coupling between light and relativistic electrons and is traditionally associated with quantum electrodynamics) that keeps the atom and the whole universe together. The name comes from the fact that it determines the size of the splitting or fine-structure of the hydrogenic spectral lines. The numerical value of α is the same in all systems of units.

Nobody knows where the alpha number for a coupling comes from, that's why it is 'magic': is it related to pi (as the reduced Plancks constant hbar) or perhaps to the base of natural logarithms? Attempts to find a mathematical basis for this dimensionless constant have continued. It's one of the greatest mysteries of physics: a magic number, we don't know what kind of dance to do on the computer to make this number come out; 29 and 137 being the 10th and 33rd prime numbers. The difference between the 2007 CODATA value for α and this theoretical value is about 3 x 10-11, about 6 * S.E. for the measured value. Today the most accurate value is 137,035999084, the three last unsure. In QED, using the quantum Hall effect or the anomalous magnetic moment of the electron (the "Lande g-factor", g), or with "quantum cyclotron" apparatus.

hbar*c = Plancks charge. So it is elemental charge quantized/Plancks charge, or Plancks charge is the ratio of elemental charge and √α? The Planck charge is α^{-1/2} or approx. 11.706 times greater than the elementary charge e carried by an electron.

Stable matter, and therefore life and intelligent beings, could not exist if its value were much different. If alpha were bigger than it really is, we should not be able to distinguish matter from 'ether' (the vacuum, nothingness, grid), (Einstein showed there was no ether). The fact that alpha has just its value 1/137 is certainly no chance but itself a law of nature.
Arnold Sommerfeld introduced the fine-structure constant in 1916.


B. Dimensional ratios: God's units, Unit of measurement, Planck units are physical fundamental units of measurement defined exclusively in terms of five universal physical constants listed below, in such a manner that these five physical constants take the value one when expressed in terms of these units. Planck units elegantly simplify particular algebraic expressions appearing in physical law. Planck units are only one system of natural units among other systems, but are considered unique in that these units are not based on properties of any prototype object, or particle (that would be arbitrarily chosen) but are based only on properties of free space. The constants that Planck units, by definition, normalize to 1 are the:
Gravitational constant, G;
• Reduced Planck constant, ħ;
Speed of light in a vacuum, c;
Coulomb constant, (sometimes ke or k);
Boltzmann's constant, kB (sometimes k).

Lev Okun in "Trialogue on the number of fundamental constants" Part I "Fundamental constants: parameters and units" 2002:
There are two kinds of fundamental constants of Nature: dimensionless (like α about 1/137) and dimensionful (c — velocity of light, hbar — quantum of action and angular momentum, and G — Newton’s gravitational constant). To clarify the discussion I suggest to refer to the former as fundamental parameters and the latter as fundamental (or basic) units. It is necessary and sufficient to have three basic units in order to reproduce in an experimentally meaningful way the dimensions of all physical quantities. Theoretical equations describing the physical world deal with dimensionless quantities and their solutions depend on dimensionless fundamental parameters. But experiments, from which these theories are extracted and by which they could be tested, involve measurements, i.e. comparisons with standard dimensionful scales. Without standard dimensionful units and hence without certain conventions physics is unthinkable.
The third fundamental unit is the fundamental particles.

It is clear that the number of constants or units depends on the theoretical model or framework and hence depends on personal preferences and it changes of course with the evolution of physics. At each stage of this evolution it includes those constants which cannot be expressed in terms of more fundamental, hitherto unknown, ones. At present this number is a few dozens, if one includes neutrino mixing angles. It blows up with the inclusion of hypothetical new particles.

Usually combined with QED, perturbation, but also with gauge and Yukawa couplings in the quantum elemental particle framework with extensions.

Universal fundamental units of Nature.
The number three corresponds to the three basic entities (notions): space, time and matter. It does not depend on the dimensionality of space, being the same in spaces of any dimension. It does not depend on the number and nature of fundamental interactions. For instance, in a world without gravity it still would be three.

The three basic physical dimensions: L, T, M:

Planck 1899 Stoney 1870 [ ]= dimension
l = hbar/mc, l= e√G/c2 velocity [v] = [L/T]
t= hbar/mc2, t= e√G/c3 angular mom. [J] = [MvL] = [ML2/T]
m= hbar*c/G m = e/√G action[S] = [ET] = [Mv2T] = [ML2/T]

Stoney’s (without hbar) and Planck’s units are numerically close to each other, their ratios being √α . Originally proposed in 1899 by Max Planck, these units are also known as natural units because the origin of their definition comes only from properties of nature and not from any human construct. They are generally believed to be both universal in nature and constant in time. Now this picture is challenged.

Gauge couplings.
A coupling constant (g) is a number that determines the strength of an interaction. Usually the Lagrangian or the Hamiltonian of a system can be separated into a kinetic and interaction part. The coupling constant determines the strength of the interaction part with respect to the kinetic part, or between two sectors of the interaction part (similarity - oscillations?). For example, the electric charge of a particle is a coupling constant, or mass.
If g << 1, it is weakly coupled with perturbation, QED, CKM
If g = /> 1, - it is strongly coupled. An example is the hadronic theory of strong interactions, non-perturbative methods , QCD.
This picture change a bit with the asymptotic degrees of freedom.

The reason this can happen, seemingly violating the conservation of energy, at short times, is the uncertainty relation, result -> quantization in the interaction picture/"virtual" particles going off the mass shell. \Delta E\Delta t >/= hbar This is the Planck scale, where particles vanish 'out of sight' into the virtual Dirac Sea. Gravity is usually ignored (non-renormalisability).
(Toms with asympt. degrees of freedom: If we let g denote a generic coupling constant, then the value of g at energy scale E, the running coupling constant g(E), is determined by E ( dg(E)/dE) = β(E, g) And it can be fused with gravity in Einstein.).

Such processes renormalize the coupling and make it dependent on the energy scale, μ at which one observes the coupling. The dependence of a coupling g (μ) on the energy-scale is known as running of the coupling, and the theory is known as the renormalization group. A beta-function β(g) encodes the running of a coupling parameter, g. If the beta-functions of a quantum field theory vanish, then the theory is scale-invariant. The coupling parameters of a quantum field theory can flow even if the corresponding classical field theory is scale-invariant.

If a beta-function is positive, the corresponding coupling increases with increasing energy. An example is quantum electrodynamics (QED). Particularly at low energies, α ≈ 1/137, whereas at the scale of the Z boson, about 90 GeV, one measures α ≈ 1/127. Moreover, the perturbative beta-function tells us that the coupling continues to increase, and QED becomes strongly coupled at high energy. In fact theoretically the coupling apparently becomes infinite at some finite energy. This phenomenon was first noted by Landau, and is called the Landau pole. Furthermore it was seen that the coupling decreases logarithmically, a phenomenon known as asymptotic freedom, see the Nobel Prize in Physics 2004. This theory was first properly suggested 2006 by Robinson and Wilczek. Confirmed this year by Toms.
David Gross, David Politzer and Frank Wilczek have through their theoretical contributions made it possible to complete the Standard Model of Particle Physics, the model that describes the smallest objects in Nature and how they interact. At the same time it constitutes an important step in the endeavour to provide a unified description of all the forces of Nature, regardless of the spatial scale – from the tiniest distances within the atomic nucleus to the vast distances of the universe.
Conversely, the coupling increases with decreasing energy. This means that the coupling becomes large at low energies, and one can no longer rely on perturbation theory. Value is to be used at a scale above the bottom quark mass of about 5 GeV. Each perturbative description depends on a (string) coupling constant. It acts between the quarks, the constituents that build protons, neutrons and the nuclei.

However, in the case of string theory, these coupling constants are not pre-determined, adjustable, or universal parameters; rather they are dynamical scalar fields that can depend on the position in space and time and whose values are determined dynamically.

The Standard Model and the four forces of Nature
You can see how much weaker the other forces coupling constants are from their relative strengths. http://www.physicsmasterclasses.org/downloads/FinalAlphaS_kateshaw.pdf A coupling constant can determine how strong a fundamental force is.
Alpha S is the coupling constant of the strong nuclear force, which is felt between quarks and gluons, g. The Strong Force is the strongest of the four fundamental forces, 1/8.
The weak nuclear force W, Z, for W and Z-bosons, the force between quarks and all leptons, αw is 1/30.
Electromagnetic force depending on photons, felt by quarks and leptons (except neutrinos) are 1/130, and the
gravity depends on the graviton (not found), α G 1/10^39

Unification of forces at 10^19 GeV?
GUT scale, the natural scale of quantum gravity in QED (the Planck scale is at 10^35 GeV). At high energies this QED perturbation breaks down. (This GUT scale is seen to maybe not be true. Unification based in part on quantised Yang-Mills (or non-Abelian) gauge fields, is much lower, somewhere near Planck scale, as seen by Toms).

The four forces, from hyperphysics.


Robinson and Wilczek came up with the idea of gravity-driven asymptotic freedom. Perhaps the most tantalizing effect of QCD and asymptotic freedom is that it opens up the possibility of a unified description of Nature’s forces. When examining the energy dependence of the coupling constants for the electromagnetic, the weak and the strong interaction, it is evident that they almost, but not entirely, meet at one point and have the same value at a very high energy.


Running coupling constants in the Standard Model (left) and with the introduction of supersymmetry (right). In the Standard Model the three lines, which show the inverse value of the coupling constant for the three fundamental forces, do not meet at one point, but with the introduction of supersymmetry, and assuming that the supersymmetric particles are not heavier than about 1 TeV/c2, they do meet at one point. Is this an indication that supersymmetry will be discovered, or a coincidence? (He et.al. showed this relation was not quite true. )

Gravitational constant G: There is no known way of measuring αG directly. Hence αG is known to only four significant digits. Hence the precision of αG depends only on that of G, \hbar and the mass of the electron. A charged elementary particle has approximately one Planck charge, but a mass many orders of magnitude smaller than the Planck unit of mass. The field is much bigger than the particle. The proton is therefore a physical constant. The gravitational attraction among elementary particles, charged or not, can hence be ignored. That gravitation is relevant for macroscopic objects proves that they are electrostatically neutral to a very high degree.

At critical density (the density limit 0 to 4) the matter starts to collapse (into black holes?) as a consequence of the gravitational force, also condensation + energy release happen. Gravity is converted into kinetic energy of mass. This is how we feel weight. In order to condense, energy has to be dissipated from an ever faster rotating cloud and baryonic matter accomplishes this by the electromagnetic effect of emitting photons. Positive pressure needs collisions before it is created because of interaction. Gravity diminishes towards the center of mass, grows outward towards the border/shell. At the center of Earth is no/weak gravity. In universal units or Planck units G is given the value 1. Those photons that travel at the speed of light, c, a velocity that is invariant has a planck unit =1.

Dark matter/antimatter emitts no photons, but do have gravity, seen in gravitational lensings as instance.
We know a lot about the familiar photons, quarks, electrons and other particles that make up ordinary matter because they interact strongly with each other. Scientists can even observe the ghostly neutrino. But there remains the possibility that other particles exist that interact much more weakly. Ideas that explore this hypothesis are called “hidden valley” models, a particle Shangri La that has remained thus far undetected. There are several hidden valley models, including one specific model that incorporates the principle of supersymmetry (MSSM). In this model, a complicated decay signal could reveal signs of a dark photon.
There are no dark photons(?) say DZero researchers. They would have explained the
found asymmetry, its existence cannot be completely ruled out if the supersymmetric chargino is heavy. In may 2009 they said:
We search for a new light gauge boson, a dark photon. In the model we consider, supersymmetric partners are pair produced and cascade to lightest neutralinos that can decay into the hidden sector state plus either a photon or a dark photon. The dark photon decays through its mixing with a photon into fermion pairs. We therefore investigate a previously unexplored final state that contains a photon, two spatially close leptons, and large missing transverse energy. We do not observe any evidence for dark photons and set a limit on their production.
In the case of R-parity conservation (antimatter has R-parity), superpartners are produced in pairs and decay to the SM particles and the lightest superpartner.
Including results from Fermi/LAT, there is evidence of an excess of high energy positrons and no excessive production of anti-protons or photons. The excess can be attributed to the dark matter particles annihilating into pairs of new light gauge bosons, dark photons, which are force carriers in the hidden sector. The dark photon mass can not be much larger than 1 GeV.

In this particular hidden valley model, six particles are created, all indicated by color. The dark photons are shown in yellow. For the other particles, those indicated in light pink will be observed in the detector, while the ones marked with dark pink will escape entirely undetected. If observed, these darkinos might be the dark matter.

The Standard Model needs some modification if the dream of the unification of the forces of Nature is to be realised. One possibility is to introduce a new set of particles, and usually supersymmetric particles are thought of, but recent research has shown an asymmetry instead (no clear evidence yet).

Axion oscillating pairs (photonpairs) have been predicted by Wilczek et al.. But they are not found to be part of the dark matter.

From the Nobel lecture: "The Standard Model also needs modification to incorporate the recently discovered properties of neutrinos - that they have a mass different from zero. In addition, perhaps this will lead to an explanation of a number of other cosmological enigmas such as the dark matter that seems to dominate space."

This is what Kea does.

Asymptotic degrees of freedom.
"The fantastic and unexpected discovery of asymptotic freedom in QCD has profoundly changed our understanding of the ways in which the basic forces of Nature in our world work, says Wilczek et al.. Toms 2010:
However, it is possible instead to view Einstein gravity as an effective theory that is only valid below some high energy scale. The cut-off scale is usually associated with the Planck scale, EP approx. 10^19GeV, the natural quantum scale for gravity. Above this energy scale some theory of gravity other than Einstein’s theory applies (string theory for example); below this energy scale we can deal with quantised Einstein gravity and obtain reliable predictions. Adopting this effective field theory viewpoint it is perfectly reasonable to include Einstein gravity with the standard model of particle physics and to examine its possible consequences using quantum
field theory methods. Quantum gravity leads to a correction to the renormalisation group β-function (not present in the absence of gravity) that tends to result in asymptotic freedom; this holds even for theories (like quantum electrodynamics) that are not asymptotically free when gravity is neglected.

The value of the “running” coupling constant as, as a function of the energy scale E. The curve that slopes downwards (negative beta function) is a prediction of the asymptomatic freedom in QCD.

According to their theories, the force carriers = gluons have a unique and highly unexpected property, namely that they interact not only with quarks but also with each other. This property means that the closer quarks come to each other, the weaker the quark colour charge and the weaker the interaction. Quarks come closer to each other when the energy increases, so the interaction strength decreases with energy. This property, called asymptotic freedom, means that the beta function is negative. On the other hand, the interaction strength increases with increasing distance, which means that a quark cannot be removed from an atomic nucleus. The theory confirmed the experiments: quarks are confined, in groups of three, inside the proton and the neutron but can be visualized as “grains” in suitable experiments.

Asymptotic freedom makes it possible to calculate the small distance interaction for quarks and gluons, assuming that they are free particles. By colliding the particles at very high energies it is possible to bring them close enough together.

Unification at Planck scale, Planck constant?
Two reports coherently describes the unification all the way up to energies below the Planck scale. First Toms in Nature and He et al. in arXive. They demonstrate that the graviton-induced universal power-law runnings always assist the three SM gauge forces to reach unification around the Planck scale, irrespective of the detail of logarithmic corrections. With implementions for the Higgs boson mass. "We are interested in all the bilinear terms of background photon field, which correspond to the one-loop self-energy diagrams."

Graviton induced radiative corrections for photon self-energy: the wavy external (internal) lines stand for background (fluctuating) photon fields, the double-lines for gravitons, the dotted lines for photon-ghosts, and the circle-dotted line for graviton ghosts. He et al.
More precise numerical analyses including two-loop RG running reveal that even in MSSM the strong gauge coupling α3 does not exactly meet with the other two at the GUT scale as its value is smaller than (α1, α2) by about 3%. With the gauge-invariant gravitational power-law corrections, we can resolve running gauge coupling αi(μ) = gi^2 (μ)/4π, they say. Hence, the universal gravitational power-law corrections will always drive all gauge couplings to rapidly converge to the UV fixed point at high scales and reach unification around the Planck scale, irrespective of the detail of their logarithmic corrections and initial values. It is useful to note that there are clear evidences supporting
Einstein gravity to be asymptotically safe via the existence of nontrivial UV fixed point in its RG flow, so it may be UV complete and perturbatively sensible even beyond the Planck scale.
Toms 2008: Quantum gravity is shown to lead to a contribution to the running charge not present when the cosmological constant vanishes. We show the possibility of an ultraviolet fixed point that is linked directly to the cosmological constant.


Fundamental units in old-fashioned quantum string theory (QST): the quantity hbar is is also fundamental in the “positive” sense: it is the quantum of the angular momentum J and a natural unit of the action S. When J or S are close to hbar, the whole realm of quantum mechanical phenomena appears.
The speed of light c has already been implicitly used in order to talk about the area of a surface embedded in space-time. This fact allows us to replace hbar by a well defined length, λs, which turns out to be fundamental both in an intuitive sense and in the sense of S. Weinberg. Indeed, we should be able, in principle, to compute any observable in terms of c and λs. String theory (OST) only needs two fundamental dimensionful constants c and λs, i.e. one fundamental unit of speed and one of length.
The apparent puzzle is clear: where has our loved hbar disappeared? My answer was
then (and still is): it changed its dress! Having adopted new units of energy (energy being replaced by energy divided by tension, i.e. by length), the units of action (hence of hbar) have also changed. And indeed the most amazing outcome of this reasoning is that the new Planck constant, λs^2 is the UV cutoff.
Remember this UV-point. We come back to it.

He et.al. also shows that
...for the SM gauge coupling evolutions, despite their familiar non-convergence in the region of 10^13−17 GeV, the three couplings do unify around the Planck scale. For the MSSM, one needs not to worry about the model-dependent threshold effects or the two-loop-induced non-convergence around the scale of 10^16 GeV, it is quite possible that the GUT does not happen around the scale of 10^16 GeV, as in the SM case. Instead, the real GUT would be naturally realized around the Planck scale, and thus is expected to simultaneously unify with the gravity force as well. This also removes the old puzzle on why the conventional GUT scale is about three orders of magnitude lower than the fundamental Planck scale. Furthermore, the Planck scale unification helps to sufficiently postpone nucleon decays, which explains why all the experimental data so far support the proton stability. In addition, this is also a good news for various approaches of dynamical electroweak symmetry breaking, such as the technicolor type of theories...

Extrapolation of experimental values, measured at relatively large distances (not shown), for the strength of the strong force (αS), weak force (αW) and hypercharge force (αY) indicates that these forces might merge with gravity for processes characterized by distance scales that are at least 13 orders of magnitude smaller than those currently being explored with the Large Hadron Collider (LHC). The strength of gravity is characterized by αG, and the strength of the electromagnetic force is not shown because at the distance scales displayed this force is replaced by the combination of the weak and hypercharge forces. Toms's analysis investigates a piece of the puzzle that theorists must resolve to correctly describe this unification: the behaviour of gravity in the range between 10−32 and 10−35 metres.

References:
S. Robinson and F. Wilczek, Phys. Rev. Lett. 96, 231601 (2006) [arXiv:hep-th/0509050].

To be continued.