torsdag 28 april 2011

The Standard Model, -/+.

Wilzcek ina talk at the awards session, EPS-HEPP meeting, Krakow, July 2009. I was looking at his Higgs mechanism, and aimed at comparing it to Mattis, when I found this very clear resume'.

What are the shortcomings of the standard model?

  • In the standard model as it comes, the pattern of groups, representations, and (especially) hypercharges is a patchwork.
  • In the standard model as it comes a great discovery of recent years the existence of small non-zero neutrino masses, appears gratuitous. One can accommodate those masses using non-renormalizable, mass dimension 5 operators, but even then the fact that masses of neutrinos are far far smaller than masses of their charged lepton cousins lacks context.
  • In the standard model as it comes there is no evident link between the standard model and the remaining major force of Nature, i.e. gravity. - That is, GR is not here? (my comment)
  • In the standard model as it comes there is no evident accounting for the bulk of the mass in the universe, i.e. the dark matter and dark energy.
  • In the standard model as it comes there is no explanation for the empirical smallness of the P and T violating angle (phi) term.
  • Masses and mixing angles among quarks and leptons are not constrained by any deep principles, and this sector of the standard model brings in a plethora of unworthy “fundamental” constants.
  • The triplication of families is gratuitous. -The fourth generation?
What are the great lessons of the standard model?
  • The gauge principle translates mathematical symmetry into physical law.
  • The observed interaction symmetries and fermion representations fit into a larger, unifying symmetry and unifying representation very economically.
  • Gauge symmetry can be spontaneously broken.
  • Coupling constants evolve with distance, or energy.
Beyond the Standard Litany: LOSP and Higgs Portal: Lattice Lattice Gauge Theory (March
2010) PoS EPS-Hep 2009, hep-ph1003.4672

tisdag 26 april 2011

Photons are superconducting.

Think of the Cosmos as a superconductor, say frank Wilczek. And he wants to see more chemistry and biology linked to physics. There can be lots of benefits in looking at things from different perspectives. The video, about 10 min. From Physics Buzz.

"we have come to suspect that the entity we call empty space is an
exotic kind of superconductor ... (quark/antiquark)QQ* "

A color superconducting phase is a state in which the quarks near the Fermi surface become correlated in Cooper pairs, which condense. In phenomenological terms, a color superconducting phase breaks some of the symmetries of the underlying theory, and has a very different spectrum of excitations and very different transport properties from the normal phase.
It is very hard to predict which pairing patterns will be favored in nature. In principle this question could be decided by a QCD calculation, since QCD is the theory that fully describes the strong interaction. In the limit of infinite density, where the strong interaction becomes weak because of asymptotic freedom, controlled calculations can be performed, and it is known that the favored phase in three-flavor quark matter is the color-flavor-locked phase. But at the densities that exist in nature these calculations are unreliable, and the only known alternative is the brute-force computational approach of lattice QCD, which unfortunately has a technical difficulty (the "sign problem") that renders it useless for calculations at high quark density and low temperature.

Quantum Chromodynamics suggests that most of what we call "matter" is not all that material, because a body is made of elementary "particles" that are almost mass-less (for example, the proton is made of two quarks, whose combined masses are about 1% of the mass of the proton, and of gluons, which are mass-less).

Nature wants a quark and an anti-quark to be as near as possible to minimize the energy required (the strong/color force increases with distance) but pinpointing an anti-quark's position next to its quark would require an infinite amount of energy (as per Heisenberg's uncertainty principle); and viceversa (the energy is minimal when the two particles are let loose in the universe, but then the strong force between them would become infinite). The compromise between these two extremes is the mass of the proton.

The laws by which the states of physical systems alter are independent of the alternative, to which of two systems of coordinates, in uniform motion of parallel translation relatively to each other, these alterations of state are referred (principle of relativity).

If a body gives off the energy L in the form of radiation, its mass diminishes by L/c². The fact that the energy withdrawn from the body becomes energy of radiation evidently makes no difference, so that we are led to the more general conclusion that

The mass of a body is a measure of its energy-content; if the energy changes by L, the mass changes in the same sense by L/9 × 1020, the energy being measured in ergs, and the mass in grammes.

It is not impossible that with bodies whose energy-content is variable to a high degree (e.g. with radium salts) the theory may be successfully put to the test.

If the theory corresponds to the facts, radiation conveys inertia between the emitting and absorbing bodies.

Einstein, footnote 2005.

Noting that photons become heavy inside (electric) superconductors (or, better, that an observer inside a superconductor would perceive a photon as a massive particle), Wilczek derives the analogy that we live inside a (non-electric) superconductor into which particles (and then objects) acquire mass. That "superconductor" is made of the Higgs condensate, which is made from the Higgs particle. The rest of the review of 'The Lightness of Being', here.

The product structure SU(3)xSU(2)xU(1), the reducibility of the fermion representation (that is, the fact that the symmetry does not make connections linking all the fermions), and the peculiar values of the quantum number hypercharge assigned to the known particles all suggest a larger symmetry. The devil is in the details and it is not at all automatic that the observed, complex pattern of matter will fit neatly into a simple mathematical structure.
From 'The search of symmetry lost'.

Superconductivity, short.
When the pixel is hit by a photon, it disrupts Cooper pairs, releasing free electrons - referred to as "quasiparticles" in this context - which can pass through the insulating junction and are immediately swept out as a measurable current.

The superconducting photon sensors have been used in radiation counters whose spectroscopic capabilities allow them to identify particular controlled radioactive materials and not be confounded by background radiation. Arrays are also useful in astronomy, particularly in the millimeter and submillimeter regions of the electromagnetic spectrum. A wide range of other applications is also being investigated.

Reading The Lightness of Being by Frank Wilczek

I borrow and quote this lovely review, posted on Thursday, March 24th, 2011

An electron is a physical condensation of the electromagnetic field that permeates all of spacetime. Thus every electron has the same source. The electron is like the morning dew, appearing from and returning to “the thin air.”

Photons are not massless in a superconductor. They are heavy. Electromagnetic radiation does not penetrate superconductors. If the universe were a type of superconductor it could explain mass very well.. Mass could be the result of the type of superconductor we call empty space. This superconductor would require a new kind of particle… The Higgs condensate. Normal supereconductors conduct electrons. This new superconductor would conduct Higgs particles.

Put it this way: if we lived in a Higgs Superconductor we would have trouble finding the Higgs particle that causes Mass.

Dark energy is the discovery that space has an intrinsic density. The grid, the field, weighs. There is a constant pressure everywhere in space and for all time. This pressure is caused by the density of the grid, or field (the ether?). It is the weight of the universe.

Could the universe be a like a virtual photon that has condensed out of some more fundamental material? The explosiion that we call virtual particle creation, or quark-antiquark creation, might well be described as a “small bang.” Could the big bang be a small bang of some greater world’s particles?

Its from bits. In each 10^-24 seconds a proton computes data equivalent to what our fastest supercomputers compute in months. That is, if you took a supercomputer that could do a million million, or trillion (10^12) floating point operations (flops) per second for several months, this would approximate how much the proton calculates in each second. The quantum computer could, an analogue of the proton, could calculate in mere seconds what supercomputers could never finish within the lifetime of the universe.

We have already made the simplest of these quantum computers.

Know this: no one alive can possibly dream of the changes that such a computer would bring to the world.

In physics, a grad student is he who knows everything about nothing and a professor is he who knows nothing about everything. That is to say, grad students study equations; professors understand them. If, after solving the equation you are surprised at the result, then you are far away from any understanding. Now, with regards to current science, which equations are still surprising us? Well, the solution of the equations that tell us what mass is, they output mass from an input of non-mass. It seems we get something from nothing. The search for the Higgs particle is a search to understand why this is. It is a quest to understand mass and inertia, the most fundamental property we are aware of in the universe.

This is to acknowledge that here we have reached the very bottom of what is explainable in the universe, for everything is explained in terms of something else but we still have no deeper explanation of mass and inertia; we simply observe their effects. To this day, their explanation – their cause – remains a complete and total mystery, surpassed only perhaps by the mystery of the nature of God.

Mass is the sound of space vibrating. Our fingers can hear this vibration and send the data to our brain, which creates a map of this data. This map, we call mass. We have other maps made through sight, sound, taste, smell. Each of these senses feel certain vibrations of the universe and send the data to our brain. These maps converge in the mind to create what we call reality.

When you have an apparent contradiction, this means appearances are wrong. The question then becomes, why? This is the lesson of all paradox. We are faced with discovering some flaw in appearances.


electron >------------------< quark
| |
| |
| |
| |
| |
| |
photon >------------------< gluon

Spin in quantum terms can be thought of as layers of dimensions such that as a particle moves from one layer, one dimension, to another, it changes spin. This at first looked like different particles. They had different masses. But supersymmetry sees these particles of different spin and different mass as the same particle as it moves from one quantum dimension to another.

According to physics, gravity is fundamental. It can’t be explained in terms of anything simpler. It is what it is. It could be that the explanation of gravity is due to a phenomema beyond the perimeter of our universe.

Mass is derived from the energy of quarks and gluons, moving at the speed of light, huddling together to shield one another from the force of the grid from which they precipitated.

Publications of Frank Wilczek
K. Rajagopal and F. Wilczek, "The condensed matter physics of QCD",
F. Wilczek, 2008: The Lightness of Being: Mass, Ether, and the Unification of Forces. Google Books.
F. Wilczek, 2005: In search of symmetry lost, Nature 433.
Einstein, A. Does the inertia of a body depend upon its energy content? [english translation] Ann. Phys. 18, 639–641 (1905) and some of its citations.

måndag 25 april 2011

Easter witches.

Sweet Easter witches visited us, as the tradition is here. They want chocolate eggs and sweeties. This one is particularly dear to me :)

Then later at evening is the giant fires, aimed at scaring away all withes and trolls. In ancient times they could steal babies even, so they are not nice although they look so. Today I have heard no such stories, only small jokes and a bit less nice things, as destryed post boxes, but nothing serious anymore. Also the thunders and fireworks are forbidden nowadays. So Easter has been tamed quite a bit.

BTW her clothes was one of my first sewing projects - when I was a kid :) I had to sew by hand. So long ago.

onsdag 20 april 2011

Complexity and evolution.

To evaluate the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A information-theoretic definition identifies genomic complexity with the amount of information a sequence stores also about its environment [= epigenetics, introns, histons withtaken?]. The simulated evolution of genomic complexity in populations of digital organisms is investigated and the evolutionary transitions that increase complexity is monitored in detail . Because natural selection forces genomes to behave as a natural “Maxwell Demon”, within a fixed environment genomic complexity is forced to increase (Adami et al. Evolution of biological complexity). Demon "show that the Second Law of Thermodynamics has only a statistical certainty." The hypothetical demon opens the door to allow only the "hot" molecules of gas to flow through to a favored side of the chamber, causing that side to gradually heat up while the other side cools down.

This means that Life always fight against (dissipative structures?) the natural selection, and that there are also another, bigger force (basic energy state = coherence?, thermodynamics?) creating evolution? The environment is always part of evolution, and the individual is only a small part. In fact, the individ IS made of environment (Nature and Nurture with Hebbian plasticity, IQ, personality, cloning), only with a dissipative time-lag there also. A dissipative structure is characterized by the spontaneous appearance of symmetry breaking (anisotropy) and the formation of complex, sometimes chaotic, structures where interacting particles exhibit long range correlations. Evolution happen also without living structures, but they may work as enzymes and boost evolution through their dissipative structures. Thus complexity is forced to grow? Topology and fractality ties the structures together? See Harris and Pinker.

The concept of junk DNA (introns) and the recognition that genome size was not a reliable indicator of the number of genes. That, plus the growing collection of genome size data, soon called into question the simplistic diagrams like the one shown here from an article by John Mattick in Scientific American. (There are many things wrong with the diagram. See What's wrong with this figure? at Genomicron). From Sandwalk.

Genetic diversity in nature
, i.e., molecular genetic hereditary differences within and between populations and species, is the basis of evolutionary change said Darwin, 1859. What is 'genetic diversity'?
Nevertheless, despite its cardinal role in evolution and domestication, the origin, nature, dynamics, function, significance, and maintenance of genetic diversity in nature remains controversial. Likewise, questions concerning which evolutionary processes influence natural genetic variation for phenotypic, quantitative-trait (QTL) variation or what the genetic basis of ecologically important morphological variation such as diverse color patterns of mammals emerged. Controversies included allozyme and DNA markers, and currently include the basic units of genetic diversity, i.e., single nucleotide polymorphisms (SNPs) and copy number variation (CNV), which are both abundant in nature and largely enigmatic as to their functional significance. The basic pending problems are how much of molecular diversity in nature, at both the coding and noncoding genomic regions, is adaptive and how much of the mutations occurring in natural populations are nonrandom (adaptive, directed) rather than random, neutral, nearly neutral, non-adaptive or deleterious. Scholarpedia
Newtonian view.
Charles Darwin's ideas about evolution were first announced in public - to a meeting of the Linnean Society in London. Oddly, almost nobody noticed! It might not have helped that the joint paper (with Alfred Russell Wallace) was given the ripping title 'On the Tendency of Species to Form Varieties; and on the Perpetuation of Varieties and Species by Natural Means of Selection'.

To see evolution only as natural selection is wrong and simplistic? This was realized already by Wallace, in his concern about social communities. Alfred Russel Wallace, thought that humans shared a common ancestor with apes, but higher mental faculties could not have evolved through a purely material process. He was much concerned of domestic animals loss of consciousness. Also Darwin had this aspect in his theory, rejecting the crude, direct application of natural selection to social policies, but it has widely been ignored and transformed. In The Descent Of Man he insisted that humans are a deeply social species whose values cannot be derived from selfish calculation. Yet, as a man of his age, he still shared Spencer's obsessive belief in the creative force of competition. He ruled that natural selection was indeed the main cause of evolutionary changes. Sexual reproduction alone, and care for the offspring, talks against natural selection. See the 6 Darwin lectures. Darwin was sure, however, that natural selection could not be the sole cause. Ironically today, Darwinism is much associated to social science, often with great success. What does it tells about our communities? Degenerated human race?
The Darwinian vision of adaptive evolution may include adaptive nonrandom mutations as Darwin himself emphasized when discussing directed variation in addition to natural selection (chapter IV of the 6th edition of The Origin of Species): "there can also be little doubt that the tendency to vary in the same manner has often been so strong that all the individuals of the same species have been similarly modified without the aid of any form of selection". As Templeton (2009) emphasized, "we are just beginning to embrace again Darwin's broad vision that adaptive evolution involves more than natural selection or random variation... the 21st century will undoubtedly witness a more comprehensive synthesis based on Darwin's enduring legacy".Scholarpedia.
This brings in mind the morphogenic fields of habit, a la Sheldrake. A sort of memory field. Very difficult to proove. Evolution is also said to happen in steps, sometimes very fast, sometimes very slow. The memory field would enhance the learning capacity (of animals), as some kind of 'knowledge for free'. There are a 'critical mass' needed to enhance a change.

Lamarckism, transmutationism and orthogenesis were all non-Darwinian theories.
Lamarck had a model of ‘ladder of complexity’; evolution struggle to increased complexity. He used alchemy as scientific method, and because of that he has been ridiculed, but alchemy has certain traits in common with quantum physics, and it deserves a thorough examination. After all we don’t yet exactly know even what an atom is. Also a 'ladder of evolution' has been used earlier.

In fact, Lamarcks view is more exact? Genotype and phenotype are not correlating. And both depends on heredicity. The important one is the phenotype?

Assumption: All living organisms alive today have descended from a common ancestor (or ancestral gene pool). This is the same statement as to find a theory of everything. It is inherent in the statement that there happen a developement or change. Is it random or not? Biological systems are entropically open - and the thermodynamic force is directed?

Canalization as an evolutionary force in biological systems? It is a useful exercise to think about where biological complexity comes from as a way to facilitate the selection of data mining methods.
A common challenge of identifying meaningful patterns in high-dimensional biological data is the complexity of the relationship between genotype and phenotype. Complexity arises as a result of many environmental, genetic, genomic, metabolic and proteomic factors interacting in a nonlinear manner through time and space to influence variability in biological traits and processes. The assumptions about this complexity greatly influences the analytical methods we choose for data mining and, in turn, our results and inferences. For example, linear discriminant analysis assumes a linear additive relationship among the variables or attributes while support vector machine or neural network can model nonlinear relationships.
Tools for non-local, non-linear, quantal effects are very few. Maybe vectors (bosonic forces)? This is not a Newtonian, reductionistic, view anymore, but quantum physical.
This observation that complex organisms can be produced from simpler ones has led to the common misperception of evolution being progressive and having a direction that leads towards what are viewed as "higher organisms". Nowadays, this idea of "progression" in evolution is regarded as misleading, with natural selection having no intrinsic direction and organisms selected for either increased or decreased complexity in response to local environmental conditions. Although there has been an increase in the maximum level of complexity over the history of life, there has always been a large majority of small and simple organisms and the most common level of complexity (the mode) has remained constant. (The mode of a discrete probability distribution is the value x at which its probability mass function takes its maximum value. In other words, it is the value that is most likely to be sampled.) Wikipedia
You see, "in response to environment" is the important factor, that is environment is determining, not individs. A poorer environment automatically gets poorer individuals. But does this modal value tell anything essential about evolution? Nor diversity? Only about silent assumptions?
The growth of complexity may be driven by the co-evolution between an organism and the ecosystem of predators, prey and parasites to which it tries to stay adapted: as any of these become more complex in order to cope better with the diversity of threats offered by the ecosystem formed by the others, the others too will have to adapt by becoming more complex, thus triggering an on-going evolutionary arms race towards more complexity. This trend may be reinforced by the fact that ecosystems themselves tend to become more complex over time, as species diversity increases, together with the linkages or dependencies between species.

If evolution possessed an active trend toward complexity, then we would expect to see an increase over time in the most common value (the mode) of complexity among organisms, as shown to the right. Indeed, some computer models have suggested that the generation of complex organisms is an inescapable feature of evolution. This is sometimes referred to as evolutionary self-organization. Self-organization is the spontaneous internal organization of a system. This process is accompanied by an increase in systemic complexity, resulting in an emergent property that is distinctly different from any of the constituent parts.

However, the idea of increasing production of complexity in evolution can also be explained through a passive process. This involves an increase in variance but the mode does not change. Thus, the maximum level of complexity increases over time, but only as an indirect product. Wikipedia
Does it really matter for the result if the process is direct or indirect? After all the phenotype is about epigenetic/environmental regulation of the genotype. Non-coding RNA is mostly about introns. See list of RNAs, histone code. Horisontal gene transfer ignores the individual genome at least for bacterias and unicellular organisms (that also acts as superorganisms). In fact, evolution has always been linked to changes in environment. No change, no need for evolution. Evolution is about neutralization of stress, thermodynamics, to find the basic, lowest energy level in equilibrium; homeostasis? Selection overrules gene flow. I would NOT trust Wikipedia. Evolution has also traits of quantum mechanics, that overrule the individ, as horisontal gene transfer, synchronized behaviours (EEG-waves) etc. Biological selection can also occur at multiple levels simultaneously, like some 'sum over histories' in quantum tunneling or to find/computate the path, or in synchrony? This is called superorganisms. See also Human microbiome project, collective intelligence.

Evolution is also the successive alterations that led from the earliest proto-organism to more complex organisms. These simple organisms should then also grow their diversity (sum of paths)? It has many times been shown that micro-organisms behave (as phenotype) as superorganisms.

Current tree of life showing horizontal gene transfers. The prevalence and importance of HGT in the evolution of multicellular eukaryotes remain unclear. Wikipedia.

The quantum physical non-linear entanglement is the first level of coherence, or the basic homeostasis?

Scaling and thermodynamics.
TGD in Scaling Law and Evolution:
Cultural evolution could perhaps be seen as evolution of memes with memetic code playing the role of genetic code. There are good reasons to believe that the intronic portion of DNA codes for memes represented dynamically as field patterns associated with massless extremals.
(MEs = bosons, candidates for correlates of dark photons at various levels of dark matter hierarchy = the quantum aspect = first level of coherence). See also Macroscopic Quantum Coherence and Quantum Metabolism as Di fferent Sides of the Same Coin.
Thermodynamics is important for TGD, but there is also a condition when the second law doesn't hold. Negentropy Maximization Principle states that "
The sequence of quantum jumps is expected to lead to a 4-D counterpart of thermodynamical ensemble and thermodynamics results when one labels the states by the quantum numbers assignable to their positive energy part. Entropy is assigned with entire 4-D CD rather than to its 3-dimensional time=constant snapshots." The hierarchy of CDs forces to introduce a fractal version of the second law taking into account the p-adic length scale hypothesis and dark matter hierarchy. Second law should always be applied only at a given level of p-adic and dark matter hierarchy and one must always take into account two time scales involved, one subjective CD and one objective, = the system, and only if the system is bigger the second law holds.
Maxwell Demon is at work?
This means that second law could be broken in the geometric time scale (= system) considered, but not for personal CDs. In the case of subjective time the negentropic situation could continue forever unless the CD disappears in some quantum jump :) The huge value of gravitational Planck constant associated with space-time sheets mediating gravitational interaction and making possible perturbative quantum treatment of gravitational interaction would indeed suggest the breaking of second law in cosmological time scales. Modi fication of thermodynamics to take into account negentropic entanglement.
This is combined with broken Lorentz symmetry (?) and gravitational vawes?

"Quantum interference of large organic molecules" pretty well says it all. A group in Austria, headed by Markus Arndt, with Stefan Gerlich as the lead author, has demonstrated interference effects using large organic molecules. When they pass a beam of these molecules through an interferometer, they behave exactly like waves, producing an interference pattern analogous to the bright and dark spots you get by sending light into a diffraction grating. The ultimate goal of this is to try to push the quantum-classical boundary as far as possible. Macroscopic bodies show no signs of wave-like behavior. Microscopic objects like electrons, atoms, and now molecules, on the other hand, clearly show wave behavior.

BBC News has a piece about the continued work of the ILL to make neutrons show quantum effects in the gravity of Earth.

Astrobiology, UCLA, stepwise evolution:
Progress in the evolution of life on Earth is measured by an increase in biological diversity (biodiversity), by an increase in morphological disparity, and by progress towards increasing complexity. This are presumably general principals which apply to any living system and so studies of the processes that result in these advances should have universal application. Of the three metrics for evolutionary progress, complexity is the most difficult to define, measure, and understand. In this section, biologists and paleontologists attempt to grapple with this difficult issue by focusing on two outstanding biological events: The origin of eukaryotes and the radiation of animals during the "Cambrian Explosion".
See also Earth's Early Environment and Life. and Professor Rainey's picture of life's evolution from the perspective of major evolutionary transitions, including that from solitary organisms to societies.

Hawking and Mlodinow, The Grand Design, Phys.World, 2010: A second premise that the reader is expected to accept as The Grand Design moves along is that we can, and should, apply quantum physics to the macroscopic world. To support this premise, Hawking and Mlodinow cite Feynman's probabilistic interpretation of quantum mechanics, which is based on his "sum over histories" of particles. Basic to this interpretation is the idea that a particle can take every possible path connecting two points. Extrapolating hugely, the authors then apply Feynman's formulation of quantum mechanics to the whole universe: they announce that the universe does not have a single history, but every possible history, each one with its own probability.

Note also the periods of great extinction of animals. We are in such a period now. A clear quantum state?
Rates of change are increasingly fast and nonlinear with sudden phase shifts to novel alternative community states. We can only guess at the kinds of organisms that will benefit from this mayhem that is radically altering the selective seascape far beyond the consequences of fishing or warming alone. The prospects are especially bleak for animals and plants compared with metabolically flexible microbes and algae.
First the individual or populational level, after that the quantum aspect of evolution.

The individ and population.
Bits of information collected in qubits or selves?
Qubits and information are classical concepts. We don't know yet what is its quantal counterpart, other than it is non-local. It can be random, but more probable is entangled (one 'bit' of 'holism').

In order to have the evolution working there must be some ‘bits of information’ entangled into a (topological and fractal?) Self. Otherwise the evolution has no tool to work with. The self has to be saved and protected with boundaries; membranes, shells, competition, war and selfishness. Self and Ego and Personality express a tool in different degrees of stability. Richard Dawkins' book The Selfish Gene

This stability is the secondary or tertiary coherence or entanglement, seen as synchronized patterns, discreate information? If homeostasy is the first, lowest, most basic, energy level in temporal equilibrium, then there are secondary, tertiary etc. energy levels in temporal equilibrium too, - called allostasy? Also seen in the protein folding? What is natural selection if not a strive for energetic equilibrium? Also malfunctions, as disease, can be such a tool; built-in stress, or created stress in order to lower the total energy need? Structure is a result of these energy levels of maximal function. To find these levels is then the most basic task for determining evolution. What can structure tell us about the energy levels? They are revealed through maximal function, or use of energy.

Is competitive evolution the only possible context for explaining the biological facts? The trouble with metaphors is that they don't just mirror scientific beliefs, they also shape them, it expresses, advertises and strengthens our preferred interpretations and carries unconscious bias. Competition cannot be the ultimate human reality, because of the social aspects. Does evolution strive towards socialism?
Unresolved conflicts between communal and separatist views of human nature, in the western world from our history, is the reason for its enormous influence on the thoughts still today. The world picture shifted from a feudal, communal pattern towards the more individualistic, pluralistic model we officially follow today. Ideals of personal allegiance, heroic warfare and the divine right of kings began to yield to civilian visions based on democracy, technology and commerce. The cooperative and communion idea is badly non-supported today. The individualistic, atomistic (Newtonian), classical physical and reductionistic view has largely been seen as the only scientific way. The Universe is seen as a giant clock, and biology as a machine with molecular motors. The selfish metaphor: Conceits of evolution.
Evolution as a metaphor.
History: Evolution has been the most glaring example of the thoughtless use of metaphor over the past 30 years, with the selfish/war metaphors dominating and defining the landscape so completely it becomes hard to admit there are other ways of conceiving it. Descartes went beyond faith (the Bible) in attempting to rationally explain the world and our place in it. It was revolutionary, and allowed a scientific wiev, only if the soul was left to the church. The soul- and mindless science was born. Today the soul is still almost completely ignored.

Very different kinds of metaphors and images have been used. Rousseau, Voltaire and Locke devised a kind of social atomism (Social atomism and the old world order - in the eighteenth century it was the basis for two violent revolutions and a good deal of social unrest.). One's value would no longer be determined by one's place in the social order, which is to say by heredity, nor was it to be determined by divine right. It was, henceforth, to be a matter of one's intrinsic worth as a human being and a citizen working together in voluntary association with others for the common good. American society inherited and encoded in its founding documents the principles of equality and individualism, transforming them into a new tradition with its own hierarchy of values and preferences. Philosophical theories can and often have played an influential role in social change, as cultural evolution. By critically examining the genealogy of the dominant philosophical concepts of the self and its relation to social and cultural forms, we can come to a clearer understanding of what shapes contemporary thinking about our own self-identities and our role in the struggle for a just and ethical society.

Thomas Hobbes's claim (with the analytic tools of the geometrician) that the natural state of humans was "a war of all against all" (put forward in a bid to stop people supporting misguided governments) accidentally launched a wider revolt against the notion of citizenship. "All human beings have a natural right to self-preservation, and adding an additional premise regarding human psychology, that all human beings are motivated by a kind of enlightened self-interest, Hobbes concluded that the only rational way to live reasonably peaceful and productive lives would be for individuals, who are equally capable of inflicting harm on one another, to agree to refrain from interfering in one another's lives. But upholding this agreement would entail a compromise, that is, an exchange of one's total independence and a relinquishing of one's rights for a greater measure of security and dependence on another-the duly appointed sovereign-who was granted absolute authority over society provided he fulfilled his duty to keep the peace." The slogan made it possible to argue later that there is no such thing as society, that we owe one another nothing. (Moral atomism was a follower. "Ethical atomism combined with Hobbes' and Locke's social atomism supplies some of the most important and characteristic features of American political theory, and the imprint of these ideas is evident and fixed in the Constitution.") This thought also inspired campaigns for admirable things like easier divorce and universal suffrage and it is still strong today, even though physicists themselves no longer see their particles as radically disconnected, and economics arguing that free competition always serves the general good. Its champions could thus believe they were being scientific while still acting as good citizens. And its emphasis on conflict reassured them they were still heroes, that bourgeois life had not corrupted their machismo. So atomistic thinking, originally drawn from physics, acquired a social meaning in economics and was then returned to science as ideas of competition began to dominate 19th-century biology. The resulting jumble of atomistic ontology, market economics and warlike noises came together fully in the theories of 19th-century "social Darwinists" like Herbert Spencer.

In How The Leopard Changed Its Spots, The Evolution of Complexity 1997, biologist and complexity theorist Brian Goodwin suggested the kind of correction needed, remarking mildly that humans are "every bit as co-operative as we are competitive; as altruistic as we are selfish... These are not romantic yearnings and Utopian ideals, they arise from a rethinking of our nature that is emerging from the sciences of complexity". He represents the structuralist approach, which resonates with D'Arcy Thompson's idea that evolutionary variation is constrained by structural laws; not all forms are possible. These ideas are now connected with new principles of dynamic emergence from complex systems, as developed within the sciences of complexity. Goodwin is strongly opposed to the reductionist view of the ultra-Darwinians, and much more comfortable with the complexity ideas of Stuart Kauffman and with Francisco Varela's holistic approach to biology. A dialog here. Even Dawkins pointed out on his The Selfish Gene, that genes are actually cooperative rather than egoistic.

D'Arcy Thompson pointed this out in On Growth And Form in 1917, noting the striking natural tendencies which contribute to evolution - the subtle, natural patterns such as Fibonacci spirals that shape all manner of organic forms, and the logic underlying patterns such as the arrangement of a creature's limbs. Biologists such as Brian Goodwin, Steven Rose and Simon Conway Morris have developed his work, showing how natural selection is supplemented by a kind of self-organisation within each species, which has its own logic. The selfish metaphor: Conceits of evolution.

Competition and descreatness requires entanglement and systems.
Entities complex enough to compete cannot exist at all without much internal cooperation. (There cannot even be ordinary matter without entanglement (Vedral). Entanglement is the degree of correlation between observables pertaining to different modes that exceed any correlation allowed by the laws of classical physics. Systems can be entangled in their external degrees of freedom (positions and momenta), or internal degrees of freedom, as resonances, B-E condensates, superconductivity... The type of entanglement also gives the implementation of various gates, interpreted as hierarchies, cut-offs or boundaries. Lasers and B–E condensates are macroscopic effects of boson statistics, and requires highly specialized environments (as in membranes, nerves, see Heimburg). Carbon, protons, photons can be composite bosons, and electrons as composite fermions give the essential bricks of life. ). To create cells, organelles must combine; to create armies, soldiers must organise.

But there are also entanglement outside space or time, the EPR-paradox, shown many times, as instance by Tiller.

What is this entanglement, new organisation, self-assembly, if not information? What is the roots of the information? What is the construction, or logic?

The work of evolution can be seen as intelligent and constructive, not as a gamble driven randomly by the forces of competition. And if non-competitive imagery is needed, Denis Noble found that there was not a single oscillator which controlled heartbeat, but rather this was an emergent property of the feedback loops in the various channels. In The Music Of Life 2006 he points out how natural development, not being a car, needs no single "driver" to direct it. Symphonies are not caused only by a single dominant instrument nor, indeed, solely by their composer. And developing organisms do not even need a composer: they grow, as wholes, out of vast and ancient systems which are themselves parts of nature. He has also earlier written The Logics of Life and The Ethics of Life. He is one of the pioneers of Systems Biology and now in Computational Physiology. He played a major role in launching the Physiome Project, an international project to use computer simulations to create the quantitative physiological models necessary to interpret the genome. He is critical of the ideas of genetic determinism and genetic reductionism. Gene ontology will fail without higher-level insight. The theory of biological relativity: there is no privileged level of causality, and the Self is no object. There are no programs in the brain nor any other places, a genuine ‘theory of biology’ does not yet exist. See Genes and causation, How Central Is the Genome?, All systems go , The Music of Life - A view on nature and nurture, Molecules to Organisms: What directs the music of life and his homepage.
He contrasts Dawkins's famous statement in The Selfish Gene ("Now they [genes] swarm ... safe inside gigantic lumbering robots ... they created us, body and mind; and their preservation is the ultimate rationale for our existence") with his own view: "Now they [genes] are trapped in huge colonies, locked inside highly intelligent beings, moulded by the outside world, communicating with it by complex processes, through which, blindly, as if by magic, function emerges. They are in you and me; we are the system that allows their code to be read; and their preservation is totally dependent on the joy we experience in reproducing ourselves. We are the ultimate rationale for their existence". He further suggests that there is no empirical difference between these statements, and claims that they differ in "metaphor" and "sociological or polemical viewpoint" from The Music of Life.
The old metaphors of evolution need to give way to new ones founded on integrative thinking - reasoning based on systems thinking.

Matti says too that biology is a Big Anomaly :)

The Well. What is complexity?
Darwinian evolution is a simple yet powerful process that requires a population of reproducing organisms in which each offspring has the potential for a heritable variation. This creates the p-q variational/oscillational well (with dark matter?). Population is no individual, and potentials are no realizations. These both follow a distribution curve, be it Gaussian or something else. What does the mode tell us? What can be a better measurement? Variability, diversity? That is an increase in potentials, or degrees of freedom?

Defining complexity as “that which increases when self-organizing systems organize themselves”, is an increase of "qubits", entangled bits of freedom. Complex complexity? Of course, in order to address this issue, complexity needs to be both defined and measurable.

Structural and functional complexity by examining genomic complexity.
1. complexity is mirrored in functional complexity and vice versa?
- ambiguous definition of complexity
- the obvious difficulty of matching genes with function
2. defined in a consistent information-theoretic manner (the “physical” complexity), which appears to encompass intuitive notions of complexity used in the analysis of genomic structure and organization.
3. evolution can be observed in an artificial medium, universal aspects of the evolutionary process in a computational world, symbolic sequences subject to evolution are computer programs that have the ability to self-replicate. See also experiments with RNAs.
4. superpositions and entanglements create situations of "to know more than everything" in informational energies, creating a "negative negentropy" of enlarged or enlightened consciousness/information, the ahaa-experience.
5. Noise and inhibition are very important constraints, creating synchrony, or secondary equilibrium stable levels.

In populations of such sequences that adapt to their world (inside of a computer’s memory), noisy self-replication coupled with finite resources and an information-rich environment leads to a growth in sequence length as the digital organisms incorporate more and more information about their environment into their genome [= epigenetic factors grow?]. These populations allow us to observe the growth of physical complexity explicitly, and also to distinguish distinct evolutionary pressures acting on the genome and analyze them in a mathematical framework. Evolution of biological complexity

Here we must remember that genomes are 2-D constructions, RNAs and proteins are 3-D, functions are 4-D (time also included). Time is seen only in function, sequencies, as a result of energy differencies and polarizations (bosons). Time is the 'dance'-metaphor.

We can measure energy differencies as entropy, but living systems are open. Can entropy be measured in structure, as complexity or decoherence? That is the degree of stress-relax?

C-value paradox in genomes.
Back in the olden days, when everyone was sure that humans were at the top of the complexity tree, the lack of correlation between genome size and complexity was called the C-value paradox where "C" stands for the haploid genome size.
The C value paradox takes its name from our inability to account for the content of the genome in terms of known function. One puzzling feature is the existence of huge variations in C values between species whose apparent complexity does not vary correspondingly. An extraordinary range of C values is found in amphibians where the smallest genomes are just below 109bp while the largest are almost 1011. It is hard to believe that this could reflect a 100-fold variation in the number of genes needed to specify different amphibians. Benjamin Lewin, Genes II (1983)
There are so many examples of very similar species that have huge differences in the size of their genome. Onions, are another example—they are the reason why Ryan Gregory made up the Onion Test. The onion test is a simple reality check for anyone who thinks they have come up with a universal function for non-coding DNA. Whatever your proposed function, ask yourself this question: Can I explain why an onion needs about five times more non-coding DNA for this function than a human?

If an organism’s complexity is a reflection of the physical complexity of its genome (assumed here) the latter is of prime importance in evolutionary theory. Physical complexity, roughly speaking, reflects the number of base pairs in a sequence that are functional. As is well known, equating genomic complexity with genome length in base pairs gives rise to a conundrum (known as the C-value paradox) because large variations in genomic complexity (in particular in eukaryotes) seem to bear little relation to the differences in organismic complexity. The C-value paradox is partly resolved by recognizing that not all of DNA is functional (non-coding DNA); that there is a neutral fraction that can vary from species to species. Human DNA is made mostly of ancient viruses, as instance, as an immunological response, just as non-coding RNAs are a primitive immune system. Humans, Arabidopsis, and nematodes all have about the same number of genes.

The C-value enigma, unlike the older C-value paradox, is explicitly defined as a series of independent but equally important component questions, including:

  • What types of non-coding DNA are found in different eukaryotic genomes, and in what proportions?
  • From where does this non-coding DNA come, and how is it spread and/or lost from genomes over time?
  • What effects, or perhaps even functions, does this non-coding DNA have for chromosomes, nuclei, cells, and organisms?
  • Why do some species exhibit remarkably streamlined chromosomes, while others possess massive amounts of non-coding DNA?
Now we have a G-value paradox, where "G" is the number of genes. The only way out of this box—without abandoning your assumption about humans being the most complex animals—is to make up some stories about the function of so-called junk DNA. Maybe humans really aren't all that much more complex, in terms of number of genes, than wall cress. Maybe they should have the same number of genes. Maybe the other differences in genome size really are due to variable amounts of non-functional junk DNA.

It is likely that a significant increase in this non-neutral fraction could be observed throughout at least the early course of evolution. For the later period, in particular the later Phanerozoic Era, it is unlikely that the growth in complexity of genomes is due solely to innovations in which genes with novel functions arise de novo. Indeed, most of the enzyme activity classes in mammals, for example, are already present in prokaryotes. Rather, gene duplication events leading to repetitive DNA and subsequent diversification as well as the evolution of gene regulation patterns appears to be a more likely scenario for this stage. Still, we believe that the Maxwell Demon mechanism described below is at work during all phases of evolution and provides the driving force toward ever increasing complexity in the natural world. Evolution of biological complexity

A modification of the second law?

The notion of information.
The difficulty with defining complexity is because there is no single measure that can capture all structural characteristics. The different meanings of complexity and their measures restrict eventual usefulness, which so far seems mainly descriptive. Maybe decoherence is a better term?

Perhaps a key aspect of information theory is that information cannot exist in a vacuum, that is, information is physical. This statement implies that information must have an instantiation (be it ink on paper, bits in a computer’s memory, or even the neurons in a brain). Furthermore, it also implies that information must be about something. Information is about entropy. How is entropy in quantum world? Quantum gravity or anti-gravity?

Then there are the problem of meaning.

In the one end complexity has a funky relationship with information and descriptions. Kolmogorov complexity (algorithmic information) is a measure of the resources needed to specify an object. Used also in fractals. A random string takes the most information to specify. Kolmogorov's complexity K(x) of a bitstring x is the length of the shortest program that computes x and halts. Jurgen Schmidhuber's homepage. Poland:
Schmidhuber proposed a new concept of generalized algorithmic complexity. It allows for the description of both finite and infinite sequences. The resulting distributions are true probabilities rather than semimeasures.He prove a strong coding theorem (without logarithmic correction terms), which was left as an open problem. To this purpose, we introduce a more natural definition of generalized complexity.
That sounds very much like relativity.

Another measure with descriptive power for biological systems is mutual information, which is linked to Shannon information (requires entanglement?) used among others by TGD. It is claimed to have been used to characterize RNA secondary structure, but in any case mutual information can be used to define neural complexity. This complexity is lowest for regular or random systems of elements, but highest for networks with order on all scales such as brains (relationships, wells).

The correspondence between algorithmic constructions and physics is interesting. Computer scientist Scott Aaronson has a lot to say here. The impression is that he claims that physical processes are algorithmic at their base (even if that is quantum algorithmic). He is a founder of the Complexity Zoo, which catalogs all classes of computational complexity. Also the limits of computability he has thought of, together with quantum computation. Everything cannot be computated.

Aaronson: There were two occasions when researchers stepped back, identified some property of almost all the techniques that had been tried up to that point, then proved that no technique with that property could possibly work. These “meta-discoveries” constitute an important part of what we understand about the P versus NP problem. The first meta-discovery was relativization, any solution to the P versus NP problem will require non-relativizing
techniques: techniques that exploit properties of computation that are specific to the real world. The second meta-discovery was natural proofs. A new technique called arithmetization to circumvent the problems. The idea was that, instead of treating a Boolean formula ' as just a black box mapping inputs to outputs, one can take advantage of the structure of ', by “promoting” its AND, OR, or NOT gates to arithmetic operations over some larger field F. We now have circuit lower bounds that evade both barriers, by cleverly combining arithmetization (which is non-relativizing) with diagonalization (which is non-naturalizing). Is there a third barrier, beyond relativization and natural proofs, to which even the most recent results are subject? It is, a third barrier to solving P versus NP and the other central problems of complexity theory. Algebrization (short for “algebraic relativization”) Notice that algebrization is defined differently for inclusions and separations; and that in both cases, only one complexity class gets the algebraic oracle e A, while the other gets the Boolean version A. These subtle asymmetries are essential for this new notion to capture what we want. Arithmetization is one of the most powerful ideas in complexity theory, but simply fails to “open the black box wide enough.
The major focus in studying genetic diversity in nature should be on population functional ecological genomics across life coupled with proteomics, metabolomics, and phenomics. A major future perspective should try to analyze the effects of stresses not only through individual genes but through genomic-biochemical networks related to individual and collective environmental stresses (solar radiation, temperature, global warming, drought, mineral and photosynthetic deprivations, biotic stresses, etc). Epigenomics should be coupled with genomics, including DNA methylation, histone modification, and the regulatory effects of small RNA, and noncoding repeat elements, including transposon and retrotransposon dynamics in both adaptation and speciation to evaluate genome-wide adaptive divergence. Comparisons should be made between local, regional, and global "genetic laboratories" and the study of genomic diversity and divergence across life. The next-generation technologies will open dramatic new horizons for the study of genetic diversity in nature, unraveling its nature and dynamics at micro- and macroscales. These future studies will highlight theoretical mysteries of evolutionary biology as to the genetic diversity in nature and the evolutionary driving forces shaping its fate in adaptation and speciation.

S. Aaronson and A. Wigderson. 2008: Algebrization: A New Barrier in Complexity Theory [PS] [PDF], ACM Transactions on Computing Theory 1(1), 2009. Conference version [PS] [PDF] in Proceedings of ACM STOC'2008, pp. 731-740.

Adami C, Ofria C, Collier TC 2000: "Evolution of biological complexity". Proc. Natl. Acad. Sci. U.S.A. 97 (9): 4463–8. doi:10.1073/pnas.97.9.4463. PMC 18257. PMID 10781045

Adami C 2002: "What is complexity?". Bioessays 24 (12): 1085–94. doi:10.1002/bies.10192. PMID 12447974

Darwin, Charles 1859: On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life (Full image view 1st ed.), London: John Murray, pp. 502, retrieved 2011-03-01

- The Complete Works of Charles Darwin Online

Davnah Urbach & Jason Moore, 2011: Data mining and the evolution of biological complexity. BioData Mining 2011, 4:7. doi:10.1186/1756-0381-4-7

Richard Dawkins' The Selfish Gene.

- Complexity Explained: 13. Evolution of Biological Complexity

Gerlich, S., Eibenberger, S., Tomandl, M., Nimmrichter, S., Hornberger, K., Fagan, P., Tüxen, J., Mayor, M., & Arndt, M. 2011: Quantum interference of large organic molecules. Nature Communications, 2 DOI: 10.1038/ncomms1263

Hahn, M.W. and Wray, G.A. 2002: The g-value paradox. Evol. Dev. 4:73-75.

Stephen Hawking and Leonard Mlodinow 2010: The Grand Design. Bantam Press

Dr. Eviatar Nevo, & Avigdor Beiles, 2010: Genetic variation in Nature. Scholarpedia. Modified and approved on 10 November 2010. Extensive.

Mary Midgley, 2011: The selfish metaphor: Conceits of evolution. New Scientist.

Mattick, J.S. 2004: The hidden genetic program of complex organisms. Sci Am. 291:60-67.

Dennis Noble 2006: The Music of Life, ISBN 0-19-929573-5

Matti Pitkänen, 2010: Macroscopic Quantum Coherence and Quantum Metabolism as Di fferent Sides of the Same Coin.
- 2011 General theory of qualia, Homeopathy in many-sheeted space-time. and Negentropy Maximization Principle.

Jan Poland 2004: A Coding Theorem for Enumerable Output Machines. IDSIA-05-04.»jan

John Hagelin, Ph.D on Consciousness

John Hagelin, Ph.D ON Consciousness & Superstring Unified Field Theory, How is knowledge lost and The Observer. His homepage. Video library. On YouTube, Discovery of The Unified Field .

He is an American particle physicist, three-time candidate of the Natural Law Party for President of the United States (1992, 1996, and 2000), and the director of the Transcendental Meditation movement for the US.
In 1970, while at Taft, he was involved in a motorcycle crash that led to hospitalization and a full body cast. During this time, one of his teachers introduced him to quantum mechanics, and he also learned the Transcendental Meditation technique, both of which had major impacts on his life.

His work on the "flipped SU(5), heterotic superstring theory" is considered one of the more successful unified field theories or "theories of everything" and was highlighted in a cover story in Discover magazine.

"Weak symmetry breaking by radiative corrections in broken supergravity" 1983
"Supersymmetric relics from the big bang" 1984

In 1987 and 1989, Hagelin published two papers in the Maharishi University of Management's Journal of Modern Science and Vedic Science on the relationship between physics and consciousness. These papers discuss the Vedic understanding of consciousness as a field and compares it with theories of the unified field derived by modern physics. Hagelin argues that these two fields have almost identical properties and quantitative structure, and he presents other theoretical and empirical arguments that the two fields are actually one and the same — specifically, that the experience of unity at the basis of the mind achieved during the meditative state is the subjective experience of the very same fundamental unity of existence revealed by unified field theories.

As evidence for this explanation, Hagelin points to the body of research supporting the positive effects created by Transcendental Meditation and the more advanced TM-Sidhi program (which includes a practice called "Yogic Flying") which are said to have measurable effects on social trend parameters. This phenomenon is called the "Maharishi Effect". Hagelin cites numerous studies of such effects.

- it seems the first 2 seconds of video 2 with the words 'and effort' got cut out somehow... here is the full sentence that he is speaking at the end of video 1 and into video 2. "(video 1) try and reproduce that experience, you'll never succeed because trying involves effort (video 2) and effort keeps the awareness active and the comprehension from expanding."

Interesting guy. Compare to TGD, a living Universe :) E8 is maybe not so far from TGD, especially if E8 is 'flipped'.