onsdag 20 april 2011

Complexity and evolution.

To evaluate the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A information-theoretic definition identifies genomic complexity with the amount of information a sequence stores also about its environment [= epigenetics, introns, histons withtaken?]. The simulated evolution of genomic complexity in populations of digital organisms is investigated and the evolutionary transitions that increase complexity is monitored in detail . Because natural selection forces genomes to behave as a natural “Maxwell Demon”, within a fixed environment genomic complexity is forced to increase (Adami et al. Evolution of biological complexity). Demon "show that the Second Law of Thermodynamics has only a statistical certainty." The hypothetical demon opens the door to allow only the "hot" molecules of gas to flow through to a favored side of the chamber, causing that side to gradually heat up while the other side cools down.

This means that Life always fight against (dissipative structures?) the natural selection, and that there are also another, bigger force (basic energy state = coherence?, thermodynamics?) creating evolution? The environment is always part of evolution, and the individual is only a small part. In fact, the individ IS made of environment (Nature and Nurture with Hebbian plasticity, IQ, personality, cloning), only with a dissipative time-lag there also. A dissipative structure is characterized by the spontaneous appearance of symmetry breaking (anisotropy) and the formation of complex, sometimes chaotic, structures where interacting particles exhibit long range correlations. Evolution happen also without living structures, but they may work as enzymes and boost evolution through their dissipative structures. Thus complexity is forced to grow? Topology and fractality ties the structures together? See Harris and Pinker.




The concept of junk DNA (introns) and the recognition that genome size was not a reliable indicator of the number of genes. That, plus the growing collection of genome size data, soon called into question the simplistic diagrams like the one shown here from an article by John Mattick in Scientific American. (There are many things wrong with the diagram. See What's wrong with this figure? at Genomicron). From Sandwalk.

Genetic diversity in nature
, i.e., molecular genetic hereditary differences within and between populations and species, is the basis of evolutionary change said Darwin, 1859. What is 'genetic diversity'?
Nevertheless, despite its cardinal role in evolution and domestication, the origin, nature, dynamics, function, significance, and maintenance of genetic diversity in nature remains controversial. Likewise, questions concerning which evolutionary processes influence natural genetic variation for phenotypic, quantitative-trait (QTL) variation or what the genetic basis of ecologically important morphological variation such as diverse color patterns of mammals emerged. Controversies included allozyme and DNA markers, and currently include the basic units of genetic diversity, i.e., single nucleotide polymorphisms (SNPs) and copy number variation (CNV), which are both abundant in nature and largely enigmatic as to their functional significance. The basic pending problems are how much of molecular diversity in nature, at both the coding and noncoding genomic regions, is adaptive and how much of the mutations occurring in natural populations are nonrandom (adaptive, directed) rather than random, neutral, nearly neutral, non-adaptive or deleterious. Scholarpedia
Newtonian view.
Charles Darwin's ideas about evolution were first announced in public - to a meeting of the Linnean Society in London. Oddly, almost nobody noticed! It might not have helped that the joint paper (with Alfred Russell Wallace) was given the ripping title 'On the Tendency of Species to Form Varieties; and on the Perpetuation of Varieties and Species by Natural Means of Selection'.

To see evolution only as natural selection is wrong and simplistic? This was realized already by Wallace, in his concern about social communities. Alfred Russel Wallace, thought that humans shared a common ancestor with apes, but higher mental faculties could not have evolved through a purely material process. He was much concerned of domestic animals loss of consciousness. Also Darwin had this aspect in his theory, rejecting the crude, direct application of natural selection to social policies, but it has widely been ignored and transformed. In The Descent Of Man he insisted that humans are a deeply social species whose values cannot be derived from selfish calculation. Yet, as a man of his age, he still shared Spencer's obsessive belief in the creative force of competition. He ruled that natural selection was indeed the main cause of evolutionary changes. Sexual reproduction alone, and care for the offspring, talks against natural selection. See the 6 Darwin lectures. Darwin was sure, however, that natural selection could not be the sole cause. Ironically today, Darwinism is much associated to social science, often with great success. What does it tells about our communities? Degenerated human race?
The Darwinian vision of adaptive evolution may include adaptive nonrandom mutations as Darwin himself emphasized when discussing directed variation in addition to natural selection (chapter IV of the 6th edition of The Origin of Species): "there can also be little doubt that the tendency to vary in the same manner has often been so strong that all the individuals of the same species have been similarly modified without the aid of any form of selection". As Templeton (2009) emphasized, "we are just beginning to embrace again Darwin's broad vision that adaptive evolution involves more than natural selection or random variation... the 21st century will undoubtedly witness a more comprehensive synthesis based on Darwin's enduring legacy".Scholarpedia.
This brings in mind the morphogenic fields of habit, a la Sheldrake. A sort of memory field. Very difficult to proove. Evolution is also said to happen in steps, sometimes very fast, sometimes very slow. The memory field would enhance the learning capacity (of animals), as some kind of 'knowledge for free'. There are a 'critical mass' needed to enhance a change.

Lamarckism, transmutationism and orthogenesis were all non-Darwinian theories.
Lamarck had a model of ‘ladder of complexity’; evolution struggle to increased complexity. He used alchemy as scientific method, and because of that he has been ridiculed, but alchemy has certain traits in common with quantum physics, and it deserves a thorough examination. After all we don’t yet exactly know even what an atom is. Also a 'ladder of evolution' has been used earlier.

In fact, Lamarcks view is more exact? Genotype and phenotype are not correlating. And both depends on heredicity. The important one is the phenotype?

Assumption: All living organisms alive today have descended from a common ancestor (or ancestral gene pool). This is the same statement as to find a theory of everything. It is inherent in the statement that there happen a developement or change. Is it random or not? Biological systems are entropically open - and the thermodynamic force is directed?

Canalization as an evolutionary force in biological systems? It is a useful exercise to think about where biological complexity comes from as a way to facilitate the selection of data mining methods.
A common challenge of identifying meaningful patterns in high-dimensional biological data is the complexity of the relationship between genotype and phenotype. Complexity arises as a result of many environmental, genetic, genomic, metabolic and proteomic factors interacting in a nonlinear manner through time and space to influence variability in biological traits and processes. The assumptions about this complexity greatly influences the analytical methods we choose for data mining and, in turn, our results and inferences. For example, linear discriminant analysis assumes a linear additive relationship among the variables or attributes while support vector machine or neural network can model nonlinear relationships.
Tools for non-local, non-linear, quantal effects are very few. Maybe vectors (bosonic forces)? This is not a Newtonian, reductionistic, view anymore, but quantum physical.
This observation that complex organisms can be produced from simpler ones has led to the common misperception of evolution being progressive and having a direction that leads towards what are viewed as "higher organisms". Nowadays, this idea of "progression" in evolution is regarded as misleading, with natural selection having no intrinsic direction and organisms selected for either increased or decreased complexity in response to local environmental conditions. Although there has been an increase in the maximum level of complexity over the history of life, there has always been a large majority of small and simple organisms and the most common level of complexity (the mode) has remained constant. (The mode of a discrete probability distribution is the value x at which its probability mass function takes its maximum value. In other words, it is the value that is most likely to be sampled.) Wikipedia
You see, "in response to environment" is the important factor, that is environment is determining, not individs. A poorer environment automatically gets poorer individuals. But does this modal value tell anything essential about evolution? Nor diversity? Only about silent assumptions?
The growth of complexity may be driven by the co-evolution between an organism and the ecosystem of predators, prey and parasites to which it tries to stay adapted: as any of these become more complex in order to cope better with the diversity of threats offered by the ecosystem formed by the others, the others too will have to adapt by becoming more complex, thus triggering an on-going evolutionary arms race towards more complexity. This trend may be reinforced by the fact that ecosystems themselves tend to become more complex over time, as species diversity increases, together with the linkages or dependencies between species.

If evolution possessed an active trend toward complexity, then we would expect to see an increase over time in the most common value (the mode) of complexity among organisms, as shown to the right. Indeed, some computer models have suggested that the generation of complex organisms is an inescapable feature of evolution. This is sometimes referred to as evolutionary self-organization. Self-organization is the spontaneous internal organization of a system. This process is accompanied by an increase in systemic complexity, resulting in an emergent property that is distinctly different from any of the constituent parts.

However, the idea of increasing production of complexity in evolution can also be explained through a passive process. This involves an increase in variance but the mode does not change. Thus, the maximum level of complexity increases over time, but only as an indirect product. Wikipedia
Does it really matter for the result if the process is direct or indirect? After all the phenotype is about epigenetic/environmental regulation of the genotype. Non-coding RNA is mostly about introns. See list of RNAs, histone code. Horisontal gene transfer ignores the individual genome at least for bacterias and unicellular organisms (that also acts as superorganisms). In fact, evolution has always been linked to changes in environment. No change, no need for evolution. Evolution is about neutralization of stress, thermodynamics, to find the basic, lowest energy level in equilibrium; homeostasis? Selection overrules gene flow. I would NOT trust Wikipedia. Evolution has also traits of quantum mechanics, that overrule the individ, as horisontal gene transfer, synchronized behaviours (EEG-waves) etc. Biological selection can also occur at multiple levels simultaneously, like some 'sum over histories' in quantum tunneling or to find/computate the path, or in synchrony? This is called superorganisms. See also Human microbiome project, collective intelligence.

Evolution is also the successive alterations that led from the earliest proto-organism to more complex organisms. These simple organisms should then also grow their diversity (sum of paths)? It has many times been shown that micro-organisms behave (as phenotype) as superorganisms.

Current tree of life showing horizontal gene transfers. The prevalence and importance of HGT in the evolution of multicellular eukaryotes remain unclear. Wikipedia.

The quantum physical non-linear entanglement is the first level of coherence, or the basic homeostasis?

Scaling and thermodynamics.
TGD in Scaling Law and Evolution:
Cultural evolution could perhaps be seen as evolution of memes with memetic code playing the role of genetic code. There are good reasons to believe that the intronic portion of DNA codes for memes represented dynamically as field patterns associated with massless extremals.
(MEs = bosons, candidates for correlates of dark photons at various levels of dark matter hierarchy = the quantum aspect = first level of coherence). See also Macroscopic Quantum Coherence and Quantum Metabolism as Di fferent Sides of the Same Coin.
Thermodynamics is important for TGD, but there is also a condition when the second law doesn't hold. Negentropy Maximization Principle states that "
The sequence of quantum jumps is expected to lead to a 4-D counterpart of thermodynamical ensemble and thermodynamics results when one labels the states by the quantum numbers assignable to their positive energy part. Entropy is assigned with entire 4-D CD rather than to its 3-dimensional time=constant snapshots." The hierarchy of CDs forces to introduce a fractal version of the second law taking into account the p-adic length scale hypothesis and dark matter hierarchy. Second law should always be applied only at a given level of p-adic and dark matter hierarchy and one must always take into account two time scales involved, one subjective CD and one objective, = the system, and only if the system is bigger the second law holds.
Maxwell Demon is at work?
This means that second law could be broken in the geometric time scale (= system) considered, but not for personal CDs. In the case of subjective time the negentropic situation could continue forever unless the CD disappears in some quantum jump :) The huge value of gravitational Planck constant associated with space-time sheets mediating gravitational interaction and making possible perturbative quantum treatment of gravitational interaction would indeed suggest the breaking of second law in cosmological time scales. Modi fication of thermodynamics to take into account negentropic entanglement.
This is combined with broken Lorentz symmetry (?) and gravitational vawes?

"Quantum interference of large organic molecules" pretty well says it all. A group in Austria, headed by Markus Arndt, with Stefan Gerlich as the lead author, has demonstrated interference effects using large organic molecules. When they pass a beam of these molecules through an interferometer, they behave exactly like waves, producing an interference pattern analogous to the bright and dark spots you get by sending light into a diffraction grating. The ultimate goal of this is to try to push the quantum-classical boundary as far as possible. Macroscopic bodies show no signs of wave-like behavior. Microscopic objects like electrons, atoms, and now molecules, on the other hand, clearly show wave behavior.

BBC News has a piece about the continued work of the ILL to make neutrons show quantum effects in the gravity of Earth.

Astrobiology, UCLA, stepwise evolution:
Progress in the evolution of life on Earth is measured by an increase in biological diversity (biodiversity), by an increase in morphological disparity, and by progress towards increasing complexity. This are presumably general principals which apply to any living system and so studies of the processes that result in these advances should have universal application. Of the three metrics for evolutionary progress, complexity is the most difficult to define, measure, and understand. In this section, biologists and paleontologists attempt to grapple with this difficult issue by focusing on two outstanding biological events: The origin of eukaryotes and the radiation of animals during the "Cambrian Explosion".
See also Earth's Early Environment and Life. and Professor Rainey's picture of life's evolution from the perspective of major evolutionary transitions, including that from solitary organisms to societies.

Hawking and Mlodinow, The Grand Design, Phys.World, 2010: A second premise that the reader is expected to accept as The Grand Design moves along is that we can, and should, apply quantum physics to the macroscopic world. To support this premise, Hawking and Mlodinow cite Feynman's probabilistic interpretation of quantum mechanics, which is based on his "sum over histories" of particles. Basic to this interpretation is the idea that a particle can take every possible path connecting two points. Extrapolating hugely, the authors then apply Feynman's formulation of quantum mechanics to the whole universe: they announce that the universe does not have a single history, but every possible history, each one with its own probability.

Note also the periods of great extinction of animals. We are in such a period now. A clear quantum state?
Rates of change are increasingly fast and nonlinear with sudden phase shifts to novel alternative community states. We can only guess at the kinds of organisms that will benefit from this mayhem that is radically altering the selective seascape far beyond the consequences of fishing or warming alone. The prospects are especially bleak for animals and plants compared with metabolically flexible microbes and algae.
First the individual or populational level, after that the quantum aspect of evolution.


The individ and population.
Bits of information collected in qubits or selves?
Reductionism.
Qubits and information are classical concepts. We don't know yet what is its quantal counterpart, other than it is non-local. It can be random, but more probable is entangled (one 'bit' of 'holism').

In order to have the evolution working there must be some ‘bits of information’ entangled into a (topological and fractal?) Self. Otherwise the evolution has no tool to work with. The self has to be saved and protected with boundaries; membranes, shells, competition, war and selfishness. Self and Ego and Personality express a tool in different degrees of stability. Richard Dawkins' book The Selfish Gene

This stability is the secondary or tertiary coherence or entanglement, seen as synchronized patterns, discreate information? If homeostasy is the first, lowest, most basic, energy level in temporal equilibrium, then there are secondary, tertiary etc. energy levels in temporal equilibrium too, - called allostasy? Also seen in the protein folding? What is natural selection if not a strive for energetic equilibrium? Also malfunctions, as disease, can be such a tool; built-in stress, or created stress in order to lower the total energy need? Structure is a result of these energy levels of maximal function. To find these levels is then the most basic task for determining evolution. What can structure tell us about the energy levels? They are revealed through maximal function, or use of energy.

Is competitive evolution the only possible context for explaining the biological facts? The trouble with metaphors is that they don't just mirror scientific beliefs, they also shape them, it expresses, advertises and strengthens our preferred interpretations and carries unconscious bias. Competition cannot be the ultimate human reality, because of the social aspects. Does evolution strive towards socialism?
Unresolved conflicts between communal and separatist views of human nature, in the western world from our history, is the reason for its enormous influence on the thoughts still today. The world picture shifted from a feudal, communal pattern towards the more individualistic, pluralistic model we officially follow today. Ideals of personal allegiance, heroic warfare and the divine right of kings began to yield to civilian visions based on democracy, technology and commerce. The cooperative and communion idea is badly non-supported today. The individualistic, atomistic (Newtonian), classical physical and reductionistic view has largely been seen as the only scientific way. The Universe is seen as a giant clock, and biology as a machine with molecular motors. The selfish metaphor: Conceits of evolution.
Evolution as a metaphor.
History: Evolution has been the most glaring example of the thoughtless use of metaphor over the past 30 years, with the selfish/war metaphors dominating and defining the landscape so completely it becomes hard to admit there are other ways of conceiving it. Descartes went beyond faith (the Bible) in attempting to rationally explain the world and our place in it. It was revolutionary, and allowed a scientific wiev, only if the soul was left to the church. The soul- and mindless science was born. Today the soul is still almost completely ignored.

Very different kinds of metaphors and images have been used. Rousseau, Voltaire and Locke devised a kind of social atomism (Social atomism and the old world order - in the eighteenth century it was the basis for two violent revolutions and a good deal of social unrest.). One's value would no longer be determined by one's place in the social order, which is to say by heredity, nor was it to be determined by divine right. It was, henceforth, to be a matter of one's intrinsic worth as a human being and a citizen working together in voluntary association with others for the common good. American society inherited and encoded in its founding documents the principles of equality and individualism, transforming them into a new tradition with its own hierarchy of values and preferences. Philosophical theories can and often have played an influential role in social change, as cultural evolution. By critically examining the genealogy of the dominant philosophical concepts of the self and its relation to social and cultural forms, we can come to a clearer understanding of what shapes contemporary thinking about our own self-identities and our role in the struggle for a just and ethical society.

Thomas Hobbes's claim (with the analytic tools of the geometrician) that the natural state of humans was "a war of all against all" (put forward in a bid to stop people supporting misguided governments) accidentally launched a wider revolt against the notion of citizenship. "All human beings have a natural right to self-preservation, and adding an additional premise regarding human psychology, that all human beings are motivated by a kind of enlightened self-interest, Hobbes concluded that the only rational way to live reasonably peaceful and productive lives would be for individuals, who are equally capable of inflicting harm on one another, to agree to refrain from interfering in one another's lives. But upholding this agreement would entail a compromise, that is, an exchange of one's total independence and a relinquishing of one's rights for a greater measure of security and dependence on another-the duly appointed sovereign-who was granted absolute authority over society provided he fulfilled his duty to keep the peace." The slogan made it possible to argue later that there is no such thing as society, that we owe one another nothing. (Moral atomism was a follower. "Ethical atomism combined with Hobbes' and Locke's social atomism supplies some of the most important and characteristic features of American political theory, and the imprint of these ideas is evident and fixed in the Constitution.") This thought also inspired campaigns for admirable things like easier divorce and universal suffrage and it is still strong today, even though physicists themselves no longer see their particles as radically disconnected, and economics arguing that free competition always serves the general good. Its champions could thus believe they were being scientific while still acting as good citizens. And its emphasis on conflict reassured them they were still heroes, that bourgeois life had not corrupted their machismo. So atomistic thinking, originally drawn from physics, acquired a social meaning in economics and was then returned to science as ideas of competition began to dominate 19th-century biology. The resulting jumble of atomistic ontology, market economics and warlike noises came together fully in the theories of 19th-century "social Darwinists" like Herbert Spencer.

In How The Leopard Changed Its Spots, The Evolution of Complexity 1997, biologist and complexity theorist Brian Goodwin suggested the kind of correction needed, remarking mildly that humans are "every bit as co-operative as we are competitive; as altruistic as we are selfish... These are not romantic yearnings and Utopian ideals, they arise from a rethinking of our nature that is emerging from the sciences of complexity". He represents the structuralist approach, which resonates with D'Arcy Thompson's idea that evolutionary variation is constrained by structural laws; not all forms are possible. These ideas are now connected with new principles of dynamic emergence from complex systems, as developed within the sciences of complexity. Goodwin is strongly opposed to the reductionist view of the ultra-Darwinians, and much more comfortable with the complexity ideas of Stuart Kauffman and with Francisco Varela's holistic approach to biology. A dialog here. Even Dawkins pointed out on his The Selfish Gene, that genes are actually cooperative rather than egoistic.

D'Arcy Thompson pointed this out in On Growth And Form in 1917, noting the striking natural tendencies which contribute to evolution - the subtle, natural patterns such as Fibonacci spirals that shape all manner of organic forms, and the logic underlying patterns such as the arrangement of a creature's limbs. Biologists such as Brian Goodwin, Steven Rose and Simon Conway Morris have developed his work, showing how natural selection is supplemented by a kind of self-organisation within each species, which has its own logic. The selfish metaphor: Conceits of evolution.

Competition and descreatness requires entanglement and systems.
Entities complex enough to compete cannot exist at all without much internal cooperation. (There cannot even be ordinary matter without entanglement (Vedral). Entanglement is the degree of correlation between observables pertaining to different modes that exceed any correlation allowed by the laws of classical physics. Systems can be entangled in their external degrees of freedom (positions and momenta), or internal degrees of freedom, as resonances, B-E condensates, superconductivity... The type of entanglement also gives the implementation of various gates, interpreted as hierarchies, cut-offs or boundaries. Lasers and B–E condensates are macroscopic effects of boson statistics, and requires highly specialized environments (as in membranes, nerves, see Heimburg). Carbon, protons, photons can be composite bosons, and electrons as composite fermions give the essential bricks of life. ). To create cells, organelles must combine; to create armies, soldiers must organise.

But there are also entanglement outside space or time, the EPR-paradox, shown many times, as instance by Tiller.

What is this entanglement, new organisation, self-assembly, if not information? What is the roots of the information? What is the construction, or logic?

The work of evolution can be seen as intelligent and constructive, not as a gamble driven randomly by the forces of competition. And if non-competitive imagery is needed, Denis Noble found that there was not a single oscillator which controlled heartbeat, but rather this was an emergent property of the feedback loops in the various channels. In The Music Of Life 2006 he points out how natural development, not being a car, needs no single "driver" to direct it. Symphonies are not caused only by a single dominant instrument nor, indeed, solely by their composer. And developing organisms do not even need a composer: they grow, as wholes, out of vast and ancient systems which are themselves parts of nature. He has also earlier written The Logics of Life and The Ethics of Life. He is one of the pioneers of Systems Biology and now in Computational Physiology. He played a major role in launching the Physiome Project, an international project to use computer simulations to create the quantitative physiological models necessary to interpret the genome. He is critical of the ideas of genetic determinism and genetic reductionism. Gene ontology will fail without higher-level insight. The theory of biological relativity: there is no privileged level of causality, and the Self is no object. There are no programs in the brain nor any other places, a genuine ‘theory of biology’ does not yet exist. See Genes and causation, How Central Is the Genome?, All systems go , The Music of Life - A view on nature and nurture, Molecules to Organisms: What directs the music of life and his homepage.
He contrasts Dawkins's famous statement in The Selfish Gene ("Now they [genes] swarm ... safe inside gigantic lumbering robots ... they created us, body and mind; and their preservation is the ultimate rationale for our existence") with his own view: "Now they [genes] are trapped in huge colonies, locked inside highly intelligent beings, moulded by the outside world, communicating with it by complex processes, through which, blindly, as if by magic, function emerges. They are in you and me; we are the system that allows their code to be read; and their preservation is totally dependent on the joy we experience in reproducing ourselves. We are the ultimate rationale for their existence". He further suggests that there is no empirical difference between these statements, and claims that they differ in "metaphor" and "sociological or polemical viewpoint" from The Music of Life.
The old metaphors of evolution need to give way to new ones founded on integrative thinking - reasoning based on systems thinking.

Matti says too that biology is a Big Anomaly :)

The Well. What is complexity?
Darwinian evolution is a simple yet powerful process that requires a population of reproducing organisms in which each offspring has the potential for a heritable variation. This creates the p-q variational/oscillational well (with dark matter?). Population is no individual, and potentials are no realizations. These both follow a distribution curve, be it Gaussian or something else. What does the mode tell us? What can be a better measurement? Variability, diversity? That is an increase in potentials, or degrees of freedom?

Defining complexity as “that which increases when self-organizing systems organize themselves”, is an increase of "qubits", entangled bits of freedom. Complex complexity? Of course, in order to address this issue, complexity needs to be both defined and measurable.

Structural and functional complexity by examining genomic complexity.
1. complexity is mirrored in functional complexity and vice versa?
- ambiguous definition of complexity
- the obvious difficulty of matching genes with function
2. defined in a consistent information-theoretic manner (the “physical” complexity), which appears to encompass intuitive notions of complexity used in the analysis of genomic structure and organization.
3. evolution can be observed in an artificial medium, universal aspects of the evolutionary process in a computational world, symbolic sequences subject to evolution are computer programs that have the ability to self-replicate. See also experiments with RNAs.
4. superpositions and entanglements create situations of "to know more than everything" in informational energies, creating a "negative negentropy" of enlarged or enlightened consciousness/information, the ahaa-experience.
5. Noise and inhibition are very important constraints, creating synchrony, or secondary equilibrium stable levels.

In populations of such sequences that adapt to their world (inside of a computer’s memory), noisy self-replication coupled with finite resources and an information-rich environment leads to a growth in sequence length as the digital organisms incorporate more and more information about their environment into their genome [= epigenetic factors grow?]. These populations allow us to observe the growth of physical complexity explicitly, and also to distinguish distinct evolutionary pressures acting on the genome and analyze them in a mathematical framework. Evolution of biological complexity

Here we must remember that genomes are 2-D constructions, RNAs and proteins are 3-D, functions are 4-D (time also included). Time is seen only in function, sequencies, as a result of energy differencies and polarizations (bosons). Time is the 'dance'-metaphor.

We can measure energy differencies as entropy, but living systems are open. Can entropy be measured in structure, as complexity or decoherence? That is the degree of stress-relax?

C-value paradox in genomes.
Back in the olden days, when everyone was sure that humans were at the top of the complexity tree, the lack of correlation between genome size and complexity was called the C-value paradox where "C" stands for the haploid genome size.
The C value paradox takes its name from our inability to account for the content of the genome in terms of known function. One puzzling feature is the existence of huge variations in C values between species whose apparent complexity does not vary correspondingly. An extraordinary range of C values is found in amphibians where the smallest genomes are just below 109bp while the largest are almost 1011. It is hard to believe that this could reflect a 100-fold variation in the number of genes needed to specify different amphibians. Benjamin Lewin, Genes II (1983)
There are so many examples of very similar species that have huge differences in the size of their genome. Onions, are another example—they are the reason why Ryan Gregory made up the Onion Test. The onion test is a simple reality check for anyone who thinks they have come up with a universal function for non-coding DNA. Whatever your proposed function, ask yourself this question: Can I explain why an onion needs about five times more non-coding DNA for this function than a human?

If an organism’s complexity is a reflection of the physical complexity of its genome (assumed here) the latter is of prime importance in evolutionary theory. Physical complexity, roughly speaking, reflects the number of base pairs in a sequence that are functional. As is well known, equating genomic complexity with genome length in base pairs gives rise to a conundrum (known as the C-value paradox) because large variations in genomic complexity (in particular in eukaryotes) seem to bear little relation to the differences in organismic complexity. The C-value paradox is partly resolved by recognizing that not all of DNA is functional (non-coding DNA); that there is a neutral fraction that can vary from species to species. Human DNA is made mostly of ancient viruses, as instance, as an immunological response, just as non-coding RNAs are a primitive immune system. Humans, Arabidopsis, and nematodes all have about the same number of genes.

The C-value enigma, unlike the older C-value paradox, is explicitly defined as a series of independent but equally important component questions, including:

  • What types of non-coding DNA are found in different eukaryotic genomes, and in what proportions?
  • From where does this non-coding DNA come, and how is it spread and/or lost from genomes over time?
  • What effects, or perhaps even functions, does this non-coding DNA have for chromosomes, nuclei, cells, and organisms?
  • Why do some species exhibit remarkably streamlined chromosomes, while others possess massive amounts of non-coding DNA?
Now we have a G-value paradox, where "G" is the number of genes. The only way out of this box—without abandoning your assumption about humans being the most complex animals—is to make up some stories about the function of so-called junk DNA. Maybe humans really aren't all that much more complex, in terms of number of genes, than wall cress. Maybe they should have the same number of genes. Maybe the other differences in genome size really are due to variable amounts of non-functional junk DNA.

It is likely that a significant increase in this non-neutral fraction could be observed throughout at least the early course of evolution. For the later period, in particular the later Phanerozoic Era, it is unlikely that the growth in complexity of genomes is due solely to innovations in which genes with novel functions arise de novo. Indeed, most of the enzyme activity classes in mammals, for example, are already present in prokaryotes. Rather, gene duplication events leading to repetitive DNA and subsequent diversification as well as the evolution of gene regulation patterns appears to be a more likely scenario for this stage. Still, we believe that the Maxwell Demon mechanism described below is at work during all phases of evolution and provides the driving force toward ever increasing complexity in the natural world. Evolution of biological complexity

A modification of the second law?

The notion of information.
The difficulty with defining complexity is because there is no single measure that can capture all structural characteristics. The different meanings of complexity and their measures restrict eventual usefulness, which so far seems mainly descriptive. Maybe decoherence is a better term?

Perhaps a key aspect of information theory is that information cannot exist in a vacuum, that is, information is physical. This statement implies that information must have an instantiation (be it ink on paper, bits in a computer’s memory, or even the neurons in a brain). Furthermore, it also implies that information must be about something. Information is about entropy. How is entropy in quantum world? Quantum gravity or anti-gravity?

Then there are the problem of meaning.

In the one end complexity has a funky relationship with information and descriptions. Kolmogorov complexity (algorithmic information) is a measure of the resources needed to specify an object. Used also in fractals. A random string takes the most information to specify. Kolmogorov's complexity K(x) of a bitstring x is the length of the shortest program that computes x and halts. Jurgen Schmidhuber's homepage. Poland:
Schmidhuber proposed a new concept of generalized algorithmic complexity. It allows for the description of both finite and infinite sequences. The resulting distributions are true probabilities rather than semimeasures.He prove a strong coding theorem (without logarithmic correction terms), which was left as an open problem. To this purpose, we introduce a more natural definition of generalized complexity.
That sounds very much like relativity.

Another measure with descriptive power for biological systems is mutual information, which is linked to Shannon information (requires entanglement?) used among others by TGD. It is claimed to have been used to characterize RNA secondary structure, but in any case mutual information can be used to define neural complexity. This complexity is lowest for regular or random systems of elements, but highest for networks with order on all scales such as brains (relationships, wells).

The correspondence between algorithmic constructions and physics is interesting. Computer scientist Scott Aaronson has a lot to say here. The impression is that he claims that physical processes are algorithmic at their base (even if that is quantum algorithmic). He is a founder of the Complexity Zoo, which catalogs all classes of computational complexity. Also the limits of computability he has thought of, together with quantum computation. Everything cannot be computated.

Aaronson: There were two occasions when researchers stepped back, identified some property of almost all the techniques that had been tried up to that point, then proved that no technique with that property could possibly work. These “meta-discoveries” constitute an important part of what we understand about the P versus NP problem. The first meta-discovery was relativization, any solution to the P versus NP problem will require non-relativizing
techniques: techniques that exploit properties of computation that are specific to the real world. The second meta-discovery was natural proofs. A new technique called arithmetization to circumvent the problems. The idea was that, instead of treating a Boolean formula ' as just a black box mapping inputs to outputs, one can take advantage of the structure of ', by “promoting” its AND, OR, or NOT gates to arithmetic operations over some larger field F. We now have circuit lower bounds that evade both barriers, by cleverly combining arithmetization (which is non-relativizing) with diagonalization (which is non-naturalizing). Is there a third barrier, beyond relativization and natural proofs, to which even the most recent results are subject? It is, a third barrier to solving P versus NP and the other central problems of complexity theory. Algebrization (short for “algebraic relativization”) Notice that algebrization is defined differently for inclusions and separations; and that in both cases, only one complexity class gets the algebraic oracle e A, while the other gets the Boolean version A. These subtle asymmetries are essential for this new notion to capture what we want. Arithmetization is one of the most powerful ideas in complexity theory, but simply fails to “open the black box wide enough.
The major focus in studying genetic diversity in nature should be on population functional ecological genomics across life coupled with proteomics, metabolomics, and phenomics. A major future perspective should try to analyze the effects of stresses not only through individual genes but through genomic-biochemical networks related to individual and collective environmental stresses (solar radiation, temperature, global warming, drought, mineral and photosynthetic deprivations, biotic stresses, etc). Epigenomics should be coupled with genomics, including DNA methylation, histone modification, and the regulatory effects of small RNA, and noncoding repeat elements, including transposon and retrotransposon dynamics in both adaptation and speciation to evaluate genome-wide adaptive divergence. Comparisons should be made between local, regional, and global "genetic laboratories" and the study of genomic diversity and divergence across life. The next-generation technologies will open dramatic new horizons for the study of genetic diversity in nature, unraveling its nature and dynamics at micro- and macroscales. These future studies will highlight theoretical mysteries of evolutionary biology as to the genetic diversity in nature and the evolutionary driving forces shaping its fate in adaptation and speciation.


References.
S. Aaronson and A. Wigderson. 2008: Algebrization: A New Barrier in Complexity Theory [PS] [PDF], ACM Transactions on Computing Theory 1(1), 2009. Conference version [PS] [PDF] in Proceedings of ACM STOC'2008, pp. 731-740.

Adami C, Ofria C, Collier TC 2000: "Evolution of biological complexity". Proc. Natl. Acad. Sci. U.S.A. 97 (9): 4463–8. doi:10.1073/pnas.97.9.4463. PMC 18257. PMID 10781045

Adami C 2002: "What is complexity?". Bioessays 24 (12): 1085–94. doi:10.1002/bies.10192. PMID 12447974


Darwin, Charles 1859: On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life (Full image view 1st ed.), London: John Murray, pp. 502, retrieved 2011-03-01


- The Complete Works of Charles Darwin Online

Davnah Urbach & Jason Moore, 2011: Data mining and the evolution of biological complexity. BioData Mining 2011, 4:7. doi:10.1186/1756-0381-4-7

Richard Dawkins' The Selfish Gene.

- Complexity Explained: 13. Evolution of Biological Complexity

Gerlich, S., Eibenberger, S., Tomandl, M., Nimmrichter, S., Hornberger, K., Fagan, P., Tüxen, J., Mayor, M., & Arndt, M. 2011: Quantum interference of large organic molecules. Nature Communications, 2 DOI: 10.1038/ncomms1263


Hahn, M.W. and Wray, G.A. 2002: The g-value paradox. Evol. Dev. 4:73-75.

Stephen Hawking and Leonard Mlodinow 2010: The Grand Design. Bantam Press

Dr. Eviatar Nevo, & Avigdor Beiles, 2010: Genetic variation in Nature. Scholarpedia. Modified and approved on 10 November 2010. Extensive.

Mary Midgley, 2011: The selfish metaphor: Conceits of evolution. New Scientist.

Mattick, J.S. 2004: The hidden genetic program of complex organisms. Sci Am. 291:60-67.

Dennis Noble 2006: The Music of Life, ISBN 0-19-929573-5

Matti Pitkänen, 2010: Macroscopic Quantum Coherence and Quantum Metabolism as Di fferent Sides of the Same Coin.
- 2011 General theory of qualia, Homeopathy in many-sheeted space-time. and Negentropy Maximization Principle.

Jan Poland 2004: A Coding Theorem for Enumerable Output Machines. IDSIA-05-04. http://www.idsia.ch/»jan

http://scholar.google.fi/scholar?cites=17473276152719269686&as_sdt=5&sciodt=0&hl=sv

5 kommentarer:

  1. Scientists find way to map brain's complexity

    Posted 2011/04/11 http://www.newsdaily.com/stories/tre7392ku-us-brain-model/

    Quote: "We are beginning to untangle the complexity of the brain," said Tom Mrsic-Flogel, who led the study.

    "Once we understand the function and connectivity of nerve cells spanning different layers of the brain, we can begin to develop a computer simulation of how this remarkable organ works."

    But he said would take many years of work among scientists and huge computer processing power before that could be done.

    In a report of his research, Mrsic-Flogel explained how mapping the brain's connections is no small feat: There are an estimated one hundred billion nerve cells, or neurons, in the brain, each connected to thousands of other nerve cells, he said, making an estimated 150 trillion synapses.

    "How do we figure out how the brain's neural circuitry works? We first need to understand the function of each neuron and find out to which other brain cells it connects," he said.

    In this study, Mrsic-Flogel's team focused on vision and looked into the visual cortex of the mouse brain, which contains thousands of neurons and millions of different connections.

    Using high resolution imaging, they were able to detect which of these neurons responded to a particular stimulus.

    Taking a slice of the same tissue, the scientists then applied small currents to subsets of neurons to see which other neurons responded and which of them were synaptically connected.

    Using this method, the team hopes to begin generating a wiring diagram of a brain area with a particular function, such as the visual cortex. The technique should also help them map the wiring of regions that underpin touch, hearing and movement.

    SvaraRadera
  2. Complex Systems in Biomedicine, ISBN 978-88-470-0394-1

    Quarteroni, A.; Formaggia, L.; Veneziani, A. (Eds.)

    1st ed. Softcover of orig. ed. 2006, 2006, XIV, 292 p. 88 illus.

    Mathematical modeling of human physiopathology is a tremendously ambitious task. It encompasses the modeling of most diverse compartments such as the cardiovascular, respiratory, skeletal and nervous systems, as well as the mechanical and biochemical interaction between blood flow and arterial walls, or electrocardiac processes and the electric conduction into biological tissues. Mathematical models can be set up to simulate both vasculogenesis (the aggregation and organisation of endothelial cells dispersed in a given environment) and angiogenesis (the formation of new vessels sprouting from an existing vessel) that are relevant to the formation of vascular networks, and in particular to the description of tumor growth. The integration of models aimed at simulating the cooperation and interrelation of different systems is an even more difficult task. It calls for the set up of, for instance, interaction models for the integrated cardio-vascular system and the interplay between central circulation and peripheral compartments, models for the mid-long range cardiovascular adjustments to pathological conditions (e.g. to account for surgical interventions, congenital malformations, or tumor growth), models for the integration among circulation, tissue perfusion, biochemical and thermal regulation, models for parameter identification and sensitivity analysis to parameter changes or data uncertainty – and many others. The heart is a complex system in itself, where electrical phenomena are functionally related with the wall deformation. In its turn, electrical activity is related with heart physiology. It involves nonlinear reaction-diffusion processes and provides the activation stimulus to the heart dynamics and eventually the blood ventricular flow that drives the haemodynamics of the whole circulatory system. In fact, the influence is reciprocal, since the circulatory system in turns affects the heart dynamics and may induce an overload depending upon the individual physiopathologies ( for instance the presence of a stenotic artery or a vascular prosthesis). Virtually, all the fields of mathematics have a role to play in this context. Geometry and approximation theory provide the tools for handling clinical data acquired by tomography or magnetic resonance, identifying meaningful geometrical patterns and producing three-dimensional geometrical models stemming from the original patients data. Mathematical analysis, flow and solid dynamics, stochastic analysis are used to set up the differential models and predict uncertainty. Numerical analysis and high performance computing are needed to numerically solve the complex differential models. Finally, methods from stochastic and statistical analysis are exploited for the modeling and interpretation of space-time patterns.Indeed, the complexity of the problems at hand often stimulates the use of innovative mathematical techniques that are able, for instance, to accurately catch those processes that occur at multiple scales in time and space (like cellular and systemic effects), and that are governed by heterogeneous physical laws. In this book we have collected the contribution from several Italian research groups that are successfully working on this fascinating and challenging field. Every chapter will deal with a specific subfield, with the aim of providing an overview of the subject and an account of the most recent research results.

    SvaraRadera
  3. http://www.wired.com/wiredscience/2011/04/fish-brain-diversity/

    Better brains make one fish, two fish, into lots and lots of fish.
    After upgrading their ability to communicate using electrical signals, a group of African fish exploded into dozens of species. This may be the first study to show a link between central brain evolution and increasing species diversity, researchers report in the April 29 Science.

    “The brain structure triggered an explosion of signals and an explosion of species as a result,” says Carl Hopkins,

    SvaraRadera
  4. http://matpitka.blogspot.com/2011/07/quantum-model-for-remote-replication-of.html

    Summary

    The basic assumptions of the model of remote replication deserve a short summary.

    Bio-molecules would serve as receiving and sending quantum antennas forming populations with communications between members just like higher organisms. The molecules participating the same reaction would naturally have same antenna frequencies. Quarks and antiquarks at the ends of the flux tubes would code for different nucleotides and the frequencies associated with the nucleotides would be identical. The character of classical electromagnetic field would code for a particular nucleotide.

    Remote replication and other remote polymerization processes would differ from the ordinary one only in that the phase transition reducing the value of Planck constant for the flux tube would not take place and bring the molecules near each other.

    The immediate product of remote replication would be the conjugate of the original DNA sequence and DNA polymerase would amplify it to the copy of the original DNA sequence. This prediction could be tested by using very simple DNAs sequences- say sequences consisting two nucleotides which are not conjugates. For instance, one could check what happens if conjugate nucleotides are absent from the target (neither conjugate nor original DNA sequence should be produced). If the target contains conjugate nucleotides but no originals, only conjugate DNA sequences would be produced - one might hope in sufficiently large amounts to be detectable.

    Frequency coding would be natural for quantum antenna interactions between ordinary DNA and its dark variant and also between dark variants of DNA, RNA, tRNA, and amino-acids. The reason is that dark nucleons represent the genetic code by entanglement and it is not possible to reduce the codon to a sequence of letters.

    SvaraRadera
  5. http://www.sciencedaily.com/releases/2011/07/110706113450.htm

    "Chromosome Size in Diploid Eukaryotic Species Centers on the Average Length with a Conserved Boundary," was recently published in the journal Molecular Biology and Evolution 28:1901-11. It details a project that compared 886 chromosomes in 68 random species of eukaryotes - organisms whose cells contain a nucleus and are enclosed by cellular membranes. The researchers found that the chromosome sizes within each eukaryotic species are actually similar rather than drastically different as previously believed. They also found that the chromosomes of these different organisms share a similar distribution pattern.

    http://images.sciencedaily.com/2011/07/110706113450.jpg

    The team found that nearly every genome perfectly formed an S-curve of ascending chromosomal lengths when placed on a standardized X-Y axis. That meant the genome from a species of rice expressed the same pattern as the genome from a species of maize, sorghum, fruit fly, dog, chimpanzee, etc. "It eliminated a scale effect and made it possible to compare a species with several dozen chromosomes to a species with much fewer chromosomes," Birds are unique because in addition to their usual chromosome sequences, they contain one additional set of minichromosome sequences,

    DOI: 10.1093/molbev/msr011

    Mitochondrias also contain additional female DNA. And the Nuture-effect can also be heredical.

    The genome isn't structure itself. Only its products are. One gene can produce many structures.

    Dark information? Action at a distance?

    SvaraRadera