More Recent Comments

Saturday, October 01, 2016

Extending evolutionary theory? - James Shapiro

I will be attending the Royal Society Meeting on New trends in evolutionary biology: biological, philosophical and social science perspectives. I'll post each of the abstracts and ask for your help in deciding what question to pose to the speakers. Here's the abstract for James Shapiro's talk on Biological action in Read-Write genome evolution.

Many of the most important evolutionary variations that generated phenotypic adaptations and originated novel taxa resulted from complex cellular activities affecting genome content and expression. These activities included: (i) the symbiogenetic cell merger that produced the mitochondrion-bearing ancestor of all extant eukaryotes; (ii) symbiogenetic cell mergers that produced chloroplast-bearing ancestors of photosynthetic eukaryotes; and (iii) interspecific hybridisations and genome doublings that have generated adaptive radiations and new species of higher plants and animals. Adaptive variations have also arisen by horizontal DNA transfers (frequently involving infectious agents), by natural genetic engineering of coding sequence DNA in protein evolution (e.g. exon shuffling), and by mobile DNA rewiring of transcriptional regulatory networks, such as those essential to viviparous reproduction in mammals. In the most highly evolved multicellular organisms, we now know that biological complexity scales with ‘non-coding’ DNA content rather than with protein-coding capacity in the genome. Coincidentally, we have come to recognise that ‘non-coding’ RNAs rich in repetitive mobile DNA sequences function as key regulators of complex adaptive phenotypes, such as stem cell pluripotency. The intersections of cell activities and Read-Write genome modifications provide a rich molecular and biological foundation for understanding how ecological disruptions can stimulate productive, often abrupt, evolutionary transformations.
I have dozens of questions for Jim Shapiro but here are two possibilities.
Most of the events you describe are one-off events in the history of life. They are mostly accidents. They were unpredictable. How does the occurrence of unique events such as endosymbiosis or genome doubling fit into evolutionary theory as opposed to just historical facts in the history of life.

OR

Michael Lynch and others say that the amount of junk DNA in a genome correlates with the population size of the species. This view is perfectly consistent with modern population genetics. There is plenty of evidence that 90% of our genome is junk. You seem to be implying that this extra DNA is not junk but serves some adaptive purpose. What evidence do you have that supports this claim and why do you disagree with Michael Lynch?

131 comments :

Graham Jones said...

"How does the occurrence of unique events such as endosymbiosis or genome doubling fit into evolutionary theory"

Genome doubling is not unique. It happens all the time in plants. Autopolyploids fit into evolutionary theory easily. Allopolyploids require phylogenetic trees to be replaced with networks, or multi-labelled trees.

I like your second question is much more.

Larry Moran said...

What do you mean when you say it happens "all the time?"

Is that once every year, every century, every million years? Can you come up with a theory (hypothesis) that predicts geome doubling, or a theory that accounts for genome doubling in the history of extant lineages?

I'm guessing you can't. We can observe, or deduce, when genome doubling has occurred in the past but those are one-off events. They are not mechanisms of evolution. These events fall into the same category as asteroid impacts and nobody thinks asteroids impacts should be incorporated into an extended evolutionary theory.

Mikkel Rumraket Rasmussen said...

Shapiro posits an additional source of mutation/genome altering called "natural genetic engineering" that gives organism an additional method of responding and adapting to environmental change besides just random mutations filtered by natural selection. He seems to suggest these alterations are in some sense "intelligent", that they cause adaptive change over and above what a random sampling process like the stochastic accumulation of mutations would produce.

How does he propose this works, in the sense of, how can such a mechanism possess any "knowledge" about what genetic alteration will be adaptive and, much more importantly, what kind of evidence is there that such a mechanism even exists?

Ask him that.

Graham Jones said...

A new plant species emerges roughly once a century due to polyploidization. There are hypotheses around which attempt to explain why ferns do it more than angiosperms, and angiosperms more than animals, etc. There are attempts to work out whether polyploidization is 'good' for a species, that is, does it lead to higher or lower subsequent diversification rates.

John Harshman said...

I'm at a loss to understand your point here, Larry. Point mutations are one-off events too right? Is it your claim that some events are too rare to be significant in evolution? How rare, then, do they have to be? Autopolyploidy and allopolyploidy seem to be fairly common over evolutionary time. It's been estimated that the latter is implicated in around 5% of speciations in plants, which seems significant to me. And of course the events themselves must be considerably more common than the ones that end up leaving evidence in extant genomes, just as mutations are vastly more common than fixations.

Anonymous said...

"In the most highly evolved multicellular organisms, we now know that biological complexity scales with ‘non-coding’ DNA content rather than with protein-coding capacity in the genome." -- Is this true? And what are the most highly evolved multicelluar organisms? What are the most highly evolved plants?

John Harshman said...

The most highly evolved multicellular organisms are salamanders and ferns, of course. Isn't it obvious?

Joe Felsenstein said...

In animals, lungfish are even more highly evolved than salamanders.

Unknown said...

Here is one example of what is being looked at as an example-

http://www.pagepress.org/journals/index.php/eb/article/view/eb.2011.e3/6022

From the Conclusion-
"... In this review, we summarize some of the early work on directed mutation as well as recent evidence suggesting that mutations are directed to the glpFK control region, thereby allowing growth of E. coli crp mutants on glycerol. These mutations are specifically induced by the presence of glycerol under starvation conditions. The glp regulon-specific protein that regulates mutation frequency was identified, and the control mechanism was in large measure elucidated..."

Faizal Ali said...

If I understand that paper correctly, it suggests that a protein has been identified which can increase the mutation rate under stress conditions. It does not "direct" the type of mutations that occur, however. They also do not suggest that this protein arose thru anything other than standard processes as understood by current evolutionary theory. So, continuing the ongoing theme of this series of posts, who does this "extend" evolutionary theory?

Anonymous said...

The panicoid grasses are amazingly highly evolved plants, with their modified spikelets and diverse methods of photosynthesis. One could make a good argument for the grasses of the Andropogoneae, though.

John Harshman said...

You mention irrelevant characters. The only thing that decides evolutionary highness is the size of the non-coding genome.

Anonymous said...

Yes, of course! So among the most highly evolved grasses are the higher polyploid individuals of Poa pratensis (Kentucky Bluegrass), e.g. those with 2n = 147. They seem pretty much the same in morphology and function as their relatives with 2n = 4x = 28. These high polyploids have a LOT more genome than they need, obviously.

Yes, 147 is an odd number, but that they can cope with that is all the more evidence of their highly-evolvedness. (Not to mention apomictic seed set.)

Eric said...

I have read some of Shapiro's papers, and I keep coming away with the same question. How did you test the null hypothesis? From what I read, what he calls "genetic engineering" is random mutations. They are changes in DNA sequence that are not guided by fitness. Exon shuffling, horizontal genetic transfer, and the rest can result in beneficial, neutral, or deleterious changes. There is nothing inherent in these mechanisms that allow the organism to specifically change a base to a specific environmental challenge. Instead, these random changes (with respect to fitness) are always ticking away in the background, and the beneficial ones are amplified through natural selection.

The trick that both Shapiro and Barbara Wright use is to ignore all of the neutral and deleterious mutations that these mechanisms create in hopes that no one will recognize them as random mutations.

John Harshman said...

Well, at least we can laugh at fugu. Stupid fugu, ha ha.

Arlin said...

John, the issue here is that Shapiro is strong on token cases but weak on general causes. A general-cause claim is that smoking causes cancer, and a token-cause claim is that Bob's tumor was caused by smoking. Shapiro is very good at pointing to changes and saying that there was a specific mutation underlying that. Almost everyone would agree that, underlying a token event of evolutionary change, there is a token event of mutation. But where is the theory of general causes? Where are the predictions?

ElShamah777 said...

Mikkel wrote:

"How does he propose this works, in the sense of, how can such a mechanism possess any "knowledge" about what genetic alteration will be adaptive and, much more importantly, what kind of evidence is there that such a mechanism even exists? "

Shapiro writes:

"Research dating back to the 1930s has shown that genetic change is the result of cell-mediated processes, not simply accidents or damage to the DNA. This cell-active view of genome change applies to all scales of DNA sequence variation, from point mutations to large-scale genome rearrangements and whole genome duplications (WGDs). This conceptual change to active cell inscriptions controlling RW genome functions has profound implications for all areas of the life sciences."

Remarkably, he is not the only one.

Evidence of non-random mutation rates suggests an evolutionary risk management strategy

http://www.nature.com/nature/journal/v485/n7396/full/nature10995.html

"Upon comparing 34 Escherichia coli genomes, we observe that the neutral mutation rate varies by more than an order of magnitude across 2,659 genes, with mutational hot and cold spots spanning several kilobases. Importantly, the variation is not random: we detect a lower rate in highly expressed genes and in those undergoing stronger purifying selection. Our observations suggest that the mutation rate has been evolutionarily optimized to reduce the risk of deleterious mutations. Current knowledge of factors influencing the mutation rate—including transcription-coupled repair and context-dependent mutagenesis—do not explain these observations, indicating that additional mechanisms must be involved. The findings have important implications for our understanding of evolution and the control of mutations."

Cool. Didnt know that evolution has the remarkable hability of "evolutionary risk management strategy "

Risk management is something that i usually atribute to highly intelligent bankers, stock portfolio managers etc... LOL

But thats not the first time that we see proponents of evolution atributing intelligence to a mechanism that should not have this hability, as Professor Richard Watson for example, that sayd

"new research shows that evolution is ABLE TO LEARN from previous experience, which could provide a better explanation of how evolution by natural selection produces such apparently intelligent designs.

haha.

John Harshman said...

Predictions of future evolution are notoriously difficult, and I don't see how they make the difference between good and bad science. Evolutionary biology certainly has a lot of individual events that can't easily be predicted. How is polyploidization different from point mutations in this respect, other than that the rate of the latter is higher than the rate of the former? Perhaps we can't come up with a general theory of polyploidization. But it's certainly been important in the history of life, and it certainly fits the definition of evolution, unlike asteroid impacts.

I'm not sure this particular argument has anything to do with Shapiro or the hypothetical extended synthesis.

judmarc said...

Shapiro had co-authors on some of his earlier "directed evolution" papers, such as Barry Hall, but the co-authors later backed off the idea. From what I remember, the early papers were full of excitement about simple mutations that indeed increased in microorganisms under stress, as is well known. However, the kind of work done by Lenski and others showed this stopped well short of anything that could be truly called "directed." So perhaps among the questions might be what Shapiro can point to as evidence that mutation is not only speeded up in stressed organisms but given directionality, and what he thinks Lenski's work says about this.

Larry Moran said...

@John Harshman

Arlin made the point better than I did but let me try again.

What we're discussing is evolutionary theory. If you are promoting a revision of modern evolutionary theory then you have to come up with a generalizable principle that explains a number of observations. In this case, you have the observation that, in a small percentage of lineages, polyploidization has occurred.

In order to convert this to a general hypothesis (theory) you need to explain why polyploidization is almost nonexistant in bacteria and protozoa and why it is much more common in plants than in fungi or insects. You also need to come up with ideas about why polyploidization might have been selected in plants in order to promote evolution and why genome doubling in an individual becomes fixed in the population.

In the absense of such explanations, polyploidization is just another example of a mutation and specific mutations, such as the one causing sickle cell disease, are not part of evolutionary THEORY.

The Lorax said...

I'm no fan of Shapiro's thoughts on evolution, but I disagree with the emphasis here on polyploidization. There are numerous examples of anueploidies and/or chromosomal level changes that separate lineages. Gene duplications or better yet large regions of duplication provide a mechanism for speciation, which is an important aspect of evolutionary theory in my opinion.

Eric said...

This is part of Shaprio's problem. When we say that mutations are random we don't mean that the distribution of mutations in the genome are random or that the rate of mutation is the same through time. What we mean when we say that mutations are random is that they are random WITH RESPECT TO FITNESS. Like others in the EES movement, they try to redefine what others are saying, and then attack that strawman.

The mutations caused by cellular mechanism are random with respect to fitness. They produce beneficial, neutral, and deleterious mutations. There is no meaningful connection between the mutations that these mechanisms create and the mutations that the organism needs. It is equivalent to a poor person buying more lottery tickets. It increases their chances of winning, but it is still a random drawing.

John Harshman said...

How does gene duplication provide a mechanism for speciation? Not seeing it. I can see how the differential loss of individual gene copies can contribute to speciation, but the copying itself?

It seems clear why polyploidy is absent in bacteria. A single circular chromosome and complete absence of meiosis should be good enough. I have no idea of the distribution of polyploidy in protists. And I would suggest a hypothesis that a capacity for selfing and vegetative reproduction makes polyploidy more likely to become established in a population. There may also be something about a sensitive dosage-dependence of some essential processes that would discourage it. This seems fully amenable to incorporation into evolutionary theory, though without any sort of revolution.

Unknown said...

From the link I provided before : http://www.pagepress.org/journals/index.php/eb/article/view/eb.2011.e3/6022

The researchers claim: “This insertional event appears to represent a genuine example of directed mutation… This is the first example of transposon-mediated directed mutation where the molecular explanation, involving a DNA-binding protein, has been provided.”

Are they wrong about that?
If they are wrong, then I would appreciate a rundown of the error(s) because it seems they have done a good job of making the case.

If they are correct and they have shown an example of directed mutation in action, then it seems to me any theory which states ’no directed mutation possible’ would have to be altered to the degree required to accommodate the new data.

Isn't that how it works?

Faizal Ali said...

@ Jack Johnson:

You are not understanding the different uses of the term "directed mutation". The competent scientists use the term to describe situations in which the frequency of mutations can be increased, but the specific form of the mutations remain random. Examples of this exist.

The crackpots are claiming that organisms have the ability to cause specific beneficial mutations to occur when the conditions in which these mutations will be beneficial arise. There is no evidence of this actually occurring.

Diogenes said...

If you have James Shapiro in your sights, Larry, it would be a waste of a great opportunity to argue over whether polyploidization is or isn't part of evolutionary theory.

Diogenes said...

What Lutesuite said. To put it another way. Suppose that a particular protein causes a higher mutation rate in one region of the genome compared to a different region. That is "directed" in the sense that some regions have higher RANDOM mutation rates than others... but it isn't news, we've known for a long time that mutation rates are lower e.g. in metabolic genes.

Suppose alternatively that RANDOM mutation rates increase when the cells are stressed. This is "directed" in the sense that a starving cell is in a tough environment and "wants" its offspring to mutate out... through RANDOM mutations. This also is not news. The work of BG Hall, Cairns etc.-- the mutational mechanism sometimes called "Cairnsian" and once called "Fred"-- has been known for decades to increase the rate of RANDOM mutations. But it's "directed" in the sense that the cell knows it's in bad straits and "wants" more RANDOM mutations in its offspring.

But no observed process is ever "directed" and creative, in the sense that

1. the cell knows ahead of time that some final mutation states are more fit than others-- e.g. the cell knows that mutating T to A is good, but G or C would be bad-- the cell never knows which final state is "better", and

2. the mutated final state is new, complex, and non-repetitive. I have to add the second condition because telomerase does create new DNA sequence, but it's a repetition of a short ancient sequence.

Diogenes said...

Bwilson295, if a plant just doubles or quadruples its genome, the ratio of coding to non-coding remains the same, yes? Assuming that nothing happens to snip out the coding bits. So your polycoid grass example appears to be irrelevant to the "non-coding ratio = complexity" claims of the Dog's Ass Plot.

Diogenes said...

ElShamah77, your quote is precisely the reason why Shapiro is dishonest, as he is claiming things have been "observed" since the 1930's which have in fact never been observed. I have read Shapiro's papers and his claims. Shapiro has NEVER cited a single observed example of his alleged "Natural Genetic Engineering." On the contrary, Shapiro always backs up his ridiculous claims by casting them as a deduction:

1. Cells have complex biochemical control mechanisms, which look like they "know what's good for them",
2. Cells mutate and evolve, therefore
3. Cell mutation and evolution are directed because the cell "knows" what kind of mutations are good for it.

This is total bullshit. It's a non sequitur. 1 and 2 are correct, but 3 does not logically follow. Just because cells have complex biochemical responses (e.g. if you put them in sugar, they start expressing proteins to digest sugar) this does not prove Shapiro's claim-- that cells create new, complex, non-repetitive genetic structures that the cell "knows" is good for it, before the mutation is filtered through natural selection. This bolded claim of Shapiro's is bullshit, he never cited any observed example of that.

Thus your Shapiro quote, "Research dating back to the 1930s has shown that genetic change is the result of cell-mediated processes, not simply accidents or damage to the DNA. This cell-active view of genome change applies to all scales of DNA sequence variation, from point mutations to large-scale genome rearrangements and whole genome duplications (WGDs). This conceptual change to active cell inscriptions controlling RW genome functions has profound implications for all areas of the life sciences", is simply bullshit. It's Shapiro's hypothesis, not an observation.

Shapiro has every right to hypothesize whatever he wants. This "Natural Genetic Engineering" might be real, who knows? But he's a crackpot because he writes as if it's been observed, and because he has never suggested a way for his hypothesis to be falsified experimentally, even when I demanded repeatedly that he suggest an experiment to falsify.

Some of you may remember that a couple years back I pointed out Shapiro's bullshit, and on his blog he attacked me personally. I repeatedly demanded that he suggest a means to falsify his hypothesis, and he never did, he just banned me from his blog. Then he denied banning me and his sycophants called me a liar.

Diogenes said...

ElShamah, do not respond to my criticism by citing more authorities' opinions (which you quote but don't understand). From now on, I demand you cite observations, experiments which allegedly show Shapiro's Natural Genetic Engineering being observed. Every time I demand this of Shapiro's creationist sychophants, they never cite an observation. They just quote authorities that I have no respect for. I demolished Shapiro in a fair debate before he banned me.

Diogenes said...

And ElShamah's second authority quotes, while accurate, does not agree with Shapiro. It makes a different point that is NOT Shapiro's hypothetical Natural Genetic Engineering.

Your quote: "Upon comparing 34 Escherichia coli genomes, we observe that the neutral mutation rate varies by more than an order of magnitude across 2,659 genes, with mutational hot and cold spots spanning several kilobases.

This is not Shapiro's NGE, it is not news to us and not controversial.

As I stated above, to repeat myself: Suppose that a particular protein causes a higher mutation rate in one region of the genome compared to a different region. That is "directed" in the sense that some regions have higher RANDOM mutation rates than others... but it isn't news, we've known for a long time that mutation rates are lower e.g. in metabolic genes.

But no observed process is ever "directed" and creative, in the sense that

1. the cell knows ahead of time that some final mutation states are more fit than others-- e.g. the cell knows that mutating T to A is good, but G or C would be bad-- the cell never knows which final state is "better", and

2. the mutated final state is new, complex, and non-repetitive. I have to add the second condition because telomerase does create new DNA sequence, but it's a repetition of a short ancient sequence.

So Shapiro is full of it. There are no observed examples of NGE, and when I demand observations from Shapiro's sycophants, they never cite observations. They bullshit with some paper about sugar digestion-- duh, we know proteins are expressed in response to environmental changes, that's not a creative mutation-- or they cite authorities they don't understand.

Enough. I demand you describe experiments, not quote authorities.

Diogenes said...

Larry Moran, I hope you are still monitoring this thread, because I do have some suggested questions which I would love for you to direct to Prof. Shapiro. In my dreams. They're long and technical, you can edit.

Question One. Prof. Shapiro, you have frequently written about alleged mechanisms of "Natural Genetic Engineering" as if they've already been observed, and are not merely hypothetical. But to count as real "Genetic Engineering", a mutation or genetic change in a cell would have to satisfy four conditions:

1. The cell would have to somehow "know" ahead of time (before the mutation and before natural selection filters mutations) that some mutations are better than others, e.g. that T mutating to A will increase fitness but T to G or C will decrease fitness.

2. The mutation/genetic change must produce genuinely novel DNA, e.g. not just gene duplication, or transposon copying, which all create duplicates of sequences that already exist-- and which are far more likely to produce disease states than anything beneficial to the organism, thus showing only randomness, no foresight or your claimed "cell intelligence".

3. The genuinely novel DNA created must be complex and not simply repetitive, like the output of telomerase.

What one observation, or one mechanism has all properties 1, 2 and 3? If just one feature is missing, you can't honestly call it Natural Genetic Engineering.


(Larry, I can anticipate Shapiro's response. He will attempt to pass off NGE as an inelecutable deduction, which follows from his syllogism:

1. Cells have complex biochemical control mechanisms, which look like they "know what's good for them",
2. Cells mutate and evolve, therefore
3. Cell mutation and evolution are directed because the cell "knows" what kind of mutations are good for it.

But while we agree with 1 and 2, Shapiro's deduction 3 does not follow, and he is dishonest to pass off his deduction as if it's already been observed. It never was.)

Diogenes said...

Question Two. Prof. Shapiro, regarding the mechanisms which you claim are Natural Genetic Engineering. Certainly, if a cell is intelligent as you claim, and the cell "knows" what mutations are good for it before the mutation is filtered through natural selection, then that mechanism would produce the same results in a population under NS and in a mutation accumulation (MA) experiment, when NS is turned off. Have any of your proposed mechanisms of NGE ever been tested in an MA experiment to prove that the cell "knows" certain mutations are good for it before, or without, any filtration by Natural Selection? How do the results compare with NS on, vs. NS off?

Diogenes said...

Question Three. Prof. Shapiro, regarding the mechanisms which you claim are Natural Genetic Engineering, if a cell is "intelligent" as you claim, and the cell "knows" what mutations are good for it before the mutation is filtered through natural selection, then how do you explain the results of the many experiments showing mutation is random?

E.g. how do you explain the results of the experiment of Luria and Delbruck, or the Lederberg experiment (replica plating), which show that advantageous mutations occur and exist in small numbers before the environment changes, and do not occur after the environment changes, but rather, are amplified by NS after the environment changes?

If NGE were real, and cells were "intelligent" and know ahead of time what mutations are good for them, before filtration by NS, wouldn't well-studied novel mutations, like development of antibiotic resistance, be responses to changes in the environment? Because they're not, Prof. Shapiro.

And how do you explain the results of chromosomal crossover experiments, which show that the location of the crossover is apparently random within the chromosome, because the probability of genes being crossed over is inversely proportional to the physical distance between them on the chromosome? Surely, if cells were "intelligent" and knew which mutations were good for them, chromosomes would cross over in places that are "good" for the cell and never in places "bad" for the cell, and the distribution would be non-random. Why aren't they?

Diogenes said...

Question Four. Prof. Shapiro, we know from phylogenetic analysis and genomic analysis that one of the main drivers of novel evolutionary changes is gene duplication. If cells were "intelligent" as you claim and NGE were real, cells would know ahead of time, before filtration by NS, which gene duplications were good for them, and to avoid gene duplications that are bad for them, and where a duplication should start and where it should end. However, before filtration by natural selection, gene duplication is more likely to cause a disease state than a gain-of-fitness mutation. So how cany any hypothetical mechanism of NGE depend on any gene duplications in any way, when you're calling cells "intelligent" but gene duplication appears to be dumb as a box of rocks, and will more likely kill you?

Diogenes said...

And finally, Question Five. Prof. Shapiro, how the $%^& could we experimentally test or falsify your $%^&ing hypothesis?

It's possible, Larry, you may need to edit this last question.

John Harshman said...

Diogenes, you're right initially. But polyploids go through a process of "diploidization", in which a lot of the extra coding sequences turn into pseudogenes and/or just go away.

Eric said...

That's entirely correct. To use an analogy, Shapiro is claiming that John Smith must have ESP because John Smith won the lottery. What he ignores is all of the lottery tickets that weren't winners.

What Shapiro ignores is the null hypothesis, which is random mutations with respect to fitness. If these are directed mutations then they shouldn't produce neutral or deleterious changes to the genome, but he fails (or refuses) to test this null hypothesis.

I think it was mentioned earlier, but it is worth mentioning the genesis of the whole directed mutation idea. It started when people saw specific mutations occurring at higher rates than were expected from known mutation rates. It was worth hypothesizing that there could be mechanisms that guided mutations, and those hypotheses were tested. What they found was that the bacteria were increasing the random mutation rate in response to stress (i.e. DNA damage). It would be no different than certain people having an increased chance to win the lottery because they bought more tickets.

ElShamah777 said...

Diogenes

without error detection and error correction, there would be no life. These mechanisms had to be present from day one, when life began, otherwise the mutation rates would be too high, and cells could never replicate with sufficient accuracy, and they would die.

In the article published in Nature:

Genetic recombination and adaptation to fluctuating environments: selection for geotaxis in Drosophila melanogaster

we read :
Heritable variation in fitness is the fuel of adaptive evolution, and sex can generate new adaptive combinations of alleles. If the generation of beneficial combinations drives the evolution of recombination, then the level of recombination should result in changes in the response to selection.

So, genetic recombination was discoverd in 1944 by Barbara McClintock and won the Nobel Prize for her work in 1983.

Shapiro based his findings on hers. So what happens, is PROGRAMMED MUTATIONS, not random mutations.

We know that random mutations degenerate the genome.

Diogenes said...

Everything you just wrote is factually incorrect, and you didn't answer my question. I demanded an observed example of NGE, not NGE presented as some kind of deduction. You presented no examples of observed NGE. Your reference to recombination is not NGE, not remotely, and is not programmed as you absurdly claim. I already pointed out that crossover experiments show that prob of crossover is proportional to physical distance between genes. Meaning the location of crossover is random. It's not programmed, and your authority quote says nothing about programmed anything.

Shapiro's hypothesis (not "findings", he has no findings) is not based on McClintock's. He's just speculating, and he dishobestly passes off what he thinks as a deduction as observed processes.

You presented no examples of observation of NGE.

Your statement "random mutations degenerate the genome" is ridiculous. Modern research keeps adjusting up the rate of beneficial mutations. A common result is that about 3% of all mutations in E. coli are beneficial. It will depend on species, changes in environment, etc. Lenski's long term E. coil experiment showed that beneficial mutations slow down in a constant environment, but never stop. Genome degeneration never happens unless you turn off natural selection.

I asked for programmed mutations. You cited random mutations. It is ever thus.

Diogenes said...

OK.

The whole truth said...

ElShamah/Otangelo said:

"without error detection and error correction, there would be no life. These mechanisms had to be present from day one, when life began, otherwise the mutation rates would be too high, and cells could never replicate with sufficient accuracy, and they would die."

Huh? What? The allegedly omnipotent, omniscient, omnipresent, omnibenevolent, perfect sky daddy yahoo-yeshoo-holy-spook designed-created (from day one) all living things (including 'man') with errors that need error detection and error correction? Your sky daddy is apparently incompetent, and errors from day one can't be blamed on 'the fall'.

Oh, and living things do die. Obviously that error detection and error correction have their own 'errors'.

"So what happens, is PROGRAMMED MUTATIONS, not random mutations."

So all of the disabling, disease causing, disfiguring, and deadly mutations that have ever occurred "from day one" were/are programmed (designed-created) by yahoo-yeshoo-holy-spook (the imaginary, so-called ' Christian God' that you believe in and worship)?

"We know that random mutations degenerate the genome."

But didn't you say that mutations are "PROGRAMMED", "not random"? How can random mutations degenerate the genome when random mutations don't exist and have never occurred?

Unknown said...

Diogenes

In the peer reviewed paper :

Natural genetic engineering: intelligence & design in evolution?

David W Ussery writes:
Organisms seem 'designed'. When one examines the data from sequenced genomes, the changes appear NOT to be random or accidental, but one observes that whole chunks of the genome come and go. These 'chunks' often contain functional units, encoding sets of genes that together can perform some specific function.

Eukaryotes carry extremely complex splicing mechanisms, upon which genes are rearranged to make novel proteins and organisms. That IS NGE, since the information to carry out the correct splicing, at the right place of the genome, and the right rearranging MUST BE a engeneered , arranged, pre programmed process. So the mere fact that it has been observed to occur, permits to make the logical and rational inference that the process is not random.

As i have more than once challenged Larry, but obviously no answer, the origin of the spliceosome cannot be rationally explained by arguing evolution would be capable of the task. As once more, the whole thing is interdependent. There would be no need , use , and reason for evolution to produce a spliceosome, if there were not the information on the genome in order for for the spliceosome to find out where to cut, and to stick together, in order to get new protein products.

As James Barham wrote:

We are finally beginning to realize, on the basis of irrefutable empirical evidence, as well as more careful analysis of Darwinian theory itself, that purposeful action in living things is an objectively real phenomenon that is presupposed, not explained, by the theory of natural selection.

And finally, you are simply wrong by arguing that Shapiro did not back up his claims. He infact did, as he writes here:

McClintock's idea, she went on to identify the predicted ring chromosomes in the variegating mutants (McClintock 1932). Other mutants induced by X-ray treatment also carried chromosome rearrangements. Sections of chromosomes were deleted, translocated, inverted, and duplicated. All of these rearrangements could result from breaking chromosomes at two sites and then rejoining the broken ends to build brand new chromosome structures. These were quite different from the "gene mutations" imagined by Muller and his colleagues.

Unknown said...

The whole truth wrote:

""We know that random mutations degenerate the genome."

But didn't you say that mutations are "PROGRAMMED", "not random"? How can random mutations degenerate the genome when random mutations don't exist and have never occurred? "

There must be a error detection and repair mechanism right from the beginning. "

Natural selection cannot act without accurate replication, yet the protein machinery for the level of accuracy required is itself built by the very genetic code it is designed to protect. Thats a catch22 situation. It would have been challenging enough to explain accurate transcription and translation alone by natural means, but as consequence of UV radiation, it would have quickly been destroyed through accumulation of errors. So accurate replication and proofreading are required for the origin of life. How on earth could proofreading enzymes emerge, especially with this degree of fidelity, when they depend on the very information that they are designed to protect? Think about it.... This is one more prima facie example of chicken and egg situation. What is the alternative explanation to design ? Proofreading DNA by chance ? And a complex suite of translation machinery without a designer?

furthermore, we read here:

http://nrc58.nas.edu/RAPLab10/Opportunity/Opportunity.aspx?LabCode=50&ROPCD=506451&RONum=B7083

Since the editing machinery itself requires proper proofreading and editing during its manufacturing, how would the information for the machinery be transmitted accurately before the machinery was in place and working properly? Lest it be argued that the accuracy could be achieved stepwise through selection, note that a high degree of accuracy is needed to prevent ‘error catastrophe’ in the first place—from the accumulation of ‘noise’ in the form of junk proteins specified by the damaged DNA.

That does not mean, even with the repair mechanisms in place, errors and mutations do not occur. They do, but in a lifepermitting range, but as Pitman puts it:

" Genome stability is continually challenged by a diverse array of mutagenic forces that include errors during DNA replication, environmental factors such as UV radiation, and endogenous mutagens such as oxygen free radicals generated during oxidative metabolism. This damage must also be detected and repaired on a constant basis. Of course, this repair isn't perfect and therefore likely contributes significantly to the actual mutation rate over and above that estimated by the indirect methods"

Faizal Ali said...

@ Otangelo"

In the peer reviewed paper :

Natural genetic engineering: intelligence & design in evolution?

David W Ussery writes...


Here is the first sentence of the abstract for this "peer-reviewed paper":

There are many things that I like about James Shapiro's new book "Evolution: A View from the 21st Century" (FT Press Science, 2011).

That doesn't sound like scientific paper, so much as it does a book report written by a nine-year-old. Perhaps this explains how this article got published in this (now defunct) "journal":

BioMed Central is retracting 43 papers, following their investigation into 50 papers that raised suspicions of fake peer review, possibly involving third-party companies selling the service.

Hmmm....

judmarc said...

How on earth could proofreading enzymes emerge, especially with this degree of fidelity, when they depend on the very information that they are designed to protect? Think about it.... This is one more prima facie example of chicken and egg situation.

When water freezes - it makes ice! How could something so hard emerge when it depends on the presence of a liquid? Think about it...this is one more prima facie example of a chicken and egg situation.

Thanks to you Otangelo, we now know only God can make ice.

Unknown said...

lutesuite-
From the paper: “Directed mutation has been defined as a genetic change that is specifically induced by the stress condition that the mutation relieves.”

This is what they claim to have found. You seem to be claiming there are no examples of this. Am I misunderstanding the claim of the paper, am I misunderstanding you, have I missed an error from the paper?

Diogenes-
I think when you say ‘random’ you mean ‘random with respect to survival’.
The paper is about a specific instance where the genetic change induced was specific, caused by the stress condition, and relieved the stress condition.
I don’t think that would fit with ‘random with respect to survival’.

The researchers make no claim about what the organism ‘knows’ or is thinking.

If the paper I linked to is incorrect about having found such an instance, could you point out the error(s) made?

Unknown said...

eric-
I have linked to an example of a research paper where the researchers seem to be claiming they have found a 'directed mutation'.
http://www.pagepress.org/journals/index.php/eb/article/view/eb.2011.e3/6022
Can you tell me what error(s) they made in determining this?

Faizal Ali said...

@ Jack Johnson,

What did you not understand about the two previous posts, from Diogenes and myself? Try reading them again. You'll find your questions have already been addressed there. If you are still confused, maybe find someone who can help you with your dyslexia.

Faizal Ali said...

Why are you asking other people to answer questions that have already been answered? Did you not like the answers you were given? Or are you just not smart enough to understand them?

The Lorax said...

@John Harshman Sure the duplication itself is not sufficient for speciation but it provides the material for speciation to happen by loss of copies or diversification, but also through recombination that establish genetic barriers to sexual reproduction.

Faizal Ali said...

That's entirely correct. To use an analogy, Shapiro is claiming that John Smith must have ESP because John Smith won the lottery. What he ignores is all of the lottery tickets that weren't winners.

To extend the analogy further: There may exist conditions in which more people purchase lottery tickets (e.g. if the jackpot is particularly large). Under those conditions, it is more likely that someone will win, for the simple reason that there are more people playing the game.

So lets say John Smith doen't usually buy lottery tickets, but there had been no winner for several weeks and the jackpot had now hit $200 million, so he figured he might as well buy a ticket this time. Shapiro would interpret this as showing that John Smith wasn't just lucky, but actually knew the winning number ahead of time, because the chances of someone (anyone) winning the lottery is not completely random, but is more likely the larger the jackpot becomes.

Diogenes said...

Jack Jackson's citation is at least relevant, unlike anything cited by Otangelo or ElShamah. I'm reading it in more detail. I hope that Larry and you all will take a look.

judmarc said...

Jack, I've only read the abstract, but here's the basic problem: One must account not only for the events where a favorable mutation occurs, but also for the non-events, where mutations that would be favorable fail to occur. Or to put it more succinctly: Did Richard Lenski get a batch of especially stupid E. coli?

I seem to recall in my reading of later articles by/about Shapiro's co-authors on early papers like Barry G. Hall that they eventually concluded the mutations they initially evaluated as directed were just the easiest ones to accomplish. When anything that required multiple mutational steps was evaluated, especially if one or more of the steps was a mutation that occurred only rarely for whatever biochemical or organismal reason, suddenly the ability to "direct" mutation in favorable directions was gone. On this basis Shapiro's co-authors eventually concluded that they had been excited by studying the more likely candidates, i.e., the easer mutations, first, which had biased them in favor of the directed position.

John Harshman said...

Not sure what you mean by "recombination that establish genetic barriers". Could you elaborate? I already mentioned loss of copies.

Eric said...

@Jack Jackson

Here is a section from that paper:

"But what about other sites on the E. coli chromosome? Examination of three other operons known to be activated by IS5 insertion, the fucAO,41 flhDC42 and bglGFB operons,38,39 revealed that neither the presence of glycerol nor the loss of GlpR influenced the IS5 hopping rates to these sites."

The activity of this transposon produces insertions all over the E. coli chromosome. It isn't as if this transposon only inserts into one position under one set of specific conditions and at a consistently high rate. While there is an increased rate in the mutation they focused on, they didn't do a complete genome sweep of many organisms to find out if there are other hotspots of transposon insertion at positions that are neutral or deleterious. They looked at a few sites, but that is hardly enough to truly test the null hypothesis.

We could also ask what we should see with a truly directed system for mutations. Should we see these mutations occur once in every 10 million bacteria? That isn't what I would expect form a directed system, yet that is what we see in these experiments. When I see those types of odds it looks a lot more like happenstance than direction.

While these results are certainly interesting and worth looking at, they do fall short of a truly directed system.

Eric said...

@lutesuite

To extend the analogy even further . . .

I think most people would agree that the lottery is random because the odds of a given result are not influenced by the ticket that a specific person is holding. However, Shapiro would argue that the lottery is not random. He would argue that some people buy more tickets, as you mentioned, which increases their odds of winning. He would also argue that the drawings occur at a set time on a set day which makes the lottery non-random.

It is this type of subterfuge that rubs scientists the wrong way. Shapiro is trying to reframe what random mutations are in order to sell an idea.

Unknown said...

Diogenes-
Thank-you.

judmarc-
The researchers claim to have found a specific example of a ‘directed mutation’ defined as-
“… a genetic change that is specifically induced by the stress condition that the mutation relieves.”
They do not claim all mutations are directed or that the organism can direct any mutation— they give a specific example.

The organism doesn’t have to be ’smart’ to mutate. That a mutation can be ‘specifically induced by the stress condition the mutation relieves’ does not require the organism to be intelligent. knowing. or even aware.

Eric-
You say-
“It isn't as if this transposon only inserts into one position under one set of specific conditions and at a consistently high rate…”
But that is what the paper does say is happening.
I'm not sure how much of the genome they surveyed before making the decision, your point is taken. I thought since they had looked at the areas they did look at, the conclusion was probably OK.

I gave the definition of ‘directed mutation’ being used in the paper- a definition that seems quite good.
It seems what we would see if these mutations are directed is what the researchers present. (Assuming the researchers have not made an error- something I am not sure of yet.)
I’m not sure how else a ‘truly directed system’ would work.

Eric said...

Jack Jackson writes:

"You say-
“It isn't as if this transposon only inserts into one position under one set of specific conditions and at a consistently high rate…”
But that is what the paper does say is happening."

Nowhere does the paper state that the transposon only inserts into one position, nor that it does so at a high rate.

If you look at fig. 3 you will see that the insertion happens 10-30 times per 100 million divisions. If your favorite sports team only won 1 out of 10 million games, would you say that they have a high win rate?

A truly directed system would change all genomes in all bacteria if they are exposed to the same stimuli. As it is, only 1 in 10 million bacteria respond with this mutation. As a comparison, immune cells have directed gene expression in response to antigens. If you take a specific subgroup of white blood cells and expose them to lipopolysaccharide, all of the cells will start producing specific cytokines. All of them. Not 1 in 10 million, but all of them. That is a directed system.

What we have with this experiment looks very much like happenstance. Instead of occurring 1 time out of 100 million divisions, it occurs 10 times out of every 100 million divisions. And that is just for one site out of thousands the transposon probably inserts into.

The basic problem is that the methodology is inherently biased. I am not saying this is a bad thing, just an unavoidable thing. Like I said before, what they discovered is interesting. However, if they want to make the grander claim that these mutations are guided then they need to use a different methodology to get a feel for the entire spectrum of transposon mutagenesis. What I strongly suspect they will find is an increase in both the deleterious and neutral mutation rates elsewhere in the genome under the same conditions.

Eric said...

Diogenes,

All of your questions are great. I would add just one thing to them.

If these organisms know which mutations are advantageous, why do we only see 1 out of millions or billions actually produce this mutation? Are they stupid or lazy?

Diogenes said...

Eric and JJ makes some valid points, but let me examine this passage:

"But what about other sites on the E. coli chromosome? Examination of three other operons known to be activated by IS5 insertion, the fucAO,41 flhDC42 and bglGFB operons,38,39 revealed that neither the presence of glycerol nor the loss of GlpR influenced the IS5 hopping rates to these sites."

OK. Here is how I interpret that, and tell me if I'm wrong.

Let us consider four operons known to be activated by IS5 insertion. Call the probabilty of IS5 activating each of these p1, p2, p3, and p4. Insertions that activate the operon 1 are benefical in the presence of glycerol.

In the presence of glycerol:

the probability of IS5 insertion happening to activate operon 1 changes to a new value p1' > p1.

the probability of IS5 inserting to activate operons 2, 3 or 4 is unchanged, so p2' = p2, etc.

Have I got this right so far? The authors say this is a directed mutation because p1' > p1, but Eric says it's no big deal because p1' is in the range of 1 in 10 million or so. Apparently true. But it's still interesting that p1' > p1.

I'll need some time to comb through this paper in detail.

Diogenes said...

They work for the DMV.

Eric said...

I would say that it's no big deal because it still inserts into that same spot when it isn't needed, even if it is at a lower rate. It also inserts into other positions in the genome where it isn't needed. The low rate of conversion is yet another problem, at least in my eyes.

They also need to show that it isn't just happenstance that the mutation rate increases for that specific operon. They did look at three other operons, which is commendable, but that is a drop in the bucket for the E. coli genome as a whole. Are there occasions where an increased rate of transposon mutagenesis would be adaptive, but the rate of mutation stays the same? I think we would all agree that it would be surprising to find examples of this happening with this very transposon if we looked hard enough. We could possibly even find examples where the mutation rate decreased when it would be advantageous for the rate to increase.

From a larger metaview, bacteria don't want scripted adaptive systems anyway. If you only have a set number of scripted adaptations to a given stimulus then you run the risk of "running out of ideas". Even in the human adaptive immune system there is a large dose of random mutation and selection within antibody production. Imagine if humans only had a set number of antibodies, and all those antibodies were the same across the entire population. All a virus would need to do is get around those set number of antibodies and it would wipe out our species. A system that produces novel solutions is what you want because it allows for a wider array of adaptations.

Diogenes said...

Well to continue with my formalism, is p2, p3 or p4 > p1'? Or are p2, p3 or p4 > p1? The former would make the mutation look less directed. How much variation is there in these rates? Do they vary by orders of magnitude? Is p1'-p1 Within the expected standard dev of insertion rates?

Diogenes said...

Eric: "We could possibly even find examples where the mutation rate decreased when it would be advantageous for the rate to increase."

Possibly, but that situation (where p1' < p1 in my formalism) would be harder to detect that experimentally. This raises the possibility that the results in the paper could be an artifact of our ability to detect beneficial mutations, that is, perhaps transposon insertions with p1' < p1 happen as often or more often than p1' > p1, but the latter is easy to detect, and the former is harder to detect.

Diogenes said...

Eric: "From a larger metaview, bacteria don't want scripted adaptive systems anyway. If you only have a set number of scripted adaptations to a given stimulus then you run the risk of "running out of ideas".

Personally I agree, but we cannot assume this when discussing whether mutations are random or not, because we run the risk of using circular logic to dismiss valid counter-examples. We have to consciously not assume such conclusions when considering whether mutations are random or not.

Eric said...

Diogenes wrote,

"Possibly, but that situation (where p1' < p1 in my formalism) would be harder to detect that experimentally. This raises the possibility that the results in the paper could be an artifact of our ability to detect beneficial mutations, . . . "

That's my conclusion as well. There is an inherent bias in the methodology. The ol' "small sample size" problem, otherwise known as "need to increase n".

I always assume that authors are being honest unless proven otherwise, but there can be cases where they look at 20 or 100 genes and only see the results they want in 1 gene. What do they report? The results for that 1 gene. I don't think that has happened here, but it is a possibility that further research would need to rule out.

Unknown said...

Diogenes- Your formalism is helpful.
I think you are right about p1’> p1. The paper seems to indicate by a factor of 10 times. Further p2’=p2 etc. is also correct.
This indicates the mutation occurs in response to the stressor, and since the mutation relieves the stress, it is a directed mutation— that’s how I read it now.

It would also be an example of a ‘directed mutation’ in the since that it is not random with respect to fitness.

It seems like a preprogramed response to a stressor would be advantageous in some situations, while it would be disadvantageous in others. A trait like that can be ‘selected for’, and is therefore evolvable, if I understand the terminology.

It might be this is an example of where the theory might be different than the history of life on Earth. It seems that evolution could occur if all mutations are random with respect to fitness (theory), and that this particular trait evolved in this particular incidence might be seen more as an accident of history. If life has evolved elsewhere, one would expect a different set of accidents to have played out.

Eric-
It seems you have a different definition of ‘directed mutation’ than what the researchers have.

Mikkel Rumraket Rasmussen said...

"Natural selection cannot act without accurate replication, yet the protein machinery for the level of accuracy required is itself built by the very genetic code it is designed to protect."

The natural fidelity of polymerase enzymes is high enough to faithfully replicate tens of thousands of bases without making a single error. So in point of fact, error correction machinery is not strictly required for replication of DNA. All the genes of the replication system can be encoded and replicated by even relatively error-prone DNA-polymerases without them making a single error in these relatively few genes. Enzymes have been deliberately designed to be EIGHTY THOUSAND TIMES more error prone than wild-type variants and they still faithfully replicate almost ONE HUNDRED THOUSAND BASEPAIRS without making a single error.

You seem to be under the misapprehension that without error correction machinery, the mutation rate would be instantly catastrophic. That is simply not the case. As usual your understanding is abysmal and your ignorance is astounding.

Unknown said...

Mikkel

"The natural fidelity of polymerase enzymes is high enough to faithfully replicate tens of thousands of bases without making a single error."

Correct. And how does it achieve that feat ?!!

Bruce Alberts explains in : The High Fidelity of DNA Replication Requires Several Proofreading Mechanisms

If the DNA polymerase did nothing special when a mispairing occurred between an incoming deoxyribonucleoside triphosphate and the DNA template, the wrong nucleotide would often be incorporated into the new DNA chain, producing frequent mutations. The high fidelity of DNA replication, however, depends not only on complementary base-pairing but also on several “proofreading” mechanisms that act sequentially to correct any initial mispairing that might have occurred.

3′-to-5′ proofreading exonuclease clips off any unpaired residues at the primer terminus, continuing until enough nucleotides have been removed to regenerate a base-paired 3′-OH terminus that can prime DNA synthesis. In this way, DNA polymerase functions as a “self-correcting” enzyme that removes its own polymerization errors as it moves along the DNA

And afterwards, postreplicative mismatch repair is required as well.

Resumed: No inbuilt 3'-5' exonuclease in the DNA polymerase holoenzyme, and no postreplication repair (PRR) mechanisms, no life !!

As usual your understanding is abysmal and your ignorance is astounding. But as a intellectual masochist, you must feel a warm fuzzy feeling right now, Mikkel ??!!

judmarc said...

Bruce Alberts explains

Otangelo, you mean the very same Bruce Alberts who says:


"[But] intelligent design is not science, it has no testable hypotheses, no proposed methodologies

Larry Moran said...

Mikkel is perfectly correct, as usual. The error rate of DNA polymerase without proofreading is more than sufficient to copy DNA efficiently in organisms with small genomes such as bacteria. The first cells on Earth had small genomes. Proofreading evolved later. Most modern polymerases still don't have a proofreading mechanism.

As usual, Otangelo's understanding of basic biochemistry is abysmal and his ignorance is astounding. He is incapable of learning even when you take the time to explain thiings to him.

Ed said...

Otangelo, can you please point to the bit where the author states: "Resumed: No inbuilt 3'-5' exonuclease in the DNA polymerase holoenzyme, and no postreplication repair (PRR) mechanisms, no life !! "

What the author does write though:
"As a result, a great deal is known about the detailed enzymology of DNA replication in eucaryotes, and it is clear that the fundamental features of DNA replication—including replication fork geometry and the use of a multiprotein replication machine—have been conserved during the long evolutionary process that separates bacteria and eucaryotes."

Ahh, the author is describing how DNA replication is NOW, not how it was or how it evolved.

He continues:
"There are more protein components in eucaryotic replication machines than there are in the bacterial analogs, even though the basic functions are the same. Thus, for example, the eucaryotic single-strand binding (SSB) protein is formed from three subunits, whereas only a single subunit is found in bacteria."

So what he's saying, there's a simpeler version of DNA replication in bacteria, compared to the eucaryotic system, but the proteins involved perform the same function.

In fact you've (again!) just quoted a book which refutes your IC fantasy. Well done!!

judmarc said...

judmarc-
The researchers claim to have found a specific example of a ‘directed mutation’ defined as-
“… a genetic change that is specifically induced by the stress condition that the mutation relieves.”
They do not claim all mutations are directed or that the organism can direct any mutation— they give a specific example.

The organism doesn’t have to be ’smart’ to mutate. That a mutation can be ‘specifically induced by the stress condition the mutation relieves’ does not require the organism to be intelligent. knowing. or even aware.


@JJ - Yes, I agree. But this really raises a couple of points that Eric makes, and one further point I'd like to make.

Eric's two points that I think are especially relevant are how often the "directed" mutation occurs, and how often "non-directed" mutations occur, in response to a given stimulus. If we're truly talking about 1 in 10 million, then does this qualify as non-random? That would have to be viewed against the background of the usual neutral and deleterious mutation rate - is 1 out of 10 million significantly better than usual odds? Or is the *overall* mutation rate simply going up in response to stress, a well known phenomenon, so that instead of 1 mutation that relieves the challenge in 10 million you get 10 in 100 million? That's not providing a direction for mutation, it's simply an elevated mutation rate, and it is limited by the "error catastrophe" condition.

When we look at the series of careful experiments performed by Richard Lenski's group, it seems to me the leading hypothesis is that what we're seeing in at least most of the "directed mutation" experiments may well be an elevated overall mutation rate that captures more "easy" mutations, but really lacks any specific direction. Otherwise, how to explain the stupidity of Lenski's poor old E. coli, sitting there for generation after generation with the equivalent of "water, water everywhere, nor any drop to drink"?

The other point I'd like to raise is about the author's chosen definition and the ambiguity it creates in English. "Directed" to many people implies a "director," in the sense of some level of orchestration by the bacteria (which is how I believe Shapiro and his early co-authors meant it), or even the ID fantasy of God taking time out to help the little E. coli (when He isn't busy implanting little propellers in the rear ends of other microorganisms, or worrying about who we are having sex with). "Directional" or some other word I haven't thought of might have been a better term for what the authors of this particular paper are talking about.

The Lorax said...

I'm thinking intrachromosomal and interchromosomal recombination events that lead to genetic barriers because meiosis would lead to haploids with incomplete genomes. Obviously this only applies to eukaryotes.

Unknown said...

Ho Ho Ho!!

Papa Larry must come to help out his pupill, Mikkel, from the humiliating situation.

As Koonin writes The Logic of Chance: The Nature and Origin of Biological Evolution :

Eigen’s theory revealed the existence of the fundamental limit on the fidelity of replication (the Eigen threshold): If the product of the error (mutation) rate and the information capacity (genome size) is below the Eigen threshold, there will be stable inheritance and hence evolution; however, if it is above the threshold, the mutational meltdown and extinction become inevitable (Eigen, 1971). The Eigen threshold lies somewhere between 1 and 10 mutations per round of replication (Tejero, et al., 2011) regardless of the exact value, staying above the threshold fidelity is required for sustainable replication and so is a prerequisite for the start of biological evolution. Indeed, the very origin of the first organisms presents at least an appearance of a paradox because a certain minimum level of complexity is required to make self-replication possible at all; high-fidelity replication requires additional functionalities that need even more information to be encoded (Penny, 2005). The crucial question in the study of the origin of life is how the Darwin-Eigen cycle started—how was the minimum complexity that is required to achieve the minimally acceptable replication fidelity attained? In even the simplest modern systems, such as RNA viruses with the replication fidelity of only about 10^3 and viroids that replicate with the lowest fidelity among the known replicons (about 10^2; Gago, et al., 2009), replication is catalyzed by complex protein polymerases. The replicase itself is produced by translation of the respective mRNA(s), which is mediated by the immensely complex ribosomal apparatus. Hence, the dramatic paradox of the origin of life is that, to attain the minimum complexity required for a biological system to start on the Darwin-Eigen spiral, a system of a far greater complexity appears to be required. How such a system could evolve is a puzzle that defeats conventional evolutionary thinking, all of which is about biological systems moving along the spiral; the solution is bound to be unusual.

As usual, Larrys understanding as University Professor of basic biochemistry is abysmal and his ignorance is astounding. He is incapable of learning even from a laymen as me, even when you take the time to explain thiings to him.

Funny, if it would not be sad.....

Faizal Ali said...

Anyone else smell a quote mine? Koonin's whole book can be found at the link below, with Otangelo's quote(mine) on p.355. You be the judge:

http://evolocus.com/Textbooks/Koonin2011.pdf

Larry Moran said...

Otangelo Grasso is an expert at shifting goalposts. He first claimed that proofreading and error correction were essential. He said,

... without error detection and error correction, there would be no life. These mechanisms had to be present from day one, when life began, otherwise the mutation rates would be too high, and cells could never replicate with sufficient accuracy, and they would die.

This is not correct. Error detection and proofreading were not necessary in the beginning. They evolved later when genomes became more complex.

This demonstrates that Otangelo Grasso does not understand anything about biochemistry. However, instead of admitting he was wrong he now shifts the argument to the incorporation accuracy of DNA polymerase itself. This is so typical. It's why I find it impossible to have an intelligent conversation with him.

It's a good thing I turned off my irony meter before reading this ...

As usual, Larrys understanding as University Professor of basic biochemistry is abysmal and his ignorance is astounding. He is incapable of learning even from a laymen as me, even when you take the time to explain thiings to him.

Funny, if it would not be sad.....

judmarc said...

As Koonin writes The Logic of Chance: The Nature and Origin of Biological Evolution

Good book. You might try reading it, or even better, understanding it.

By "understanding," I mean understanding why these scientists you quote support the theory of evolution.

Unknown said...

Larry wrote

"he was wrong he now shifts the argument to the incorporation accuracy of DNA polymerase itself. "

That was the crux of the question, when Mikkel wrote :

"The natural fidelity of polymerase enzymes is high enough to faithfully replicate tens of thousands of bases without making a single error. "

Well, yes, it is, precisely because it INCORPORATES ESSENTIAL REPAIR MECHANISMS, as pointed out above.

But maybe i am missing something, and it would be kind of your part to back up your claim :

" This is not correct. Error detection and proofreading were not necessary in the beginning. They evolved later when genomes became more complex. "

Any evidence for this ??

Ed said...

Otangelo, reads scientific literature, like he reads the bible, cherry picking all the way.
He's been caught lying/ quote mining Davies, Bruce Alberts, Koonin, only picking out the bits which confirm his fantasy (ID), ignoring everything which doesn't fit or refutes his own fantasy.

And he also is under the impression god is knitting nucleotides together when you run a PCR.

Unknown said...

Ed wrote

" ignoring everything which doesn't fit or refutes his own fantasy."

What exactly i am ignoring ?!

And if you read my comment above, i am actually eagerly waiting Larry to back up his claims. Be assured that i WILL NOT IGNORE his evidence, and be the first to change my mind, if the eventually new scientific evidence ( which i might not know of yet ) refutes what i learned so far.

" And he also is under the impression god is knitting nucleotides together when you run a PCR ".

No, thats actually scientists doing this. Thats why it is a example of INTELLIGENT DESIGN, and invalid to refute what i wrote.

John Harshman said...

I don't see that as being very important in speciation, except perhaps in species with facultative asexual reproduction. If a single recombination event causes a genetic barrier, how is the mutant going to reproduce? Differential loss of copies, on the other hand, requires events in two populations, neither of them causing problems within either population. In obligate outcrossers, a genetic barrier requires at least two events.

Larry Moran said...

Otangelo Grasso says,

That was the crux of the question, when Mikkel wrote :

"The natural fidelity of polymerase enzymes is high enough to faithfully replicate tens of thousands of bases without making a single error. "

Well, yes, it is, precisely because it INCORPORATES ESSENTIAL REPAIR MECHANISMS, as pointed out above.


RNA polymerase is an examples of a polymerase that does not have proofreading. The proofreading activity of DNA polymerases can be inactivated by mutation. Their modern nucleotide incorporation error rates are about 10^-6, which means they make a mistake about one time in a million nucleotides incorporated. They don't need additional repair or proofreading mechanisms to accurately copy a genome of about 500,000 bp. (That's enough for hundreds of genes.)

Mikkel said "tens of thousands," which is a lot less than one million. He was perfectly correct.

I think Otangelo should stop digging.

Unknown said...

Larry wrote:

" Their modern nucleotide incorporation error rates are about 10^-6, which means they make a mistake about one time in a million nucleotides incorporated. They don't need additional repair or proofreading mechanisms to accurately copy a genome of about 500,000 bp. (That's enough for hundreds of genes.) "

Bruce Alberts writes :

https://www.ncbi.nlm.nih.gov/books/NBK26850/

Rare tautomeric forms of the four DNA bases occur transiently in ratios of 1 part to 104 or 105. These forms mispair without a change in helix geometry: the rare tautomeric form of C pairs with A instead of G, for example.

If the DNA polymerase did nothing special when a mispairing occurred between an incoming deoxyribonucleoside triphosphate and the DNA template, the wrong nucleotide would often be incorporated into the new DNA chain, producing frequent mutations. The high fidelity of DNA replication, however, depends not only on complementary base-pairing but also on several “proofreading” mechanisms that act sequentially to correct any initial mispairing that might have occurred.

So, who is right, Larry ??!!

"I think Otangelo should stop digging. "

Why ? You call yourself a " skeptical biochemist "

I am doing precisely this. I am skeptical and critical. I elucidate precisely where the evidence leads, to.

Are you disconfortable that the evidence does not lead where you want to, Larry ?!

Unknown said...

judmark-
Eric is using a different definition of ‘directed mutation’ than what the researchers are using. The researchers are using the definition from Cairns ‘a genetic change that is specifically induced by the stress condition that the mutation relieves’
He uses a phrase ‘truly directed system’, but I don’t know what it means.

How often something occurs has nothing to do with it being random or not.

I understand people can think ‘directed’ implies a ‘director’ and appreciate the concern.
I would think this paper would be an excellent remedy for that.

Faizal Ali said...

And Otangelo just keeps on digging.

Mikkel Rumraket Rasmussen said...

He's still operating on the idea that a DNA polymerase without proofreading would become a catastrophic error machine, rather than the reality that it just ups the error rate from some extremely small rate to one a few orders of magnitude higher yet still entirely feasible for an organism with a small genome.

He quotemines Koonin talking about the origin of life in the context of a fully living entity with a translation system.
Koonin gives error rates of some extremely error-prone RNA polymerases from viruses. Even so a replication fidelity of 10^3 would be one error in a thousand basepairs, which is more than enough for a self-replicating enzyme to be able to stay significantly above the Eigen-threshold when it replicates itself. Besides, viral RNA polymerase fidelities are due to natural selection for extremely high mutation rates.

Koonin writes, "The Eigen threshold lies somewhere between 1 and 10 mutations per round of replication".

But a typical Ribozyme is only up to a few hundred bases (and so are the quasi-selfreplicators evolved by Joyce et al), so with an error rate of 10^-3 a hypothetical self-replicating Ribozyme could still replicate itself somewhere around five to ten times before a mutation crept in.

More pertinently, DNA polymerases are much more accurate and even with proofreading disabled will still replicate tens of thousands of basepairs without a single error. Exactly what I wrote to begin with. Notice how not even a single of Otangelo's references challenges this. That's because it's a concrete real-world fact that the people he quotemines are also aware of.

See Otangelo, if you actually bother to do the math, rather than just copy-paste cherry-picked sections from the literature, turns out this isn't actually a problem. Not for DNA, not even for RNA polymerases and a hypothetical minimal living entity without translation or proofreading machinery.

In point of fact: Real ribozymes suggest a relaxed error threshold

"The error threshold for replication, the critical copying fidelity below which the fittest genotype deterministically disappears, limits the length of the genome that can be maintained by selection. Primordial replication must have been error-prone, and so early replicators are thought to have been necessarily short1. The error threshold also depends on the fitness landscape. In an RNA world2, many neutral and compensatory mutations can raise the threshold, below which the functional phenotype3, rather than a particular sequence, is still present4, 5. Here we show, on the basis of comparative analysis of two extensively mutagenized ribozymes, that with a copying fidelity of 0.999 per digit per replication the phenotypic error threshold rises well above 7,000 nucleotides, which permits the selective maintenance of a functionally rich riboorganism6 with a genome of more than 100 different genes, the size of a tRNA. This requires an order of magnitude of improvement in the accuracy of in vitro-generated polymerase ribozymes7, 8. Incidentally, this genome size coincides with that estimated for a minimal cell achieved by top-down analysis9, omitting the genes dealing with translation."

And now over to Otangelo who will bring up another irrelevancy, either by quotemining more irrelevant material or just plain make shit up. Or well, I concede he can continue the PCR route and just respond "pfff... hahaha" when shown to be wrong for the fiftieth time.

Mikkel Rumraket Rasmussen said...

"Rare tautomeric forms of the four DNA bases occur transiently in ratios of 1 part to 104 or 105."

That's between tenthousand and one hundred thousand.
Let's remind ourselves what I wrote to begin with and what I responded to:
Otangelo: ""Natural selection cannot act without accurate replication, yet the protein machinery for the level of accuracy required is itself built by the very genetic code it is designed to protect."

Me: The natural fidelity of polymerase enzymes is high enough to faithfully replicate tens of thousands of bases without making a single error. So in point of fact, error correction machinery is not strictly required for replication of DNA. All the genes of the replication system can be encoded and replicated by even relatively error-prone DNA-polymerases without them making a single error in these relatively few genes. Enzymes have been deliberately designed to be EIGHTY THOUSAND TIMES more error prone than wild-type variants and they still faithfully replicate almost ONE HUNDRED THOUSAND BASEPAIRS without making a single error.

So I'm still correct and you're still wrong. You complete dolt.

Keep digging Otangelo, keep digging.

The Lorax said...

Those species with facultative asexual reproduction probably 'think' it's very important in speciation. I know this basically excludes the macroscopic eukaryotes that receive most of the attention, but microscopic eukaryotes are often in not generally asexual in nature.

The Saccharomyces sense stricto genus is a great example of speciation after genome duplication. Since I'm a big proponent of malt fermentation, I think this is important.

judmarc said...

How often something occurs has nothing to do with it being random or not.

More specifically, would you agree it makes a difference as to whether we have something new and worthy of attention, or the same old accelerated pace of mutation under stress, under two scenarios:

- (1) The normal rate of a particular mutation is 1 in 10 million mutations per period x, and under stress it becomes 10 in 100 million over the same period.

- (2) The normal rate of a particular mutation is 1 in 10 million per period x, and under stress it becomes 10 in 10 million over the same period.

Also, would you have any idea, if the second scenario is occurring, why Lenski's E. coli don't hurry up and mutate in the right ways?

Unknown said...

Mikkel wrote

" He's still operating on the idea that a DNA polymerase without proofreading would become a catastrophic error machine, rather than the reality that it just ups the error rate from some extremely small rate to one a few orders of magnitude higher yet still entirely feasible for an organism with a small genome. "

If DNA replication would be sufficient error prone to not cause cancer , aging, and cell death, why at all would cells produce a array of various repair mechanisms, that is :

A proofreading system that catches almost all errors
A mismatch repair system to back up the proofreading system
Photoreactivation (light repair)
Removal of methyl or ethyl groups by O6 – methylguanine methyltransferase
Base excision repair
Nucleotide excision repair
Double-strand DNA break repair
Recombination repair
Error-prone bypass

???

Take a peace of paper, write a sentence and insert a number of mispelled grammatical errors. Now ask someone that only understands chinese to find your errors. Could he without understanding english ? How could error be detected without a preceeding program and convention of what is correct ? Furthermore, the whole machinery to produce proteins would have to be in place to produce the error correction enzymes.

James Shapiro points out that:

all cells from bacteria to man possess a truly astonishing array of repair systems which serve to remove accidental and stochastic sources of mutation. Multiple levels of proofreading mechanisms recognize and remove errors that inevitably occur during DNA replication. … cells protect themselves against precisely the kinds of accidental genetic change that, according to conventional theory, are the sources of evolutionary variability. By virtue of their proofreading and repair systems, living cells are not passive victims of the random forces of chemistry and physics. They devote large resources to suppressing random genetic variation and have the capacity to set the level of background localized mutability by adjusting the activity of their repair systems

judmarc said...

Keep digging Otangelo, keep digging.

If it leads to more interesting stuff from you, Mikkel, I'd encourage it as well. ;-)

Thanks.

Mikkel Rumraket Rasmussen said...

"If DNA replication would be sufficient error prone to not cause cancer , aging, and cell death, why at all would cells produce a array of various repair mechanisms, that is :"

First of all, a single-celled organism can't die to age or cancer, those are uniquely problems that multicellular organisms suffer from.

So why would higher replication fidelity evolve?
Because on average, less errors mean higher fitness. More errors do not mean instant death, it usually means more deleterious mutations, not instant lethality. That's also why higher replication fidelity can evolve from non-correcting enzymes. They still work even without error correction and don't cause the death of the carrier organism, so if they mutate and increase their accuracy, the carrier will have children with fewer deleterious mutations. If you could actaully think all this would be immediately apparent to you.

You are stuck in the black-and-white thinking of a fundamentalist. You think it has to be fully the way it is now, or else it could not work at all.

If you read a book with a spelling error in 1 out of every 100 words, YOU CAN STILL READ IT. It's annoying, maybe some of the meaning is ambiguous and in rare cases, lost. But it's readable. But a book with 1 spelling error in every 1000 words is even easier to read. Maybe no meaning is lost there at all, and there are only cases of ambiguity in the details. And a book with 1 error in every 5000 words is even better and so on and so forth.

Notice how it doesn't have to be ENTIRELY GIBBERISH vs COMPLETELY ERROR FREE. There is a spectrum between the two extremes.
At a basic level, the base-pairing of ribonucleotides is itself a level of error correction. The system is intrinsically accurate. It is simply not possible under normal physiological conditions for the error-rate to be on the order of total gibberish.

Larry Moran said...

Otangelo Grasso writes,

So, who is right, Larry ??!!

Both of us are right. That shouldn't come as a big surprise since I did my Ph.D. thesis with Bruce Alberts working on DNA replication.

He and I both know that the error rate of the polymerization reaction, without proofreading, is several orders of magnitude less than the reaction with proofreading. However, we both know that the polymerization reaction, by itself, is still accurate enough to replicate lots of DNA without making any mistakes.

And we both know that a study of modern polymerases strongly suggests that the primitive enzymes did not have a proofreading mechanism.

Unknown said...

judmarc-
Scenario 1 would indicate an increased overall mutation rate. Scenario 2 would indicate the mutation was in response to the stressor, and if the mutation relieved the stress, then scenario 2 would be an example of a directed mutation.

The first scenario is fairly common as I understand it. Mutation rates go up under stress.
What makes the paper I linked to interesting is that it is reporting an example of the second scenario.

Lenski uses different food sources than the experiment run. The mechanism in the paper is very specific to the stressor- that is what makes the mutation 'directed'. Apparently the mechanism doesn't exist for the stressors Lenski is providing.

Mikkel Rumraket Rasmussen said...

"Both of us are right. That shouldn't come as a big surprise since I did my Ph.D. thesis with Bruce Alberts working on DNA replication."

This is poetic irony. I think I've chuckled at this for a good 5 minutes now.

Eric said...

Jack Jackson,

"I think you are right about p1’> p1. The paper seems to indicate by a factor of 10 times. Further p2’=p2 etc. is also correct.
This indicates the mutation occurs in response to the stressor, and since the mutation relieves the stress, it is a directed mutation— that’s how I read it now."

That is not entirely correct. The mutation also occurs in the absence of the stressor. Mutations from the same mechanism also occur throughout the genome and the run the gamut from beneficial to neutral to detrimental.

"The researchers are using the definition from Cairns ‘a genetic change that is specifically induced by the stress condition that the mutation relieves’"

It isn't specific. The mutation they are looking at occurs without the stressor, and the same mechanism produces mutations at different sites. On top of that, it only occurs in 10 out of millions of bacteria. A directed system should work at a much higher rate.

If this system is set up to produce a specific mutation in response to a specific stimuli, then why does it only occur once in millions of bacteria? If it set up to produce a specific mutation, then why is it producing mutations all over the genome?

Faizal Ali said...

A genuine Marshal Mcluhan/Annie Hall moment. Priceless! I bet Ontangelo doesn't even know what just hit him.

John Harshman said...

the error rate of the polymerization reaction, without proofreading, is several orders of magnitude more than the reaction with proofreading.

Fixed. You know what he meant.

Larry Moran said...

Thanks John. "More" is correct. My bad.

Unknown said...

Larry asserts that

" we both know that the polymerization reaction, by itself, is still accurate enough to replicate lots of DNA without making any mistakes."

In contrast, Alberts writes:

" The high fidelity of DNA replication, however, DEPENDS not only on complementary base-pairing but also on SEVERAL "PROOFREADING" MECHANISMS that act sequentially to correct any initial mispairing that might have occurred. "

So Alberts does NOT agree with you, Larry.

Furthermore, following paper : DNA Damage, DNA Repair, and Nanomaterial Toxicity states:

DNA repair is regarded as one of the essential events in ALL life forms.

Unknown said...


Mikkel wrote

" That's also why higher replication fidelity can evolve from non-correcting enzymes. They still work even without error correction and don't cause the death of the carrier organism, so if they mutate and increase their accuracy, the carrier will have children with fewer deleterious mutations. If you could actaully think all this would be immediately apparent to you. "

You just sucked that out of your finger ? No, you just had a sudden burst of fantasious imagination , Mikkel ? Anyway, you provide a very nice example of what i call PSEUDO - SCIENCE. That is, when someone makes up a superficial scientific story without any evidence whatsoever to back up the claim. But you are not alone, Mikkel. It happens all the time in scientific papers.... LOL.

Let us digg deeper, shall we ??!!

Do you have a idea about the complexity of the three steps that give rise to High-Fidelity DNA synthesis ??

5ʹ → 3ʹ polymerization 1 in 10^5
3ʹ → 5ʹ exonucleolytic proofreading 1 in 10^2
Strand-directed mismatch repair 1 in 10^3
Combined 1 in 10^10

Unknown said...
This comment has been removed by a blog administrator.
Faizal Ali said...

ROTFLMAO! Keep it up, Otangelo. You have no idea how amusing it is to watch you keep spinning your wheels after you claim has been demonstrated to be wrong by direct empirical observation. That claim, you seem to have forgotten is: "Natural selection cannot act without accurate replication, yet the protein machinery for the level of accuracy required is itself built by the very genetic code it is designed to protect." The evidence that demonstrates that to be false has been presented to you, yet you refuse to accept it. And your posts above do not address this.

IDiot.

Unknown said...

Eric,
A ten fold increase in rate is in response to the stressor.
It doesn't matter that the mutation occurs without the stressor.
It doesn't matter that the rate is less than 100%.
It doesn't matter that other mutations occur- it matters that this one occurs at a much higher rate than the others. And according to the text this one occurred 10 times more often while the other sites did not experience similar changes.

You are using a different definition for 'directed mutation' than the researchers.
Based on what I've read thus far I'm going to agree the researchers did not find a 'truly directed system' as you seem to be using the term.

Anonymous said...

Diogenes -- Good point -- for recent polyploids. As John Harshman says, you're right as long as all that DNA stays functional. However, genes tend to mutate away or be lost. Therefore, it is common that, for example, hexaploids have less than 3 times as much DNA as diploids, and plants that are 20X have a lot less than 10 times as much DNA as diploids. As far as isozymes go, ancient "diploidized" polyploids can be scored as diploids except for occasional duplicated genes.

Ed said...

Luitesuite, are you sure? I mean Otangelo did write this a few posts above:

"And if you read my comment above, i am actually eagerly waiting Larry to back up his claims. Be assured that i WILL NOT IGNORE his evidence, and be the first to change my mind, if the eventually new scientific evidence ( which i might not know of yet ) refutes what i learned so far. "

He can't have lied, I mean he's a devout christian, they can't lie... right??

Mikkel Rumraket Rasmussen said...

"Let us digg deeper, shall we ??!!"

Yes, let's do that. Keep going. XD

Mikkel Rumraket Rasmussen said...

As a committed trooper of God, doing God's work on Earth, not only can Otangelo not lie, and it can't be the case that there is something he doesn't understand, he literally can't be wrong. It's impossible. The chosen one, a prophet send to us by God Himself! Us lowly nonbelievers just can't keep up with this glorious oracle of scientific facts.

It's been good boys but now is time to buy sandals and prayer mats and PRAISE THE LORD!

Unknown said...

kkkk.... lick your wounds, Mikkel.... the situation is terrible.

Faizal Ali said...

Again: I don't think anyone is disputing the claims of these researchers. What is being disputed is whether this finding supports the claims of the "3rd Way" creationists.

Anonymous said...

Otangelo,

I don't expect honesty from you, but prove me wrong. Where in your quote does Alberts deny what Larry said. Remember that Larry was very specific in what he said:

"is still accurate enough to replicate lots of DNA without making any mistakes"

Alberts is talking about high fidelity, but he's not saying that the polymerase alone cannot replicate lots of DNA without making any mistakes.

Do you understand that at all? Show us that honesty you were claiming. Show that you understand your mistake.

I won't expect much from you, so I call you an illiterate imbecile right now. I know you won't admit that you neither can know what high-fidelity means in that quote, nor whether or not that quote contradicts Larry's point.

You're an illiterate imbecile.

Anonymous said...

Once again we see a creationist understanding of evolution lead to thinking about an impossible situation, which is used to disprove evolution. He imagines a first cell that had to be highly efficient at replication. He forgets the long time before the first cell and that no doubt the first cell didn't function all that well.

Long before the first living cell, there were eons of biochemical change and evolution. RNA, for example, can catalyze its own replication. It's not perfect, but variants that are better at replicating accurately become more common -- and eventually more efficient. (This has been seen in lab experiments.) RNA can also catalyze protein formation. If an RNA molecule by chance catalyzes formation of a protein that increases the rate or accuracy of its RNA replication, that RNA/protein combination would increase. A great deal of biochemical evolution could happen, must have happened before the first cell that we'd all agree was living.

(I know OG/ES has defined natural selection as impossible before the first living cell, but he's wrong.)

The first cells we'd call living didn't need to be highly efficient at replicating, as long as the probability of producing a viable cell was just slightly higher than the probability of producing an inviable one. After all, there was nobody to compete with, at first. And given how poorly the first cells operated, there may have been more mutational "space" to improve function than there is now.

The editing machinery associated with DNA replication is amazing. Wonderful. But not needed intially.

Anonymous said...

OG/ES wrote: "Nucleotide excision repair (NER) is a particularly important excision mechanism that removes DNA damage induced by ultraviolet light (UV)."

Very true. And no doubt irrelevant to the earliest cells, which probably formed in deep sea vents, mud in estuaries, and other locations effectively hidden from UV radiation.

Eric said...

Eric,

"You are using a different definition for 'directed mutation' than the researchers."

I am using a different definition, and the correct definition. All of those things you claim don't matter, well, they do matter.

Part of it is understanding what the theory of evolution really states, which is that there is no meaningful connection between what the organism needs and what mutations occur. Going from 1 mutation to 10 mutations per 10 million divisions is not meaningful. At best, it is happenstance that certain conditions cause different mutations to happen at a slightly different rate. There is no meaningful cellular system that specifically senses a metabolite and then mutations a specific base in response to the presence of that metabolite. If such a system existed, it would happen a lot more than 10 times every 10 million divisions. If we look at the lac operon, 100% of E. coli with the lac operon start producing more B-gal when lactose is present (and glucose is not). ALL of them. That is a directed system.

Unknown said...

You seem to be engaged in goal post moving.

I have given you the definition being used by the experts in the field and you have denied this definition without supplying an alternative with reference.

Anonymous said...

Medical dictionaries are vague: directed mutation = "useful mutations occurring at specific genomic locations in response to particular conditions related to selection."

Lenski & Mittler’s paper on the directed mutation controversy is carefully specific (1993. The directed muation controversy and neo-Darwinism. Science 259: 188-194: “We define as directed a mutation that occurs at a higher rate specifically when (and even because) it is advantageous to the organism, wereas comparable increases in rate do not occur either (i) in the same environment for similar mutations that are not advantageous or (ii) for the same mutation in similar environoments where it is not advantageous.”

Faizal Ali said...

@ Jack Johnson

I have given you the definition being used by the experts in the field and you have denied this definition without supplying an alternative with reference.

No, what you have given is a definition, not the definition.

Anyway, now that you have an alternative definition with a reference, perhaps you can tell us whether any examples exist of the type defined by Lenski and Miller. The example in the paper you have cited clearly does not, and that example is fully explained by current evolutionary theory. And example meeting the Lenski and Miller definition, OTOH, would require a major revision of evolutionary theory.

Faizal Ali said...

Sorry, that should be "Mittler" not "Miller"

Unknown said...

The paper I linked to is an example of a 'directed mutation' as defined by Lenski and Miller.

As I pointed out earlier, this trait would effect reproduction rates, therefore it is evolvable.

What would require a revision of the theory would be if the sequence the researchers found was not evolvable.


Faizal Ali said...

The paper I linked to is an example of a 'directed mutation' as defined by Lenski and Miller.

No, it isn't. Learn how to read.

Faizal Ali said...

Here, let me try help you, Jack. This is Lenski and Mittler's defintion, w/ emphasis added by me:

“We define as directed a mutation that occurs at a higher rate specifically when (and even because) it is advantageous to the organism, wereas comparable increases in rate do not occur either (i) in the same environment for similar mutations that are not advantageous or (ii) for the same mutation in similar environoments where it is not advantageous.”

Do you understand the part in bold? Do you think that requirement is met by your example? If you think so, then you need to re-read, because you don't understand what you've read.

Unknown said...

lutesuite-
From the paper-

"But what about other sites on the E. coli chromosome? Examination of three other operons known to be activated by IS5 insertion, the fucAO,41 flhDC42 and bglGFB operons,38,39 revealed that neither the presence of glycerol nor the loss of GlpR influenced the IS5 hopping rates to these sites."

From this we see-
“comparable increases in rate do not occur either (i) in the same environment for similar mutations that are not advantageous”

I’m guessing that because you did not bold the part about ‘comparable increases’ you forgot that clause applied to the phrase you bolded.
Otherwise I’m sensing that you have not read the paper at all because if you had you would realize it is laid out to make the case the definition has been met.

Anonymous said...

I've read the article. A slow slog because it's not my field. This example seems to meet the definition of directed mutation, but not the idea behind directed mutation. (This is a "letter of the law" versus "spirit of the law" thing.) It seems to be a very odd mechanism of gene regulation.

I'll try to explain.

Anonymous said...

Bear in mind that this is not my field! Correct me if necessary!

In E. coli, aerobic growth on a substrate of glycerol requires enzymes whose transcription is controlled by the glp regulon. Though the situation is more complicated, CRP is needed for this gene cluster to operate, and GlpR stops it. Mutants that can't make CRP (or aren't sensitive to it?) can't live on just glycerol.

Meanwhile, the genome includes the IS5 transposon. As transposons do, it hops around. One of the places it can go is in front of the whole glp gene complex. When it goes there, it turns the complex on. Interestingly, it can also hop into the regulatory regions of other genes and turn them on (or off??).

So . . .

Anonymous said...

If E. coli that lacks CRP are placed on a medium where glycerol is the only major food source, they can't grow. However, if you wait a few days, though colonies will appear. They're making the necessary enzymes.

How could this be? 116 of 116 such mutant colonies had IS5 inserted in the beginning of the glp regulon!

Transposition of IS5 is regulated by the same GlpR that is involved in repressing this gene complex. (At that point, my brain gave up, so if you want to understand more about that, read the paper.)

The researchers checked the regulation areas of some other genes unrelated to glp but known to have their rates affected by the IS5 transposon. They report that IS5 insertion rate did not increase when the bacteria were plated with glycerol as the food source.

So . . .

Anonymous said...

Is the insertion of IS5 in front of the glp regulon a mutation? Yes, it is a change in gene sequence.

Does the insertion of IS5 in front of the glp region meet the definition of a "directed" mutation? I can't evaluate the evidence well, but apparently yes. It occurs at a higher rate when glycerol is the only food source than in other conditions. It seems to occur in the regulon of this gene and not in others, when glyerol is available, due to the action of GlpR.

Is this what Lenski and Mittel, for example, meant by directed mutation? Not really. It appears that this is an example of a "tamed" transposon functioning in the cell as a form of regulation of the glp genes (and probably some other, unrelated genes). This is NOT a case of the DNA mutating through normal mechanisms to provide what the cell needs.

Very interesting example.

Unknown said...

bwilson295-
Thank-you for the analysis— I agree as I understand it. (I’m not as expert as you probably).

I’m thinking the trait (a rapid response to a specific threat) would be available for selection and therefore evolvable. (Excuse me if I butcher the nomenclature). From this I get the impression that it is possible to evolve sequences that respond in ‘directed’ and ’non-random’ ways.

From other comments I see that there is some idea that a ‘directed mutation’ would require foresight or knowledge or perhaps even a supernatural element.
I don’t think any of those things are required to explain the sequence presented in the paper.

I’ve been told that evolution is smarter than evolutionary biologists.
Maybe this is an example of that.

Anonymous said...

I agree that foresight is not needed here. It is an evolvable system. And it's not just free mutations.

"Evolution is smarter than evolutionary biologists" -- maybe, and certainly it's trickier!

Faizal Ali said...

OK, it looks like I have egg on my face. I misread the paper first skim thru. 2nd skim indicates that Jack Johnson is right, and this seems to meet Lenski and Mittler's definition of "directed." My apologies.