Rob Stadler & James Tour: Darwin vs. Design

I just finished watching three interesting discussions between Rob Stadler and James Tour comparing the potential and limits of Darwinian-style naturalistic evolution vs. what one should expect if intelligent design were required to explain the diversity between major kinds of living things (see videos below).

Both of these scientists have a high degree of relevant expertise when it comes to evaluating designed systems. Rob Stadler obtained his PhD in medical engineering from the Harvard/MIT Division of Health Sciences and Technology. James Tour obtained his Ph.D. in synthetic organic and organometallic chemistry from Purdue University and postdoctoral training in synthetic organic chemistry at the University of Wisconsin and Stanford University.

The Darwinian Tree of Life:

Their discussions focus on the proposed Darwinian “Tree of Life” where all of the diversity of living things on this planet is thought to have been derived over hundreds of millions of years via the addition of innumerable evolutionary modifications starting from the same original common ancestor (some single-celled organism in some warm pond some four billion or so years ago).

The Forest of Life:

Or, does the evidence better support a kind of “Forest of Life” (or a Dependency Graph Model) where all of the major kinds of living things were originally designed and front-loaded with all of the genetic information for extensive, but limited, diversity within their respective gene pools?

These fascinating discussions are well worth your time. I’m embedding the first video, starting at a section that kind of summarizes the arguments in favor of the “Tree” vs. the “Forest” of Life.  However, I would recommend watching the entirety of both discussions below.

Examples of Evolution in Action:

Citrate Evolution:

The second discussion, in particular, reviews the novel evolution of citrate metabolism, in the presence of oxygen within E. coli bacteria, demonstrating that nothing truly genetically novel evolves within the gene pool (based on just two gene duplication mutations without any changes to those genes or new structural enzymes evolving). Note that E. coli could already metabolize citrate in the absence of oxygen. Overall, this is based on an actual loss of genetic control over gene enzymatic production. This is actually a form of devolution, not devolution. A similar thing, a loss of genetic information, was the basis for lactase evolution in E. coli.

Whole Genome Duplication:

There is also an interesting side discussion with Sy Garte, PhD (biochemist, author, researcher, and former atheist) regarding the argument that whole genome duplication allows for the evolution of more complex features within the gene pool, but there is just no evidence for this. The “multicellular yeast” experiments didn’t show any differentiation or specialization of the multiple cells.  They were all identical. It turns out that genetic deletions (of three genes in particular) resulted in the same “multicellular” effect.  So, again, this is based on a loss, not a gain, of original genetic information within the gene pool.

Two Coordinated Mutations Require > 100 Million Years:

In fact, as highlighted in the third discussion below, theoretical statistical research on how long it would take to realize a novel function that required just two coordinated genetic point mutations (non-beneficial unless both mutations were realized within an organism) shows that this would require over 100 million years in a human population of 8 billion (Link– later amended to 216 million years). This is because the odds of just two coordinated mutations just coming along at random chance are 1 in ten million billion (1016). This, of course, means that statistically, humans could not have evolved from apes.  The required functional genetic gap distances for the millions of required genetic changes would simply have been too hard to cross in what anyone would consider to be a reasonable amount of time.

Rick Durrett & Deena Schmidt, Waiting for two mutations: with applications to regulatory sequence evolution and the limits of Darwinian evolution. Genetics. 2008 Nov;180(3):1501-9. (Link)

This paper was, ironically, written in an effort to discredit Michael Behe and his statistical arguments against the evolution of complex biomachines with a minimum required number of uniquely interacting parts.

They wrote that one of their aims was to “expose flaws in some of Michael Behe’s arguments concerning mathematical limits to Darwinian evolution.”

Behe’s response to Durrett and Schmidt is also interesting: Link, Link

Texas Sharpshooter Fallacy:

The best counterargument to the coordinated mutation problem is known as the Texas Sharpshooter Fallacy (Link). Lawrence Moran (2016) presented a particularly interesting problem regarding chloroquine resistance in malaria organisms (Plasmodium falciparum) in relatively short order (Link). At that time, it was thought that at least two separate mutations were required to achieve a selectable level of chloroquine resistance, where the initial mutation(s) were neutral with respect to function. Now, it is known that a selectable level of chloroquine resistance can be achieved with just a single point mutation (in different locations).

Chloroquine (CQ) resistance in Plasmodium falciparum malaria can be initiated by a single point mutation, specifically the K76T substitution (lysine to threonine at position 76) in the pfcrt gene. (Link)

The critical mutation conferring the first level of chloroquine resistance is found in aat1, a putative aminoacid transporter. (Link)

Chloroquine resistance in Plasmodium falciparum is a complex, polygenic trait primarily driven by multiple mutations (typically 4–10) in the pfcrt (Plasmodium falciparum chloroquine resistance transporter) gene. While a single point mutation can initiate low-level transport activity, full resistance requires a specific, multi-step evolutionary pathway. (Link)

 

Of course, once even low-level beneficial functional activity is realized, further stepwise improvements based on individual point mutations will happen rapidly under heavy selection pressure.

So, even given that a at least two mutations are required to acheive a seletable level of functionality, in the relatively large population of malarial organisms on the planet (much larger than the number of humans on the planet – likely in the trillions or quadrillions, as a single infected person can host billions of parasites), the first mutation is likely already present in a number of the individual organisms in the population. That means, of course, that the realization of the second necessary mutation could occur in relatively short order, particularly given the rapid generation time (24-72 hours).

In short, the argument of Larry Moran is that there are many different possible targets in sequence space that are potentially beneficial. In other words, hitting the same bullet hole isn’t so insurmountable if there are a great many bullet holes already in the target. Therefore, even though a lot of time may be required to hit one particular target, the odds that at least one target, among a great many options, will be hit in a short amount of time are actually very good. And, this is true at very low levels of functional complexity. However, with each linear increase in the minimum size and/or specificity required at higher and higher levels of functional complexity, the ratio of non-beneficial to potentially beneficial targets within sequence space increases exponentially – and the average time to success as well (Link, Link). The evolution of chloroquine resistance in malaria is not a good example here since a selectable level of functionality is based on only a single point mutation.

Compare this to the gap distances in sequence space comprised of numerous specific mutational changes between each of the proposed steppingstones along the evolutionary pathway for complex multiprotein systems like the flagellar motility system.  It is for this very reason that no such evolutionary success has ever been demonstrated in any wild population or under laboratory conditions, nor has an evolutionary pathway even been proposed that could be reasonably traversed in what anyone would consider to be a reasonable amount of time (Link).

 

Rice Evolving Cold Weather Resistance:

Another interesting observation is that research done on rice gaining resistance to cold weather isn’t based on genetic mutations, but upon changes in epigenetic methylation rates. These changes are based on pre-programmed genetic options that already existed within the gene pool (Link).

YouTube Videos:

.

.

 

 

 

 

_________________

Dr. Sean Pitman is a pathologist, with subspecialties in anatomic, clinical, and hematopathology, currently working in N. California.