Monday, 24 December 2012
Abuse, Catholics and Richard Dawkins
Anyway rather than warranting his comments with a reply I thought instead I could make you aware of an on-line rap he features in, enjoy and have a great Christmas!
Friday, 12 October 2012
Lydia McGrew on God, Faith and Evidence
Tuesday, 25 September 2012
The Bacterial Flagellum Revisited: A Paradigm of Design
Going back to my undergraduate days, I have long been struck by the engineering elegance and intrinsic beauty of that familiar icon of intelligent design, the bacterial flagellar nano-motor. In tribute to this masterpiece of design, I have just published a detailed (31 pages, inclusive of references) literature review in which I describe the processes underlying its self-assembly and operations.
My essay also attempts to evaluate the plausibility of such a system having evolved by natural selection. Here’s a short excerpt to whet your appetite.
The bacterial flagellum is a reversible, self-assembling, rotary nano-motor associated with the majority of swimming bacteria. There exists a number of different models of this rotary motor (Pallen and Matzke, 2006; Soutourina and Bertin, 2003). Flagella are produced by a very tightly regulated assembly pathway (Chevance and Hughes, 2008; Macnab, 2003; Aldridge and Hughes, 2002), and the archetypical system for understanding flagellar assembly belongs to Salmonella enterica serovar Typhimurium, a rod-shaped gram negative bacterium of the family Enterobacteriaceae.
Flagella receive feedback from the environment by virtue of an elegant signal transduction circuit and can adjust their course in response to external stimuli by a mechanism known as chemotaxis (Baker et al., 2006 Bourret and Stock, 2002; Bren and Eisenbach, 2000). The most extensively studied chemotaxis system belongs to Escherichia coli.
By itself, the rotor is able to turn at a speed between 6,000 and 17,000 rotations per minute (rpm) but normally only achieves a speed of 200 to 1000 rpm when the flagellar filament (that is, the propeller) is attached. Its forward and reverse gears allow the motor to reverse direction within a quarter turn.
The bacterial flagellum, which has been described as a “nanotechnological marvel” (Berg, 2003), has long been championed as an icon of the modern intelligent design movement and the flagship example of “irreducible complexity” (Behe, 1996). But even biologists outside of this community have been struck by the motor’s engineering elegance and intrinsic beauty. As one writer put it, “Since the flagellum is so well designed and beautifully constructed by an ordered assembly pathway, even I, who am not a creationist, get an awe-inspiring feeling from its “divine’ beauty” (Aizawa, 2009).
The mechanistic basis of flagellar assembly is so breathtakingly elegant and mesmerizing that the sheer engineering brilliance of the flagellar motor — and, indeed, the magnitude of the challenge it addresses to Darwinism — cannot be properly appreciated without, at minimum, a cursory knowledge of its underlying operations. The purpose of this essay is to review these intricate processes, and evaluate the plausibility of such a system evolving by natural selection.Click here to continue reading.
Thursday, 13 September 2012
On the Origin of Protein Folds
Click here to continue reading>>>
Thursday, 6 September 2012
Latest ENCODE Research Validates ID Predictions On Non-Coding Repertoire

Readers will likely recall the ENCODE project, published in a series of papers in 2007, in which (among other interesting findings) it was discovered that, even though the vast majority of our DNA does not code for proteins, the human genome is nonetheless pervasively transcribed into mRNA. The science media and blogosphere is now abuzz with the latest published research from the ENCODE project, the most recent blow to the “junk DNA” paradigm. Since the majority of the genome being non-functional (as has been claimed by many, including notably Larry Moran, P.Z. Myers, Nick Matzke, Jerry Coyne, Kenneth Miller and Richard Dawkins) would be surprising given the hypothesis of design, ID proponents have long predicted that function will be identified for much of our DNA that was once considered to be useless. In a spectacular vindication of this hypothesis, six papers have been released in Nature, in addition to a further 24 papers in Genome Research and Genome Biology, plus six review articles in The Journal of Biological Chemistry.
The lead publication of the finding (“An Integrated Encyclopaedia of DNA Elements in the Human Genome“) was released in Nature. The abstract reports,
“The human genome encodes the blueprint of life, but the function of the vast majority of its nearly three billion bases is unknown. The Encyclopedia of DNA Elements (ENCODE) project has systematically mapped regions of transcription, transcription factor association, chromatin structure and histone modification. These data enabled us to assign biochemical functions for 80% of the genome, in particular outside of the well-studied protein-coding regions. Many discovered candidate regulatory elements are physically associated with one another and with expressed genes, providing new insights into the mechanisms of gene regulation. The newly identified elements also show a statistical correspondence to sequence variants linked to human disease, and can thereby guide interpretation of this variation. Overall, the project provides new insights into the organization and regulation of our genes and genome, and is an expansive resource of functional annotations for biomedical research.” [emphasis added]They further report that,
“[E]ven using the most conservative estimates, the fraction of bases likely to be involved in direct gene regulation, even though incomplete, is significantly higher than that ascribed to protein- coding exons (1.2%), raising the possibility that more information in the human genome may be important for gene regulation than for biochemical function. Many of the regulatory elements are not constrained across mammalian evolution, which so far has been one of the most reliable indications of an important biochemical event for the organism. Thus, our data provide orthologous indicators for suggesting possible functional elements.”As this Nature press release states,
“Collectively, the papers describe 1,640 data sets generated across 147 different cell types. Among the many important results there is one that stands out above them all: more than 80% of the human genome’s components have now been assigned at least one biochemical function.” [emphasis added]The UK Guardian also covered the story, noting that
“For years, the vast stretches of DNA between our 20,000 or so protein-coding genes – more than 98% of the genetic sequence inside each of our cells – was written off as “junk” DNA. Already falling out of favour in recent years, this concept will now, with Encode’s work, be consigned to the history books.” [emphasis added]This new research places a dagger through the heart of the junk DNA paradigm, and should give adherents to this out-dated assumption yet further cause for caution before they write off DNA, for which function has yet to be identified, as “junk”. Be sure to also check out Casey Luskin’s coverage of the findings at ENV.
Monday, 27 August 2012
Are these atheists and agnostics really covert creationists?
We’ve all heard it before. Time and time again the somewhat tiresome and predictable Darwinian propagandists, in a fit of florid indignation, assert that intelligent design will be the death of science and the dawn of theocracy. The claim that ID is nothing more than warmed-over creationism is one that has been thoroughly addressed by ID proponents, yet, the vacuous claim continues to be thrown around.
Over at Evolution News And Views, John West recently highlighted the upcoming release of Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature is Almost Certainly False (Oxford University Press, 2012), a new book by New York University’s atheist philosopher Thomas Nagel. Those immersed in the debate over ID and Darwinism will be familiar with Nagel’s open scepticism towards neo-Darwinian theory and his sympathetic attitude towards ID theory. Though Nagel does not accept ID, he goes as far as to say that it has much merit and that it is science. Good on him! Nagel’s views on this issue can be found in his 2008, Philosophy & Public Affairs article Public Education and Intelligent Design. It will be good to read more about his views as they are further expressed in his new book Mind and Cosmos. West includes a couple of delicious quotes from chapter 1 of Nagel’s book:
Saturday, 4 August 2012
Thursday, 24 May 2012
Dolphins and Porpoises and...Bats? Oh My! Evolution's Convergence Problem
I have recently been reading George McGhee's Convergent Evolution: Limited Forms Most Beautiful. McGhee's book is a gripping read, and it favorably cites the work of both Michael Denton and Douglas Axe, ID-friendly scientists well known to readers of ENV. The book documents a multitude of cases of convergent evolution (homoplasy), the phenomenon of repeated evolution. When similarity is thought to have arisen by means of common ancestry, the features in question are said to be "homologous." When similarity is thought to have arisen by means other than common ancestry, the features are said to be "analogous."
If you take such similarity as pointing to common descent, then you would expect to see it exhibiting a nested hierarchical distribution, the more seamless the better. In other words, the patterns of distribution of this similarity ought to mutually corroborate a single family tree. Sure, there might be occasional deviations from that tree, the results of phenomena such as incomplete lineage sorting. One would not expect to see the pervasive occurrence of a high degree of similarity -- what would normally be regarded as "homology" -- that decidedly cannot be accounted for within the framework of common descent. Yet that is in fact what we do observe.
Tuesday, 13 March 2012
Saturday, 25 February 2012
The Argument From Cosmic Fine Tuning

Fundamental Constants
The ripples in the universe left over from the original ‘Big Bang’ singularity (often referred to as CMB, or cosmic background radiation) are detectable at one part in 10^5 (100,000). If this factor were even slightly smaller, the cosmos would exist exclusively as a collection of gas -- stars, planets and galaxies would not exist. Conversely, if this factor were increased slightly, the universe would consist only of large black holes. Either way, the universe would be uninhabitable.
Another finely tuned value is the strong nuclear force that holds atoms -- and therefore matter -- together. The sun derives its ‘fuel’ from fusing hydrogen atoms together. When two hydrogen atoms fuse, 0.7% of the mass of the hydrogen atoms is converted into energy. If the amount of matter converted were slightly smaller -- say, 0.6% instead of 0.7% -- a proton would not be able to bond to a neutron and the universe would consist only of hydrogen. Without the presence of heavy elements, planets would not form and hence no life would be possible. Conversely, if the amount of matter converted were increased to 0.8% instead of 0.7%, fusion would occur so rapidly that no hydrogen would remain. Again, the result would be no planets, no solar systems and hence no life.
The ratio of electrons to protons must be finely balanced to a degree of one part in 10^37. If this fundamental constant were to be any larger or smaller than this, the electromagnetism would dominate gravity -- preventing the formation of galaxies, stars and planets. Again, life would not be possible.
The ratio of the electromagnetic force to gravity must be finely balanced to a degree of one part in 10^40. If this value were to be increased slightly, all stars would be at least 40% more massive than our Sun. This would mean that stellar burning would be too brief and too uneven to support complex life. If this value were to be decreased slightly, all stars would be at least 20% less massive than the sun. This would render them incapable of producing heavy elements.
The rate at which the universe expands must be finely tuned to one part in 10^55. If the universe expanded too fast, matter would expand too quickly for the formation of stars, planets and galaxies. If the universe expanded too slowly, the universe would quickly collapse -- before the formation of stars.
The mass density of the universe is finely balanced to permit life to a degree of one part in 10^59. If the universe were slightly more massive, an overabundance of deuterium from the big bang would cause stars to burn too rapidly for the formation of complex life. If the universe were slightly less massive, an insufficiency of helium would result in a shortage of the heavy elements -- again, resulting in no life.
Sunday, 12 February 2012
Tuesday, 7 February 2012
Friday, 3 February 2012
Wednesday, 1 February 2012
Monday, 30 January 2012
Sunday, 29 January 2012
Tuesday, 24 January 2012
The Finely Tuned Genetic Code
Francis Crick regarded the genetic code found in nature as a "frozen accident." Yet more and more it is looking to be the case that this code is exquisitely finely tuned -- with features suggesting it is indeed one in a million. Therefore ought not purposive or intelligent design be regarded as a legitimate inference, as the best explanation for how the code came into existence?
We are all familiar with the genetic code by virtue of which an mRNA transcript is translated into the amino acid residues that form proteins. Triplets of nucleotides -- called "codons" -- serve as "molecular words," each of them specifying a particular amino acid or the stop sites of open reading frames (ORFs). Ribosomes and tRNA-methionine complexes (called "charged" methionyl tRNAs) attach near the 5' end of the mRNA molecule at the initiation codon AUG (which specifies the amino acid methionine) and begin to translate its ribonucleotide sequences into the specific amino acid sequence necessary to form a functional protein. Each amino acid becomes attached at its carboxyl terminus to the 3' end of its own species of tRNA by an enzyme known as amino-acyl tRNA synthetase.
Two sites exist on a ribosome for activated tRNAs: the peptidyl site and the amino-acyl site (P site and A site respectively). The initiation codon, carrying methionine, enters the P site. The 3' UAC 5' anticodon of the tRNA is paired with the complementary 5' AUG 3' mRNA codon. The second tRNA enters the A site. An enzymatic part of the ribosome called peptidyl transferase then creates a peptide bond to link the two amino acids. Upon formation of the peptide bond, the amino-acyl bond that connected the amino acid to its corresponding tRNA is broken, and the tRNA is thus able to leave the P site. This is followed by ribosomal translocation to position a new open codon in the empty A site and also move the second tRNA -- which is now bonded to a dipeptide -- from the A to the P site. And so the cycle repeats until the occurrence of a stop codon that prevents further chain elongation.
For a visual illustration of how this works in practice, I refer readers to the following short animation:
The total number of possible RNA triplets amounts to 64 different codons. Of those, 61 specify amino acids, with the remaining three (UAG, UAA and UGA) serving as stop codons, which halt the process of protein synthesis. Because there are only twenty different amino acids, some of the codons are redundant. This means that several codons can code for the same amino acid. The cellular pathways and mechanisms that make this 64-to-20 mapping possible is a marvel of molecular logic. It's enough to make any engineer to drool. But the signs of design extend well beyond the sheer engineering brilliance of the cellular translation apparatus. In this article, I will show several layers of design ingenuity exhibited by this masterpiece of nanotechnology.