Thursday 12 January 2012

Has The New Testament Been Substantially Edited Since It Was First Penned?

This is one of the most frequently-asked questions I encounter when attempting to explain to people the massive evidence which stands in support and corroboration of the Christian worldview. I probably actually get this question more frequently from the Muslim/Islamic community than I do from atheists, agnostics and other non-believers. The reason for this is not hard to see. The Qur’an not only claims that the gospels and the torah are revealed Scripture, but claims that the Qur’an and the Bible are consistent with one another (e.g. Surah 2:75; Surah 5:49; Surah 32:23; Surah 17:55). But any cursory reading of these books quickly reveals that this is not the case. And it is not merely on minor details that the Qur’an and Bible differ, but on close-to-all of their assertions. For example, the Qur’an (Surah 4:157) explicitly denies that Jesus was even crucified, though this is the core doctrine of the Christian faith. The Qur’an also explicitly denies the Triune character of God (Surah 4:171), though there is no evidence that Muhammad understood what this doctrine was, and it is substantially misrepresented in the pages of the Qur’an (e.g. Surah 2:116, 5:72-76, 5:119, 6:101, 19:35). The Qur’an also teaches that Muhammad is predicted in the Christian and Jewish Scriptures (Surah 61:6), though Muhammad is no where to be found in any of the Christian Scriptures.

Modern Muslims attempt to evade this dilemma by arguing that the Christian Scriptures were consistent with the Qur’an, in their original form, but that they have been substantially edited and re-written in the centuries thereafter. But this argument fails for at least two reasons. First, the Qur’an refers the Christians and the Jews back to their own Scriptures for confirmation of the message revealed by Islam. There is absolutely no suggestion in the Qur’an or Hadith literature, at least to my knowledge, that the Bible was regarded as having been edited in such a radical fashion. Certainly this process of editing cannot have taken place after the Qur’an was written (we have entire copies of the New Testament which predate the Qur’an by hundreds of years, such as the Codex Sinaiticus). But what about before?

This claim is not only made by the modern Islamic community, but also by liberal new testament scholars such as Bart Ehrman (as outlined in Misquoting Jesus, among other books and publications). For a thorough critique of the views of Ehrman, I refer readers to the debate featuring Dr. Bart Ehrman and Dr. James White (link here) and Timothy Paul Jones’ book, Misquoting Truth – A Guide to the Fallacies of Bart Ehrman.

The curious thing about Bart Ehrman is that the views he articulates in his popular-level work are not the same as those he espouses in his professional/scholarly publications. Indeed, readers may find this curious and very telling quotation, taken from the appendix (p. 252) of Misquoting Jesus, of interest:

“Bruce Metzger is one of the great scholars of modern times, and I dedicated the book to him because he was both my inspiration for going into textual criticism and the person who trained me in the field. I have nothing but respect and admiration for him. And even though we may disagree on important religious questions - he is a firmly committed Christian and I am not - we are in complete agreement on a number of very important historical and textual questions. If he and I were put in a room and asked to hammer out a consensus statement on what we think the original text of the New Testament probably looked like, there would be very few points of disagreement - maybe one or two dozen places out of many thousands. The position I argue for in ‘Misquoting Jesus’ does not actually stand at odds with Prof. Metzger’s position that the essential Christian beliefs are not affected by textual variants in the manuscript tradition of the New Testament.” [Emphasis added]
When compared to the situation for other great works of antiquity, the New Testament comes out very well indeed as far as the manuscript tradition and textual variation is concerned. Indeed, we now have more than 5,800 Greek manuscripts of the New Testament for comparison with our present new testament, and most of the new testament can be reconstructed from quotations by the early church fathers.

It is the application of double standards – specifically, the subjection of the New Testament to criterion which would never be applied to such an extent to any other ancient writing – which is the sure sign of a failed argument. The Jewish transmission of sacred traditions was highly developed and reliable. There has never been an Uthman-like character controlling manuscript transmission, as in Islam. All assertions regarding adding and manipulation of doctrines, altering theology, removing key teachings etc, are non-viable. The Christian church was a persecuted minority until the fourth century, with no power to enforce a uniform textual transmission.

Then we have multiple lines of transmission. The earliest manuscripts we have demonstrated the existence not of a single line of corrupt transmission, but multiple lines of transmission with varying levels of accuracy. Multiple lines of transmission defy the possibility of being under the control of any central editing process. The burden of proof lies with the skeptic who asserts corruption of the primitive New Testament texts since the extant manuscripts show multiple lines of independent transmission. The skeptic must show how the New Testament text can appear in history, via multiple lines of transmission, and yet each line presents the same text, without any controlling authority.

It is not Bart Ehrman’s facts which I take issue with. I take issue with his conclusions which are drawn from his (usually correct) facts. For example, he asserts that there are more inconsistencies in our various copies of the New Testament than there are words in the New Testament. As a matter of factual statement, the claim is true. There are more textual variants in the New Testament than there are words. This is absolutely correct. However, it is the conclusions which are drawn from this correct statement which are so misleading, and I might even go as far as to say maliciously deceiving.

As noted, there are about 5800 (give or take) handwritten New Testament Greek manuscripts, and there are approximately 400,000 textual variants amongst those manuscripts. But the fact is that the vast majority of these variants are utterly irrelevant to the proper understanding and translation of the text.

Obviously, the more manuscripts you have, the more variants you are going to have. If you only have a small number of manuscripts, you have fewer variants and reciprocally less certainty of the original text.

Having manuscripts from different areas at different times, from different lines of transmission, yet all testifying to the same text, is solid evidence that you have the document in more or less its original form. The New Testament has more manuscripts (by far) than any other work of antiquity, approximately 1.3 million pages of handwritten text. When you consider that most of the differences come down to such things as whether the name John is spelt with one nu or two nus, the actual number of meaningful textual variants of the New Testament is much, much less.

On top of that, when you factor in viability, that is whether the variants in question have a chance to be original, the situation changes even more dramatically.

This does not prove that Christianity is true, of course. One can have a perfect copy of a lie. But it does serve to refute a popular objection to the Christian faith, coming from multiple groups, including Islam and the New Atheism.


  1. "...the fact is that the vast majority of these variants are utterly irrelevant to the proper understanding and translation of the text."

    Can you either explain this statement or provide a link that explains it? 400,000 is a large number, it seems like the details of how you minimize it are needed here.

    1. This was a short article about a discipline know as "textual criticism" (not critical, but more in the line of analysis). There are numerous ways that fallible humans can inadvertently corrupt a text. Here are just a few:

      (1) Letters and words that sound alike. If a copyist is not a student of the language, and he is listening to a reader, he may use a "k" for a "q," for instance. Or perhaps write "here" for "hear" without thinking.

      (2) A copyist might not even know the language, and thus make a letter-by-letter copy. In the process, he might form letters wrongly. "u" becomes a "v" (where a "v" is an approximation of the Greek "n"). A tau ("t") might replace a "pi."

      (3) A well educated copyist might actually copy from memory instead of listening to a reader. He could leave out a word, or spell it differently, or any number of other mental lapses.

      (4) If a copyist has the luxury, or the responsibility, of copying a manuscript on his own, he might make errors based on mental fatigue, poor lighting, or merely carelessly skipping a line, thus leaving out words or repeating a phrase.

      I have seen examples of this kind of errors, and have made some of them myself in copying Greek or Hebrew into a notebook. The errors that abound in the copies do not invalidate the original manuscripts because the mistakes are small and obvious.


Related Posts Plugin for WordPress, Blogger...