The Religion Guy observes that with “fake news” all over the news it’s wise to be aware of fake history.
Consider Dan Brown’s influential pop novel “The DaVinci Code.” Though the plot is fiction, readers may assume the book provides reliable historical background. Experts say that’s misleading, and one example is Brown’s version of how the New Testament came to be.
That’s a timely question due to an important new technical work on the subject, “The Biblical Canon Lists from Early Christianity” by Edmon Gallagher and John Meade. “Canon” refers to recognized scriptures. Oxford University Press published this collection of ancient texts and analysis in Britain this month, with U.S. release due in January.
(The following relies on “The Formation of the New Testament” (1965) by Robert M. Grant of the University of Chicago, and “The Canon of the New Testament” (1987) by Bruce M. Metzger of Princeton Theological Seminary, and covers only the New Testament, not the canon of the Hebrew Bible a.k.a. Old Testament.)
Brown is correct that many texts about Jesus were circulating during Christianity’s first few centuries, so decisions had to be made about which were authentic and recognized as scripture. Many Christian folk don’t realize how complex the process was.
By Brown’s account, the Roman Emperor Constantine was the power-broker who picked only Matthew, Mark, Luke and John out of some 80 Gospels in contention. Actually what Constantine did in A.D. 331 was commission Bishop Eusebius to have copyists produce 50 new copies of the Greek canon to replace Scriptures that had been destroyed during Rome’s previous anti-Christian purge. The 80 count is exaggerated, and most rejected writings did not resemble the genre of the favored four, which were not chosen by the emperor but church authorities.