Friday, November 23, 2007
Why Darwinists Hate Information Theory
The title is a play on John 1:1, "In the beginning was the Word..." As Dr. Gitt points out: a word is information. The two terms are interchangeable in that a "word" strictly
defined is a unit of information. He further points out that modern physics recognizes
three constituents, which make up the Universe. These are Matter, Energy, and Information. Information is the organization of the matter and energy in the Universe. Without organization, the Universe would be totally chaotic, and in a totally chaotic environment, life would not be possible.
The organization in the universe is produced by the Natural Laws, which govern it. It is
agreed by all physicists that natural laws cannot be proven. They are unlike mathematical theorems and must be discovered and tested empirically. The origin of natural laws is unknown. Natural laws are the carriers of the information, which organizes the Universe. Gitt points out that the origin of the Universe cannot be explained in terms of these natural laws because they do not preexist the creation of the Universe. In the original creation event, "The Big Bang," all the information that governs the physics of the Universe today had to be in place already. The Big Bang was like a giant firework; it had to be packed just right before the “Bang” in order to produce all the stars, galaxies, and clusters of galaxies seen today.
This initial information can only have come from one of two sources. Either it is an inherent property of matter, or its origin had to come from an outside intelligent source. For the past forty years, some theorists have tried to find a way to ascribe this information’s origin to matter. Their efforts resulted in such conjectures as a Fourth Law of Thermodynamics, Chaos Theory, emergent intelligence from computational processes, and other proposals. These ideas have all proved utter failures in this arena because information is not a property of matter; it is a basic independent constituent of the Universe.
Dr. Gitt demonstrates that information is also the central characteristic of all
living systems. All biological processes are completely information dependent. Whether it is making a tool, collecting pollen, or digesting a sugar molecule, tremendous amounts of information and information processing power are required. The human body processes 3x10 to the 24th bits of information every day. All the libraries on the planet only hold 10 to the 18th bits or 1 three millionth as much information as the human body processes everyday. In terms of information processing, the cells of the human body legitimately can be thought of as supreme super computers.
This super computational capacity enables the body to manufacture at least 50,000 different proteins plus tens of thousands of RNAs, micro RNAs, DNAs, enzymes, antibodies, antigens, and other products needed to run the body. The DNA molecule is the primary storage medium for the information needed to manage all of this activity. A DNA molecule is a strand of amino acids that is only about two millionths of an inch thick and has achieved the theoretical limit of miniaturization for information storage capacity. Dr. Gitt finds it no accident that the DNA molecule, with the genetic code it carries, is an optimized information carrier.
This is another fact that defies Darwinian expectations. If the information stored in the genetic code arose as the result of random processes, why would the system have achieved optimization? This optimization of information storage is so efficient that it allows the DNA molecule, during cell division, to transfer all of the information it carries to another DNA molecule in as little as 20 minutes. Imagine trying to down load a large library onto your home computer in 20 minutes? The cells of your body do this all day long.
These, however, are all minor issues compared to the largest informational conundrum- the origin of biological information. DNA produces RNA; RNA in turn makes proteins and enzymes; these proteins and enzymes in turn control the synthesis of DNA. How could such an information dependent cycle possibly originate naturally? According to Dr. Gitt it cannot. A cycle where A makes B and B makes C and C makes A, cannot boot strap itself into existence because of the mutually dependent and the self-referential nature of the process. The type of information needed to initiate such an internally dependent cycle can only come from an intelligent source.
Why This Is Important
Darwinists do not like to discuss information theory because they have no adequate answers for the profound mysteries uncovered in this area. This is another dirty little secret that the media diligently ignores. Ever since the 1960s when Darwinists ran smack into the immovable obstacle of Information Theory and Technology, they have been avoiding the subject assiduously in the hope that some new piece of evidence will emerge that allows them to disprove the math. This has not happened. As a defensive measure, books like “In The Beginning Was Information” can only be ignored. It is no accident that Dr. Gitt waited until he had retired before he published this work.
Friday, October 26, 2007
Scientists Meet To Discuss God And Science
Among the exciting topics discussed were those presented in the Keynote address by Alister McGrath. Dr. McGrath is a Molecular Biophysicist and theologian who holds the post of Professor of Historical and Systematic Theology at Oxford University. He cited several areas that represent the “New Frontiers in Science and Faith.” Among these are the renewed interest in Anthropic Phenomenon and the return to Natural Theology. The Anthropic Phenomenon (or Principle) explores the growing set of scientific observations that point to the fact that the Universe has been fine tuned (designed) to accommodate intelligent life. Natural Theology is the historical Christian approach to understanding nature and is the foundation of modern science. The return to Natural Theology has reinvigorated Christian scientists’ interest in cognitive research (where does our consciousness come from?), the origins of life, quantum physics, biology, and other areas of scientific study.
Dr. McGrath also pointed out that this new interest in the theological implications of recent scientific discoveries has contributed to the appearance of a new militant atheism that is trying to use science to evangelize its belief system.
In his lecture “Space, Time and Eternity,” Sir John Polkinghorne, FRS and former professor of Mathematical Physics at Cambridge University, discussed recent attempts by atheists to explain away the nearly undisputed fact that the universe is fine tuned for human life. In this particular instance, the atheists’ denial of God’s creation of the Universe is founded upon the claim of a Multiverse. The Multiverse theory postulates that our Universe is only one of an infinite or nearly infinite number of other Universes. With enough Universes, all with different natural laws operating in them, you will eventually get one like ours, which is perfect for human life. No designer needed! Unfortunately for the Multiverse advocates, these Universes are now and may always be undetectable; however, we are asked to believe on faith alone in their existence as an alternative to believing that the one Universe we know of and live in was created by God as a habitation for life.
Polkinghorne pointed out that this is, ‘“an idea of quite incredible ontological prodigality’ that would make William of Ockham roll over in his grave.”
Another presenter, Dr. Jan Centrella, Chief of NASA’s Gravitational Astrophysics Laboratory at the Goddard Space Flight Center, discussed her groundbreaking work on gravity waves and their implications for testing the General Theory of Relativity. She also discussed her faith journey.
Dr. Conway Morris, FRS professor of Paleobiology at the Cambridge University, discussed the fact that recent work on “evolutionary convergence” brings into question the Primary Axiom of Darwinism. The Primary Axiom is the idea that organisms are the result of random natural processes. Convergence research is showing that the necessary building blocks for intelligence appeared billions of years before the emergence of intelligence, indicating very advanced and highly sophisticated planning and design at work rather then random chance.
Why This is Important
The misuse and misinterpretation of scientific discoveries have been among the most important tools used to undermine Christian faith. These are still seen today in the intellectually shallow and ham fisted attempts by Richard Dawkins and other evangelists of atheism to destroy Christian faith through the abuse of science. Unfortunately, many are fooled by this approach, especially those who do not have the time or expertise to investigate the accuracy of their claims.
It is important that first-rate scientists in each field counter these claims. It is important that the public realize that they are not called to a blind faith that denies the facts of the real world but rather to a Faith grounded in reality.
Source: Newsletter of the American Scientific Affiliation , Sep/Oct 2007, Vol 49, No. 5, pp 1-7.
Friday, October 12, 2007
Another Icon Of Darwinism Is Falling
The appendix, a little worm shaped organ that sits at a major bend in the large intestine, seemed to be doing nothing except producing trouble by occasionally getting inflamed and causing appendicitis. However, new research published in the Journal of Theoretical Biology describes the discovery of a vital function that the appendix performs. These researchers found that the appendix provides a backup system, which has been important to the long-term survival of the human race. It seems that the appendix acts as a reservoir and shelter for the digestive bacteria that populate the human gut. These are the bacteria, such as E. Coli, that help break down food and assist in its transformation into polysaccharide molecules that can be absorbed by the body’s cells.
A reservoir of these “good bacteria” is needed because certain diseases, such as cholera, and many types of food poisoning wipe out these bacteria. It is very difficult for these bacteria to repopulate the gut after such an infection without a healthy colony to act as a starter. This is where the appendix comes in. It preserves a supply of these bacteria, which, subsequent to the eradication of friendly bacteria in the gut, are released back into the intestine, allowing the digestive system to come back online quickly to resume digesting food and to enable the victim to recover. Without an appendix, some common maladies would be fatal more often.
Oh yes, regarding “Junk DNA,” those sections of DNA which do not code for protein production, on August 25th this Blog published an article on new research done by the Encode Project showing that at least 50% of the so-called “Junk DNA” is not “Junk” at all, but indeed has numerous vital functions. As research continues on “Junk” DNA, more functions are being discovered for it. It seems that ultimately all of this “junk” will be found to have a vital purpose.
New research shows that these non-protein-coding DNAs are playing the pivotal role of “conductor” in regulating the cell’s activities. They produce huge numbers of micro-RNAs, which regulate gene expression and thereby “conduct” the activities of the coding sections of the DNA molecules. This is no small thing; these micro-RNAs are like little mid-level managers, accountants, and quality control agents in a factory. They are responsible for making sure that the right proteins get to the right place at the right time and much more.
This new, vast level of cellular complexity has been labeled the “Conductome.”
Why This Is Important
The existence of currently useless organs (if such actually exist) does not prove that God did not design them. It arguably could show that God was slow to remove them or, the truth be told, what it actually shows is that we have failed to understand their purposes. However, the existence of vestigial organs is necessary to the Darwinian claim of evolution by purely natural processes. If an organ is no longer needed, random mutations should begin to destroy it through the accumulation of small changes which do not effect the overall fitness of the organism
These two icons of Darwinism, the appendix and junk DNA, have been so widely touted in the literature that it will probably take decades before they are expunged. However, the Darwinists will soon advance new examples in their effort to discredit God’s role in creation. After all, in their view, Darwinism is unfalsifiable; so the evidence against it does not shake them.
Sources:
1. “Appendix Isn’t Useless After All” Global Health Vision, Richard Merritt, Duke University Medical Center.
2. “The RNA Conductome” The-Scientist, Vol. 21, Issue 10, p.55.
Friday, October 5, 2007
Cornell University Geneticist Rejects Darwinism
“Genetic Entropy” is a very difficult book to review in a short Blog. Therefore, I strongly recommend that you read it. Despite its title and technical details, the book is understandable and quite interesting.
Dr. Sanford begins by pointing out that the terms “Natural Selection” and “Survival of the Fittest” have become a kind of “magic wand” for Darwinists. The terms are thrown at biological problems like an “Easy Button” that automatically explains everything. For example, why do giraffes have long necks? Easy, the longer the neck, the better to get to high leaves – therefore giraffes with longer necks have a better chance of surviving and reproducing. This ignores the fact that tens of thousands of mutually interdependent changes are required to lengthen the animal's neck. Just for starters: additional vertebrae are needed along with an enlarged body frame, lengthened and strengthened bones and muscles, a greatly enlarged heart and oh yes, a unique new organ similar to a spleen located next to the brain that acts as a backup blood reservoir to keep the giraffe from passing out when it raises it head to 20 feet above the ground.
This mode of magic-like explanation has even been extended to the persistence of religion – why does every culture have religion? Easy, religion “even though just a fantasy” has survival value.
The power in this book comes from its exploration of how genetics actually work in the real world and of the math behind it. To greatly simplify an enormously complicated process, consider the fact that your genetic code is like a library. The genes are like books in the library and are written on three billion nucleotide pairs, three of these pairs constitute one letter of your genetic code. This equals a library of about 1700 volumes (which incidentally have undergone “data compression so sophisticated that they are contained in an area 100,000th the size of the period at the end of this sentence).” In a lifetime, about 200 point mutations or misspellings will be introduced into each person’s genetic code. It is from these misspellings, combined with a mate’s misspellings, that natural selection is supposed to produce biological evolution.
However, there are some big problems with this theory. First, when you reproduce, you do not select genes one by one; you take your mate’s entire genome; you are forced to take all modifications good and bad. Additionally you do not know what mutations your mate has undergone because these are not expressed in the adult but will only be manifest in the offspring. You don’t even know what mutations have occurred in your own genetic code!
Second, most changes to the genetic code are point mutations that effect only one nucleotide out of six billion. These mutations have such tiny effects that they are of no value or harm; their effect is “neutral.” Therefore, they are not actually “selectable” unless they are fatal.
Third, it has recently been discovered that many genes are “polyconstrained,” meaning that they simultaneous code for more than one protein (usually three and possibly more) depending on how they are read. Consequently, mutations for these nucleotides cannot produce an improvement in the organism because even if one protein were improved then at least another one or more would be damaged.
Why this is Important
Sanford’s research caused him to lose faith in the Darwinian mechanism’s ability to drive Darwinian Evolution. In the book, he describes the painful intellectual change he underwent when he realized that he would have to reject the “Primary Axiom,” “the most sacred cow in biology.” He also realized that this probably meant the end of his career and possibly his expulsion from academia. However, he felt compelled to tell the truth about what his research really showed. He became convinced that life and its development could only be explained by a purposeful design, which he attributes to God.
He now views Darwinism with its claims that life is meaningless and that man is just a bag of molecules as a false doctrine which “…has been the most insidious and destructive thought system ever devised by man.”
However the recognition that we are the product of divine planning means that we do have purpose, and life does have meaning. The consequences of this truth are truly mind-boggling.
Source: Genetic Entropy & The Mystery of the Genome , John C. Sanford, 2005, Elim Publishing, Lima, NY
Thursday, September 27, 2007
Atheism Is Killing Mathematics
The new Princeton University Press book: How Mathematicians Think by William Byers a mathematician at UC Berkeley supports the extraordinary claim that atheism is “killing mathematics.” In his insightful review of the book, Gregory Chaitin one of the worlds leading mathematicians asks, “Would Euler, Cantor, and Ramanujan be welcome in the Mathematics Department of a university today?” His conclusion is a resounding “No”. These giants of mathematics would not be welcome among academic mathematicians because atheistic materialism has become the dominant paradigm in today’s universities.
Euler, who created much of the math used today, was so strongly informed by his Christian beliefs that he is recognized as a Lutheran Saint, and is commemorated each May 24 on the Church Calendar. Cantor invented, or in his view “discovered,” transcendental numbers (multiples of infinities) as a way to “better understand God.” Ramanujan, recognized as one of the greatest geniuses of the Twentieth Century for his work in Analysis and Number Theory, argued that, “…an equation is only of value if it expresses one of God’s thoughts…”
Other famous theistic mathematicians, who would now be expelled from academia, include the Sumerian priests who started it all with accounting and calculating the astronomical calendar, the Pythagoreans, who invented geometry and number theory (the foundations of advanced math) as part of their mystical investigations into knowing God, and a vast number of Christian luminaries including Descartes and Pascal. Towering above all of these are the penultimate mathematicians of all time, Newton and Leibnitz, the cofounders of modern calculus, with deeply and explicitly Christian motivations behind their mathematical investigations; they most definitely would not be welcome.
Byers and Chaitin believe that math began to die in the twentieth century as freedom of thought and creativity became constrained by an over emphasis on formulae; “…words, ideas, diagrams, examples, explanations, and applications” were all rejected in favor of a “nit-picking avoidance of mistakes.” Creativity was abandoned in favor of rigor, and this rigor has resulted in “rigour mortis.” As someone who has taught math, I was struck by the truth of these claims. It is very difficult to think of a truly important discovery in math coming after the 1950s.
What caused the creativity, imagination, and leaps of insight characteristic of mathematicians to be replaced with a stultifying and slavish attachment to formulaic rigor? It is Byers and Chaitin’s conclusion that secular humanistic beliefs about the nature of man are at the heart of the problem. Secular humanists contend that man is nothing more than an accident of nature, that consciousness is simply biochemical reactions in the brain, and that life itself is totally without purpose and meaning. “If mathematicians see themselves as machines they will behave like machines; if mathematicians think they are trivial, then they will be trivial.”
Why This Is Important
It is important that Christians know their intellectual heritage. Many surveys have been done on the religious beliefs of scientists. Mathematicians are always at, or near, the top of these studies showing that 70% to 80% of them believe in God. The percentage of believers decreases, as the field of scientific study gets “softer,” with the social sciences having the lowest percentage of believers.
The media consistently try to portray Christians as stupid, ignorant, benighted dupes, or worse. However, in the sciences, the exact opposite is true; intellectual capability tends to be highly correlated with belief in God. Christians need to know that they have been the key players in the cutting edge of mathematics, the world’s most fundamental intellectual endeavor.
Beyond the troubling damage being done to the advancement of mathematics by secular humanism, there is also the practical cost. Technological innovation, economic progress, and health are all tied to the continued advancement of math. Kill math and you kill innovation. Kill innovation and you kill economic growth and people.
Sources:
1. New Scientist, "Review: How Mathematicians Think" by Gregory Chiatin, July 25, 2007.
2. "How Mathematicians Think", by William P. Byers, 2007, Princeton Universty Press, Princeton, NJ.
Friday, September 21, 2007
Most Research Findings Are False
Recently there have been a number of high profile scandals involving bogus research findings. These have spanned the world and have covered every field from genetics to the disappearance from public view of the UN’s purported “Smoking Gun” of “Global Warming” - the now discredited and vanished “Hockey Stick.” However, these scandals are just the tip of the iceberg.
On September 14, 2007, the Wall Street Journal published an article entitled, “Most Science Studies Appear to be Tainted by Sloppy Analysis.” This article was based on the work of Dr. John Ioannidis of Tufts University. In Ioannidis’s provocative article, “Why Most Research Findings Are False,” he makes an even stronger case for skepticism of “scientific research.”
Dr. Ioannidis’s findings show that over half of all scientific research is seriously flawed and therefore not valid. Very often these flaws involve using the wrong mathematical tools, misapplying them, or inappropriate handling of data. While sloppiness is a major factor in false research, there are far more powerful motives for falsifying or misinterpreting research. These include money and, most importantly, “bias.”
Not surprisingly, a large amount of academic funding is dependent upon the results of research. If your funding comes from the National Education Association, your research on the academic success of parochial schools is not likely to show or emphasize the fact that parochial schools consistently out perform public schools at half the price. If you are a socialist, using environmentalism to advance your economic agenda, your research is not likely to point out that the United States (whose Congress has not ratified the Kyoto protocols) has met the CO2 reductions mandated by Kyoto, while most of the nations that have ratified the treaty have not met their mandated reductions.
Very often these “false research findings” are spin. A leading example is the failure of hundreds of genetic studies that aimed to bolster the Darwinian Tree of Life model of descent from a common ancestor. The researchers and media rarely mention that these unexpected and contradictory findings are a huge problem for Darwinism. The results are duplicitously spun and reported as being consistent with the Darwinian paradigm.
Why This Is Important
A few years ago DDT was banned based on research now universally acknowledged to be incorrect. The ban resulted in millions of unnecessary malarial deaths (mostly children) worldwide. Bogus research can also have disastrous social consequences. Margaret Mead’s studies of sexual freedom in South Pacific Islanders, including her book, “Coming of Age in Samoa,” are used to rationalize and to promote sexual promiscuity. Yet, it is widely acknowledged in the current anthropological literature that her findings were total fiction.
Be very skeptical of what you read and believe is supported by “scientific research,” especially if you are going to quote or act on it. Check to see who did the work, who funded it, where it was published, and what the agendas were of all involved. Read the details: was the data cropped, were the statistical tools appropriate, were the truly relevant questions even asked, was the question framed to produce the result desired? Very often you will find that even when the research has been done properly, the analysis may not support the conclusion as claimed.
Consequently, if research flies in the face of your common sense, it probably is false.
Source: Public Library Of Science: Medicine v.2(8); Aug. 2005
Friday, September 14, 2007
Microbes Have Consciousness
When conditions for the colony are not favorable, the plasmodium may produce a stalk with a puffball full of spores. The puffball releases spores into the wind, which are carried off and produce more of the amoeba-like slime molds. After this reproductive action, the plasmodium may again disaggregate into individual cells that return to their solitary lives.
How is this possible? These individual cells appear to act with conscious volition. How do they communicate? How do they know how to differentiate into specialized organs?
As amazing as this cellular level intelligence is, a new, even more amazing phenomenon has been discovered at the sub-cellular level. Labeled “Natural Genetic Engineering” by Dr. James Shapiro, a University of Chicago Geneticist and Biochemist, and brought to light by Dr. Barbara McClintock in her 1983 Nobel Prize Acceptance Address, predictably, this stunning discovery has been largely ignored by the media.
Natural genetic engineering is the process by which cells modify their own DNA. If you are not familiar with this process, your eyes are not deceiving you, it actually happens. Cells, under challenge from something in their environment, can restructure their own DNA, thereby changing their internal biochemical capabilities. This restructuring process enables the cells to produce new proteins and other molecular products needed for survival.
Why This is Important
Dr. Shapiro has come under criticism and has had difficulty getting at least one paper published because of the implications of his work. The quotation below shows why.
“The idea of natural genetic engineering is controversial to some because it implies the existence of an “engineer” to decide when restructuring should occur.”… “The obvious problem is that it is hard to imagine material causes alone producing sentience and consciousness via random interactions. The sentience, together with messages in DNA and extrodinarily sophisticated genetic code and information processing systems, are arguably a large number of “smoking guns” for an intelligent cause operating in the system itself.”
The problem comes down to this: cells have no brains and no nervous systems to direct their activities, so where is the intelligence coming from that directs the cells to perform intricate, delicate surgery on their own genes? It would be easier for a human to perform complex brain surgery on him or herself; so, how is it possible for a cell to modify its own genetic code?
The answer of course is obvious to all except those who refuse to accept the truth glaring starkly in their faces. There are, of course, intervening natural mechanisms, which mediate these biological processes; however, God is the intelligence behind it all.
Source: http//Shapiro.bsd.uchicago.edu/2006.ExeterMeeting.pdf
Friday, September 7, 2007
Human/Animal Hybrids Have Been Created
Until recently, the actual production of a human/animal chimera was not legal in most countries, and scientists refrained from engaging in this type of experimentation. It was generally seen as a moral boundary few wanted to cross. However, with the slippery slope of today’s utilitarian materialism, sooner or later, every boundary is crossed.
On March 26th of this year, mailonsunday.co.uk reported that British researchers had succeeded in producing a sheep chimera with “…15% human cells and 85% animal cells.” The human cells were mostly isolated to the liver and other internal organs. The creature looks like a sheep with no visible human traits. However, it has a human liver. The object of the experiment was to create organs to be transplanted back into humans.
On September 4, 2007, it was reported by theregister.co.uk that the British government is about to permit the creation of human-chimera embryos. These will be embryos that are about 99% human and 1% animal. These will not be allowed to live beyond 14 days and are seen as only a research tool, at least for now.
It is not difficult to see where this research is heading. In the future, there will be pressure to bring such organisms to full term and to increase the percentage of animal cells in them. Indeed, Nazi researchers already tried this in the 1940s. They unsuccessfully tried to cross humans with apes. Some unfortunate female concentration camp inmates were actually inseminated with sperm from great apes and chimps to try to produce a half human - half ape hybrid. Fortunately, the Nazis lacked the proper technology, and the experiments failed. Unfortunately, humanity no longer lacks the necessary technology, and another of Hitler’s dreams may become reality before long.
Why This Is Important
The moral and legal dilemmas that this line of research will produce are unprecedented. What will a half human – half animal hybrid be regarded as spiritually? Does it have a soul? Is it made in God’s image? Should it be baptized? These questions probably will be a source of great delight to secularists as Christians struggle to deal with such issues.
However, the secular world will also have problems with this. For example, at what percentage of human genes does a chimera gain human rights? 51%? Can a human own a human chimera, or is this slavery? Can they be eaten? Can they own property and bring actions in court? Does the 15% human sheep chimera have 15% of human rights? Can a full human marry a chimera?
These are only a few of the hundreds of questions and morally repugnant possibilities that will soon be upon us.
Friday, August 31, 2007
Without Twilight You'd Be Dead
True this overture to each day and night adds beauty to human life and has served as a source of inspiration to numerous poets, painters, composers, and romantics of every stripe, but is it really important? If twilight vanished tomorrow, would it really change anything? After all we could live without it, couldn’t we? If you think so, think again.
The October issue of Astronomy, in the “Strange Universe” section carried a story entitled “The Real Twilight Zone,” which inspired this Blog. The author points out that the Earth is the only planet in the Solar System that has twilight. Mars almost has a little, but its thin atmosphere and lack of water vapor fail to produce a palate of colors or a long duration of low light, which gradually trickles off. The other planets and planet size satellites either have no atmosphere, going from bright sun light to pitch dark immediately, or they have dense gaseous atmospheres, absorbing and diffusing the sun’s light before it reaches any great depth. The same appears to be true of the 200-plus planets that have been discovered outside of this solar system. Not one of them is a good candidate for producing twilight.
“Okay,” you say. “Not many planets, perhaps no others, have any twilight, so what?” Stop to consider this: we could not exist on this planet without twilight.
Evening Twilight starts when the sun touches the horizon and lasts until it has fallen 18 degrees below the horizon. The length of time this takes depends on one’s location on the globe and the time of year. It is shortest at the equator and longest at the poles. As the sun falls lower on the horizon, its light progressively passes through more and more of the atmosphere. The gases, water vapor, and dust in the atmosphere change the spectrum of the light as the sun moves lower. This produces a gradient effect on the light that reaches the Earth’s surface. This gradient is critical because it triggers certain biochemical cascades in different organisms. Some biochemical processes are only triggered in certain organisms by light of a particular wavelength, and that wavelength is only in sufficient intensity when the sunlight is coming into the earth’s atmosphere at a certain angle. This light plays a role in many types of animal behavior, from the singing of some birds to the feeding habits of many fish. However, this is another story, which is far too complex to go into here.
There are two other major phenomena, which are currently known and there are probably more. First, the thermal equilibrium of the atmosphere is strongly influenced by twilight. It helps moderate the shift in temperature between day and night. On Mars temperatures can easily swing 200 degrees Fahrenheit from day to night, but on Earth it is rarely more than 40 degrees. This narrow temperature range reduces the intensity of storms, is essential in maintenance of the water cycle, and damps down wind velocities. On Jupiter it is not unusual to see sustained wind speeds of 1,000 miles per hour. Imagine something like that on Earth!
The second major phenomenon is the maintenance of the Oxygen (O2)/Carbon Dioxide (CO2) balance of the atmosphere. While plants take in CO2 and produce O2 during the day, the process is reversed at night, when O2 is taken in and CO2 is given off. In order to maintain the balance needed for higher life forms over geologic time, the additional input of twilight to the photosynthetic process is essential. Without it, O2 levels would slowly decline.
So the next time you see a beautiful sunrise or sunset, take in a deep breath, and remember that it was put there for you.
Saturday, August 25, 2007
A Pillar of Darwinism is Falling
Because Darwinists believe that evolution is a purposeless process, they believe that it should be riddled with useless, unnecessary products. The human appendix and the panda’s thumb are the two organs, which are most commonly cited as examples of the undirectedness of this process of random mutation. It is argued that the appendix has no known function with the possible exception of enabling children to obtain an immediate attention from parents by complaining of a pain in the side. The panda’s thumb however has recently fallen out of favor as an example of evolution’s randomness due to studies showing that it is actually a finely tuned adaptation, assisting the efficient stripping of bark from bamboo, it’s main food source thereby allowing the panda can get to the inner soft tissue more efficiently (But that is another story).
However, the most powerful evidence for the randomness of the evolutionary process was “Junk DNA.” Junk DNA’s are portions of DNA molecules that were believed to serve no useful purpose. They appeared to be inactive, not coding for the manufacture of proteins, which is the primary function of DNA. It was believed that junk DNA composed about 98% of all human DNA. The fact that so much DNA appeared to do nothing led to the conclusion that junk DNA represented the results of random natural processes mutating the DNA and producing unusable, nonsensical stretches of nucleotide code.
The New Discovery
In June of this year the Encode Project (Encyclopedia of DNA Elements) published a group of 29 papers produced by hundreds of scientists working in 11 different countries that have completely overturned the view of “Junk DNA” as useless. This work is still in its early stages but to sum it up briefly: the scientists found that over 50% of all human DNA actually codes for RNA’s which do not go on to produce proteins but seem to be involved in other regulatory functions. These functions include keeping chromosomes from unraveling and controlling cell division.
But the most surprising, indeed shocking, contention of all is that most of these RNA’s seems to be standing by in the event that the body is suddenly exposed to an environmental change. In that event, they are immediately available to produce an adaptation to enable the organism to meet this challenge. This raises an enormously difficult question for the Darwinian paradigm: how could these standby RNA’s be maintained throughout numerous generations, possibly even thousands of generations, until they are needed? With no utility to the current generation they should be quickly selected out of the population but they are not.
Why This is Important
If evolution is an undirected process, as Darwinists claim, it should produce nonsensical and useless outcomes on a prodigious basis as was once believed to be the case. It has been estimated that in humans, about 10,000 harmful or useless mutations should accompany every useful one. Junk DNA was thought to be “smoking gun” evidence for this random process. As it is turning out there is probably no such thing as “Junk DNA.” This is powerful evidence for the observation that evolution is not the result of a random natural process but is rather the result of purposeful design and that purposeful design requires a designer. This designer is, of course, God.
Sources:
1. “Intricate Toiling Found in Nooks of DNA Once Believed to Stand Idle” Washington Post, June 2, 2007, p. A1.
2. http://www.genome.gov/ , The Encode Project.
Friday, August 17, 2007
Archaeological Discovery Supports Biblical Accuracy
This new line of attack concedes such facts as the existence of David but then argues that the Bible still cannot be trusted to have accurately reported the details of his life. For example, they take the position that while it is true that King David was a real person there is no extra-Biblical confirmation that he ever fought Goliath and therefore the Bible alone cannot be trusted for this sort of information. The claim is that the stories in the Bible are like stories about the Old West, where it is true that Wyatt Earp really existed, but the numerous novels written about him are largely or completely fictional.
In July of this year that line of attack against the Bible became harder to maintain because of a small but important discovery made by Michael Jursa an Austrian Assyriologist translating cuneiform tablets at the British Museum in London. Jeremiah Chapter 39:3 reports that someone named Nebo-Sarsekim, Nebuchadnezzer II’s “Chief Officer”, was with him at the Babylonian siege and conquest of Jerusalem in 587 BC.
The tablet that Professor Jursa recently translated is a receipt from the Babylonian temple of Esanglia acknowledging a payment of 1.5 minas (1.65 pounds) of gold. The year of the payment is 595 BC and the payer is King Nebuchadnezzer II’s Chief Eunuch, Nabu-sharrussu-ukin (Nebo-Sarsekim in Hebrew). This confirms three separate Biblical facts Nebo-Sarsekim actually existed, he was Nebuchadnezzer’s Chief Officer and he lived at the time of the siege.
Why This is Important
It is the very minor nature of the details confirmed by this tablet and similar discoveries that make them so important. In the words of Dr. Irving Finkel, one of the British Museums specialists, “This is a fantastic discovery, a world class find. If Nebo-Sarsekim existed which other lesser figures in the Old Testament existed? A throwaway detail in the Old Testament turns out to be accurate and true. I think that this means that the whole of the narrative takes on a new kind of power.”
Dr. Finkel’s observation is well taken. If such obscure and minor facts as those verified on this tablet, “throwaway detail” as he calls them, are correct, it is likely that other facts reported by Jeremiah and other Biblical authors are also accurately reported.
Source: London Daily Telegraph, telegraph.co.uk, July 11, 2007.
Friday, August 10, 2007
The Continents Appeared In A Sudden Burst
Genesis 1:9 And God said, “Let the water under the sky be gathered to one place and let the dry ground appear.”…. the third day.
First physicists discovered that there was a single Creation Event now called the Big Bang through which the universe emerged. Then biologists discovered their own big bang, labeled the “Cambrian Explosion”. This biological big bang gave rise to almost all the modern phyla of life. Now it appears that the continental crust of the Earth also had a "Big Bang" of its own. New research indicates that the Earth’s crust appeared rather suddenly in a single burst of creation. This is the conclusion of two prominent geological researchers, Samuel Browning of MIT and Ian Williams of the Australian National University in Canberra as reported in several sources, including the February 1998 special issue of "Earth".
Browning and Williams have been studying the oldest rocks known on Earth. These rocks appear to be at least 4 billion years old and are found in West Greenland. It is not the age of the rocks that make them so important, for it is generally believed that the Earth is about 4.5 billion years old, it is the composition of these rocks that makes them so intriguing. Most geologists believe that the Earth's continents formed gradually over a long period of time. This standard model also postulates that continent formation did not begin until 500 million years after the planet had formed. They believe that the continents then began to grow in a series of slow pulses as the result of plate tectonics. This process is still supposed to be in operation today.
Browning and Williams argue instead that continent formation actually began almost at the planet's birth. The continents then proceeded to grow in one continuous process so that by 3.8 billion years ago there was actually about a third more continental crust on the planet then there is today.
This conclusion is based on a comparison of the ratio of two of the elements found in the ancient rocks studied by the team. Browning, Williams and others at the Australian National University examined the ratio of the element neodymium to the element samarium found in these rocks. Radioactive samarium decays to form neodymium. The Australian team found that the rocks had an unexpectedly small amount of neodymium compared to the amount of samarium. The big question was where had all of the neodymium in these early rocks gone? The answer appears to be that it has gone down into the mantle underneath the crust. This indicates that by 3.8 billion years ago the continental crust had already separated from the mantle and was floating on top of the mantle where it still is today.
It was initially argued that this rather surprising finding could be the result of sampling error because the material tested all came from Western Greenland. Two subsequent tests, however, conducted on rocks from Northern Canada and Labrador also show similar neodymium depletion. Therefore the best current evidence supports the idea that this process was global and not a local phenomenon.
These findings have set off a bitter and acrimonious debate among geologists, particularly between Steven Moorbath of Oxford University and the Australians.
Why This Is Important!
In the mid nineteenth century Charles Lyle, a British lawyer, put forth the idea of gradualism in geology. Lyle argued that the Earth had been formed slowly over a long period of time by a series of processes that could still be observed going on today. This theory known as uniformaterianism is still predominant today. (This idea was later picked up by Charles Darwin and applied to biology). Up until that time most Christian and Jewish theologians had maintained, based on their study of the Bible, that the Earth had been formed primarily by a series of unique creative steps, many of which occurred in a catastrophic or very rapid process. They argued that these formative processes were not observable in the normal, gradual changes taking place now on the Earth on a day- to-day basis. Many mainline Christian theologians of that time and earlier times had in general believed that the Earth had considerable antiquity, at least on the order of hundreds of thousands of years, so the main debate was not just over the Earth's age, but also over its method of formation.
It is interesting to note that the idea that the Earth is 6,000 years old only gained prominence in the nineteenth century. It had generally not been a major issue in the thinking of church leaders historically. The antiquity of the Earth had been assumed to be at least on the order of hundreds of thousands of years by the many theologians as early as the fifth century AD. Consequently, when evidence of great age for the Earth was found in the nineteenth century, it was generally accepted by the majority of theologians Catholic, Protestant, Orthodox and Jewish.
It is still too early to see what the long-range outcome of the research by the team from the Australian National University will be. However, numerous geological findings in subsequent years points toward the occurrence of single catastrophic or sudden event that shaped the course of the Earth's development. Such findings include the discovery that many geological eras ended with a major asteroidal impact. These discoveries tend to support the claims of earlier Christian theologians that catastrophes did indeed play a major part in the Earth's history. It is also of great interest that the above discoveries about the early and rapid appearance of the Earth’s crust tend to support their claim that all of the processes that formed the Earth’s surface are not observable today.
There is also a second major mystery reported in this research, which has even more interesting implications. This comes from the argument made by some scientists who believe that the universe has been deliberately fine tuned to support life here on Earth. They believe that the universe was intelligently designed and is not the result of purposeless natural processes. Life on Earth requires that the Earth's crust be continuously recycled. New crust comes out of volcanoes while old crust is pulled down into the mantle along subduction zones located on plate boundaries. The ejecta from volcanoes return minerals to the surface that are needed by biological organisms. Without this recycling life above the bacterial level would gradually die out.
Terry Plank of the University of Kansas studies ocean sediments. She has found that the rate of sediment subduction into the mantle is about the same as the rate at which new crust is built. She noted that, "It is pretty remarkable that the two processes are in balance,” The geological gradualists believe that the Earth's crustal recycling balance simply “developed” as a natural out growth of a continuous slow formative process. If it turns out that the crust appeared all at once, how could such a fine homeostasis between crust and mantle have “developed”? One possible answer is that it was deliberately designed that way.
Monday, August 6, 2007
Stem Cells: Facts and Media Misrepresentation
Please allow me to explain. As you have no doubt noticed during the last decade the media has been full of stories critical of those “opposed to stem cell research”. Such people are usually depicted as ignorant, pro-life, Luddites, opposed to scientific progress. Their mindless opposition, according to the media, is resulting in masses of suffering victims who would otherwise be cured if only these backward Christian, Conservatives would end their resistance to scientific progress.
What is the truth? Here are some facts.
A stem cell is a cell that has the ability to differentiate itself into different types of cells. A stem cell can turn into a liver, kidney, brain, hair, skin, bone or any other type of cell. Consequently it is believed that stem cells hold great promises for repairing damaged or diseased tissue. Imagine how wonderful it would be if a victim of liver cancer could be given an injection of stem cells that would go into the body and regenerate a new liver.
There are two main categories of stem cells: Embryonic Stem Cells and Adult Stem Cells. Embryonic stem cells are found in the placenta, placental blood and in embryos. Adult Stem Cells are found in other tissue mainly bone marrow, muscle and the brain. Stem cells are usually obtained from organ donors, fertility clinics and volunteers.
If you have been following the reporting on the subject you may be under the impression that stem cell research in the United States is currently not quite legal but some scientists, brave enough to defy the “zealots” on the Christian right have persisted in the face of persecution nonetheless.
Here are some facts. To begin with, there are no legal barriers to stem cell research carried out within HHS guidelines, absolutely none – none as in zero – despite the media misrepresentation. There are no limits on funding stem cell research with one exception and it is this one exception that the media is using as the source of its disinformation campaign. This one exception is using federal funds to harvest new lines of embryonic stem cells taken from human fetuses. This is the one and only exception. This one exception has come as the result of pro-life groups’ opposition to destroying more human embryos in order to extract their stem cells.
Here are a few other facts the media regularly fails to mention. If you want to harvest new lines of human embryonic stem cells there is absolutely nothing stopping you from doing it, as long as the funds used are not federal funds. So any foreign country, any university, any private company, any state, local government or private individual is completely free to do such research as long as they pay for it. As a matter of fact, several states including California have made funds available for this purpose.
Right now federal funds can be used to do research on all lines of adult human stem cells and all animal stem cells. Additionally in the United States there are currently over sixty lines of existing human embryonic stem cells available for research plus foreign line and privately owned lines for which federal funds can be used. On top of this, to show how truly bogus this entire issue is, new lines of Human Embryonic Stem Cells can be harvested from the placenta and placental blood of new births, making it completely unnecessary to use human embryos as a source for new lines of embryonic stem cells.
However, despite these facts, little money is going into establishing new lines of human embryonic stem cells. The reason for this is simple economics. Investors do not want to throw money away. It was widely believe on theoretical grounds that embryonic stem cells, being more plastic then adult cells, would be more likely to produce viable therapies. However so far this has not proven to be the case as there are currently 60 therapeutic uses of adult stem cells and 0 therapeutic uses of embryonic stem cells. The reasons for this are not entirely clear but seem to be related to genetic level complexities.
One of the reasons for the big push for federal funding of new lines from human fetal tissue seems to be to get funding for projects that would not be funded on economic or scientific grounds. This is being done by portraying denied federal funding as martyrs of “pro-life loonies” rather than people who are trying to pursue, what may very well be, a scientific blind alley.
Why the media distortions?
It is no secret that the media has an extremely strong liberal bias. They want to get their candidates elected and advance their ideological agenda and they are succeeding in this area. Polls have shown that their misrepresentation of this issue has created a very negative impression in the minds of many. The pro-life community is frequently believed to be indifferent to the suffering of disease victims. Additionally the media has succeeded to a large degree in portraying anyone who is concerned about the “slippery slope" of this type of research for any reason as “anti-science.”
Wednesday, August 1, 2007
Is The Brain Really Necessary
New research not only supports this shocking claim by the greatest philosopher of the Twentieth Century, but actually goes beyond it to show that this may even be an understatement: indeed consciousness may not depend on your brain.
You have probably read of curiosities such as the well-publicized case of Phineas Gage a railroad worker who had a 3’ by 1.25” iron tamping rod accidentally shot through his brain. He recovered from the accident with little effect and no memory loss even after suffering the destruction of one frontal brain lobe and damage to the other. Okay, you say, amazing but he only lost his frontal lobes. He still had a lot of his brain left so I’m not convinced that it is possible to think without your brain.
How about more extreme cases? Scientific American in an article titled “Strange But True” May 2007 reports on a surgery known as a hemispherectomy, in which half of the brain, one entire hemisphere is removed. These surgeries are most commonly performed on patients who are in neurologically desperate situations such as massive tumors or extremely frequent uncontrollable seizures.
In a study of 111 children who underwent the surgery because of uncontrollable seizures 86% are now either seizure free or greatly improved. But even more surprisingly most suffered no memory loss or personality change. Another study of these patients found that they often improved academically. One even went on to become his state’s Chess Champion. However the outcomes are mixed, the younger the patient the better the recovery usually is. Most often there will be a substantial or total loss of arm movement and vision on the side of the body opposite the lost hemisphere. Language loss, if any, is often recovered regardless of which hemisphere is removed. Okay, you say that is even more amazing but these people still had half of their brains left.
The July 20, 2007 issue of Nature contains an article titled “The Man With a Hole in His Brain.” This is the story of a man named Lionel Feuilet, a French civil servant who was recently discovered to have about 80% of his entire brain missing. The discovery was made during a routine CT scan conducted on him following his complaint of weakness in his left leg. His brain tissue had been slowly replaced through the expansion of his cerebral ventricles by spinal fluid in a process of slow hydocephalicization. His doctors were very surprised that he was behaviorally normal, holding down a fairly mentally demanding job and raising a family.
Wow! You say 80% of the brain gone but still functioning normally - I’m surprised. It really is shocking that this is possible, but perhaps the remaining 20% is still doing everything.
Okay, then how about this one. Science magazine reported in “Is Your Brain Really Necessary” Vol. 210, December 12, 1980 a similar case. However, in this instance the person in question had received a First Class Honors degree in Mathematics from Sheffield University. After complaining of headaches he was given a CT scan. His doctors were shocked when the scan revealed that 95% of his brain, was missing, having been replaced by water. It was estimated that he only had about 1 millimeter of brain tissue left on the outer edge of his cranial cavity, barely enough to be detected in the scan. In effect his brain was literally missing. The article also discussed the fact that other such cases are out there but go undetected because the persons involved have no symptoms.
In all likelihood the mathematician from Sheffield is not the most extreme case in the world. It is very probable that there are a few people walking around right now completely oblivious to the fact that substantially more then 95% of their brain is missing and in effect have no brain.
Why This Is Important
One of the central claims of materialism is that the soul does not exist and that consciousness and all of the processes associated with thought and intellect are merely the result of biochemical activity in the brain. Destroy the brain and you destroy the conscious part of you. This argument is held to disprove the possible of life after death.
While it is true that damage to the brain (in some cases even minor damage) can have catastrophic and tragic consequences, the existence of the anomalies discussed above show examples of people with normal levels of cognitive function, who, for all practical purposes have no brain. This is very strong evidence against the materialist claim that consciousness is nothing but natural biochemical reactions. While it seems evident that the brain has a major role in cognitive functioning there is clearly more to consciousness than physiochemical processes.
Saturday, July 21, 2007
Near Death The Blind See
NDE has become the acronym for that set of phenomena that people all over the world report during periods of clinical death. By now many people have heard about these phenomena that occur when people die and are resuscitated. During the period when they are considered dead about one third undergo an NDE. The NDE’s usually have several common elements. These include the experience of leaving one’s body and viewing oneself, the people and environment in the immediate surroundings. Many also report traveling through a tunnel and coming out on the other end in the presence of a Being of Light, frequently identified as Jesus Christ. They often report going through a review of their lives. The most important impression that these people are left with is a feeling of intense, total love that they experience from the Being of Light. Those who experience an NDE almost universally report not wanting to leave this love to return to Earth. Most people who undergo an NDE report that the experience transforms their lives and leaves them more focused on the truly important things such as loving and serving others and appreciating what they have, especially their families.
Interestingly, when the blind have NDE’s they report exactly the same phenomena. They see both this world and the next. When people who are born blind recover their sight through a natural process, such as a medical procedure, they report an initial period of disorientation because it takes their minds sometime to learn how to interpret the new visual input. This is exactly what the blind report during an NDE. Those who were blind from birth report having difficulty relating to what they are seeing, whereas those who lost sight later in life immediately recognize the return of vision.
This research is reported among other sources in the Summer 1997 issue of “The Anomalist”. Given the rather amazing result the authors try to come up with a naturalistic explanation for this phenomenon. They are able to eliminate certain explanations such as fantasy by independently verifying details through corroborating evidence. For example those who died in surgery were asked to describe the surgery room, the hospital and other environmental factors, which a blind person could not know, were there. This sometimes included such observations as relatives in the waiting room that they had no way of knowing were there at the time. Dr. Ring and his co-researcher Sharon Cooper were also able to eliminate a host of other natural explanations for why the blind can see at death, such as dream based explanations, sensory-cueing, skin based sight and a host of others.
WHY THIS IS IMPORTANT!
Ring and Cooper searched thoroughly for a naturalistic explanation for their findings. This is not surprising because they are scientists and scientists in most cases argue that the proper focus of science is the realm of the natural. Many and possibly most scientists would argue further that there is no supernatural realm. However, it is not clear what the conclusion of Ring and Cooper is in this case. Their determination as to what is the cause of sight in the blind near death seems to be somewhat muddled. This is not surprising because of the clearly supernatural implications of this research.
How can the blind possibly see at death? They were blind before and they are blind after the NDE, yet during the death period some can describe what only a sighted person can perceive. They cannot see these things with their eyes, so how do the blind transcend their blindness near death?
The answer is obvious but hard to swallow for many. This is clear evidence for the existence of a transcendental part of the human being. It points to the existence of a Soul or Spirit. Vision can be impaired in the physical body with blindness resulting, but when the Soul is separated from the body this impediment is removed because the Soul does not depend on the physical body for vision. Therefore during death sight is possible but leaves again upon resuscitation.
Saturday, June 9, 2007
An Amazing Genetic Coding Discovery
Imagine being hired to write three different books.
The first one a mystery novel, the second a history of
World War II and the third a French cookbook. A lot of
work to be sure but you decide that you are up to the
task. However there is one catch – all three books
must be contained within the exact same text. The only
difference being that the novel will begin with the
first letter of the first word, the history will begin
with the second letter of the first word and the
cookbook will begin with the third letter of the first
word.
And of course all the words must be spelled exactly
correctly and all punctuation must be in the proper
place because even one error will result in an error
in all three books. So if the first word in the first
sentence of the mystery novel is "Help" the first word
of the history must begin with "elp" and the cookbook
must begin with "lp".
Try to write even one short sentence in this fashion.
"Impossible" you say. Yes, for a human mind it is far
too complex a task. It is even a task far too complex
for the most advanced super computers. The information
density of a system that encodes three different
readings in one text simultaneously is truly
mind-boggling. For these reasons and others
geneticists had never bothered to examine the DNA in
eukaryotic cells to see if they were performing this
task until now.
To their astonishment researchers at Pennsylvania
State University, U. C. San Diego and Vrije University
in the Netherlands have discovered this process in
four different species including humans. The process
labeled Duel-Coding or Alternate Reading Frames is
taking place in at least 40 of the genes in your cells
right now.
Why This Is Important
Let the article published in May of 2007 in
"Computational Biology" speak for itself: "Duel Coding
Is Virtually Impossible by Chance." (The capitals are
not mine) The researchers using new techniques of
statistical analysis further concluded: "Here we show
that although duel-coding is nearly impossible by
chance, a number of human transcripts contain
overlapping coding regions." and " Maintainance of
duel-coding is evolutionary costly and there occurance
by chance is statistically improbable." When the
authors use the word "improbable" here they really
mean as they stated earlier that the chances are so
small statistically of the phenomena occurring by
chance that in reality is not going to happen.
So if this duel-coding is "Impossible by Chance" how
does it come into being? The fact that these
scientists had the courage to publish this result is
shocking in itself. They may have put their jobs and
careers in jeapordy already. So it should be no
surprise that the authors do not deal with the obvious
design implications of their work. But there are only
two possible explanations for any natural phenomena
either it comes about as the result of a meaningless,
purposeless process (as materialism demands) or it
comes about as the result of a purposeful design. The
design inference here is very strong.
Source:
Wen-Yu Chung, Samir Wadhawan, Radek Szklarczyk, Sergei
Pond and Anton Nekrutenko, "A First Look at ARFome:
Duel-Coding Genes in Mammalian Genomes", Public
Library of Science: Computational Biology, May 18,
2007, Vol 3, No 5.
Tuesday, April 3, 2007
New Light on the Dark Ages
"Decline and Fall of the Roman Empire". The book was an immediate sensation and has been

Gibbon's main thrust in the book was an attempt to
answer on very important question. That question was; what caused the fall of the mighty Roman Empire? Rome had lasted for over a thousand years and had encompassed the entire known Western "civilized" world.
The "Decline and Fall of the Roman Empire" is
considered one of the seminal works of the
"Enlightenment" (That period of time starting in the mid eighteenth century when Rationalism or more broadly put scientific thought emerged as a major philosophical movement). Gibbons work became famous not only because it was one of the first works of modern historiography but also because of its conclusion that one of the chief culprits in the fall of Rome was Christianity.
The demise of Rome has almost universally been regarded as a major disaster for the Western world because the loss of the Roman Empire brought a loss of Roman knowledge and organizational ability and consequently resulted in a "Dark Age". Following Gibbon's lead came many similar works that over time produced the now popular image of the Dark Ages as a time in which ignorant, bigoted Christians ran rampant over Europe destroying what was left of the Roman legacy of learning, art and technology. Christianity has been widely thought of as largely responsible for the dark ages.
However new archaeological work is showing that quite the opposite is true. The November/December issue of Archaeology magazine reports on a group of studies conducted by the European Science Foundation and others. While Gibbon and those who followed his lead saw Christianity as the "source of decay" in the ancient world, recent scholarship has concluded that
"The Christian Church was actually the mediator of the continuity from late antiquity to early medieval Europe". The eminent historian Judith Herrin of Kings College, London argues that rather than hastening or causing the fall of Rome, Christianity actually delayed Romes' demise. Her work also concludes that it was the Christians of the Roman Empire who were the
ones who were largely responsible for the transmission of Roman culture to its eventual medieval form.
For example, it has long been believed that the collapse of the Roman presence in Britain was marked by a sharp discontinuity in Britons culture and economy. However new research such as Herrin's and John Blair's of Queens College, Oxford shows that the transition from Roman to British culture was gradual and mediated by the organized Christian presence. Blair using primary archaeological evidence argues that Britons towns and eventually cities grew up
around church monasteries. The evidence indicates that Monks were sent into Britain from the early ecclesiastical centers. These monks established monasteries, which included churches and living compounds. These eventually grew into towns and cities. These were the only "urban" settlements in Britain through the eighth century and were the vehicle by which Roman learning was preserved and spread through Britain. Early British royalty even settled on the periphery of these Christian proto-villages and not visa versa as had been believed. This explains the persistent mystery of why churches and not royal residences are the focus of the
downtown area of British cities.
Why This Is Important!
For over two hundred years it has been widely believed by scholars and others following the lead of Gibbon that Christianity retarded the economic and cultural development of Europe. This notion is often seen in today's popular media. Christians of the Dark Ages are portrayed quite negatively as Gibbon characterized them. Instead new research shows that it was these early Christians who were primarily responsible for preserving and carrying forward Rome's legacy to the modern world. Rather then being responsible for the loss of Roman achievements, Christians were the ones who salvaged what little they could and preserved it for us.
This new emerging view is also consistent with the work of the great scholar G.K. Chesterton and others who have argued that it was these early Christians who fought against the rise of Eastern despotism in Europe that in all likelihood would have followed the fall of Rome. In their opposition to eastern forms of government the groundwork of democracy was preserved.
It was upon this resistance to theocratic and state despotism those later democratic institutions were able to emerge.