The modern scientific method dates all the way back to the days of Isaac Newton who, ironically, only went to university because his parents thought he was too poor of a farmer to make anything of himself – so they sent him to read theology (Weisstein, 2011). The climate of research and inquiry produced one of the most transformational minds in all of the history of science. In addition to his laws of motion, he also developed the basis of a scientific method that still guides research and experimentation done in the present day.
Newton’s scientific method boiled down to four principles that, he asserted, should underlie all endeavors involving scientific reasoning. The first rule is that one should accept the causes for natural phenomena that are shown to be true and necessary to explain those phenomena. The second is that natural phenomena and their effects should be linked to the causes of those phenomena. The third is that, if one body demonstrates a particular trait, that trait should be assumed to be universal. The fourth is that any deductions made from natural phenomena should be considered valid until contradicted by other observations (Weisstein, 2011). While these rules may seem obvious to us today, in the time frame in which they were presented, they completely changed the way that people viewed the world. This method took the existing scientific principles for investigation, that dated all the way back to the days of Aristotle, and turned them on their heads. To summarize his philosophy, Newton wrote that “[a]s in mathematics, so in natural philosophy the investigation of difficult things by the method of analysis ought ever to precede the method of composition” (Weisstein, 2011). This analytic way to approach scientific (and other) questions took matters of subjectivity out of the equation, ensuring that scientific results would be verifiable and, in general terms, much more reliable than they had ever been before.
It did not take much time to move from a new system of scientific inquiry to a new way of viewing the individual’s place in society. The Age of Reason, which started to take shape under the guidance of Newton, Descartes and other thinkers, called into question the entire system of morals and ethics. After all, if the Church was wrong about the earth being at the center of the universe, could priests and ministers also be wrong about the afterlife? If weather phenomena were the result of atmospheric conditions, instead of an answer to prayer, did God really deserve the place of priority that so many would give him?
As a result, individualism began to flourish in the Newtonian age, to a degree not seen since the days of ancient Greece. Homer’s Iliad and Odyssey provide powerful portraits of individual characters daring to take on amazing quests without much consideration for the greater good of those around them. Menelaus, for example, orders all of the Achaians to war, because his wife happened to leave with a foreign prince. Achilles, irritated because Agamemnon takes away his war-won concubine, pouts in the Greek tents, refusing to join the battle against Troy (but, at the same time, staying instead of leaving) until his friend Patroklos is slain in battle, wearing Achilles’ armor. Odysseus thumbs his nose at the gods by claiming that it was his own heroism that brought him through the Trojan conflict; as a result, Poseidon won’t let him return for ten years. However, it is Odysseus’ exploits that are the focus of the story, while Poseidon comes off as petty and self-serving. Individualism flourished during this time period, because the defining value was reason. The instrument of reason, by and large, is the brain; since the brain can make its own conclusions, then the individual becomes the most important unit. In more affectively driven cultures, the group is more important than the individual, because emotions propel us, by and large, toward the group. Instead of conclusions, affective processing leads to a system of obligations – and obligations are almost never towards oneself. The time period between the waning of Roman and Greek hegemony and the rise of Newtonian thought was a time characterized by the ascendancy of obligation – obligations to the Church, to one’s feudal superiors, to family. The network of relationships that turned into feudalism rotated around duty toward others.
When Newtonian thought came into widespread acceptance, though, the logical regained its position of priority over the affective. After all, wouldn’t a more proper response to the Black Plague have been to isolate the cause of the infection through investigation and observation? Instead, many people organized themselves into groups of “flagellants,” getting in circles and whipping each other, assuming that the plague was God’s punishment. These zealots would whip until the blood came – and yet many of them still fell to the plague. Clearly, this was not the work of reason.
Moving from individualism and logic to liberalism, though, was another step altogether. While the individual might well be the primary unit of reason, saying that the individual is capable of independent reason is one thing; moving on to say that the individual is the highest level of arbitration for right and wrong is another. Just because the Church proved to be a cruel wielder of power during the Dark Ages and the Middle Ages does not mean that there is no larger source of ethical or moral teaching than the person. However, this idea took hold – perhaps most violently so in the years following the fall of the Bourbon family in revolutionary France. Because the individual was the sole arbiter of right and wrong, violence ensued on a large and horrific scale, and it would not cease until Napoleon came in and gave the country a totalitarian leader once again. One question that opponents of liberalism rightly ask is “how individuals are constituted and why the rights we attribute to individuals are asserted to be of the highest moral and political importance” (Fairfield, 2000, p. 89).
One problem with the full expression of the Newtonian/Cartesian/Liberal viewpoint is that it neglects many possibilities, by its insistence on the purely rational and demonstrable. The modern scientific method has three basic stages: formation of theory or theories, collection of data, and interpretation of that data. While collection of data is purely rational and does not allow for the notion of supernatural agency, that sort of influence is possible in the formation of theories and the interpretation of data. While the rules for scientific inquiry put forth by Newton insisted that only identifiable causes be linked to their effects, it can also be argued that, while that approach might result in a sufficient explanation of the “mechanical” aspects of a process (Corey, 1999), there are other levels of agency that can be explored in a scientific investigation. Knowing the agents behind a mechanical process can be even more helpful for the scientist than just knowing the workings involved, depending on the purpose of the investigation. If the scientist just wants to know how a process works, or what conditions have to be in place for a certain phenomenon to take place, then a more limited explanation of phenomena can be acceptable. However, if the scope of the investigation expands to take in the metaphysical milieu in which a phenomenon can appear, then a larger explanation is necessary.
For example, if a boy runs through a field during an electrical storm, holding aloft a pitching wedge, we should not be surprised to see that boy get struck by lightning. Ever since the days of Benjamin Franklin, we have known that lightning carries an electrical charge, and that that charge is attracted to metal – particularly metal sticking upward into the air. This is the scientific principle behind the lightning rod, which is designed to take the electricity in lightning and bring it to ground, instead of letting it burn your house down.
But what about the metaphysical considerations with such a proposition? After all, not everyone who runs around with metal gets struck by lightning during thunderstorms. Scientists who looked into the possibility of converting the astronomical amounts of electrical charge inside a lightning bolt into usable power have run into several obstacles, such as the fact that lightning strikes that would be close enough for the same power receptacle to absorb it are so infrequent that it wouldn’t be worth the cost of building those receptacles in the first place. In other words, lightning really doesn’t strike in the same place twice – at least, not very often.
So, back to the boy who’s running in the field. Newton’s rules help us find out the cause for the electrical strike – the presence of metal in an electrical field, connected to a very favorable conductor, known as a person. A larger metaphysical question, though, might ask some questions that the mechanical process of electricity does not explain. Why, for example, did that particular boy get hit by lightning? Why didn’t the lightning come from some other part of the storm? If the boy was killed by the strike, why didn’t he survive? If he did miraculously survive, what was the supernatural agency in the situation?
And so we come to the limits in the Newtonian view of the way that the world works. While his rules did change the way people viewed scientific investigation, the limitations that the rules of the modern scientific method placed on the unseen – and the unprovable – removed a great deal of the richness from the explanations of how things truly work – and why they work that way.
Another example worth considering is the way that the narrator in Twain’s A Connecticut Yankee in King Arthur’s Court “proves” to his audience that he is a deity. Having memorized the date of the eclipses in history from perusing his almanac (and having been fortunate enough for that eclipse to take place at a highly fortuitous time for him), he says to his doubters that he will blot out on the sun at a particular date and time. When the eclipse happens at the appointed time, the fear that breaks out is so great that the narrator is able to leverage a considerable amount of power and respect for himself before the eclipse ends, and the sun slowly returns (Twain, c2011).
For the pre-Newtonian Britons, the possibility that a deity could blot out the sun is much more real than it would have been for Newtonians. Before the Age of Reason, superstition and fear were two primary motivators – and so it was in Twain’s novel. Nowadays, eclipses are not, at least not in Western society, a cause for fear. Instead, schoolchildren run outside, having just poked a small hole in a shoebox, so that they can observe the eclipse without hurting their eyes. Having the opportunity to investigate and observe this phenomenon is a definite improvement in the quality of existence. However, if one were to wonder what the divine purpose for a system that includes eclipses might be, or if one were to wonder how much of a role divine intervention plays in climatic events, knowing the simple mechanics of the eclipse would not provide an answer. The mechanics provide a starting point – but not a conclusion.
How to bridge the gap? The “new sciences,” which started to gain some credence near the end of the nineteenth century, seek to turn the same analysis onto religion and spirituality as scientists had been turning on microscopic bacteria, tornadoes and other natural phenomena since the days of Newton. This movement into the study of comparative religion, as well as attempts to universalize the religious impulse and explain it, started with the work of William James (The Varieties of Religious Experience) and the archetypal writings of Carl Gustav Jung. These new sciences sought to give religious impulse as much of a rational mechanism as the lightning bolt streaking toward the upraised golf club. While the latter is fairly easy to explain, though, the former is not. The lightning bolt loses none of its power, simply because we know where it comes from and how it works. It does lose its associations of fear (well, some of them) as a result. However, the religious impulse, while arguably also a natural phenomenon, is not as easily reduced to a mechanism. If rationalists cannot accept the notion that God created people, and as a result, people seek God, then the study of the religious impulse will likely be extremely reductive.
If you think about the true nature of this reduction, though, the results on humanity’s general worldview at this point in the 21st century is somewhat easier to understand. By removing the wondrous and miraculous from existence, Newtonians create a world in which we are much less likely to feel the fears and thrills associated with superstition; however, we are much less open to experiences that require us to have faith in the unseen to fully appreciate. God has become much smaller in the 20th and 21st centuries, because the parts of the world which we can explain mechanically have expanded dramatically. With much less of the world, at least ostensibly, for God to involve Himself with, humanity’s sense of His power appears to be dwindling over time. If this weren’t true, then church services would still focus on the majesty of God – instead of tailoring self-help lectures to include a small bit of Scripture. If this weren’t true, then we would be able to agree on more moral universals than the facts that murder is always wrong. (Oh, wait. Do we believe this? The definition of “murder” appears to vary somewhat in our society, depending on the point in time that you believe to be the beginning of actual life.) Also, places of religious worship would not more and more resemble the places where we shop – if our view of God still held that He is a being of ultimate power, churches would still look unlike any other buildings in our towns and cities, because we would still expect to find experiences in those buildings that are unlike the experiences we find anywhere else. The worldview of modern science and liberalism, which seeks to put faith and other metaphysical elements under the same umbrella as Newton’s experience with the apple tree and Archimedes’ moment in the bathtub, removes much of the fear from existence. The problem is that it also removes the wonder – and the miraculous.
Corey, M. (1999). Supernatural agency and the modern scientific method. Web. Retrieved 8
November 2011 from http://ai.clm.org/articles/corey_supernatural.html
Fairfield, P. (2000). Moral selfhood in the liberal tradition: The politics of individuality.
Toronto: University of Toronto Press.
Twain, M. (c2011). A Connecticut Yankee in King Arthur’s court. New York: Simon and
Weisstein, E. (2011). Newton, Isaac. Web. Retrieved 8 November 2011 from
Why can’t we capture lightning and convert it into usable electricity? (2007). Boston Globe 29
October 2007. Web. Retrieved 8 November 2011 from