God (cont.)

IIIIIIIVVVIVIIVIIIIXXXIXIIXIIIXIVXVXVIXVIIXVIIIXIXXXXXIXXIIXXIIIXXIVXXVXXVIXXVIIXXVIIIXXIXXXXXXXIXXXIIXXXIIIXXXIVXXXV – XXXVI – XXXVIIXXXVIIIXXXIXXLXLIXLIIXLIIIXLIVXLVXLVI
[Single-page view]

Maybe God really does exist. Maybe he’s simply decided to hide the evidence of his existence for whatever reason, and he only wants to reveal himself to us after we die. What if that actually is the case? What if nonbelievers like myself, for all our meticulous reasoning, are just flat-out wrong? Well, according to Christianity, the proper punishment for adopting a careful mindset of “I’ll believe it when I see it” (which even Christians would consider to be a perfectly commendable approach toward any religion other than Christianity) is an eternity of fiery torment. But personally, I just can’t square the idea of a God that really is all-knowing and all-loving, as Christianity claims, with the idea that he would want to be so petty and shortsighted and cruel. (As Cheryl Cohen-Greene puts it, “I’ve come to a place where I don’t believe in a God that’s less compassionate than I am.”) Wouldn’t a truly compassionate God understand the difference between willful malice and simple confusion? Wouldn’t he have enough room in his heart to allow for an honest mistake?

TheoreticalBullshit offers his take on this:

Obviously, we can’t say with 100% certainty that God doesn’t exist. We can’t say with 100% certainty that anything exists or doesn’t exist (aside from existence itself). Still, I think it’s fair to say that we can at least achieve functional certainty in a lot of cases. It might not be possible, for instance, to prove with 100% mathematical certainty that dragons and fairies don’t exist – but that doesn’t mean we can’t say “I know dragons and fairies don’t exist” with as much certainty as it’s possible to say that about anything. And I consider supernatural entities like ghosts and spirits and angels and gods to fall into that same category. It might become possible to demonstrate their existence at some point in the future (and we should always remain open to that possibility); but until that point arrives, we ought to just stick with what scientists call the null hypothesis – the default assumption in any scientific study that there’s no phenomenon actually going on there, and that none of the proposed hypotheses are true, until demonstrated otherwise.

After all, regardless of my own personal opinion, the question of whether God exists is, like I said before, one of objective reality – and that means it’s an empirical determination to make, not one that can just be asserted on faith. The tools and techniques of scientific inquiry can and should be brought to bear on this question, because figuring out what’s real is the whole point of science – and if God is real, then that would be a scientifically true fact too. As Harris writes:

Science, in the broadest sense, includes all reasonable claims to knowledge about ourselves and the world. If there were good reasons to believe that Jesus was born of a virgin, or that Muhammad flew to heaven on a winged horse, these beliefs would necessarily form part of our rational description of the universe. [In contrast,] faith is nothing more than the license that religious people give one another to believe such propositions when reasons fail.

Science is often disparaged by believers as a being kind of “faith” in itself; empirically-minded people are accused of having their own “religion” of dogmatically rejecting all religious claims on principle. But this is a misunderstanding of what the role of science actually is and how it works. Science isn’t a specific belief or a set of beliefs; it’s a tool – a process for discovering which ideas are true and which ones aren’t. It’s just an epistemic approach of only accepting empirical claims after they’ve been empirically demonstrated; it doesn’t say in advance what the answers to those claims actually have to be. There’s no permanent, unalterable scientific canon in the same way that there’s a permanent, unalterable biblical canon; science adjusts its understanding of what’s true in accordance with the evidence it sees, and as new evidence comes in, that understanding is continually updated.

Here’s Barker again:

Religionists sometimes accuse nonbelievers of having faith. Every time you flip a light switch you exercise faith, they say. But this is not faith; it is a rational expectation based on experience and knowledge of electricity. If the light fails to turn on, my worldview is not shattered. I expect that the light will sometimes fail due to a burnt-out bulb, blown circuit or other natural cause. This is the opposite of religious faith. The light does not turn on because of my expectation. Rather, my expectation is based on experience. If lights were to begin failing most of the time, I would have to adjust my expectations. (Or adjust my electrical system.) But religious faith is not adjustable. It remains strong in spite of a lack of evidence, or in spite of contrary evidence.

Sometimes we nonbelievers might express faith, but when we do it we are not pretending that our faith makes the statement true. We often assert trust or confidence in something that is not known 100 percent. For example, I respect my dad tremendously, from what I know of him. I have “faith” and trust in his character. But this does not mean I know everything, nor does it mean I can’t be wrong about him. It is possible that my dad is actually a serial murderer who has not been caught yet, though I doubt it. (I hope he smiles when he reads this.) The point is that although I do often express sentiments with near absolute confidence, I am open to the possibility that I might be wrong, admitting that my faith claim is not a knowledge claim. My dad has earned my respect. God has not. Scientists do something similar when they claim that a “fact” can be asserted when the evidence passes a certain threshold, such as the common 95 percent level. In fact, I think all knowledge is like this: we probably can’t say we know a thing with 100 percent certainty, except maybe “I think, therefore I am,” and even that has its critics. But scientific confidence is not faith – it is a tentative acceptance of the truth of a hypothesis that has been repeatedly tested, and it is subject to being overturned in the light of new evidence. The data and methods of testing are publicized, peer reviewed and open to any of us for examination. This is nothing at all like religious faith, which makes a leap from possibility to fact. Or, often, from impossibility to fact.

In short, then, the biggest reason why we should trust the findings of science over those of faith is simply that the findings of science can actually be verified. If you ever have doubts about whether some particular scientific finding is really true, you can go out and perform your own tests to confirm or falsify it. As long as you’re willing to put in the necessary time (and obtain the right equipment), you can independently verify it for yourself. Or alternatively, you might even disprove some longstanding misconception and earn yourself a Nobel Prize. And science not only welcomes that kind of scrutiny – it’s defined by it. Religious belief, on the other hand, declares itself exempt from empirical verification or falsification; it simply tells you that it’s true, and you have to believe it on that basis alone. Its primary mechanism is faith – and accordingly, the “truths” it provides vary wildly from person to person. But science’s primary mechanisms are critical examination and testing – and accordingly, the truths it provides are universally replicable. So when religious believers accuse empirically-minded nonbelievers of putting just as much “faith” in the word of scientists as the believers themselves put in their holy books, that accusation just isn’t true. The whole point of science is that it frees you up from having to accept things solely on faith.

Ricky Gervais illustrates the point this way:

Science is constantly proved all the time. You see, if we [took] something like any fiction – any holy book [or] any other fiction – and destroyed it, [then] in a thousand years’ time that wouldn’t come back just as it was. [There might be other fictions or religions, but they’d be different.] Whereas if we took every science book and every fact, and destroyed them all, in a thousand years they’d all be back – because all the same tests would be the same result.

To put it another way: The reason why scientific inquiry is a trustworthy source of knowledge is because it works – not just for some people some of the time, but universally. You might be tempted to respond, “But what about hundreds of years ago when so-called ‘scientists’ said that the world was flat and that objects fell to the earth because they ‘belonged’ there, not because of gravity?” That’s a fair question – but the reason why the “scientists” of those earlier centuries came to such flawed conclusions was that they weren’t actually doing legitimate science, with controlled experiments and such; what they were doing was more like speculative philosophy. It was only when their descendants actually did start using scientific techniques that they were able to disconfirm those old mistaken ideas. And that’s the key point here; the problem with the flat-Earth model and other such mistaken beliefs wasn’t that there was too much science going on, it’s that there was too little science going on. Science is designed to be self-correcting; so while individual researchers might sometimes make mistakes, the scientific process itself ultimately fixes those mistakes and moves us ever closer to the truth. Chuck Klosterman (in conversation with Neil deGrasse Tyson) describes how the field of classical physics (i.e. non-quantum, non-relativistic physics) exemplifies this:

Will our current understanding of how space and time function eventually seem as absurd as Aristotle’s assertion that a brick doesn’t float because the ground is the “natural” place a brick wants to be?

No. (Or so I am told.)

“The only examples you can give [in classical physics] of complete shifts in widely accepted beliefs – beliefs being completely thrown out the window – are from before 1600,” says superstar astrophysicist Neil deGrasse Tyson. […] “You mentioned Aristotle, for example. You could also mention Copernicus and the Copernican Revolution. That’s all before 1600. What was different from 1600 onward was how science got conducted. Science gets conducted by experiment. There is no truth that does not exist without experimental verification of that truth. And not only one person’s experiment, but an ensemble of experiments testing the same idea. And only when an ensemble of experiments statistically agree do we then talk about an emerging truth within science. And that emerging truth does not change, because it was verified. Previous to 1600 – before Galileo figured out that experiments matter – Aristotle had no clue about experiments, so I guess we can’t blame him. Though he was so influential and so authoritative, one might say some damage was done, because of how much confidence people placed in his writing and how smart he was and how deeply he thought about the world… I will add that in 1603 the microscope was invented, and in 1609 the telescope was invented. So these things gave us tools to replace our own senses, because our own senses are quite feeble when it comes to recording objective reality. So it’s not like this is a policy. This is, ‘Holy shit, this really works. I can establish an objective truth that’s not a function of my state of mind, and you can do a different experiment and come up with the same result.’ Thus was born the modern era of science.”

[…]

The “history of ideas,” as [Brian] Greene notes, is a pattern of error, with each new generation reframing and correcting the mistakes of the one that came before. But “not in [classical] physics, and not since 1600,” insists Tyson. In the ancient world, science was fundamentally connected to philosophy. Since the age of Newton, it’s become fundamentally connected to math. And in any situation where the math zeroes out, the possibility of overturning the idea becomes borderline impossible. We don’t know – and we can’t know – if the laws of physics are the same everywhere in the universe, because we can’t access most of the universe. But there are compelling reasons to believe this is indeed the case, and those reasons can’t be marginalized as egocentric constructions that will wax and wane with the attitudes of man. Tyson uses an example from 1846, during a period when the laws of Newton had seemed to reach their breaking point. For reasons no one could comprehend, Newtonian principles were failing to describe the orbit of Uranus. The natural conclusion was that the laws of physics must work only within the inner solar system (and since Uranus represented the known edge of that system, it must be operating under a different set of rules).

“But then,” Tyson explains, “someone said: ‘Maybe Newton’s laws still work. Maybe there’s an unseen force of gravity operating on this planet that we have not accounted for in our equations.’ So let’s assume Newton’s law is correct and ask, ‘If there is a hidden force of gravity, where would that force be coming from? Maybe it’s coming from a planet we have yet to discover.’ This is a very difficult math problem, because it’s one thing to say, ‘Here’s a planetary mass and here’s the value of its gravity.’ Now we’re saying we have the value of gravity, so let’s infer the existence of a mass. In math, this is called an inversion problem, which is way harder than starting with the object and calculating its gravitational field. But great mathematicians engaged in this, and they said, ‘We predict, based on Newton’s laws that work on the inner solar system, that if Newton’s laws are just as accurate on Uranus as they are anywhere else, there ought to be a planet right here – go look for it.’ And the very night they put a telescope in that part of the sky, they discovered the planet Neptune.”

The reason this anecdote is so significant is the sequence. It’s easy to discover a new planet and then work up the math proving that it’s there; it’s quite another to mathematically insist a massive undiscovered planet should be precisely where it ends up being. This is a different level of correctness. It’s not interpretative, because numbers have no agenda, no sense of history, and no sense of humor. The Pythagorean theorem doesn’t need the existence of Mr. Pythagoras in order to work exactly as it does.

And it’s this consistency in predicting and discovering the truth that makes science the superior method for doing so. As Harris writes:

Religion once offered answers to many questions that have now been ceded to the care of science. This process of scientific conquest and religious forfeiture has been relentless, one directional, and utterly predictable. As it turns out, real knowledge, being both valid and verifiable across cultures, is the only remedy for religious discord. Muslims and Christians cannot disagree about the causes of cholera, for instance, because whatever their traditions might say about infectious disease, a genuine understanding of cholera has arrived from another quarter. Epidemiology trumps religious superstition (eventually), especially when people are watching their children die. This is where our hope for a truly nonsectarian future lies: when things matter, people tend to want to understand what is actually going on in the world. Science delivers this understanding in torrents; it also offers an honest appraisal of its current limitations. Religion fails on both counts.

He adds, “I would challenge anyone […] to think of a question [for] which we once had a scientific answer, however inadequate, but for which the best answer is now a religious one.”

Religion may have been the only game in town for a long time; but we’ve reached a point now where we can finally do better, both as a civilization and as individuals. And if we really care about truth, we should embrace the opportunity to do so.

Continued on next page →