We’re watching a political party implode.

tru-2Unless you’ve been cut off from all human contact in the backwoods of Maine for the past 6 or 8 months, you know exactly what I’m talking about:  the Republican party’s freak show of a primary process, complete with debates that are little more the than potty-mouthed insult-laden shouting matches better befitting a bunch of 6 year-olds. And they’re not even very creative. If the best they can do is shout over each other, resort to ad-hominem attacks and spew absurd platitudes (“When I’m president it’s gonna be GREAT. All my policies are the BEST. You’ll see”), there’s a serious problem. So far, I’ve seen no workable policy proposals (other than the elimination of everything Obama has accomplished in the last 8 years), let alone a serious policy debate. And the latest nonsense is that Trump is going to get the Mexican government to pay for his absurd wall on our border with Mexico by twisting their arm. Or something equally as juvenile.

The last debate would have been hilarious, if it wasn’t part of the process of selecting a candidate for the most powerful job in the world. This person will literally have the capacity to destroy civilization and send the few human survivors back to the stone age, and they’re calling each other names and comparing dick size.

In the run-up to the actual primary process, the conventional wisdom was that it would be Jeb Bush vs Hillary Clinton. On the Republican side, Ted Cruz, Marco Rubio and a whole bunch of others were expected to make it interesting, and for the Democrats is always Hillary’s to lose, but at the end of the day it was widely expected to be a contest between the two party Brahmins. As it turned out, Bernie Sanders has made a surprisingly strong showing against Clinton, while virtually everyone on the Republican side has been stunned by Trump’s dominance. He was expected to self-immolate very early on, but the more outrageous and hate-filled things that come out of his mouth the better his followers like it. He’s even said they are so loyal he could shoot someone in the middle of Time Square and they’d still vote for him. I don’t know which is scarier: the disregard he has for his own followers’ critical thinking skills, or the fact that he was probably correct.

Anyhow, it was Jeb who never got off the ground, and it currently looks like the Republican nomination is Trump’s to lose. So the leaders of the Republican Party are throwing everything they have at Trump to try to prevent that from happening. They see that he represents the loss of the control of the party, and that terrifies them.

Of course the proximal cause is Trump’s ascendancy and apparently clear path to the Republican nomination, but as I imply, that’s just the most visible. I think this started following the shellacking they got with Barry Goldwater in the presidential election of 1964 when the Republican party wonks realized that they had to expand their constituency to have a hope of winning (let alone keeping) the White House and Congress in the future. They did the simple math that revealed that there were more Democrats than Republicans and that was only going to become more pronounced as the demographics of the country shifted. They were going to have to somehow expand their appeal from their current base of white (predominately male), wealthy upper and middle class voters in order to keep from being forever marginalized.

So they sold their souls to the Devil.

Or more accurately, to the Fundamental Christians. To appeal to them, they decided to emphasize what their focus groups told them were hot buttons for this group, so they positioned themselves as the party that would protect people from the Godless Communist hordes (strong on national defense), the criminal drug-users (strong on crime) and the destruction of the American family (anti-abortion, pro-traditional family with Dad as the breadwinner head of the house and Mom in a stay-at-home supportive role). With the exception of that last, it wasn’t much of a stretch from earlier positions; and as I think about it even the pro-traditional family stance was really just finding a parade to march in front of. After all, who would be opposed to a strong family? But in order to make that work, they had to make it appear that they were most closely aligned with that demographic; no little feat when their party was predominately rich, white and male, and their target was working class, less educated, and both men and women. The only commonality up until then was race. S0 they needed to stir up fear and create a siege mentality by saying that family values and Christianity were (and are) under attack. From whom or what is never clearly stated, except a vague “elitist secular agenda,” whatever that means. We are constantly reminded that “This country was founded on Christianity and the Bible!” and of course this is exactly wrong; the Founding Fathers may have incorporated Christian principles into the Bill of Rights, but they specifically and painstakingly avoided an official religion, be it one of the many branches of Christianity or any other, monotheistic or polytheistic. The Treaty of Tripoli (penned in 1796, ratified by Congress and signed by President John Adams) goes so far as to explicitly state that “The Government of the United States of America is not, in any, sense founded on the Christian religion” (Article 11). Not sure how it could have been stated more plainly.

The irony is of course lost on the Wing Nuts out there when they trumpet our “Christian Constitutional Foundation.”

Anyhow, back to our Republican decision 50 some-odd years ago and the implosion of today. So they stirred up the evangelicals, got them to vote reliably Republican with nonsense about an attack on Christian Values from some unnamed “elitist” group and used their votes to stay in power. We’ve had Reagan, two Bush’s with a Gingrich-led rebellion in between, and now Ted Cruz. Today a person running under the Republican ticket has no hope unless they declare themselves unequivocally anti-abortion, pro-gun and pro Christ Jesus.

The problem is that the Republicans wonks post-Goldwater never really wanted the party to become so socially conservative. True, most Americans (myself among them) are not pro-abortion. I think life is precious and should be treated as such. But I’m MUCH more opposed to being told by a politician when life begins, or what my wife and I can or cannot do in what is at its core a profoundly personal decision. Those guys 50-plus years ago wanted the votes that would allow them to get elected, but wanted to keep the party the way it was: fiscally conservative, favoring (and run by) wealthy, nearly all white, old men.

They are horrified by Trump and the fact that he’s taken over the party. And Ted Cruz would be just as bad, from their perspective; he’s known for his refusal to compromise on anything, even within his own party. Lindsey Graham said that the Trump/Cruz choice is like deciding whether you’d rather be shot or poisoned. That’s why we saw Mitt Romney being trotted out to bash Trump. Not because they want Cruz to win, but almost anyone would be better than Trump.

Except, of course Hillary. Or Bernie. Either way, it’s a disaster for the Republican Establishment. But they brought it on themselves.

Be careful what you wish for.

Posted in General commentary on the world as I see it..., Political commentary | Leave a comment

What happens when facts lead us to where we don’t want to go?

As I outlined in the last entry, sometimes the facts take you to a place you don’t want to go. Obviously the first thing to do is make sure you have your facts in order. But if the facts are truly “facts” rather than assertions, assumptions, or faulty conclusions, then you have a problem, and a decision to make. Do you accept that you now have to change your position, or do you decide to ignore the facts?

Interestingly, it isn’t necessarily a clear choice. Here’s a couple of examples.

Vaccines and autism.
Jenny McCarthy (among many others) believes that there is a correlation between vaccines (or more specifically, thimerosol, the preservative used for decades in vaccines) and autism; the contention is that the significant upswing in the number of autistic children today is caused (or at least worsened) by the vaccines given kids. There was a study that was published in 1998 in a respected medical journal by a Dr. Andrew Wakefield that apparently established this cause-and-effect relationship. So today many people have begun refusing to have their kids vaccinated for fear that they will develop autism as a result of exposure to thimerosol. These vaccines, by the way, are the reason that polio, whooping cough, chicken pox and other diseases that killed hundreds of thousands of kids in our past, are virtually nonexistent today. Or would be, except for this choice by a significant number of parents to not vaccinate their kids, which is making it possible that some of these terrible disease may resurface.

But here’s the thing: Wakefield’s study was unable to be replicated; even by him. His work was carefully reviewed by the journal that published his paper, which determined that he falsified the data. The study was withdrawn, he was exposed as a fraud and his license to practice medicine was revoked. Additional studies have shown no connection at all between exposure to thimerosol and autism; other models for the cause of autism are emerging. Yet the belief persists that vaccines cause autism. And many of the practitioners in my industry are strongly supportive of the anti-vaccination position. (Aside: my company takes no official position on the vaccination issue, but when the question comes up in my workshops I lay out the facts as we have them and let the audience decide.)

GMO (genetically modified organisms) in our food supply.
non_gmoScientists have figured out how to take genes from one organism and splice them into another. One of the first commercialized and perhaps best-known examples of this involves an herbicide called Roundup (glyphosate). A gene-spliced corn called “Roundup Ready” has been developed that is not affected by glyphosate, so it will grow in the presence of the herbicide. Whether glyphosate is as harmful as it is purported to be is not our topic here, but instead the gene-spliced corn. Lots of people in my area of business are concerned that the process of gene-splicing may have unintended (meaning: bad) consequences somewhere down the road.

Another modified food: there is a fish that resists freezing even when the water around it freezes, because of a specific gene found in its DNA. Scientists have taken that gene and spliced it into strawberries, making the strawberries freeze-resistant. The non-GMO crowd (again, largely in my area of business) is very concerned about foods like this, calling them “Frankenfoods” and trying to get them banned. Most farmers are against these bans, or even of labeling the foods as GMO for obvious economic reasons: they can get higher crop yields using GMO plants. (The irony here is that virtually all the food we eat has been genetically modified; it’s just been done over many years by grafting or selective breeding, rather than in a laboratory.)

But the science is fairly clear that GMO foods are safe. Study after study has shown no detectable difference in the food quality, nutrient content, use in the human body, or any other known variable in GMO foods when compared to non-GMO foods. I suppose it would be more accurate to say “there has never been any indication that GMO foods are any different from non-GMO foods in how they are metabolized,” since it’s a fairly new area in research. But the anti-GMO position then takes advantage of the inherent logical impossibility of proving GMO foods could never cause a problem. And while technically that statement is true, again, it’s a logical impossibility to prove.

So we have two situations where the facts seem clear: there is no evidence that autism is linked to vaccinations, or that genetically-modified foods are harmful. But my particular branch of health care insists on believing the opposite in spite of the lack of evidence.

I must also point out that, while there may not be any scientific support for avoiding GMO ingredients, the perception on the part of the marketplace that GMO ingredients are bad may still drive the decision to use all non-GMO ingredients. In point of fact that is exactly what my company is doing.

But it is a decision based on market demand and not scientific facts.

Posted in Nutrition and eating, Science | Leave a comment

Where do the facts lead?

There is a price to pay for critical thinking, if you are an honest person.

The scientific method is really just an organized way of consistently applying the rules of critical thinking. Let’s take a closer look at the process and see what the implications of using the scientific method might be.

Because the scientific method includes testing one’s hypothesis, obviously the results are often negative. In fact, one of the main points of scientific studies is to attempt to disprove the hypothesis. At first this sounds counterintuitive; why would anyone go to the trouble of designing a study specifically to prove themselves wrong? But look at it from this perspective: a scientist believes that their study (and the associated findings) are going to replicated by other researchers. In fact, the progression of science depends upon it. So if a scientist feels strongly about a hypothesis, it is to his or her advantage to make the strongest case possible. If the hypothesis is able to stand up under the scrutiny of a study designed to disprove it, there is a stronger probability that the hypothesis is a fact. A good study will therefore look carefully at all the weakest areas and present the case why those weaknesses don’t disprove or discount the essential premise.

Now back to where the facts lead. If a study appears to be taking the hypothesis off the rails, a reputable scientist will not view that as a failure; instead it helps advance the field. Thomas Edison purportedly failed more than a thousand times in the development of the lightbulb, but he viewed each failure instead as a success: he discovered one more way to not make a working lightbulb! So “failures” in science are not viewed negatively. (The business people who fund studies of course find this very frustrating when they think they are paying for validation of their pet product, but that’s a different issue.)

Translating the processes of applying the scientific method to critical thinking skills, it means that, simply put, as facts emerge, they may lead you to a place that conflicts with what you believe to be true. Intellectual honesty compels you to either accept that you were wrong, or you have to reject the facts. But if they are facts, they can’t be rejected; the best you can do is go back to the data and see if what you thought was a fact was in fact, not.

Let’s take an example. I’m listening to a series of lectures on an age-old philosophical debate: is there such a thing as free will? And please understand that this is an ongoing debate (and has been for literally millennia), so I don’t pretend to have an answer. But at the very least, the fact that this IS an old debate means that there’s no simple answer. I’m not going into the debate here (I’ve looked at it before, and will likely revisit it as some point in the future on this blog), but it leads to some interesting (and disturbing) areas, for me at least. At first it seems obvious that, yes, humans can decide to do something or not, the definition of free will. But (again, without going into the details), the harder you look at the question the less likely it seems that we do. (If you’re really interested in pursuing this, I suggest you purchase the audio program I’m listening to from The Teaching Company called “Great Philosophical Debates: Free Will and Determinism.” Parts of it are heavy sailing, but you’ll find that many of your rock-solid convictions may not be based as solidly as you thought!)

Back to our example. Let’s say that, as I’m implying in the last paragraph, our cherished notions of having free will don’t hold up to some deep thinking. (And you’ll have to trust me unless you listen to the audio, but they don’t. At least not categorically). So where does that lead? If we don’t have free will (or at least not as we understand it), then that means our actions are determined by outside factors (the opposite of free will). But how do you hold people accountable for their actions if they weren’t responsible for them? In our justice system, we already make accommodations for people who don’t understand the consequences of their actions (innocent by reason of insanity is a classic, if sometimes unsatisfying, defense).

We don’t punish babies for grabbing candy in a store; they don’t know it’s not theirs and has to be paid for. We put it back and teach them over time that’s not done. But what if an adult has a tumor in their brain that shuts off the “mine/not-mine” equation (which, by the way, has been documented to happen)? Is it fair to punish them for something they have no control over? Most people would say no, or at least have to think about it.
Again, I’m not arguing the point of whether we actually have free will here, but using it to illustrate the point of this post. If we agree on a “fact” (in my illustration, that free will doesn’t exist), then there are actions that must flow out of that. If we don’t like where that takes us (in this case, the fairness of our justice system), the natural step is to go back and reject the fact. But if it is a fact, by its very definition it cannot be rejected; it just “is.”

Most people by now would be saying “OK, so without free will it isn’t fair to hold people accountable for their actions. But since I feel like I can make choices, others can too and thus may make choices to take something that isn’t theirs. It is appropriate to punish them when they do that, so we MUST have free will.” (And intuitively, this makes sense to pretty much everyone. We believe we have free will). This would lead most of us to reject the arguments against free will based on where that would take us.

But that’s exactly the wrong way to go about it. If we reject a “fact” because of where it takes us (again, making the distinction between a fact and an assertion), we’re being intellectually dishonest. We may not like where it leads, and we may even choose to pretend it isn’t really a fact, but it’s not being honest.

To be honest, that’s a hard thing to confront.

Posted in General commentary on the world as I see it..., Religion and philosophy, Science | Leave a comment

Picking cherries

Cherry picking data is a logical fallacy where only information that support’s one’s conclusion is shared; contrary information is either ignored or suppressed. It’s also called “the fallacy of incomplete evidence,” which probably makes it more clear what is going on.

Scientists are supposed to watch out for this; the process of evaluating data specifically includes performing statistical evaluations of all the findings to determine the possibility that the results are actually a coincidence rather than the result of the intervention. (Nerdly aside: in scientific papers you’ll see something like “p=.005” to indicate this; that means that, statistically, there are 5 chances in 1000 that the results were a coincidence, and conversely, 995 chances out of a thousand that the same results will happen the next time. A p value of less than .05 is considered statistically significant.)

Ahem.

So cherry picking is a bad thing, scientifically speaking. In politics it happens all the time; you’ll hear this factoid or that statistic being trotted out, and when you dig into it you find that it’s not at all representative of reality.

A great example of this is in the area of anthrogenic global warming (climate change caused by humans). As I’ve written in previous posts, there is a rather vocal group (predominately Republicans) who deny that our climate is changing (getting warmer), and even if it is, it’s not caused by human activity. They have pointed to an increase in arctic ice to prove their point; if the arctic ice pack is growing, it’s pretty difficult to say the environment is getting warmer. And to support this they say that the size and thickness of the arctic sea ice was higher in 2009 than it was 20 years prior to that, in 1989.

Here’s the thing:  the data they point to are simply wrong. Below is a graph showing the measurements of the thickness and extent of the arctic sea ice in 1989 (blue line) and again in 2009 (red line).Arctic ice chart Even a casual observation would show that the ice is less in 2009 than it was even 20 years prior. But for one (very) brief moment (in March, where the arrow is pointing), the lines crossed and it seemed to show that the ice pack was ever-so-slightly greater in 2009 than 1989.

This is classic cherry picking. It is obvious when you look over time that the ice pack is getting smaller. It would clearly take more sophisticated statistical analysis to know exactly how much smaller, how fast it’s happening and whether it’s stable, accelerating or slowing, but it’s definitely smaller.

This would be exactly like having your bank account be overdrawn every day for a month except for paydays, and then attempting to convince the bank they were being unfair for charging you overdraft fees by pointing only to paydays as evidence of your sterling record-keeping. It’s just plain wrong.

There are lots of examples of the use of this type of bad science, whether in politics, the news, or in religion today. A good example in religion is the so-called “Bible code.” In the book of the same name, the author maintains that a sophisticated statistical analysis of the words and letters in the Bible (particularly the Hebrew Scriptures, or Old Testament) reveals patterns that make startlingly-accurate predictions of events today, written thousands of years before they happened.

The problem is that if you perform exactly the same analysis on Moby Dick, for example, the same types of patterns emerge. And while most people in Western society would agree that Moby Dick is one of the great works of recent history, I doubt if we’d say it contains predictions of the future.

At its most destructive (and least honest), cherry picking starts with having a point you want to make, and then finding and only reporting the evidence that supports your pre-determined position. The most honest way is to first gather as much evidence as you can, and after careful (and objective) review, determine what the evidence is telling you. Scientists work very hard to do just that; and they don’t always get it right. Confirmation bias shows up frequently, even when it’s being guarded against.

Avoiding this confirmation bias is challenging and takes time, so (not very surprisingly) not many people are willing to go looking for it in their own reasoning. It’s way easier to start with your conclusion and cherry pick your “facts.”  In fact, I would suggest that we all do it by default, unless we actively and specifically guard against it.

It’s surprisingly difficult.

Posted in Political commentary, Science | Leave a comment

Just the facts, Ma’am.

Fact: something that actually exists; reality; truth.

There’s no such things as a false fact; if it is a fact then it is true. It’s true that people argue over facts, but actually they are arguing over the interpretation of facts, because once something is a fact it is no longer arguable. For example, it is a fact that fossils exist of animals that are now extinct. No one contests that; you can touch them. You may argue over the age of the fossil, how it got to where it was found, and on and on, but the fact of the existence of the fossil is not open to debate.

People confuse assertions with facts. An assertion is “a confident and forceful statement of fact or belief, often made without proof or support” (conflating two online definitions). Note that an assertion may be a fact, but not necessarily; not all assertions are true. So I’ve heard said “Donald Trump is not electable, and that’s a fact.” That’s easy; it’s an assertion and of course it’s not a “fact.” It would be better to have said “Donald Trump says things in his speeches that are so contrary and offensive to the majority of Americans that I can’t see any path that leads to him being elected as our next president.” It’s equally obvious why the former statement (assertion) is used; it’s much more forceful and lots easier to say.

More difficult would be facts as applied to science. Science is always searching for the “facts” that lie behind the observations we make about the world around us. So any reputable scientist will tell us “We know this (fill in the blank) to be a fact, based on our observations thus far. But of course future observations may cause us to change our understanding.” Sir Isaac Newton observed an apple fall to earth from a tree and developed the understanding and explanation of gravity. The “fact” is that gravity exists, things fall to the earth and we don’t go flying off into space. Newton’s explanation of exactly why and how that happened was considered proven for many generations, until Einstein opened up the whole new area of quantum physics. Newton wasn’t “wrong;” apples still fall to earth from trees, but our understanding of how that works has grown.

People often mistake this fact (see what I did there?), that the process of scientific discovery never ends, to somehow call into question the “facts” of science. They’ll take a scientific fact (gravity, for example) and with the awareness that underlying discovery process is never ending, conclude that we can’t ever really “know” anything. And if that’s true, then any explanation is as good as any other.

And there be dragons.

Facts have no emotion or judgment attached to them. But facts carry weight. If a new fact contradicts what you have believed to be true, honesty compels you to abandon what was wrong and embrace the newly discovered truth, regardless of the cost. This is the essence of both scientific and philosophical inquiry (they aren’t the same). Both will say “These are the facts as best I understand them; as new facts emerge I may have to change how I think.” Again, that doesn’t mean that I can’t use the information that I have now to make predictions or calculate probabilities; it just means that I have to go with the facts as they emerge.

That is very different from starting with something I “know” to be true and then cherry picking facts (or interpreting them as I see fit) to conform to these preconceived conclusions.

Posted in General commentary on the world as I see it..., Science | Leave a comment

“I have no need of that hypothesis”

1005054-Pierre_Simon_marquis_de_LaplacePierre-Simon Laplace was another really smart guy, by all accounts. He was a physicist, mathematician and astronomer who lived in France from 1749-1827 during some pretty turbulent times: the French Revolution and the Napoleonic Wars. All occurred during his lifetime. He may have even met Ben Franklin or Thomas Jefferson while they were in Paris schmoozing the rich for donations to our own revolution and generally enjoying themselves, although I couldn’t find specific evidence of any meeting. But it’s kind of cool to think they might have hung out!

Anyhow, Laplace’s social life is not the purpose of this entry; it’s for something he is credited as having said to Napoleon during a presentation on his work of the movement of the various planets in the solar system.

Back up a step or two. Laplace lived almost exactly 100 years after Sir Isaac Newton 3a112917176b79ea921aabc46b000fd7(1642-1726 or 7), and followed in Newton’s footsteps. Recall that Newton is widely considered one of the (if not THE) greatest scientist who ever lived, leading the Scientific
Revolution and the birth of the Enlightenment by developing and then refining laws of motion (Newton’s Three Laws of Motion are still taught in basic physics today); calculus, gravitational theory, optics (the understanding that light can be “bent” and split out into all the colors of the rainbow were his initial observations), and on and on. It’s difficult to overstate his contribution to science. Even today, most physicists would say it’s a tossup whether Newton or Einstein have contributed more to the body of science we have today. I would suggest that future scientists will add Stephen Hawking to that debate, but that’s for another entry.

Anyhow, it’s very clear that Laplace had read Newton’s work and was heavily influenced by him; he built on, and defended Newtonian physics. Among many other accomplishments, he developed the mathematics that explained the movement of planets and other heavenly bodies around the sun. The story of Newton and his famous apple is apparently true; he was sitting under a true one sunny summer day when an apple fell to the ground. He began wondering why it went straight to the ground rather than sideways, and from that came up with the gravitational theory (that mass attracts mass, so large bodies like the earth pull smaller bodies like the apple to them). This led to an explanation of the sun as the center of our solar system “holding” all the planets, asteroids and meteors in orbit because of the massive gravitational pull exerted by the sun on each (much) smaller planet.

All this using just his brain. I don’t know about you, but that is absolutely staggering to me. Little wonder that he is still revered in the scientific world (if not the more mundane world of the rest of us!)

A hundred years later, Laplace took Newton’s physics up a notch. He stated that the small irregularities observed in the orbits of the various planets could be explained through mathematics. One of the reasons this is important is that it, prior to Laplace, these irregularities were used as evidence of God. Odd as that may sound today, the belief prior to Laplace was that God needed to step in every so often and bring things back to their proper place in the universe.

So now we go back to Napoleon. He appointed Laplace to the position of Grand Poobah of Science in his government (not really the title, but you get the idea). Laplace was invited to explain his theory of how the planets were kept in their orbits; when he finished Napoleon purportedly asked where was God in this calculation. And Laplace said “I have no need of that hypothesis.” (While there is some debate as to the exact context, it’s pretty universally agreed that something like that happened).

Recall my entry a few back talking about the historical contention that God was involved in the universe in a very personal way, and that things like volcanoes, earthquakes and crop failures were due to God’s wrath; if we pleased God we’d get good crops and no disasters. During The Enlightenment the great thinkers (like Newton and Laplace) realized that there were laws that controlled the universe. If we understood those laws and how they interacted, we could explain how volcanoes and tornadoes happen, as well as things like how and why the planets move the way they do.

Side note: some have used Laplace’s statement as evidence that he was an atheist and was rather arrogantly dismissing God’s place in the universe; in actual fact he was doing nothing of the sort. He was only saying that God’s intervention was not necessary to explain the undeniable observations of the way the universe works; mathematics was fully capable of doing so. He left the existence of God a question that others could debate, but at the very least, His intervention was not required to explain the observed universe.

Full circle to the Enlightenment and the scientific method. The whole point was that, by using our noggins and applying the rules of the universe we can understand (and explain) everything that happens. The tragedy is that we still have people today (Pat Robertson is a classic example) saying a few years back that AIDS is God’s will—He caused people to get AIDS as a punishment for their sins. Or more currently that hurricanes, droughts in California and the like are happening because we’ve somehow displeased God.

Use your heads, people.

Posted in General commentary on the world as I see it..., Religion and philosophy, Science | Leave a comment

Science denial today

In my last post I talked about how, during the 1500s, the church leaders persecuted the astronomers Galileo and Bruno for their observations that the earth was not the center of the universe. Galileo recanted and was confined to house arrest for the rest of his life; Bruno refused to recant and was burned at the stake. Pretty brutal for making observations about the way the universe is organized.

So let’s take a look around us today to see if we have learned anything. We haven’t burned anyone at the stake for a while for saying something the Church finds offensive, which is a good thing; I suppose we can take comfort in knowing that, here in the USA at least, those who promote unpopular positions don’t have to worry about that. But I still see an unwillingness to accept what science tells us. Part of this seems to be a misunderstanding of the scientific process, or confusion over definitions; other times I think it’s more deliberate; people who are clearly intelligent men and women refuse to accept what scientists have concluded. It strikes me that sometimes it appears there are religious reasons for this but other times it seems to be more politically motivated.

For example, global climate change jumps out. There is a near-universal consensus among environmental scientists that we are experiencing an unprecedented change in our climate (it’s getting warmer), and furthermore that this is being caused by our (that is to say, human) actions. Yet, a significant number of Americans (predominately in the Republican party) deny it. Either it isn’t happening (they say), or if it is, it’s part of a natural process and human activities have nothing to do with it. In support of that position these folks point to changes in our environment throughout geological history. And of course this is true; we’ve had ice ages as well as times when the arctic supported tropical plants. But the scientists know that as well and still say that human activity is what’s causing our current changes in climate. Last year was the warmest year on record; the previous record was the year before that. Experts are virtually universal in their agreement that unless we make significant changes in our consumption of fossil fuels and reduce our carbon footprint immediately, we will be confronted with things like rising sea level caused by melting of the polar ice caps, possibly irreversible changes in climate patterns and the loss of species (like polar bears) caused by those pattern changes.

But that’s a different rant; what I’m focusing on here is the denial of the science, not the effects of human activity on our climate.

The political reason to deny the science that underpins climate change is based on economics: whatever we do will cost jobs. I think this is a false argument being promoted by the fossil fuel lobbies. I suspect that the horse-buggy lobby (had there been one) would have given the same argument against the auto industry back at the beginning of the 20th century, and as it turned out automobile manufacturing became the juggernaut of jobs creation in the US. It continues to be a huge source of jobs today, even since the Japanese figured out how to build cars that would effectively compete. Our economy is pretty fluid; there will be a great many completely new jobs that emerge as we shift to a different source of power. Take a look at the jobs out there now in the communications/internet arena that didn’t even exist as recently as a generation ago.

The other reason given for denying climate change has it’s roots in religion, and that can be further divided into two positions: God gave us the earth and it’s therefore our God-given right to do anything we want with its resources is one; the other (that was loosely the JW position) was that God would protect the earth and prevent it from being damaged. (The scripture most quoted in this context was Revelation 11:18, which says that God will “bring to ruin those ruining the earth.”) So we didn’t need to be concerned; God would handle it for us. For what it’s worth, that didn’t translate to a total lack of concern (for us at least); we were careful not to litter, we had concerns about the loss of rain forests, etc. but we believed that in the end God would make it all good. I suppose that under that was the tacit belief that Satan was behind the scenes contributing somewhere as well.

Contributing to these two reasons is a disturbing trend evident in our culture over the past couple of decades: a distrust of intellect. Or more specifically, a belief that “elitism” permeates society, leading in some people to an almost perverse pride in not thinking too deeply about things. What might be contributing to that and what it leads to is the subject of a future post, but for this topic, it’s expressed most commonly by a general distrust of science.

And I can kind of see where that came from; during the Vietnam War and anti-war movement (my generation’s defining moment), Monsanto and Dow Chemical played a big role in sowing the seeds of distrust by producing Agent Orange and napalm; or more specifically, what we were told about them and how they were used. In the case of Agent Orange, we were led to believe that any long-term effects were minimal (pretty well disproven today); and few who remember the war think of napalm favorably. For the young of that era, Big Chemical went from being the Friend of Humanity to being disliked and distrusted in a single 10-year span. One could say that it was the government that should be blamed rather than the chemical companies, but the net result was the same: a distrust of “the scientist.”

Recall that it was around the same time that we put a team on the moon; it seems that science should have been the hero based on that accomplishment. But there was a general “distrust authority” feeling in America’s young people at that time, so I think that helps explain what we see today.

Fast forward to today. I doubt if most people who mistrust science today think of Agent Orange and Dow Chemical as the reasons why; I think it’s morphed into a different collective attitude. But here we are. It’s particularly ironic to me that the same people who talk about how we can’t trust science do so on their websites, texts and emails, all of which are founded on the very scientific principles discovered and developed by the “elites” they say can’t be trusted.

Posted in General commentary on the world as I see it..., Political commentary, Science | Leave a comment

The Enlightenment and the scientific method

A while ago I wrote about how, when DARPA funded research leading to distributive computing and our effort to land a man on the moon drove the search for ever-more-powerful and energy-conserving computing capability, no one could possibly have envisioned the internet, cell phones and iPads of today. The message, of course, is that it’s impossible to predict where science will take us. It was an appeal for understanding of why basic research is a good thing; most of the time we don’t have an inkling of what will come of it. Frequently, of course, it’s nothing practical. But every so often it’s a cell phone or personal computer or LED headlights. And it’s the scientific method that gets us there.

Basically, the scientific method is nothing more than a process for finding stuff out in reliably reproducible ways. There’s a few basic steps that make up the core of the scientific method:

  • Ask a question
  • Come up with a possible answer to your question (create a hypothesis)
  • Figure out a way to test your hypothesis (conduct an experiment)
  • Evaluate what happened
  • Draw a conclusion
  • Tell others (publish your results)

If done correctly, each cycle of these steps (for the last step should be followed by either you or someone else starting over by asking another question), gradually adds to our knowledge of how things work. Each of these steps is important, but to my mind “telling others” is probably the most critical, for what it implies: telling others exactly what you did and how you did it gives them the opportunity to try the same thing for themselves.

Not all that long ago, mankind had no idea what caused earthquakes, volcanoes, tornados or what your dog is thinking as he watches you get dressed. With the exception of the last, we’ve now got a pretty good idea and can explain them pretty thoroughly (after the fact); accurate prediction is getting closer but still a ways off. But that’s the point: we believe that there IS a way to predict earthquakes, because they behave by rules; we just don’t yet have a clear enough understanding of the rules (or of how to measure them), but it’s coming. Back when we didn’t understand the physics involved, “God caused it” was the go-to explanation.

The Enlightenment was a period in (predominately) European history when “God caused it” was no longer considered the best answer to phenomena we couldn’t explain. It was roughly from the mid-17th century through the mid-18th (about 1650-1800), and during that time the great thinkers of that era developed the conviction that the universe and everything in it behaved according to rules; it wasn’t just “God’s will.” The realization that there was an explanation for everything was incredibly important and freeing. Once you knew what those rules were, it was possible to figure out what makes things do what they do, and more importantly, what they are going to do next. For the first time in all history, man was no longer at the mercy of the gods but could begin to predict what was coming. All because, as it turns out, the universe plays by rules.

As we get better and better at understanding how these rules apply, “God did it,” or “it’s a miracle” can be used less and less. Assigning events we can’t explain to divine intervention is called “the God in the gaps” argument. It’s called a logical fallacy, and for good reason; the Enlightenment and an understanding of the scientific method has taught us that we don’t need to fall back on divine intervention; we can figure it out. It would be more accurate to say “we can’t yet explain it” and leave it at that.

This is obviously a threat to many peoples’ view of the Divine.

Posted in Religion and philosophy, Science | Leave a comment

History, heliocentrism and hubris

Any kid of 8 in the US who’s paid the least amount of attention in school knows that the sun is the center of our solar system and that the earth revolves around it. This is of course  contrary to personal observation, because the earth seems to stand still while the sun revolves around it, “rising” in the east in the morning and “setting” in the west 12 hours (give or take) later. The benefit of observations made over the last several hundred years by astronomers beginning with Copernicus is seen in the fact that virtually all people understand this little tidbit of celestial awareness.

But it wasn’t always like that. Claudius Ptolemy was a Greek citizen of Rome who lived in the city of Alexandria in what is now Egypt. He was born at the end of the fiPtolemyrst century C.E. and developed a model to describe the workings of the universe that lasted for nearly a millennia and a half. By all accounts Ptolemy was a smart guy; he wrote important treatises in mathematics and geology in addition to astronomy. He just got it exactly wrong when it came to astronomy; he said that the earth was at the center of the universe, and everything revolved around us. Given our personal experience, it seems a reasonable conclusion: the earth doesn’t feel to us like it’s moving, and we can clearly see the objects in the night sky slowly move from one horizon to the other each night. As we all know however, that’s not what happens. But it took a long time and no little pain (both mental and physical) to change that, which gives us some idea of how people respected Ptolemy’s work.

copernicusAround the 1500’s, our aforementioned Nicolai Copernicus (1473-1543) made the observation that the earth does not in fact sit still; it revolves around the sun once a year. He got much of the rest wrong; he thought that the sun, not the earth was the center of universe, and all the stars spin around the sun. (Nerdly aside:  the notion that the earth is the center of the universe is called “geocentrism;” the belief that the sun is at the center of the universe is called “heliocentrism.”) Given that lens-grinding was still in its infancy and the first telescopes were just beginning to be developed, I think Copernicus can be forgiven for his mistakes. Especially since it had been “common understanding” for 15 centuries that Ptolemy had it right. Over the next couple of hundred years more and more very smart people started looking differently at the stars and the picture we know to be true today gradually emerged. But as I said, it didn’t happen overnight. In fact, there was considerable resistance to this novel concept, and not just from the Ptolemy-loyalists among the other night-sky observers.

About 50 years after Copernicus made his observations (which, as I said, were met with considerable skepticism by the other thinkers of that time, galileoparticularly The Church) Galileo Galilei (1564-1642) was born. Galileo was strongly influenced by Copernicus, and in fact expanded significantly on Copernicus’ observations about heliocentrism, but in so doing he ran afoul of The Church.

The Church (there was really only one at the time; today of course it’s known as the Catholic Church) was not pleased, to put it mildly. They took umbrage at Galileo’s (and others), for this position because of its larger implications. After all, Earth was clearly identified in the Bible as God’s direct creation, the place where He put the pinnacle of his effort (man, naturally) and where he sent his Son (or Himself, as the Athanasian Creed defines God) to atone for the sins of mankind. And if the earth is that important in God’s plan for the universe, it clearly made sense that it be the pivot point around which everything else in the entire universe revolved. The Bible even says that “(God) set the earth on its foundations; it can never be moved.” (Psalms 104:5) So for some upstart astronomers to claim otherwise clearly challenged the primacy of the earth, God and His proxies on earth (or at least, as they thought of themselves), the Catholic hierarchy.

At that time, the Church was so closely linked to the government that they were virtually the same. If the Church wanted something badly, she was going to get it. So if you displeased the Church you were in deep trouble. Galileo was charged and subsequently convicted of heresy; he was able to save his life by recanting, saying he was wrong and the earth was the center of the universe. The story is that, as he was leaving the presence of the Pope he said (presumably under his breath) “And yet, it moves!” In any case, the Church accepted his reversal, sparing him from a death sentence, but he spent the rest of his life under house arrest. That sounds pretty harsh, but a contemporary of his, a Dominican friar named Giordano Bruno, was not nearly so lucky. He was also an astronomer (in addition to having expertise in mathematics, philosophy and poetry), but he went even farther than Galileo; he postulated (correctly, as it turned out) that the stars were nothing but distant versions of our Sun, with their own planets and moons circling them, and even having the possibility of supporting life on their local planets. For his position (and refusal to recant), he was burned at the stake. (There are some who say that what really caused his death sentence was his mocking of the Pope rather than his positions on the nature of the universe, but that is a minority view, promoted largely by today’s Catholic church. Make your own judgement.)

The Enlightenment (a topic for several future blog entries) took the position that we could obtain answers about the physical world through careful observation; we didn’t need to depend on miracles or explanations that “God did it.” One clear example is the what we’ve been discussing:  the observations of these early astronomers such as Copernicus, Galileo, Bruno and others. They looked up at the sky, watched what happened for a while and then, using nothing more than their brains and the rules of mathematics, figured out that the earth simply couldn’t be the center of the universe.

I think about that when I reflect on the vastness of the universe. As I said in my last blog entry, we’re on a planet circling a fairly unimpressive star, in one of the outer arms of a minor galaxy, out on the far end of Laniakea (our Super Cluster). There is absolutely nothing to make us stand out in any way, except for the fact that we’re here. It’s far more likely (given the number of stars we know have planets, and extrapolating that to the universe) that there are many places where life emerged, than that we are alone in the universe. So it makes me think how arrogant those Church leaders were, insisting against all the facts that literally everything in the universe revolved around us.

It would be laughable, except that they burned people to death if they disagreed.

Posted in Religion and philosophy, Science | Leave a comment

Space is really, REALLY big!

It’s nearly impossible to get a grasp on how truly vast space is. I got interested in space when I was a little kid growing up in Illinois farm country. I had a map of our solar system on my wall; although it wasn’t to scale it still gave me a sense of far apart things are, just in our solar system. The fastest speed anything can travel is the speed of light (at least according to physicists; sci-fi fans know that FTL—faster-than-light—travel is made possible by warping spacetime. Or something.) Anyhow, a theoretical something traveling at the speed of light (186,282 miles per second) would circle the earth about 7-1/2 times in one second. So for our purposes here on earth the speed of light is pretty much instantaneous. Or close enough it might as well be.

But the distances of space actually make the speed of light a handy measuring tool:  space is so vast it’s necessary to think in the distance light travels in 12 months, called, logically enough, a light year. I knew that the sun was 93 million miles away and it takes a little over 8 minutes for light to travel to earth from the sun. That doesn’t seem like all that much, but Pluto (which was still a planet then) is so far away from the sun that it takes light about 5-1/2 hours to get there. Our closest neighbor (actually a star system consisting of 3 stars) is Alpha Centauri, just under 4 1/2 light years away. That means that when we look at that star system, the light we are seeing left Alpha Centauri 4-1/2 years ago. To put it a little differently, if our sun were the size of a grapefruit and was placed on the beach near my house on the west coast, Alpha Centauri would be all the way across the country in Boston, 2500 miles away. And that’s our closest neighbor.

Milky Way galaxyWe’re on the fringes of our galaxy, the Milky Way (a spiral-shaped galaxy of no particular note other than the obvious fact that it’s where we live). When we look up in the night sky like I did as a kid in Illinois, the swath of lighter sky is the center of our galaxy; we’re look back toward it from where we are in one of the outer arms. Our Milky Way is somewhere between 100,000 and 180,000 light-years across. Again, that means that light from a star on the other side of our galaxy left there up to 180,000 years ago and is just reaching us now.

And our galaxy is one of countless such galaxies. At the Griffith Observatory here in Los Angeles, there’s a huge mural that represents just a small section of the night sky. It’s fascinating because when you first look at it from across the hall it appears to be just dots on a white background and you figure each dot is a star. Walk closer and a different picture emerges (literally), because up close you see that what appear to be dots are actually photographs of other galaxies, placed in the proper position relative to the other “dots” (some are stars in our Milky Way; others are other galaxies visible from earth) in the night sky. So it’s actually a star map, accurately representing what’s out there.

Astronomers estimate there are over 100 billion (with a “b”) stars in our galaxy. But there’s more. Lots more. The universe has structure:  Our Milky Way is part of a group of galaxies, called a Local Group In our Local Group there’s about 30 galaxies spanning about 10 million light years. Our Milky Way and the Andromeda galaxies are the largest in the group.

Our Local Group is part of a larger group, which is itself part of a yet larger group. And as I said, the Milky Way is not particularly notable as galaxies go; there are much bigger galaxies out there as well as more interesting ones (younger galaxies where stars are forming; galaxies with massive black holes at their center, and so on nearly ad infinitum).

LaniakeaTake a giant step backward:  now the scale of distance becomes nearly impossible to grasp. I read an article in the news a couple of weeks ago indicating that astronomers have now come to understand that our galaxy and associated Local Group structures are part of a “supercluster” of galaxies that’s been named Laniakea. Our Milky Way is located in a remote arm of that. (See the small red dot in the right-center of the artist’s representation of Laniakea to the right? That’s covering the Milky Way). This Super Cluster is about 50 times bigger than our Local Group.

So here we are, living on a small planet in an unremarkable solar system, out on the outer edge of a galaxy, in the outer reaches of a supercluster of about 100,000 galaxies spanning 520 million light years. And that’s only one of five such super-clusters!

The reason this is interesting to me is twofold. First, and what initially stimulated this entry, was no more complicated than thinking about how staggeringly BIG the universe is, the incomprehensible distances it encompasses, and how insignificant we humans are. I mention above looking up at the night sky as a kid. My brother Jim (Jimmy at the time) and I would go out in out side yard in a summer evening when there was no moon and lie in the grass, looking up at the night sky. Out in the country and away from the corner streetlights there wasn’t a lot of ambient light, so the stars were in full glory. I understood a light year even then, so attempting to come to grips with how far away these little twinkly lights actually are was both entertaining and challenging. And more than a little awe-inspiring.

The second reason this interests me is more historical and takes another post to develop. Stay tuned.

Posted in General commentary on the world as I see it..., Science | Leave a comment