Deep Existential Torment is Sexy

April 7, 2008

torment existential

More and more philosophy majors these days. This article proposes that, somewhat paradoxically, tougher economic times are pushing people to ever more general (and arguably less marketable) college majors. The first paragraph made me laugh, as they describe the oh-so-unexpected shift of a pre-med freshwoman to a pre-law senior. Must’ve been Organic Chem.

The money quote is at the end though:

Jenna Schaal-O’Connor, a 20-year-old sophomore who is majoring in cognitive science and linguistics, said philosophy had other perks. She said she found many male philosophy majors interesting and sensitive.

“That whole deep existential torment,” she said. “It’s good for getting girlfriends.”

Proust Really Was a Neuroscientist

March 9, 2008

I just finished Jonah Lehrer’s first book, Proust Was a Neuroscientist. Lehrer is an editor for Seed magazine, blogs at the Frontal Cortex, and did a lot of the research for his book while on a Rhodes Scholarship at Oxford. Proust Was a Neuroscientist was just published last November, and one of the most impressive things about it is that Lehrer is only 25.

Proust was a neuroscientist Sandwiched between an introduction called “Prelude” and a postcript titled “Coda,” Proust presents eight chapters on 19th century figures who anticipated a discovery of modern neuroscience in their art. Authors Walt Whitman, George Eliot, Marcel Proust, Gertrude Stein, and Virginia Woolf are profiled along with chef Auguste Escoffier, painter Paul Cezanne, and composer Igor Stravinsky.

Proust is an intellectual tour de force; you get excellent descriptions of the techniques and discoveries of neuroscience alongside biography of fascinating cultural figures and accessible criticism of their works. Given my background studying biochemistry, the science wasn’t that in-depth, but the literary history is something I really needed. My dad is an art professor, so I’m generally more familiar with the great art of history than great literature.

And that’s exactly the point. My friend Jimmy recommend CP Snow’s The Two Cultures a couple years ago. Snow’s classic little text is a call for more communication between the humanities and the sciences. Lehrer summarizes the philosophy behind his book in its Coda by beginning with Snow’s argument for a Third Culture of artists and scientists enjoying knowledge from both spheres. Lehrer’s critique:

Snow turned out to be prophetic, at least in part. The third culture is a now a genuine cultural movement. However, while this new third culture borrows Snow’s phrase, it strays from his project… The third culture today refers to scientists who communicate directly with the general public. They are translating their truths for the masses… From Richard Dawkins to Brian Greene, from Steven Pinker to E.O. Wilson, these scientists do important scientific research and write in elegant prose. Because of their work, black holes, memes, and selfish genes are now part of our cultural lexicon.

He then summarizes the theme of Wilson’s Consilience: The Unity of Knowledge–a book I enjoyed immensely but felt ultimately fell short–with this quote from Consilience:

The central idea of the consilience world view is that all tangible phenomena, from the birth of stars to the workings of social institutions, are based on material processes that are ultimately reducible, however long and tortuous the sequences, to the laws of physics.

This is a statement with which I can’t argue, but also find dissatisfying. Jonah’s response?

Wilson’s ideology is technically true but, in the end, rather meaningless…When some things are broken apart, they are just broken. What the artists in this book reveal is that there are many different ways of describing reality, each of which is capable of generating truth.

And on a derogatory reference to Virginia Woolf by Steven Pinker:

But if Pinker is wrong to thoguhtless attack Virginia Woolf (seeing an enemy when he should see an ally), he is right to admonish what he calls “the priests of postmodernism.” Too often, postmodernism…indulges in cheap disavowals of science and the scientific method. There is no truth, postmodernists say, only differing descriptions, all of which are equally invalid. Obviously, this idea very quickly exhausts itself. No truth is perfect, but that doesn’t mean all truths are equally imperfect. We will always need some way to distinguish among our claims.

The reception for Lehrer’s first offering hasn’t been completely positive (see a rather critical review at Slate) but it seems that most of the complaints I’ve read center around small quibbles with Lehrer’s points or a misunderstanding of the overall thesis. To be sure, I was skeptical going in; it seems that every critique of the epistemology of the likes of Wilson I’ve read has been cover to a religious or postmodern agenda (or both).

Proust Was a Neuroscientist was so refreshing because it was written by someone with an obvious knowledge of and respect for science on a level commensurate with its accomplishments. Because Lehrer can write compelling prose about neuroscience while admitting that our mind really is just the machine and not the ghost, he can also write convincingly about the value of art in giving us knowledge about our human experience.

Greeks Vs. Germans

March 9, 2008

Checklist for Quacks

July 13, 2007

Have you ever sat down and wondered, “Am I a scientific quack?” Well, probably not. But I’ve met enough True Believers in pseudoscience to give myself pause. What I really am the person to come up with a scientific breakthrough? How will anyone ever believe me? Doesn’t science trudge along, a la Thomas Kuhn, in the dominant paradigm until the evidence suggesting otherwise is just too overwhelming to ignore? What if I’m part of the new paradigm that will supplant the old, and I want to get the word out?

After all, weren’t many breakthroughs originally derided? Who believed the Earth actually revolved around the Sun? Who knew that Helicobacter pylori bacteria played a role in ulcers? Who knew that RNA interference played such a large role in cell function? Or that cells were programmed to self-destruct as a natural part of development? Or that a thing as wacky as prions actually existed? (After heliocentricity, these ideas won the Nobel Prize for Medicine and Physiology in 2005, 2006, 2002, and 1997, respectively).

Luckily for all of us, Cosmic Variance has compiled an Alternative-Science Respectability Checklist. Excerpts follow:

Believe me, I sympathize. You are in possession of a truly incredible breakthrough that offers the prospect of changing the very face of science as we know it, if not more. The only problem is, you’re coming at things from an unorthodox perspective… Perhaps you have been able to construct a machine that produces more energy than it consumes, using only common household implements; or maybe you’ve discovered a hidden pattern within the Fibonacci sequence that accurately predicts the weight that a top quark would experience on Ganymede, expressed in femtonewtons; or it might be that you’ve elaborated upon an alternative explanation for the evolution of life on Earth that augments natural selection by unspecified interventions from a vaguely-defined higher power. Whatever the specifics, the point is that certain kinds of breakthroughs just aren’t going to come from a hide-bound scholastic establishment; they require the fresh perspective and beginner’s mind that only an outsider genius (such as yourself) can bring to the table.

No sarcasm there. Rule 1:

Acquire basic competency in whatever field of science your discovery belongs to.

But! But! Seems a bit demanding, doesn’t it?

Now, you may object that steering clear of such pre-existing knowledge has played a crucial role in your unique brand of breakthrough research, and you would never have been able to make those dazzling conceptual leaps had you been weighed down by all of that established art. Let me break it down for you: no.

Rule 2:

Understand, and make a good-faith effort to confront, the fundamental objections to your claims within established science.


Scientific claims — whether theoretical insights or experimental breakthroughs — don’t exist all by their lonesome. They are situated within a framework of pre-existing knowledge and expectations. If the claim you are making seems manifestly inconsistent with that framework, it’s your job to explain why anyone should nevertheless take you seriously…. If you claim that the position of Venus within the Zodiac affects your love life, you’re not only positing some spooky correlation between celestial bodies and human affairs; your theory also requires some sort of long-range force that acts between you and Venus, and there aren’t any such forces strong enough to be relevant.

And finally, Rule 3:

Present your discovery in a way that is complete, transparent, and unambiguous.

Not likely. But in case someone still needs convincing, have them submit their theory to the Crackpot Index.

There’s Always an Alternative

June 11, 2007

If your standards are low enough.

I just spent a good hour looking at the website of Scholars for 9/11 Truth and Justice, largely because I came across a website talking about how great the ‘evidence’ is for a controlled demolition. After reading some of their ‘evidence’–which largely consists of a drip of pseudoscience here and a heavy coat of logically-indefensible theorizing there–and find myself slightly more educated about their arguments, and overwhelmingly unconvinced.

The most interesting thing for me is the similarity between the 9/11 Truth movement and Creationism/ Intelligent Design. Both of these are based on the idea that the status quo simply cannot be correct, largely because of personal incredulity. There’s no way I could have evolved from monkeys… There’s no way there’s a natural mechanism that would create X… There’s no way that those planes and/or fires could have brought down those buildings… And what they end up with an alternative theory that is less plausible than the mainstream one which is supported by evidence. This also seems to be similar to other forms of pseudoscience, like that of HIV-denialists.

Here’s an interesting example from the 9/11 site:

The word theory when used in the derisive sense of ‘conspiracy theory’ connotes detailed speculation unfounded in fact. However, a theory can stand apart from a detailed scenario explaning the means and methods behind observed events. For example, a theory of the controlled demolition of the Twin Towers can be proved simply by disproving its converse — that the Towers’ collapses were spontaneous. (emphasis added)

That last sentence is the interesting part. Of course, it’s bad logic, and bad theorizing. One should look for the more parsimonious theory–that which fits better with the available evidence. Even if one were able to disprove specific evidence for the official theories (which, judging by the evidence presented on the website, certainly hasn’t been done, but is assumed to have been done), that wouldn’t mean the alternative theory is more plausible. It’s reminiscent of the Wedge Strategy of the Intelligent Design movement:

The objective (of the wedge strategy) is to convince people that Darwinism is inherently atheistic, thus shifting the debate from creationism vs. evolution to the existence of God vs. the non-existence of God. From there people are introduced to ‘the truth’ of the Bible and then ‘the question of sin’ and finally ‘introduced to Jesus.’

Simply convince people that the main theory is either untrue or unpalatable, and then they’ll no other choice but to embrace the alternative conveniently offered at the moment of doubt. Moral for the day: If you see this strategy elsewhere, be wary.

The Age of the Impossibility of Disbelief

June 3, 2007


In her excellent book, A History of God, Karen Armstrong takes an interesting detour on the language of belief and doubt. She describes how the proliferation of religious choices may have made faith more difficult, not less:

Indeed, by the end of the sixteenth century, many people in Europe felt that religion had been gravely discredited. They were disgusted by the killing of Catholics by Protestants and Protestants by Catholics. Hundreds of people had died as martyrs for holding views that it was impossible to prove one way or the other. Sects preaching a bewildering variety of doctrines that were deemed essential for salvation had proliferated alarmingly. There was now too much theological choice: many felt paralyzed and distressed by the variety of religious interpretations on offer. Some may have felt that faith was becoming harder to achieve than ever.

So did people respond with atheism, or something else?

Yet in fact a full-blown atheism in the sense that we use the word today was impossible. As Lucien Febvre has shown in his classic book The Problem of Unbelief in the Sixteenth Century, the conceptual difficulties in the way of a complete denial of God’s existence at this time were so great as to be insurmountable. From birth and baptism to death and burial in the churchyard, religion dominated the life of every single man and woman. Every activity of the day, which was punctuated with church bells summoning the faithful to prayer, was saturated with religious beliefs and institutions: they dominated professional and public life—even the guilds and the universities were religious organizations. As Febvre points out, God and religion were so ubiquitous that nobody at this stage thought to say “So our life, the whole of our life, is dominated by Christianity! How tiny is the area of our lives that is already secularized, compared to everything that is still governed, regulated and shaped by religion!”

I wonder, what dominates our own lives in such a way that we cannot realize? How shackled is our thinking by the commonplace? It’s a fascinating line of questioning.

Even if an exceptional man could have achieved the objectivity necessary to question the nature of religion and the existence of God, he would have found no support in either the philosophy or the science of his time. Until there had formed a body of coherent reasons, each of which was based on another cluster of scientific verifications, nobody could deny the existence of a God whose religion shaped and dominated the moral, emotional, aesthetic and political life of Europe. Without this support, such a denial could only be a personal whim or a passing impulse that was unworthy of serious consideration. As Febvre has shown, a vernacular language such as French lacked either the vocabulary or the syntax for skepticism. Such words as “absolute,” “relative,” “causality,” “concept,” and “intuition” were not yet in use. We should also remember that as yet no society in the world had eliminated religion, which was taken for granted.

So, what things enabled people to visualize a world without God, and develop a secular existence? And what things encouraged them along that path? I think the religious wars of Europe and, as Armstrong mentioned, the overwhelming variety of religions from which to choose, both made religion in general seem less desirable. I’m not enough of a philosopher to name pivotal events in that movement that made a secular worldview possible, but in the field of science, Darwin was certainly one of the key turning points. Here’s a quote from Richard Dawkins in The Blind Watchmaker:

An atheist before Darwin could have said, following Hume: “I have no explanation for complex biological design. All I know is that God isn’t a good explanation, so we must wait and hope that somebody comes up with a better one.” I can’t help feeling that such a position, though logically sound, would have left one feeling pretty unsatisfied, and that although atheism might have been logically tenable before Darwin, Darwin made it possible to be an intellectually fulfilled atheist.

But the question remains–what else is holding us back?

The Big Bad Reductionist

January 15, 2007


I picked up a copy of Richard Dawkins’ The Blind Watchmaker used for about $3 the other day. I like to think that the person who sold the book did so because this paperback addition has a horrendously ugly cover, not because of the content.

I’ve been interested in reading The Blind Watchmaker for some time, as it seems to be the most direct reponse to the “argument from design” out there. Of course, my father, who teaches design as it applies to human art and is therefore rather convinced by this apologetic, is the most vocal proponent I know, but I hear jabs about watches found on walking paths and Boeing 747’s found in deserts fairly often from others as well.

Dawkins’ answer is, no surprise, that the watchmaker that designed the complexity of life is a blind one: the “designer” is Darwinian evolution. For now I’ve only gotten to page 13, where he’s currently defending reductionism (or more specifically, the hierarchical variety):

For those that like ‘ism’ sorts of names, the aptest name for my approach to understanding how things work is probably ‘hierarchical reductionism’. If you read trendy intellectual magazines, you may have noticed that ‘reductionism’ is one of those things, like sin, that is only mentioned by people who are against it. To call oneself a reductionist will sound, in some circles, a bit like admitting to eating babies. But, just as nobody actually eats babies, so nobody is really a reductionist in any sense worth being against. The nonexistent reductionist–the sort that everybody against, but who exists only in their imaginations–tries to explain complicated things directly in terms of the smallest parts, even, in some extreme versions of the myth, as the sum of the parts! The hierarchical reductionist, on the other hand, explains a complex entity at any particular level in the hierarchy of organization, in terms of entities only one level down the hierarchy; entities which, themselves, are likely to be complex enough to need further reducing to their own component parts; and so on. It goes without say–though the mythical, baby-eating reductionist is reputed to deny this–that the kinds of explanations which are suitable at high levels in the hierarchy are quite different from the kinds of explanations which are suitable at lower levels… Reductionism, in this sense, is just another name for an honest desire to understand how things work.

On a similar note, here’s an article on PubMed about “Reductionism and Antireductionism,” and another on the “Search for organizing principles: understanding in systems biology.” (The latter seems to advocate principles of holism as opposed to reductionism as a means for understanding complex systems.)

Over at Meta-Library, a writer highlights a quote by Francis Crick (co-discoverer of the structure of DNA), that “The ultimate aim of the modern movement in biology is in fact to explain all biology in terms of physics and chemistry.” The writer (evidently a Christian, a dualist, and not a big fan of reductionism based on his disagreements with Dawkins and Wilson) asks three questions to consider whether Crick’s attempt to reduce biology to physics and chemistry has or ever will succeed. The three questions are as follows:

  1. do the laws of physics and chemistry apply to the atoms and molecules of living things?
  2. are the interactions of atoms and molecules according to physics and chemistry sufficient to account for biological phenomena, or are other kinds of interaction needed?
  3. can biological theories be deduced logically from the theories of physics and chemistry?

The author, a certain Dr. Southgate, answers yes to the first two and no to the last. I think the answer appears to be yes to all three, or at least that yes, these are possible to be deduced– not that we necessarily already have.

Today, during the opening lecture of my physics course, the professor made the statement that electromagnetic forces are what make everything work, at least on the scale that we normally experience–the biological scale (he noted the existence of nuclear forces, which we don’t often experience clearly firsthand). If it weren’t for those electromagnetic forces, gravity would pull a dropped ball straight through the earth because none of the particles would hold together. None of chemistry would work (or at least be the same) if the nature of electromagnetic reactions were different, and biochemistry, biology, ecology, and on and on would each in turn be different.

While all the principles of biology–such as a thorough understanding of human consciousness or religious belief–may not be fully reducible yet, I see no epistemic or ontological limitations that should limit our pursuit of reductionism. So Dr. Southgate would have to justify his answer of “no” to statement number three. Reductionism works; postulating that it will fail in the future because it might explain realms normally coveted by other realms, like psychology or theology- requires an explanation.