Category Archives: Atheism & Religion

Between strident atheism and vanilla ecumenicism

1I am a skeptic and an atheist. And now I have to immediately qualify those words. I am a skeptic in the sense that I strive, as David Hume aptly put it, to proportion my beliefs to the available evidence. A concept that Carl Sagan famously turned into “extraordinary claims require extraordinary evidence” in the specific case of pseudoscience.

I am an atheist in the etymologically literal sense of the word: a-theist, without a positive belief in gods. I don’t profess to know that there are no gods, but simply that I don’t see sufficient evidence or reasons in favor of the notion. Likewise, I am an a-unicornist, I don’t believe in unicorns, since they don’t appear anywhere in the fossil record or among contemporary zoological catalogues (yes, yes, I know about narwhals, as well as unicorned rhinos).

These qualifications explain why I am often critical of certain segments of the skeptic and atheist communities. I don’t think “skeptics” do anyone a favor when they engage in silly hoaxes, and that they (we, really) could benefit from a bit less arrogance and a bit more virtue epistemology. Likewise, I have never been a fan of the so-called New Atheists, whom I find strident in behavior and philosophically ill-informed.

That said, I’m also not on board with what I’ve come to regard as vanilla ecumenicism, an increasingly popular stance that argues that there is, and there has never been, a conflict between science and religion, or philosophy and religion, pace Giordano Bruno, Galileo, and a number of others. A recent example (but only one among many) of such an attitude is an article by Peter Adamson in the LA Review of Books. Adamson is a professor of Late Ancient and Arabic Philosophy, and one whose “history of philosophy without any gaps” books I actually use in my introductory courses. In that particular article, he was favorably reviewing Open to Reason: Muslim Philosophers in Conversation with the Western Tradition, by Souleymane Bachir Diagne.

The gist of the book, and the review, is that – contra popular opinion even among philosophers – the Islamic tradition has always been open to reason and science. I honestly think that’s a welcome corrective, and yet at the same time more than a bit of an overstatement. But my beef here is neither with Diagne’s book nor with Adamson’s review of it. Rather, I take issue with the following statement, which appears right at the beginning of the LARB article:

“One of the most common prejudices we historians of philosophy encounter is the notion that philosophy is somehow incompatible with religious belief. Religion is based on faith, philosophy on reason; religion is rigorously imposed doctrine, philosophy is open-ended inquiry; religion is about believing what you’re told, philosophy about figuring things out for yourself. A moment’s reflection will show you that it must be a little more complicated than that. After all, nearly all philosophers in history – famous and obscure, ancient and modern, Western and non-Western, male and female – have been religious believers.”

Let’s start with the last bit: nearly all philosophers in history have been religious believers. Well yes, and so have been nearly all scientists. But the question is whether they were believers in spite of being philosophers or scientists, or whether the two really did go hand in hand. Yes, Galileo was a Catholic, and yet anyone thinking that he experienced no conflicts between his science and his religion as interpreted by Catholic theologians is either not paying attention or is engaging in some seriously misguided historical revisionism (which some people are, yes, I’m aware).

True, Newton spent more time doing biblical criticism than studying physics. But he also spent more time doing alchemy than physics, which is no good reflection on alchemy. And of course, he is celebrated for his physics, not for his biblical criticism or alchemic studies.

Indeed, insofar as science goes, the best way to summarize the conflict with religion was articulated by physicist Richard Feynman, in his The Meaning of It All: Thoughts of a Citizen-Scientist. There he writes that we can discuss general principles and cherry pick historical examples and counterexamples all we want, but when push comes to shove, the ideal scientist is always open to change her mind when new evidence comes in. The religious believer, by contrast, puts faith ahead of reason. Even the many centuries of Christian apologetics are one gigantic attempt to reconcile the “book of nature” with the “book of God.” Church fathers like Basil, Gregory of Nyssa, Augustine, John Cassian, John Chrysostom, Ephrem the Syrian, and Maximus the Confessor believed that the two books tell the same story, and yet whenever religious authors have perceived a conflict between the two, there was no hesitation about which got precedence. Yes, Galileo did write – quoting Tertullian – “We conclude that God is known first through Nature, and then again, more particularly, by doctrine; by Nature in His works, and by doctrine in His revealed word.” But we know how he ended his days…

[Side note: it is more than a bit ironic that Feynman, who was famously contemptuous of philosophy, quipping that it is as useful to science as ornithology is to birds, wrote a number of books that are exquisitely philosophical in outlook.]

What goes for science, mutatis mutandis goes for philosophy, though things there are a bit more murky. Yes, some of the greatest philosophers of history were also theologians, like Augustine of Hippo, or Thomas Aquinas. But most, and certainly, in my mind, the best, philosophy has nothing to do with gods and the like. Huge swaths of metaphysics, ethics, political philosophy, and aesthetics, as well as pretty much all of epistemology, logic, and the many “philosophies of” (science, language, mind, and so forth) have nothing whatsoever to do with religion. Indeed, to bring gods into such discussions would be very rightly frowned upon, as if one where to mention the possibility of supernatural explanations in the discussion section of a scientific paper.

Where religion plays a major role in philosophical discussions, such as the problem of free will, it doesn’t come out very well. The whole notion of “free” (meaning, contra-causal) will is incoherent, and has historically been defended by Christian theologians embarrassed by the problem of evil. Even there, it’s not a good response, since at most it takes care of the problem of human evil (you know, we’ve got free will, so the resulting shit in the world is on us), but not of the twin issue of natural evil (i.e., it doesn’t even begin to explain earthquakes, cancer, and so on. Here is a funny rendition of the problem.)

Yes, I’m aware that there are arguments (and counter-arguments) for all this. But they are hardly convincing, always feeling like rationalizations in defense of the indefensible. Do I know that there is no God, for a fact? Of course not, see above about unicorns. But as Pierre-Simon Laplace may or may not have told Napoleon when the latter was inquiring about the role of God in the former’s theory of the origin of the solar system: “Je n’avais pas besoin de cette hypothèse-là” (I had no need of that hypothesis). In a similar fashion, we don’t need God in philosophy in order to talk about right and wrong (since Plato’s Euthyphro), or whether the human mind is a computer, or the nature of science, or the structure of language, or the validity of modus ponens, or… you get the point.

And, I would argue, the most fundamental locus of friction between philosophy and religion is precisely the one singled out by Feynman in the case of science: attitude. An ideal philosopher will follow an argument wherever it leads, while a theologian will impose restrictions to guard his faith, and eventually will in fact use faith as a trump card (or respond to a penetrating objection with some entirely uninformative phrase along the lines of “the will of God is inscrutable,” often accompanied by literal hand waving).

So I find myself navigating the treacherous waters between the Scylla of scientistic atheism and the Charybdis of vanilla ecumenicism. No, religion is not “the root of all evil,” and yes, it is historically true that religious institutions have done a lot of good for humanity, alongside the notorious bad. But equally, let us not lull our critical sense and think that there isn’t something radically at odds between an approach that situates faith as fundamental and another one (be it science or philosophy) that values evidence and logic above all.

Why Alex Rosenberg is wrong just about everything

1 (4)Philosophy is my second academic career. My first one was in science, evolutionary biology, to be specific. Depending on how you look at it, this makes me either unusually competent in two normally widely distinct areas of academic scholarship, or barely making the passing grade in both. Be that as it may, I have made a personal hobby to observe my new profession from the outside, as much as it is possible, sort of like an anthropologist looking into a different yet sufficient familiar culture.

One of the things I’ve noticed is that philosophers are unusually critical of their own field, with a number of prominent ones, both now and historically, actually arguing that it should be dismantled, usually in favor of science (or linguistics). I will not get into that debate here, as I’ve covered in detail before.

Another frequent observation is that of a high frequency of colleagues who are fascinating for being very smart, well regarded in the field, and yet – in my admittedly non humble opinion – completely wrong. Perhaps the quintessential example is David Chalmers, he of “philosophical zombies,” “hard problem of consciousness,” “singularity,” “mind uploading,” “panpsychism,” and similar inane notions. But this post isn’t about David.

It’s about Alex Rosenberg. Alex is on the faculty at the prestigious Duke University in North Carolina, and someone I think should get a medal (together with Chalmers, of course) for the highest number of wrongheaded papers in a philosophical career. I met him a few years ago during a two-day conference on “Moving naturalism forward,” organized by cosmologist Sean Carroll. The conference was fun, but Alex kept trying to convince us of a notion that he called “happy nihilism,” according to which the universe is devoid of meaning (of course it is, meaning is a human construct), free will doesn’t exist (of course it doesn’t, if one uses the term in the contra-causal sense), and yet, somehow, we can still decide to take all of this on board and be happy.

Setting aside the devastating criticism Alex got at the conference from Dan Dennett, Owen Flanagan, Terrence Deacon, and others, this is also the same bleak picture of the world he presented in his dismal The Atheist’s Guide to Reality, which I reviewed for The Philosophers’ Magazine. Here is a taste of my thinking at the time:

“As a former scientist and now philosopher, I have chastised some of my colleagues for their scientistic attitude. … Thanks to [Rosenberg], I can no longer be accused of fighting a straw man. Rosenberg’s attempt is valiant and will give people much to think about. Except, of course, that according to Rosenberg we cannot really think such things because scientism ‘says’ that chunks of matter cannot possibly produce insights about anything at all, on penalty of violating physicalism.”

Nevermind that such statements are obviously self-contradictory. What was I doing while reading Alex’s book if not thinking about what he wrote? And what was he doing while writing the book? These are all illusions, claims Alex, apparently using the word “illusion” in a novel and profound way that the rest of us are unaware of. I continued my review:

“Take Rosenberg’s denial of the existence of conscious decision-making. Consciousness for him is an epiphenomenon of the brain’s activity. … His major piece of evidence? Benjamin Libet’s experiments in cognitive science. … We are informed [that] ‘consciousness is probably too big a deal not to have been organized by natural selection to solve some design problem or other, perhaps several. Exactly what its functions are, what design problem it solves, neuroscience has not yet figured out.’”

Seriously? Let us set aside that Alex completely misinterprets the implications of Libet’s famous experiments, even contradicting Libet’s own interpretation. He admits that natural selection must have evolved consciousness – which depends on brain structures that are exceedingly metabolically costly – for some reason, but he can’t think of one. Hmm, let’s see, how about the ability to reflect on our actions, make deliberate decisions, plan things ahead? Oh right, those are all illusions. Naturally. Me again:

“For Rosenberg there is no free will, morality, meaning, aboutness and so on because, you see, ‘the physical facts fix all the facts.’ We are never told exactly what this slogan actually means. Well, I’m a big fan of physics, but last time I checked, it didn’t, for instance, ‘fix’ the fact that 2+2=4.”

Nor does physic fix anything at all in the rest of mathematics. And in logic. Continuing the review:

“Rosenberg thinks that economics, the social sciences (not to mention literature, the arts, and his own field of philosophy) are all ‘stories’ that may entertain us, but that should by no means be taken seriously. He doesn’t seem to realize that science – not to mention his very book – also tells stories … because that is the way human beings communicate knowledge and achieve understanding. Science is the right type of story if you want to know about cosmology, but not if you want to learn logic.”

Or history. Or art. I concluded:

“Rosenberg’s scientistic nihilism is analogous to radical skepticism about reality. … It’s thought provoking, there is no scientific evidence that can possibly rule in its favor or against it, and it is best promptly forgotten so that you can get back to thinking about the things that really matter to you.”

Alex, impervious to criticism (well, “he” is only a bunch of subatomic particles without will or aboutness, so – to be fair – how could he change his mind, especially given that the latter is an illusion?), has continued along the same vein in recent years. Just in the last few weeks I’ve read two more articles by him that finally prompted me to write this essay.

The first one, published in The Verge, is actually an interview conducted by Angela Chen, in which Alex “explains” how our addiction to stories keeps us from understanding history. The interview is about (but wait, nothing is about anything!) his book How History Gets Things Wrong: The Neuroscience of Our Addiction to Stories.” First problem: whenever I hear the words “the neuroscience of…” I instinctively reach for my gun (fortunately, I’m a quasi-pacifist, and I don’t own guns). That’s because nowadays a lot of nonsense is written in the name of neuroscience, unfortunately.

The main trust of Alex’s argument is that neuroscience undermines what is often referred to as our “theory of mind,” the ability to guess other people’s thoughts and motivations. Since historians deploy – without realizing it – a theory of mind whenever they talk about this or that historical figure’s motivations for acting one way or another, their theorizing is made hopelessly obsolete by the modern science of the brain.

Except that Alex is making an astounding mistake here, very similar to the one made, for instance, by fellow atheist Sam Harris in his The Moral Landscape (see my review here). He is confusing a mechanistic explanation of X for the explanation of X, apparently forgetting (or simply outright denying) that explanations – which are human constructs, let us not forget – can be given at different levels, and using different language, depending on how useful they are to the target recipients, i.e., other human beings.

Let me give you an analogous example to show just how bizarre Alex’s claim that neuroscience does away with historical explanations really is. Imagine we were interested in the “neural correlates,” as cognitive scientists call them, of mathematical problem solving. We can stick someone – even a mathematician – into an fMRI machine and find out which areas of her brain lit up when she is involved in simple or complex mathematical thinking, from solving a basic equation to demonstrating Fermat’s Last Theorem.

Now, we will surely find some such neural correlates. We have to, since everything we do, and certainly any kind of higher, conscious thinking, has to be done by way of engaging one part or another of our brains. Otherwise, it would be magic.

But now imagine that our neuroscientist completes his experiment, gets the mathematician out of the fMRI machine, and gingerly informs her that mathematicians are no longer needed, because neuroscience has discovered which areas of the brain they use to solve mathematical problems. Crazy, right? Well, it’s no different from Alex’s reasoning for getting rid of historians, or Harris’ “argument” (I’m using the word charitably) for concluding that science, and neuroscience (which just happens to be his own field) in particular, can now answer moral questions. Ethicists can go play golf.

A few weeks later, Alex did it again! This time in an article he penned himself for 3:AM Magazine, entitled “Is neuroscience a bigger threat than artificial intelligence?” Oh boy. It’s the same basic idea that he has been peddling since The Atheist’s Guide to Reality, though – as in The Verge article – this time it isn’t physics that “fixes all the facts,” it is neuroscience that answers all the questions.

After acknowledging the (alleged, and I think way overblown) threat posed by future advanced AI to humanity (you know, the Singularity, again, Terminator and that sort of things), Alex informs us that the real existential downfall of humanity comes from the research of four Nobel-winning neuroscientists: Eric Kandel, John O’Keefe, Edvard [sic], and May-Britt Moser. What have they done?

“Between them they have shown that the human brain doesn’t work the way conscious experience suggests at all. Instead it operates to deliver human achievements in the way IBM’s Watson does. Thoughts with meaning have no more role in the human brain than in artificial intelligence.”

By now you have surely guessed that this is, again, about the alleged failure of the theory of mind, and that, once again, Alex is simply confusing different levels of explanation, an elementary mistake that you would think a trained philosopher simply wouldn’t make.

The fascinating thing is that Alex actually acknowledges that there is quite a bit of evidence for the theory of mind:

“Several sources of evidence suggest that we have an innate mind-reading ability more powerful than other primates. It’s an ability to track other people’s actions that is triggered soon after birth. Child psychologists have established its operation in pre-linguistic toddlers, while primatologists have shown its absence in other primates even when they exceed infants in other forms of reasoning. Social psychologists have established deficiencies in its deployment among children on the Autism spectrum. fMRI and transcranial magnetic stimulation studies have localized a brain region that delivers this mind-reading ability. Evolutionary anthropology, game theory and experimental economics have established the indispensability of powerful mind reading for the cooperation and collaboration that resulted in Hominin genus’s rapid ascent of the African savanna’s food chain.”

None of this matters, because neuroscience has (allegedly) “revealed” to us that the theory of mind is “quite as much of a dead end as Ptolemaic astronomy.” Why? Because Kandel and colleagues have shown that if you look into the brain you won’t find beliefs, desires, or reasons, but only specific, dynamic neural pathways.

No kidding, Sherlock. That’s because what we call beliefs, desires and reasons are instantiated in the brain by way of specific neural pathways. The neurobiological level is more basic – but, crucially, no more true – than the psychological one. They provide complementary, not competing, explanations of the same phenomenon. One explanation is more useful to biologists and neuroscientists, another one to psychologists, historians, and art critics, among others.

It’s like the much abused and misunderstood example of the chair in which you may be sitting at this particular time. Physics tells us that said chair is “really” just a collection of quarks, interacting in the way prescribed by the fundamental laws of nature. This is certainly the case, but by a long shot not the whole picture. Your chair is also “solid” at the level of analysis pertinent to human beings who wish to sit down in order to read a blog post, not to mention those other human beings that designed and built the chair itself. The chair is most definitely not an illusion, just because it can be (usefully, depending on the context) be described in different ways. Explanatory complementarity, not competition.

A side note, as a biologist, on Kandel et al.’s indubitably scientifically fascinating work: it was done on rats, because the pertinent experiments are too invasive and unethical to be conducted on human beings. With his usual braggadocio, Alex informs us that this doesn’t matter at all:

“Of course you could argue that what Nobel Prize winning research shows about rats is irrelevant to humans. But you’d be flying in the face of clinical evidence about human deficits and disorders, anatomical and physiological identities between the structure of rat and human brains, and the detailed molecular biology of learning and information transmission in the neuronal circuitry of both us and Rattus rattus, the very reasons neuroscientists interested in human brains have invested so much time and effort in learning how rat brains work. And won Nobel Prizes for doing it.”

I got news for Alex: while, again, Kandel et al.’s research is most certainly important, enough to win the Nobel, translating things from rats to humans is definitely not that obvious or straightforward. It is simply false that rat and human brains have a large number of anatomical and physiological identities, as the perusal of any introductory book on mammalian anatomy will readily confirm. Heck, our brains are substantially different from those of higher primates like chimpanzees and bonobos, which is a major reason we need to be careful when we extrapolate from the latter (let alone rats) to humans. For instance, we have little to go by, in terms of comparative brain anatomy and physiology, to explain exquisite and crucially human traits like language (not just communication) and iterative cultural evolution. Take a look at this book by my colleague Kevin Laland to appreciate just how carefully biologists (as distinct from some philosophers) are when it comes to interspecies comparisons.

Don’t get me wrong. Alex Rosenberg is a really smart guy, and his misguided writings are necessary in order to sharpen our thinking about all sorts of matters. After all, the British Royal Society awarded physicist Fred Hoyle (the author of the steady state theory in cosmology, which for a while rivaled the big bang theory) a medal for the highest number of wrong ideas proposed in a scientific career. This was not an example of British sarcasm, they meant it in all seriousness, as Hoyle’s theories have arguably played an important role in advancing cosmology. Perhaps we should establish a similar prize in philosophy. I have a couple of candidates in mind…

Why I’m a still a (non-card carrying) Skeptic

1 (1)I just came back from Las Vegas, where I had a lovely time at the annual CSICon event, organized by the folks that bring you Skeptical Inquirer magazine, among other things. As I’ve done almost since the beginning of my involvement with the skeptic movement, back in, ghasp, 1997, I’ve delivered a bit of a gadfly talk. This one was about scientism, reminding my fellow skeptics that they have a tendency to overdo it with the science thing, at times coming across nearly as evangelical and even obtuse as their usual targets, from creationists to UFO believers. After asking the audience to be patient with me and not serving me hemlock for lunch, I minced no words and criticized by name some of the big shots in the field, from Neil deGrasse Tyson to Richard Dawkins, from Sam Harris to Steven Pinker. And of course several of those people were giving talks at the same conference, either right before or right after me.

No hemlock was served, and I got less resistance to my chastising than usual from the audience. Some people even approached me later on telling me how much they appreciated my reminder that our community is not perfect and we need to do better. It was all very congenial, set against the perfect backdrop of the ultimate fake city in the world, and accompanied by the occasional dirty martini.

On my way back to New York I then got a tweet from a follower linking to yet another “I resign from the skeptic movement and hand in my skeptic card” article, written by a prominent (former) skeptic. It doesn’t matter who. The list of complaints by that author are familiar: a tendency toward scientism, a certain degree of sexism within the movement, and a public failure to lead by some of the de facto leaders. The same issues that I have been complaining about for years (for instance, here). But I have not quit, and do not intend to quit. Why?

The uncharitable answer would be because I’m part of the privileged elite. I doubt anyone would seriously consider me a “leader” in the movement, but I have certainly been prominent enough. And I am a male. White. Heterosexual. The problem is, uncharitable views are highly unhelpful, and I’m on record advocating on behalf of diversity in the movement, against sexual harassment, and – as I mentioned above – have made a mini-career of stinging the big shots every time I think they deserve it, which is rather often. So I’m afraid a casual dismissal based on my gender, sexual preference and ethnicity will not do. Quite apart from the fact that it would be obviously hypocritical on the part of anyone who claims that gender, sexual preference and ethnicity should not be grounds for blanket statements of any kind.

No, I stay because I believe in the fundamental soundness of the ideas that define modern skepticism, and also because I think quitting to create another group is an example of an all too common fallacy: the notion that, despite all historical evidence to the contrary, next time we’ll definitely get it right and finally create utopia on earth. Let me elaborate on each point in turn.
“Skepticism,” of course, has a long history in philosophy and science. The original Skeptics of ancient Greece and Rome where philosophers who maintained that human knowledge is either highly fallible or downright impossible (depending on which teacher of the school you refer to). Consequently, they figured that the reasonable thing to do was to either abstain entirely from any opinion, or at least to hold on to such opinions as lightly as possible. Theirs wasn’t just an epistemological stance: they turned this into a style of life, whereby they sought serenity of mind by way of detaching themselves emotionally from those opinions (political, religious) that others held so strongly and often died for. Not my cup of tea, but if you think about it, it’s not a bad approach to good living at all.

The philosopher that embodies modern skepticism most closely, however, is the Scottish Enlightenment figure par excellence, David Hume. He held an attitude of open inquiry, considering every notion worth investigating and leaving the (provisional) verdict of such investigations to the empirical evidence. He famously said that a reasonable person proportions his beliefs to the available facts, a phrase later turned by Carl Sagan in his hallmark motto: extraordinary claims require extraordinary evidence.

The contemporary skeptic movement was the brainchild of people like philosopher Paul Kurtz (the founder of the organizations that preceded CSI, as well as of Skeptical Inquirer), magician James “the Amazing” Randi (organizer of the long running conference that preceded CSICon, known as TAM, The Amazing Meeting), Carl Sagan himself, and a number of others. Initially, the movement was rather narrowly devoted to the debunking of pseudoscientific claims ranging from UFOs to telepathy, and from Bigfoot to astrology.

More recently, mainly through the efforts of a new generation of leaders – including but not limited to Steve Novella and his group, Michael Shermer, Barry Karr, and so forth – the scope of skeptical analysis has broadened to include modern challenges like those posed by the anti-vax movement and, of course, climate change. Even more recently, young people from a more diverse crowd, finally including several women like Rebecca Watson, Susan Gerbic, Kavin Senapathy, Julia Galef, and many others, have further expanded the discourse to include an evidence-based treatment of political issues, such as gender rights and racism.

The values of the skeptic movement, therefore, encompass a broad set that I am definitely on board with. At its best, the community is about reason broadly construed, critical but open minded analysis of extraordinary claim, support for science based education and critical thinking, and welcoming diversity within its ranks.

Of course, the reality is, shall we say, more complex. There has been plenty of sexual harassment scandals, involving high profile members of the community. There is that pesky tendency toward closing one’s mind and dismissing rather than investigating claims of the paranormal. And there is a new, annoying, vogue to reject philosophy, despite the fact that a skepticism (or even a science) without philosophical foundations is simply impossible.

But this leads me to the second point: I think it far more sensible to stay and fight for reform and improvement rather than to “hand my skeptic card” (there is no such thing, of course) and walk away. Because those who have walked away have, quite frankly, gone nowhere. Some have attempted to create a better version of what they have left, like the thankfully short-lived “Atheism+” experiment of a few years ago.

The problem with leaving and creating an alternative is that the new group will soon enough inevitably be characterized by the same or similar issues, because people are people. They diverge in their opinions, they get vehemently attached to those opinions, and they fight tooth and nails for them. Moreover, people are also fallible, so they will in turn engage in the same or similar behaviors as the ones that led to the splintering of the group in the first place, including discrimination and harassment. So the whole “I’m leaving and creating a new church over there” kind of approach ends up being self defeating and dispersing resources and energy that could far better be used to improve our own household from within while keep fighting the good fights we inherited from the likes of Kurtz and Sagan.

So, no, I’m not leaving the skeptic movement. I will keep going to CSICon, NECSS, the CICAP Fest, and wherever else they’ll invite me. I will keep up my self assigned role of gadfly, annoying enough people and hopefully energizing a larger number so that we keep getting things more and more right. After all, this is about making the world into an at least slightly better place, not into our personal utopia tailored to our favorite political ideology.

They’ve done it again: another embarrassing moment for the skeptic movement

1In a few days I will be in Las Vegas. No, it’s not what you may be thinking about. I’ll be the token skeptic at one of the largest conferences of skeptics: CSICon, courtesy of the same people who publish Skeptical Inquirer magazine, for which I wrote a column on the nature of science for a decade. I say “token skeptic” because I have been invited by the organizers to talk about scientism, the notion that sometimes science itself is adopted as an ideology, applied everywhere even though it doesn’t belong or is not particularly useful (here is a video about this).

I have been both a member and a friendly internal critic of the skeptic community since the late ‘90s, and I have been reminded of the value of such a gadfly-like role very recently, with the publication of yet another “skeptical” hoax co-authored by philosopher Peter Boghossian and author James Lindsay, this time accompanied by Areo magazine’s Helen Pluckrose. The hoax purports to demonstrate once and for all that what the authors disdainfully refer to as “grievance studies” (i.e., black studies, race studies, women studies, gender studies, and allied fields) is a sham hopelessly marred by leftist ideological bias. The hoax doesn’t do any such thing, although those fields are, in fact, problematic. What the stunt accomplishes instead is to reveal the authors’ own ideological bias, as well as the poverty of critical thinking by major exponents of the self-professed skeptic community. But let’s proceed in order.

Boghossian and Lindsay made a first, awkward attempt at this last year, by submitting a single fake paper entitled “The Conceptual Penis as a Social Construct.” It was a disaster: the paper was, in fact, rejected by the first (very low ranking) journal they submitted it to, and only got published in an unranked, pay-per-publish journal later on. Here is my commentary on why Boghossian and Lindsay’s achievement was simply to shine a negative light on the skeptic movement, and here is a panel discussion about their failure at the North East Conference on Science and Skepticism later on in the year. That did not stop major exponents of the skeptic movement, from Michael Shermer to Steven Pinker, from Richard Dawkins to Sam Harris and Jerry Coyne, from praising Boghossian and Lindsay, which is why I maintain the episode was an embarrassment for the whole community.

The hoax, of course, was modeled after the famous one perpetrated by NYU physicist Alan Sokal at the expense of the (non peer reviewed) postmodernist journal Social Text, back in the ‘90s, at the height of the so-called science wars. Sokal, however, is far more cautious and reasonable than Boghossian & co., writing about his own stunt:

From the mere fact of publication of my parody I think that not much can be deduced. It doesn’t prove that the whole field of cultural studies, or cultural studies of science — much less sociology of science — is nonsense. Nor does it prove that the intellectual standards in these fields are generally lax. (This might be the case, but it would have to be established on other grounds.) It proves only that the editors of one rather marginal journal were derelict in their intellectual duty.

In fact, Sokal himself published some good criticisms of the conceptual penis hoax.

Not having learned their lesson at all, Boghossian & co. engaged in a larger project of the same kind, this time sending out 21 fake papers to a number of journals, mostly in women and gender studies. Two thirds of the papers were rejected. Of the seven accepted papers, one was a collection of (bad) poetry, and thus really irrelevant to the objective at hand; two were simply boring and confusing, like a lot of academic papers; one was a self-referential piece on academic hoaxes that one independent commentator actually judged to be making “somewhat plausible arguments”; and three more included fake empirical evidence. As Daniel Engber says in Slate:

One can point to lots of silly-sounding published data from many other fields of study, including strictly scientific ones. Are those emblematic of ‘corruption’ too?

Indeed, there are several examples of this in the literature, like a 2013 hoax that saw a scientific paper about anti-cancer properties in a chemical extracted from a fictional lichen published in several hundred journals. Hundreds, not just half a dozen!

It’s very well worth reading the entirety of Engber’s commentary, which exposes several problematic aspects of the Boghossian et al.’s stunt. The major issues, as I see them, are the following:

1. Hoaxes are ethically problematic, and I honestly think Portland State University should start an academic investigation of the practices of Peter Boghossian. In the first place, I doubt the study (which was published in Aero magazine, not in a peer reviewed journal!) obtained the standard clearance required for research on human subjects. Second, the whole enterprise of academic publishing assumes that one is not faking things, particularly data. So tricking reviewers in that fashion at the very least breaches the ethical norms of any field of scholarship.

2. The authors make a big deal of the ideological slant of the fields they target, apparently entirely oblivious to their own ideological agenda, which explicitly targeted mostly women and gender studies. Both Boghossian and Lindsay have published a series of tweets (see Engber’s essay) that nakedly display their bias. Is the pot calling the kettle black?

3. While we can certainly agree that it is disturbing that academic journals publish any paper that is more or less obviously fake, this is not a good criticism of the target fields. You know what that would look like? It would take the form of a serious, in-depth analysis of arguments proposed by scholars in those fields. But Boghossian & co. actually proudly proclaimed, after their first hoax, that they have never read a paper in “X studies,” which means that – literally – they don’t know what they are talking about. Here is one example of how to do it.

4. What Boghossian et al. really want to convey is that “X studies” are intellectually bankrupt, unlike other academic disciplines, particularly scientific ones. But as the example of the anti-cancer hoax mentioned above, and several others, show, this is simply not the case. Corruption of academic culture, resulting either from ideological bias or from financial interests (pharmaceutical companies are well known to establish entire fake journals to push their products) is not limited to certain small corners of the humanities.

5. In a related fashion – and surprisingly given that Boghossian actually teaches critical thinking – while the first hoax fatally suffered from a sample size of n=1, the new one is plagued by the simple fact that it has no control! Without a similar systematic attempt being directed at journals in other fields (particularly scientific ones) we can conclude precious little about the specific state of “X studies.”

That said, do I think that the fields targeted by Boghossian & co. are problematic? Yes, as I’ve written before. Here the most useful commentary on the hoax has been published in the New York Times by William Eggington. As he puts it:

The problem is not that philosophers, historians or English professors are interested in, say, questions of how gender or racial identity or bias is expressed in culture or thought. Gender and racial identity are universally present and vitally important across all the areas that the humanities study and hence should be central concerns. The problem, rather, is that scholars who study these questions have been driven into sub-specializations that are not always seen as integral to larger fields or to the humanities as a whole. Sometimes they have been driven there by departments that are reluctant to accept them; sometimes they have been driven there by their own conviction that they alone have the standing to investigate these topics.

That strikes me as exactly right. “X studies” programs should be integrated within a university, either (ideally) in broad multidisciplinary programs, or within the most suitable departments, such as History, Philosophy, Sociology, and the like.

Eggington blames academic hyperspecialization for the current sorry state of affairs in these fields, as well as the “publish or perish” attitude that has plagued academia for decades now. But guess what? “X studies” are most definitely not the only ones to suffer from these problems. They are endemic to the whole of modern academy, including the natural sciences. Indeed, we should be far more worried about the influence of ideology and big money on scientific fields than on small areas of the humanities. After all, it is in the name of science that we spend billions annually, and it is from science that we expect miracles of medicine and technology.

As Engber writes in the Slate commentary, notwithstanding the dire warnings of Boghossian, Pinker, Harris, Dawkins and all the others:

Surprise, surprise: Civilization hasn’t yet collapsed. In spite of Derrida and Social Text, we somehow found a means of treating AIDS, and if we’re still at loggerheads about the need to deal with global warming, one can’t really blame the queer and gender theorists or imagine that the problem started with the Academic Left. (Hey, I wonder if those dang sociologists might have something interesting to say about climate change denial?)

The new Boghossian-led hoax is another example of badly executed, ideologically driven stunt that targets narrow fields with little impact while leaving alone the big elephants in the room. It is, in the end, yet another embarrassment for the skeptical community, as well as a reflection of the authors’ own biases and narrow mindedness.

Against ecstasy

My friend Jules Evans has recently published an essay arguing that religion has no monopoly on transcendent experience. The essay is in part inspired by his new book, The Art of Losing Control: A Philosopher’s Search for Ecstatic Experience. Despite the title of this post, I have nothing against ecstatic experiences per se, nor do I think that religion has, or ought to have, a monopoly over them. But I do think Jules gets a good number of things wrong, and I’m going to argue why.

Jules’ Aeon piece opens by recounting a mystical experience that occurred to the British author Philip Pullman back in 1969: “[he] was walking down the Charing Cross Road in London, when his consciousness abruptly shifted. It appeared to him that ‘everything was connected by similarities and correspondences and echoes’. [He] wasn’t on drugs, although he had been reading a lot of books on Renaissance magic. But he told me he believes that his insight was valid, and that ‘my consciousness was temporarily altered, so that I was able to see things that are normally beyond the range of routine ordinary perception.’ He had a deep sense that the Universe is ‘alive, conscious and full of purpose.’ He says: ‘Everything I’ve written has been an attempt to bear witness to the truth of that statement.’”

Jules goes on to say that Pullman calls that sort of experience “transcendent,” but that he prefers the term “ecstatic.” I call it hallucination.

Is it possible that a sudden (apparently unprovoked by drugs, but it could have been) shift in conscious perceptions gives a human being temporary access to a deeper reality (whatever that means)? Sure, it’s possible. Is it the most likely explanation of what happened to Pullman? Hardly. And as I wrote in a previous post, confusing mere logical possibility with actual empirical probability is a major portal into woo-thinking, defined as “adj., concerned with emotions, mysticism, or spiritualism; other than rational or scientific; mysterious; new agey. Also n., a person who has mystical or new age beliefs.”

Jules continues: “Over the past five centuries, Western culture has gradually marginalised and pathologised ecstasy. That’s partly a result of our shift from a supernatural or animist worldview to a disenchanted and materialist one. In most cultures, ecstasy is a connection to the spirit world.”

Indeed, although I would call supernatural and animist worldviews rather naive and ungrounded in reality, while disenchanted materialism is about looking at the world as it actually is (insofar as we understand it), and not as we wish it would be. There is, based on what is reasonable to know, no such thing as a spirit world.

Notice, incidentally, Jules’ tendentious use of words here: “disenchanted” and “materialism,” rather than, say, “reason-based” and “naturalism.” To be disenchanted is not usually considered a good thing, as disenchantment is next door to cynicism (with a small-c, not the ancient philosophy). And materialism sounds harsher than naturalism (yes, I’m aware that philosophically the two are not the same thing, but the opposite of supernatural is natural, not material).

Jules mentions an interesting statistic: “The polling company Gallup has, since the 1960s, measured the frequency of mystical experiences in the United States. In 1960, only 20 per cent of the population said they’d had one or more. Now, it’s around 50 per cent.” He takes this as a good sign, telling his readers that if they had some such experience they are not alone. But I find it disturbing that half the population has at times lost contact with reality, and am puzzled by the fact that the percentage has more than doubled in the past half century. Why would that be? Are human beings suddenly developing better abilities to get in touch with the Deep Beyond? More likely (again, possibility vs probability!) we live in times that are alienating and disturbing for a larger and larger chunk of the population, which then seeks relief in fantasies, whether induced by drugs or not. Both the problem (alienation) and the response (fantasizing) are worrisome, because wishful thinking has never been an effective answer to life’s difficulties.

Jules tells us that “the philosopher Bertrand Russell, for example, also had a ‘mystic moment’ when he suddenly felt filled with love for people on a London street. The experience didn’t turn him into a Christian, but it did turn him into a life-long pacifist.” I’m not so sure it did, Russell was a lifelong liberal-progressive. But at any rate I can hardly see one of the founders of modern analytical philosophy entertaining for a moment that his subjective experience was somehow a reliable window into an alternate, and better, perception of reality. The revealing phrase here being “it didn’t turn him into a Christian”…

Jules got interested in ecstasy after he had a bad accident when he was younger, a near-death experience during which he felt “immersed in love and light.” I’m really glad he survived and recovered, but a fleeting sensation one has under extreme circumstances hardly counts as evidence of a deeper reality, as much as I’m sure it was very psychologically useful to him. When he says “I knew that I was OK, I was loved, that there was something in me that could not be damaged, call it ‘the soul’, ‘the self,’ ‘pure consciousness’ or what-have-you,” I would say that no, there is nothing in you that cannot be damaged, and to believe so is a delusion. You just got very, very lucky. But then again, I am a “disenchanted materialist” who thinks that there is no reason to believe in a soul or a pure consciousness. (Though I do believe there is a self, of the Humean type, i.e., a constantly shifting, dynamic bundle of perceptions. That one too, of course, is hardly indestructible.)

Jules departs from the views of the above mentioned Philip Pullman, who thinks that ecstatic experiences just happen, they cannot be sought: “I disagree. It seems to me that humans have always sought ecstasy. The earliest human artefacts — the cave paintings of Lascaux — are records of Homo sapiens’ attempt to get out of our heads. We have always sought ways to ‘unself,’ as the writer Iris Murdoch called it, because the ego is an anxious, claustrophobic, lonely and boring place to be stuck.”

This passage reveals a number of things. First off, Jules is equivocating (in the philosophical sense, and very likely not on purpose, i.e., not in order to deceive his readers) on the meaning of ecstasy. Art surely is an attempt to “get out of our heads,” as he puts it, in a lose sense to “transcend” our selves. But so is, for instance, science. Just watch Carl Sagan’s Pale Blue Dot if you doubt it.

Indeed, anything that we human beings do beyond taking care of our basic need to survive is an attempt to transcend ourselves, from paintings to music, from science to mathematics, from religion to philosophy. But it seems very strange to me to assent to the notion that our ego is a lonely and boring place. It is whatever we make of it. There is a wonderful world out there, full of other, fascinating human beings. There is a vast universe out there, full of wonders beyond our imagination. What sort of a small mind could possibly find that either lonely or boring?

How do we actively seek ecstasy, according to Jules? “In its most common-garden variety, we can seek what the psychologist Mihaly Csikszentmihalyi called ‘flow.’ By this he meant moments where we become so absorbed in an activity that we forget ourselves and lose track of time. We could lose ourselves in a good book, for example, or a computer game. The author Geoff Dyer, who’s written extensively on ‘peak experiences,’ says: ‘If you asked me when I’m most in the zone, obviously it would be playing tennis. That absorption in the moment, I just love it.’ … Or we turn to sex, which the feminist Susan Sontag called the ‘oldest resource which human beings have available to them for blowing their mind.’”

Of course. And I lose myself, or experience flow, in all sorts of experiences, including — bizarrely, I know — while writing blog posts or books. But none of this has anything whatsoever to do with Jules’ starting point, which, remember, was the perception of a deeper reality about the world. One can be a perfectly thoroughgoing “disenchanted materialist” and still lose oneself in a game of tennis. Or in sex (I much prefer the latter.)

Jules tells us that “such everyday moments might seem a long way from the mystical ecstasy of St. Teresa of Ávila, but I would suggest that there is a continuum from moments of light absorption and ego-loss to much deeper and more dramatic ego-dissolution. Csikszentmihalyi agrees, saying that moments of flow are ‘the kind of experience which culminates in ecstasy.’”

But there is, in fact, no reason at all to think that either Jules or Csikszentmihalyi are right. Rather than a continuum I see a hopeless mix of apples and oranges, and I seriously doubt St. Teresa would appreciate her mystical views being mentioned in the same sentence as tennis playing and sex.

Yet Jules tells us that “that’s what humans have been doing for hundreds of thousands of years, through various ecstatic techniques such as strenuous dancing, chanting, fasting, self-inflicted pain, sensory deprivation or mind-altering drugs.” Okay, if those are the choices, I’ll take sex and tennis, in that order. Or perhaps a dose of my favorite drug, a dirty martini with three large olives, shaken, not stirred.

Despite his skepticism of disenchanted materialism, Jules does bring in science when it seems to favor his take on things, as many people inclined toward mysticism do: “researchers have discovered that one dose of psychedelics reliably triggers ‘mystical experiences’ — moments where people report a sense of ego-dissolution and connection to all things, including to spirit beings or God. … One dose of psilocybin helped to reduce chronic depression and addiction, and also significantly reduced the fear of death in patients with cancer.”

But, insofar as we can reasonably tell, there are no spirit beings or gods, so what psychedelics are triggering are hallucinations, defined as “a sensory experience of something that does not exist outside the mind, caused by various physical and mental disorders, or by reaction to certain toxic substances, and usually manifested as visual or auditory images” ( And while there is no doubt that drugs can help with medical conditions, that does in no way make them reliable guides to the Deep Beyond, nor does it mean we should take them to buttress our wishful thinking, in turn generated by our “lonely and boring ego.” You feel lonely? Get out and meet people. You feel bored? Read a good book, enter in conversation with the best minds humanity has ever produced. Have sex. Play tennis, even.

And then comes more (pseudo)science from the article: “A 1979 study by the Buddhist teacher Jack Kornfield in California found that 40 per cent of participants on a two-week meditation retreat reported unusual experiences such as rapture and visions (including hellish visions). Kornfield writes: ‘From our data it seems clear that the modern psychiatric dismissal of these so-called ‘mystical’ and altered states as psychopathology … is simply due to the limitations of the traditional Western psychiatric mental-illnesses oriented model of the mind.’”

Uhm, no. What the study shows is that meditation can trigger side effects of the hallucinatory type. Which may still be acceptable if meditation provides benefits to its practitioners (it does, I myself practice), but, again, is absolutely no reason to reject “Western” science (i.e., science). If you have hallucinations while taking drugs you are normal. If you have them at frequent random intervals in your regular life you should see a psychiatrist.

Jules gives us another fascinating personal testimony: “I spent a year exploring the world of charismatic Christianity, including the globally renowned Alpha course, and eventually succumbed to the ecstasy myself. It happened in a church in Pembrokeshire filled with Pentecostal pensioners. Suddenly, I felt filled with a force that knocked me back and took my breath away. It felt like proof. The preacher asked if anyone wanted to commit their life to Jesus and, at the back of the church, I raised my hand. The next week, I announced my conversion on my newsletter, and around a third of my subscribers immediately unsubscribed.
A few weeks later, however, the high passed, and the doubts came back. There were still basic tenets of Christianity that I couldn’t accept, particularly the idea that the only way to God is through faith in Jesus. So what had happened? Had I been hypnotised by the preacher, the ritual and the crowd emotion? Yes, probably. But that doesn’t mean it was unhealthy or unspiritual.”

Actually, Jules, that’s precisely what it means: it was both unhealthy and unspiritual. As shown by your own rather quick de-conversion (“the high passed”), once you had time to reflect on what had happened.

“Ultimately, there’s something in us that calls to us, that pulls us out the door. Let’s find out where it leads.” Well, go ahead, but proceed with caution. As for me, I’m heading to sharing a nice dirty martini with some of my close friends.

An embarrassing moment for the skeptical movement

IMG_8356Twentyone years ago physicist Alan Sokal perpetrated his famous hoax at the expense of the postmodernist journal Social Text. It was at the height of the so-called “science wars” of the ’90s, and Sokal, as a scientist fed up with a lot of extreme statements about the social construction of science, thought of scoring a rhetorical point by embarrassing the other side. He wrote a fake paper entitled “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity,” full of scientific-sounding nonsense and submitted to the editors of Social Text. They didn’t send it out for peer reviewed and published it as a welcome example of a scientist embracing the postmodernist cause.

Sokal then proceeded to unveil the hoax in the now defunct Lingua Franca, a magazine devoted to academic affairs, thus exposing the sloppy practiced of the editors of Social Text while at the same time embarrassing the postmodernist community.

Sokal, however, is no intellectual lightweight, and he wrote a sober assessment of the significance of his stunt, for instance stating:

“From the mere fact of publication of my parody I think that not much can be deduced. It doesn’t prove that the whole field of cultural studies, or cultural studies of science — much less sociology of science — is nonsense. Nor does it prove that the intellectual standards in these fields are generally lax. (This might be the case, but it would have to be established on other grounds.) It proves only that the editors of one rather marginal journal were derelict in their intellectual duty.”

Move forward to the present. Philosopher Peter Boghossian (not to be confused with NYU’s Paul Boghossian) and author James Lindsay (henceforth, B&L) attempted to replicate the Sokal hoax by trick-publishing a silly paper entitled “The Conceptual Penis as a Social Construct.” The victim, in this case, was the journal Cogent Social Sciences, which sent out the submission for review and accepted it in record time (one month). After which, B&L triumphantly exposed their stunt in Skeptic magazine.

But the similarities between the two episodes end there. Rather than showing Sokal’s restraint on the significance of the hoax, B&L went full blast. They see themselves as exposing a “deeply troubling” problem with the modern academy:

“The echo-chamber of morally driven fashionable nonsense coming out of the postmodernist social ‘sciences’ in general, and gender studies departments in particular … As we see it, gender studies in its current form needs to do some serious housecleaning.”

And (a large chunk of especially influential people in) the skeptic community joined the victory parade:

“We are proud to publish this exposé of a hoaxed article published in a peer-reviewed journal today.” (Michael Shermer)

“This is glorious. Well done!” (Sam Harris)

“Sokal-style satire on pretentious ‘gender studies.'” (Richard Dawkins)

“New academic hoax: a bogus paper on ‘the conceptual penis’ gets published in a ‘high-quality peer-reviewed’ journal.” (Steven Pinker)

“Cultural studies, including women’s studies, are particularly prone to the toxic combinations of jargon and ideology that makes for such horrible ‘scholarship.'” (Jerry Coyne)

Except that a mildly closer look shows that Boghossian and Lindsay are no Sokals, and that the hoax should actually be treated as an embarrassment for the skeptic community. Let’s do a bit of, ahem, deconstructing of the conceptual penis affair.

(i) Like the Sokal hoax, the sample size is n=1. Since Boghossian teaches critical thinking, he ought to know that pretty much nothing can be concluded from that sort of “sampling” of the relevant population. That’s why Sokal properly understood his hoax as a rhetorical success, a way to put the spotlight on the problem, not of showing anything broader than “that the editors of one rather marginal journal were derelict in their intellectual duty.”

(ii) The B&L paper was actually rejected by the first journal it was submitted to, NORMA: The International Journal for Masculinity Study. Boghossian and Lindsay admit this, but add that they were “invited” to resubmit to Cogent Social Sciences, which is handled by the same prestigious Taylor & Francis publishing group that handles NORMA. The reality is that NORMA itself doesn’t make it even on the list of top 115 publications in gender studies, which makes it an unranked journal, not a “top” one. also, if you check Cogent Social Sciences’ web site you will see that it operates independently of Taylor & Francis. Oh, fun fact: NORMA’s impact fact is a whopping zero… And remember, it actually rejected the paper.

(iii) The “invitation” to resubmit to Cogent Social Sciences was likely an automated email directing the authors to an obvious pay-to-publish vanity journal. See if you can spot the clues from the journal’s description of their acceptance policies. First, authors are invited to “pay what they can” in order to publish their papers; second, they say they are very “friendly” to prospective authors; lastly, they say that they do not “necessarily reject” papers with no impact. Does that sound to you like a respectable outlet, in any field?

(iv) But isn’t Cogent Social Sciences said to be “high quality” by the Directory of Open Access Journals (DOAJ)? It may be, but the DOAJ is community run, has no official standing, and to make it on its list of recommended publications a journal “must exercise peer-review with an editor and an editorial board or editorial review…. carried out by at least two editors.” Even vanity journals easily meet those criteria.

All of the above said, I am indeed weary of “studies” fields, of which women and gender studies are just a couple of examples. As I’ve written in the past, my experience actually interacting with some faculty and students in those programs has been that they do have a tendency to insularity, which could be remedied by integrating them into the appropriate classic departments, like philosophy, history, comparative literature, and the like. That, in fact, was the original intention when these programs first appeared decades ago, and my understanding is that it was the traditional departments that did not want to go down that route, in order to protect their turf, faculty lines, and students tuition money.

It is also the case that many in “X Studies” programs embrace left-leaning politics and see themselves as activists first, scholars next. This is a problem, as the two roles may lead to conflict, in which activism may prevail at the expense of sound scholarship. But the problem isn’t confined to X Studies, as it is found, for instance, in ecology (where a lot of practitioners are also involved with environmentalist organizations), cultural anthropology (protection, not just study, of indigenous populations), and frankly even critical thinking and philosophy. I have made a career of studying pseudoscience (academically) while at the same time advocating on behalf of science and reason (blogs, books, articles, podcasts). So the two activities shouldn’t be seen as ipso facto incompatible (as, for instance, social psychologist Jonathan Haidt does). But one does need to thread cautiously nonetheless.

Finally, my observation by talking to colleagues in X studies and reading some of their papers (an approach that Boghossian and Lindsay boast of having rejected, because they apparently know a priori that it’s all bullshit), is that there is a tendency to embrace a form of environmental determinism — as opposed to its genetic counterpart — about human cognitive and cultural traits. This attitude is not scientifically sound, and it even generates internal conflict, as in the case of some radical feminists who reject any talk of being “trapped in the wrong body” by transgender people. As someone who has actually studied gene-environment interactions I am extremely skeptical of any simplistic claim of either genetic or environmental determination. Human beings are exceedingly complex and inherently cultural organisms, and the best bet is to assume that pretty much everything we do is the highly intricate result of a continuous interplay among genes, developmental systems, and environments.

So yes, X Studies are potentially problematic, and they probably ought to undergo academic review as a concept, as well as be subjected to sustained, external scholarly criticism. But this is absolutely not what the B&L stunt has done. Not even close.

And of course, for balance, let’s remember that science too is subject to disturbingly similar problems (thanks to Ketan Joshi for this brief summary, to which many, many more entries could easily be added — here is a similarly good take):

* Andrew Wakefield, a British anti-vaccination campaigner, notoriously managed to publish a fraudulent paper in the (really) prestigious medical journal Lancet in 1998.

* A US nuclear physics conference accepted a paper written entirely in autocomplete.

* A trio of MIT graduate students created an algorithm that produces fake scientific papers, and in 2013 IEEE and Springer Publishing (really seriously academic publishers) found a whopping 120 published papers that had been generated by the program.

* A paper entitled “Get me off your fucking mailing list” was accepted for publication by a computer science journal.

* A 2013 hoax saw a scientific paper about anti-cancer properties in a chemical extracted from a fictional lichen published in several hundred journals.

And of course let’s not forget the current, very serious, replication crisis in both medical research and psychology. Or the fact that the pharmaceutical industry has created entire fake journals in order to publish studies “friendly” to their bottom line. And these are fields that — unlike gender studies — actually attract millions of dollars in funding and whose “research” affects people’s lives directly.

But I don’t see Boghossian, Lindsay, Shermer, Dawkins, Coyne, Pinker or Harris flooding their Twitter feeds with news of the intellectual bankruptcy of biology, physics, computer science, and medicine. Why not?

Well, here is one possibility:

“American liberalism has slipped into a kind of moral panic about racial gender and sexual identity that has distorted liberalism’s message” — Michael Shermer, 18 November 2016

“Gender Studies is primarily composed of radical ideologues who view indoctrination as their primary duty. These departments must be defunded” –Peter Boghossian, 25 April 2016

Turns out that a good number of “skeptics” are actually committed to the political cause of libertarianism. This is fine in and of itself, since we are all entitled to our political opinions. But it becomes a problem when it is used as a filter to inform your allegedly critical thinking. And it becomes particularly problematic when libertarian skeptics go on a rampage accusing others of ideological bias and calling for their defunding. Self-criticism before other-criticism, people — it’s the virtuous thing to do.

This latest episode does not, unfortunately, surprise me at all. It fits a pattern that has concerned me for years, as someone who has been very active within the movement and who still identifies with its core tenets. When Steven Pinker openly embraces scientism, turning an epistemic vice into a virtue; or when atheists think that their position amounts to anything more than a negative metaphysical stance — and think that being nasty about it is the way forward; or when atheism, skepticism and scientism are confused with each other for ideological purposes; then I get seriously worried about the future of a movement that has so much potential to help keep the light of reason alive in a society that desperately needs it.

The Boghossian and Lindsay hoax falls far short of the goal of demonstrating that gender studies is full of nonsense. But it does expose for all the world to see the problematic condition of the skeptic movement. Someone should try to wrestle it away from the ideologues currently running it, returning it to its core mission of critical analysis, including, and indeed beginning with, self-criticism. Call it Socratic Skepticism(TM).


Update: Steven Pinker has admitted on Twitter that the hoax was a bad idea: “‘Gender studies’ is an academic field that deserves criticism, but The ‘Conceptual Penis’ hoax missed the mark.”

Book Club: The Edge of Reason, 1, the eternal God argument

After having spent some posts examining Paul Feyerabend’s Philosophy of Nature, it’s time to tackle the second entry in Footnotes to Plato’s book club: Julian Baggini’s The Edge of Reason, A Rational Skeptic in an Irrational World. Julian is a founding editor of The Philosophers’ Magazine, and has written a number of acclaimed books in popular philosophy before. The Edge of Reason attempts to strike a, well, reasonable balance between fashionable postmodernist-inspired rejection of rationality (which, arguably, gave us the dreadful age of “post-truth”) and the older and equally unsupportable rationalist-positivist faith in reason’s essentially unlimited powers.

Continue reading

Socrates: ancient Humanist?


Socrates, Roman National Museum, photo by the Author

As part of my ongoing occasional series aiming at bringing some of my own technical papers to the attention of a wider public (after all, what the hell is the point of doing scholarship if it only benefits other scholars?), below I reprint a paper I recently published in The Human Prospect. It inquires on the possibility of interpreting Socrates as a proto-Humanist of sorts, and it therefore includes a discussion of Humanism as a philosophy of life, as well its likely stemming from the ancient Greco-Roman tradition of virtue ethics (via the mediation of the Renaissance Humanists, which were informed by, and yet were reacting against, medieval Christianity).

Continue reading

Should we be fearing death?

Epicurus, National Roman Museum, photo by the author

Epicurus, National Roman Museum, photo by the author

Death, therefore, the most awful of evils, is nothing to us, seeing that, when we are, death is not come, and, when death is come, we are not. (Epicurus, Letter to Menoeceus)

Death is one of the major issues in human life, to put it mildly. Because we are blessed and cursed with self-awareness, we know we are mortal, so one of our problems is how to deal with the prospect of our own demise. A lot of religious and philosophical thinking as well as, lately, scientific research, has gone into this. Seneca famously wrote that the point of philosophy is to learn how to die, since death is the ultimate test of who we are. And things don’t seem to have changed much in that department over the past two thousand years.

Continue reading

Richard Dawkins

Richard DawkinsIf you are following at all the skeptic / atheist / humanist / freethought movement(s) (henceforth, SAHF), last week has been an exciting and/or troubling one for you. First, the announcement that the Richard Dawkins Foundation had merged with (or taken over, depending on whom you ask) the venerable Center for Inquiry, up until then the chief remaining operation established by one of the founding fathers of modern skepticism and humanism, Paul Kurtz.

Then, a mere six days later, the organizers of the North East Conference on Science and Skepticism (NECSS), likely to soon become the major skeptic conference in North America (given the apparent demise of The Amazing Meeting), dropped a bombshell: Dawkins was being disinvited — probably a first in his career — on grounds of yet another obnoxious tweet he had thoughtlessly sent out to his 1.35 million followers.

Continue reading