Sunday, February 20, 2011

Increasing Boredom

I've finally started reading Sam Harris's most recent book, The Moral Landscape. While I'm not Harris's biggest fan (I've always felt that he completely lacks the optimism and wit that make his peer Richard Dawkins so worth reading), I was still interested to see how one of the big New Atheist authors would handle my own area of specialization, ethics.

It's still too soon for me to make any pronouncements about that, having just scratched the surface of the book, but this endnote from chapter 1 leaves me skeptical about Harris's entire project:
...I am convinced that every appearance of terms like "metaethics," "deontology," "noncognitivism," "antirealism," "emotivism," etc., directly increases the amount of boredom in the universe. My to start a conversation that a wider audience can engage with and find helpful. Few things would make this goal harder to achieve that for me to speak and write like an academic philosopher.
As someone who spent two years working on a thesis that dealt with almost all of the concepts that Harris thinks so increase boredom in the universe, I obviously disagree with his assessment; but that's beside the point. There are two things that really bother me about this, and make me suspect that The Moral Landscape is going to be shallow and, itself, quite boring.

The first is that Harris seems to be completely missing the point of popular scientific and philosophical literature, which is to present complex ideas in such a way that audiences without an academic background in those areas can begin to understand them. One does not do that by avoiding the subject, but by writing about it clearly and as simply as possible (and, yes, with minimal jargon). Here's another reason I prefer Richard Dawkins. His recent book The Greatest Show On Earth, in which he presents the evidence that supports the theory of evolution, doesn't shy away from dense subjects like genetics, geology, etc. It tackles them head on, and Dawkins proves that he has a gift for writing about them in such a way that lay-audiences can grasp their relevance to evolutionary theory, even if we couldn't afterwards teach a class on the subjects. Harris is essentially saying that his goal is simply to avoid subjects that might bore the reader (or him). That's terrible popularizing.

It's also a double standard. When I took a course on the philosophy of mind in my first semester as a grad student, we read a lot of pure neuroscience (Harris's own area of research, and one that is of central importance in The Moral Landscape) early on. I found most of it deadly dull, but trudged ahead anyway because I needed to know it to understand the more philosophical (and to me, more interesting) papers we would be reading later on. Harris simply assumes that what is interesting to him is interesting to everyone, and that what isn't interesting to him isn't worth talking about, even if it directly impacts the thesis of his book. That's not true, and I'm willing to bet that it will leave his argument feeling insubstantial in the end.

The second problem with this is that it makes Harris sound like one of the religious people who have come in for such scathing criticism in his other work. Yesterday I linked to Jerry Coyne's thorough rebuttal to Mark Vernon's incredibly ill-informed article on evolution. I suspect that Vernon feels similarly about evolution as Harris does about moral philosophy. He recognizes that it's an important topic (if only because it's often held up as a problem for a position he's committed to defending) and feels compelled to pontificate on it, but all that research he would have to do to actually understand the subject is just so boring. There's nothing wrong with being dilettantish, if all you're concerned about is cocktail party conversation. But if you're going to present yourself as some sort of authority, you really ought to take the time to learn about your subject in detail, even the parts that don't interest you much.

I'm going to continue to read The Moral Landscape, hopefully with an open mind. It is, of course, possible that Harris is some kind of philosophical savant, and that he'll be able to present answers to questions that actual philosophers have been debating for centuries, despite finding the subjects too boring to meaningfully engage. However, I have a feeling that, in the end, Harris will be revealed to have taken a vital metaethical question for granted, and we'll see that he had another reason for refusing to engage with the subject: he has no good argument for his underlying assumptions.

Saturday, February 19, 2011

Read Someone Else

You should be reading Jerry Coyne's great blog Why Evolution Is True anyway, but if you aren't, a post from earlier today might persuade you. Since I likely won't have time to write anything substantive today, it seems only fair to link to something far more substantive than what I usually write.

I have to admit, though, there's something about arguments like the one being critiqued there that I find fascinating. It's not that they're persuasive, because they aren't. But like I said in my previous post about quantum nonsense, I really wonder what it must be like to believe what some people believe. I want to know what they experience when they completely misrepresent an entire field of study. It seems too simple to say that they're being intentionally deceptive.

Friday, February 18, 2011

The 1,000 Console Future

Already having announced a new gaming handheld and a smart phone, it now looks like Sony is preparing to release an Android tablet, as well. These three new devices, combined with Nintendo's 3DS, Apple's three mobile game-playing devices, the wide array of Android devices, Windows Phone 7, OnLive, the three current-gen home consoles, and Steam serving both PC and Mac, make the "one console future" that Denis Dyack was evangelizing four years ago look pretty silly.

Of course the current proliferation of platforms won't be able to continue indefinitely, but its existence is still a good thing for now. If any one of Sony's experiments pays off, it will be because stiff competition from Apple, Nintendo and Microsoft drove them to make a device that did things that others didn't. The same can be said of Apple, Nintendo, HTC, and so on. Things may be starting to stagnate a little, with everyone focusing a little too much on keeping up with Apple rather than surpassing them, but for now it's good enough that a thousand flowers are blooming.

That's why I disagree, at least in part, with Chris Kohler's opinion piece on the subject at Wired. He compares the current situation to the mid '90s, when Sega, Philips and 3DO were flooding the market with hardware, creating confusion among consumers that would eventually lead to all three getting out of hardware development entirely. That analogy is good, but not perfect.

For one thing, devices like the 3DO, CD-i, 32X and SegaCD were, themselves, confusing. The former two wanted to be more than just video game consoles, but they failed completely at creating an identity beyond the nebulous concept of "multimedia" devices. Nobody, even the people creating content, knew what the hell multimedia was, which meant they couldn't make a compelling case for why the general public should care about it. The SegaCD and 32X might have made sense on their own, but they didn't as add-ons for the Genesis released in such close proximity to each other.

Sony doesn't have the same problem. People will know at a glance what the NGP, the Xperia Play, and the tablet (if it exists) are, because they've seen them before. Consumers have the concept of "handheld game system," "smartphone" and "tablet". There's no confusion with the hardware. There's a temptation to say that the Android marketplace is where the real confusion will come in, but let's not forget that Android phones are outselling all other smartphones as of 2010.

All this is not to say that Sony is in the clear--R&D costs on these devices must be astronomical, and none of them are going to be cheap (except maybe the Play). But I'm not ready to say just yet that they're a bad thing, or that the general proliferation of devices is a bad thing. I like choice, and like the innovation that comes from competition. There are smart and dumb ways to compete, but it's way too soon to make judgments about which strategy is which at this point.

Thursday, February 17, 2011

On Offense

Despite reading a fair number of atheist blogs, I entirely failed to miss this controversy (and some responses to it) from a little over a week ago. Which is too bad, because it covers a couple of topics that have been a flies in my personal ointment for a while now.

If you don't want to read all of those links, here's the long and short of it. At an atheist conference in Alabama, a panel discussion was held on the problem of attracting more women to the movement. The panel consisted of five men and one woman, and one of the men repeatedly used the word "female" in a way that at least one audience member found troubling. When she commented, she was cut off by the panelist with a rude joke, and angrily left the room. Cue explosive bickering on several blogs.

Not to sound completely milquetoast, but I can identify with both sides here. During my brief time as the music director at my university's radio station, one of our DJs was an outspoken feminist who was so offended by the word "female" that she would spend her shifts going through our back-catalog and crossing out any use of the word (usually in a context like "this band has a female vocalist") and writing "woman" above it.

This struck me as a case of doing surgery with a chainsaw rather than a scalpel. She had complained to management before about comments from some male co-workers making her uncomfortable, and her concerns had been taken seriously, so I was never sure why she didn't talk to anyone about the "female" issue. As at least one of her "corrections" was on an entry I wrote, I would have liked to have had the chance to explain my word choice, which had more to do with thinking "female" reads better than "woman" in some contexts than any desire (conscious or otherwise) to dehumanize women.

On the other hand, though, there are words that are similarly grating on me. For example, even though I know plenty of women who refer to other women as "chicks," I find that particular term cringe-inducing. I can't explain it; it has always bothered me, and likely always will. I personally think that my annoyance with the word "chick" is more justifiable than others' annoyance with the word "female," but really the whole issue is so subjective that finding common ground is likely to be no small task.

And that's the real problem here: determining whether particular offenses are worth speaking out against, or if they're minor annoyances that we just need to swallow. For example, I've decided that "chick" isn't worth fighting against, but "fag" is; but again, I know LGBT people who laugh off "fag" in cases that send me into a rage.

Am I irrational for being disgusted enough with the word "fag" to criticize those who use it? Was the woman at the conference who was offended by "female" irrational? That's a difficult question. I think that, to some extent, I am irrational. "Fag" offends me regardless of its context, but "retard" doesn't. That's at least inconsistent. But what are the options? Are we left with a dichotomy which says that either everyone should be offended by everything, or no one should be offended by anything? Neither option seems particularly appealing.

The closest thing that I can propose to an answer is that we all need more of that consciousness raising that Richard Dawkins likes to talk about, on a wide range of issues. That doesn't mean that we all become hypersensitive and humorless. But it does mean that we start paying more attention to how we use potentially loaded words. Words aren't inherently offensive; they become offensive because of the history of their usage. But just as understanding that history is essential to understanding whatever offense they may or may not cause, it's also essential in moving beyond mere offense.

I think there's a reason that very few people get up in arms about the prolific use of racial slurs in Quentin Tarantino's scripts, but an entire campaign was launched to try to stop kids from using "gay" as a pejorative. We assume that Tarantino knows the history of the words he uses, and uses them to the end of crafting a suitably sleazy world for his characters. To the extent that they elicit laughter, it's because audience and director are both aware of just how inappropriate they are. That is, we've had our consciousness raised.

Compare that to the kid on Xbox Live who uses "fag" like most people use commas. He doesn't know the history of that word, and isn't using it to any end other than being abrasive (or worse, he uses it simply because he hasn't thought about it). That kid is in desperate need of consciousness raising.

So what we need isn't more offense, more righteous anger, or more calls for those who disagree with us to have a sense of humor. It's more understanding, not just of those who have been offended, but those who do the offending. Nobody needs to be singled out and scorned--we all need a chance to explain ourselves.

Wednesday, February 16, 2011

Nostalgia For Nostalgia For a Time That Never Was

Occasionally I get incredibly nostalgic for the early '90s J-Pop subgenre called Shibuya-kei. It was some of the prettiest, most playful music ever created, and we'll probably never hear anything like it again.

Tuesday, February 15, 2011

They Want To Misunderstand

This article from the New York Times, about a group of biologists who set out to educate kids in the rural U.S. about evolution in honor of Darwin Day, has some good news and some bad news. It suggests that people in general aren't as aghast at the idea of their kids being taught about evolution as we might have suspected, but it also suggests that those who want the subject taught more widely are still making one small but important mistake.

First the good news:
The group’s small-town hosts took their own precautions. A high school principal in Ringgold, Va., sent out permission slips so parents could opt out of sending their children to the event (two did).
Only two sets of parents opting out of letting their kids learn basic science is, all things considered, pretty good. Yes, it could have been better (it could have been zero), but it still has to have been fewer than what the event's organizers, and maybe even the principal, were expecting. Maybe I'm too hopeful, but what this suggests to me is that the teaching of evolution is not actually as controversial as its shrillest opponents would have us believe.

Speaking of those opponents, though, some educators may still be making it too easy for them to go on willfully misrepresenting the theory of evolution. I think the most important step educators can take in making evolution clear to younger students is that there is no intentionality behind it. Of course phrasing it that way would open up a philosophical can of worms that teachers, understandably, would not want to deal with. Still, I think explanations like this don't go far enough in making the point:

Dr. [Craig] McClain, who wrapped up his Nebraska-Montana tour at a middle school on Monday, found himself explaining how giant squid evolved. 
“Smaller squids get eaten by everything,” he said. “It’s not a very good lifestyle to have.”
Hopefully McClain went on to make it clear that the change in squid size was driven by the fact that smaller squids died off, while their larger neighbors survived to pass on their genes to future generations, resulting in a larger population overall, and that this process played out over millions of years. Given that evolution's theistic critics love to claim that evolution happens by "random chance," and that change in species would require a driving intentional force, the above response doesn't go far enough. It leaves it open for some shifty apologist to say "What makes more sense--that the squids got bigger because they wanted to, or because God wanted them to?"

I don't know what McClain's full answer was, or what came before, so I don't want to suggest that he wasn't doing his job properly. But I have heard evolution's staunchest defenders talk about this subject in ways that are too ambiguous. When a large portion of your audience is primed to misunderstand you, you have to work harder.

Saturday, February 12, 2011


In a new post at his blog Why Evolution Is True, Jerry Coyne takes apart the work of yet another scientist with New Age leanings writing for the Huffington Post. Since I've spent the last two days going after the religious tendency to use subjective experience in dubious ways, it's only fair that I point to a more secular ideology based on the same thing.

The post Coyne is critiquing is nothing new if you've read even a paragraph of someone like Deepak Chopra. It's a mangling of quantum mechanics which takes the technical notion of "observers" to mean "humans looking at things" (see Sokal and Bricomont's Fashionable Nonsense for a more thorough discussion of this common mistake), then assumes that anything true of subatomic particles must be true of all sorts of macroscopic objects. It then concludes that since reality is a construct of (human) observers, there's no reason it can't go on indefinitely.

I once applied for a job at a "natural foods" grocery store, and while waiting on an interview there, I overheard a hilarious conversation. Two young hippies were sitting at a table drinking Naked fruit juice and loudly talking about how all disease is imaginary, and if you can stop believing doctors and "the government" when they tell you you're sick, you'll become immortal. The passion with which they were discussing this nonsense was funny, but I also have to admit that I was a bit envious. What must it be like to really and truly believe that we're in complete control of reality? Surely it's more than a little intoxicating.

I feel the same way when I read peddlers of quantum nonsense. I think that some of them are cynical, and just pushing this stuff on readers who don't know any better (and don't want to), but still, for those who do believe, what a different life it must be. Of course it will end the same way as mine--the inevitabilities of disease or old age will put a permanent stop to our continued observations, and no amount of positive thinking will be able to stop it.

But I confess that I sometimes wonder, in the meantime, which of us is having a better time.