Sunday, February 20, 2011

Increasing Boredom

I've finally started reading Sam Harris's most recent book, The Moral Landscape. While I'm not Harris's biggest fan (I've always felt that he completely lacks the optimism and wit that make his peer Richard Dawkins so worth reading), I was still interested to see how one of the big New Atheist authors would handle my own area of specialization, ethics.

It's still too soon for me to make any pronouncements about that, having just scratched the surface of the book, but this endnote from chapter 1 leaves me skeptical about Harris's entire project:
...I am convinced that every appearance of terms like "metaethics," "deontology," "noncognitivism," "antirealism," "emotivism," etc., directly increases the amount of boredom in the universe. My goal...is to start a conversation that a wider audience can engage with and find helpful. Few things would make this goal harder to achieve that for me to speak and write like an academic philosopher.
As someone who spent two years working on a thesis that dealt with almost all of the concepts that Harris thinks so increase boredom in the universe, I obviously disagree with his assessment; but that's beside the point. There are two things that really bother me about this, and make me suspect that The Moral Landscape is going to be shallow and, itself, quite boring.

The first is that Harris seems to be completely missing the point of popular scientific and philosophical literature, which is to present complex ideas in such a way that audiences without an academic background in those areas can begin to understand them. One does not do that by avoiding the subject, but by writing about it clearly and as simply as possible (and, yes, with minimal jargon). Here's another reason I prefer Richard Dawkins. His recent book The Greatest Show On Earth, in which he presents the evidence that supports the theory of evolution, doesn't shy away from dense subjects like genetics, geology, etc. It tackles them head on, and Dawkins proves that he has a gift for writing about them in such a way that lay-audiences can grasp their relevance to evolutionary theory, even if we couldn't afterwards teach a class on the subjects. Harris is essentially saying that his goal is simply to avoid subjects that might bore the reader (or him). That's terrible popularizing.

It's also a double standard. When I took a course on the philosophy of mind in my first semester as a grad student, we read a lot of pure neuroscience (Harris's own area of research, and one that is of central importance in The Moral Landscape) early on. I found most of it deadly dull, but trudged ahead anyway because I needed to know it to understand the more philosophical (and to me, more interesting) papers we would be reading later on. Harris simply assumes that what is interesting to him is interesting to everyone, and that what isn't interesting to him isn't worth talking about, even if it directly impacts the thesis of his book. That's not true, and I'm willing to bet that it will leave his argument feeling insubstantial in the end.

The second problem with this is that it makes Harris sound like one of the religious people who have come in for such scathing criticism in his other work. Yesterday I linked to Jerry Coyne's thorough rebuttal to Mark Vernon's incredibly ill-informed article on evolution. I suspect that Vernon feels similarly about evolution as Harris does about moral philosophy. He recognizes that it's an important topic (if only because it's often held up as a problem for a position he's committed to defending) and feels compelled to pontificate on it, but all that research he would have to do to actually understand the subject is just so boring. There's nothing wrong with being dilettantish, if all you're concerned about is cocktail party conversation. But if you're going to present yourself as some sort of authority, you really ought to take the time to learn about your subject in detail, even the parts that don't interest you much.

I'm going to continue to read The Moral Landscape, hopefully with an open mind. It is, of course, possible that Harris is some kind of philosophical savant, and that he'll be able to present answers to questions that actual philosophers have been debating for centuries, despite finding the subjects too boring to meaningfully engage. However, I have a feeling that, in the end, Harris will be revealed to have taken a vital metaethical question for granted, and we'll see that he had another reason for refusing to engage with the subject: he has no good argument for his underlying assumptions.

Saturday, February 19, 2011

Read Someone Else

You should be reading Jerry Coyne's great blog Why Evolution Is True anyway, but if you aren't, a post from earlier today might persuade you. Since I likely won't have time to write anything substantive today, it seems only fair to link to something far more substantive than what I usually write.

I have to admit, though, there's something about arguments like the one being critiqued there that I find fascinating. It's not that they're persuasive, because they aren't. But like I said in my previous post about quantum nonsense, I really wonder what it must be like to believe what some people believe. I want to know what they experience when they completely misrepresent an entire field of study. It seems too simple to say that they're being intentionally deceptive.

Friday, February 18, 2011

The 1,000 Console Future

Already having announced a new gaming handheld and a smart phone, it now looks like Sony is preparing to release an Android tablet, as well. These three new devices, combined with Nintendo's 3DS, Apple's three mobile game-playing devices, the wide array of Android devices, Windows Phone 7, OnLive, the three current-gen home consoles, and Steam serving both PC and Mac, make the "one console future" that Denis Dyack was evangelizing four years ago look pretty silly.

Of course the current proliferation of platforms won't be able to continue indefinitely, but its existence is still a good thing for now. If any one of Sony's experiments pays off, it will be because stiff competition from Apple, Nintendo and Microsoft drove them to make a device that did things that others didn't. The same can be said of Apple, Nintendo, HTC, and so on. Things may be starting to stagnate a little, with everyone focusing a little too much on keeping up with Apple rather than surpassing them, but for now it's good enough that a thousand flowers are blooming.

That's why I disagree, at least in part, with Chris Kohler's opinion piece on the subject at Wired. He compares the current situation to the mid '90s, when Sega, Philips and 3DO were flooding the market with hardware, creating confusion among consumers that would eventually lead to all three getting out of hardware development entirely. That analogy is good, but not perfect.

For one thing, devices like the 3DO, CD-i, 32X and SegaCD were, themselves, confusing. The former two wanted to be more than just video game consoles, but they failed completely at creating an identity beyond the nebulous concept of "multimedia" devices. Nobody, even the people creating content, knew what the hell multimedia was, which meant they couldn't make a compelling case for why the general public should care about it. The SegaCD and 32X might have made sense on their own, but they didn't as add-ons for the Genesis released in such close proximity to each other.

Sony doesn't have the same problem. People will know at a glance what the NGP, the Xperia Play, and the tablet (if it exists) are, because they've seen them before. Consumers have the concept of "handheld game system," "smartphone" and "tablet". There's no confusion with the hardware. There's a temptation to say that the Android marketplace is where the real confusion will come in, but let's not forget that Android phones are outselling all other smartphones as of 2010.

All this is not to say that Sony is in the clear--R&D costs on these devices must be astronomical, and none of them are going to be cheap (except maybe the Play). But I'm not ready to say just yet that they're a bad thing, or that the general proliferation of devices is a bad thing. I like choice, and like the innovation that comes from competition. There are smart and dumb ways to compete, but it's way too soon to make judgments about which strategy is which at this point.

Thursday, February 17, 2011

On Offense

Despite reading a fair number of atheist blogs, I entirely failed to miss this controversy (and some responses to it) from a little over a week ago. Which is too bad, because it covers a couple of topics that have been a flies in my personal ointment for a while now.

If you don't want to read all of those links, here's the long and short of it. At an atheist conference in Alabama, a panel discussion was held on the problem of attracting more women to the movement. The panel consisted of five men and one woman, and one of the men repeatedly used the word "female" in a way that at least one audience member found troubling. When she commented, she was cut off by the panelist with a rude joke, and angrily left the room. Cue explosive bickering on several blogs.

Not to sound completely milquetoast, but I can identify with both sides here. During my brief time as the music director at my university's radio station, one of our DJs was an outspoken feminist who was so offended by the word "female" that she would spend her shifts going through our back-catalog and crossing out any use of the word (usually in a context like "this band has a female vocalist") and writing "woman" above it.

This struck me as a case of doing surgery with a chainsaw rather than a scalpel. She had complained to management before about comments from some male co-workers making her uncomfortable, and her concerns had been taken seriously, so I was never sure why she didn't talk to anyone about the "female" issue. As at least one of her "corrections" was on an entry I wrote, I would have liked to have had the chance to explain my word choice, which had more to do with thinking "female" reads better than "woman" in some contexts than any desire (conscious or otherwise) to dehumanize women.

On the other hand, though, there are words that are similarly grating on me. For example, even though I know plenty of women who refer to other women as "chicks," I find that particular term cringe-inducing. I can't explain it; it has always bothered me, and likely always will. I personally think that my annoyance with the word "chick" is more justifiable than others' annoyance with the word "female," but really the whole issue is so subjective that finding common ground is likely to be no small task.

And that's the real problem here: determining whether particular offenses are worth speaking out against, or if they're minor annoyances that we just need to swallow. For example, I've decided that "chick" isn't worth fighting against, but "fag" is; but again, I know LGBT people who laugh off "fag" in cases that send me into a rage.

Am I irrational for being disgusted enough with the word "fag" to criticize those who use it? Was the woman at the conference who was offended by "female" irrational? That's a difficult question. I think that, to some extent, I am irrational. "Fag" offends me regardless of its context, but "retard" doesn't. That's at least inconsistent. But what are the options? Are we left with a dichotomy which says that either everyone should be offended by everything, or no one should be offended by anything? Neither option seems particularly appealing.

The closest thing that I can propose to an answer is that we all need more of that consciousness raising that Richard Dawkins likes to talk about, on a wide range of issues. That doesn't mean that we all become hypersensitive and humorless. But it does mean that we start paying more attention to how we use potentially loaded words. Words aren't inherently offensive; they become offensive because of the history of their usage. But just as understanding that history is essential to understanding whatever offense they may or may not cause, it's also essential in moving beyond mere offense.

I think there's a reason that very few people get up in arms about the prolific use of racial slurs in Quentin Tarantino's scripts, but an entire campaign was launched to try to stop kids from using "gay" as a pejorative. We assume that Tarantino knows the history of the words he uses, and uses them to the end of crafting a suitably sleazy world for his characters. To the extent that they elicit laughter, it's because audience and director are both aware of just how inappropriate they are. That is, we've had our consciousness raised.

Compare that to the kid on Xbox Live who uses "fag" like most people use commas. He doesn't know the history of that word, and isn't using it to any end other than being abrasive (or worse, he uses it simply because he hasn't thought about it). That kid is in desperate need of consciousness raising.

So what we need isn't more offense, more righteous anger, or more calls for those who disagree with us to have a sense of humor. It's more understanding, not just of those who have been offended, but those who do the offending. Nobody needs to be singled out and scorned--we all need a chance to explain ourselves.

Wednesday, February 16, 2011

Nostalgia For Nostalgia For a Time That Never Was

Occasionally I get incredibly nostalgic for the early '90s J-Pop subgenre called Shibuya-kei. It was some of the prettiest, most playful music ever created, and we'll probably never hear anything like it again.















Tuesday, February 15, 2011

They Want To Misunderstand

This article from the New York Times, about a group of biologists who set out to educate kids in the rural U.S. about evolution in honor of Darwin Day, has some good news and some bad news. It suggests that people in general aren't as aghast at the idea of their kids being taught about evolution as we might have suspected, but it also suggests that those who want the subject taught more widely are still making one small but important mistake.

First the good news:
The group’s small-town hosts took their own precautions. A high school principal in Ringgold, Va., sent out permission slips so parents could opt out of sending their children to the event (two did).
Only two sets of parents opting out of letting their kids learn basic science is, all things considered, pretty good. Yes, it could have been better (it could have been zero), but it still has to have been fewer than what the event's organizers, and maybe even the principal, were expecting. Maybe I'm too hopeful, but what this suggests to me is that the teaching of evolution is not actually as controversial as its shrillest opponents would have us believe.

Speaking of those opponents, though, some educators may still be making it too easy for them to go on willfully misrepresenting the theory of evolution. I think the most important step educators can take in making evolution clear to younger students is that there is no intentionality behind it. Of course phrasing it that way would open up a philosophical can of worms that teachers, understandably, would not want to deal with. Still, I think explanations like this don't go far enough in making the point:


Dr. [Craig] McClain, who wrapped up his Nebraska-Montana tour at a middle school on Monday, found himself explaining how giant squid evolved. 
“Smaller squids get eaten by everything,” he said. “It’s not a very good lifestyle to have.”
Hopefully McClain went on to make it clear that the change in squid size was driven by the fact that smaller squids died off, while their larger neighbors survived to pass on their genes to future generations, resulting in a larger population overall, and that this process played out over millions of years. Given that evolution's theistic critics love to claim that evolution happens by "random chance," and that change in species would require a driving intentional force, the above response doesn't go far enough. It leaves it open for some shifty apologist to say "What makes more sense--that the squids got bigger because they wanted to, or because God wanted them to?"

I don't know what McClain's full answer was, or what came before, so I don't want to suggest that he wasn't doing his job properly. But I have heard evolution's staunchest defenders talk about this subject in ways that are too ambiguous. When a large portion of your audience is primed to misunderstand you, you have to work harder.

Saturday, February 12, 2011

Observe!

In a new post at his blog Why Evolution Is True, Jerry Coyne takes apart the work of yet another scientist with New Age leanings writing for the Huffington Post. Since I've spent the last two days going after the religious tendency to use subjective experience in dubious ways, it's only fair that I point to a more secular ideology based on the same thing.

The post Coyne is critiquing is nothing new if you've read even a paragraph of someone like Deepak Chopra. It's a mangling of quantum mechanics which takes the technical notion of "observers" to mean "humans looking at things" (see Sokal and Bricomont's Fashionable Nonsense for a more thorough discussion of this common mistake), then assumes that anything true of subatomic particles must be true of all sorts of macroscopic objects. It then concludes that since reality is a construct of (human) observers, there's no reason it can't go on indefinitely.

I once applied for a job at a "natural foods" grocery store, and while waiting on an interview there, I overheard a hilarious conversation. Two young hippies were sitting at a table drinking Naked fruit juice and loudly talking about how all disease is imaginary, and if you can stop believing doctors and "the government" when they tell you you're sick, you'll become immortal. The passion with which they were discussing this nonsense was funny, but I also have to admit that I was a bit envious. What must it be like to really and truly believe that we're in complete control of reality? Surely it's more than a little intoxicating.

I feel the same way when I read peddlers of quantum nonsense. I think that some of them are cynical, and just pushing this stuff on readers who don't know any better (and don't want to), but still, for those who do believe, what a different life it must be. Of course it will end the same way as mine--the inevitabilities of disease or old age will put a permanent stop to our continued observations, and no amount of positive thinking will be able to stop it.

But I confess that I sometimes wonder, in the meantime, which of us is having a better time.

Friday, February 11, 2011

Living Subjectivism

Yesterday, I wrote about the importance to religious believers of arguing from personal experience. Peter Kreeft's "Twenty Arguments For the Existence of God" has several arguments which hinge on appealing to our perceptions of the world in order to prove that God exists. One, the Argument From Degrees Of Perfection, goes so far as to conclude that there must be an objective facts about value judgments, and one of those objective facts is God's existence.

The argument goes something like this: We notice that things vary in their characteristics, and those variations can be thought of as falling along points on a continuous spectrum. We often make value judgments about a thing based on its position on the spectrum for one of its characteristics. And we can extrapolate from those judgment-making behaviors that there is a fact of the matter at which we're trying to arrive.

This isn't entirely unreasonable. There often is a fact of the matter about whether our perceptions reflect reality, so the assumption that there could be a fact of the matter about our value judgments isn't totally unjustified. Anyone who has ever had a passionate argument about music, art, food, etc. should be able to grasp the reasoning here. If we don't, at some level, think that our judgments are right, what's the point of such arguments?

But that we behave as if there's a fact of the matter doesn't guarantee that there is, and the rest of the argument should make that clear. It continues, and here one can't help but think of the ontological argument, that we also make value judgments about beings. And if there is a fact of the matter about those value judgments, that means that there must be an objective standard against which they are being made--in other words, there must be at least one perfect being. That perfect being is God.

You may want to counter that the argument goes off the rails here--that nothing as grandiose as a perfect being is needed to explain why we value people who treat us well over people who treat us poorly. Kreeft responds that this is further proof of his point, for if there were no fact of the matter, if all value judgments were subjective, we would feel no compulsion to argue. "You can speak subjectivism," he says, "but you cannot live it."

That reply presents a false dichotomy, though. We "live subjectivism" all the time. We might argue about our tastes, and really want to convince others that they should share our tastes. But that doesn't require an objective truth, or at least not an objective truth about value. I might argue that you should appreciate my favorite band because I want you to support them by buying their albums. I also might just like a good argument.

This argument has the same problem that I pointed out about apophatic theology in  an earlier post; to the extent that it sounds good at all, it only sounds that way as long as you're talking about the right things. Point to an area of easy consensus, like whether it's better to be loved or not, and nobody's going to complain too much if you conclude that, yes, it's really better to be loved than not.

The problem comes when you start trying to "live objectivism" about less high-minded topics. For example, if I like cold beer and you like warm beer, we couldn't put it down to different tastes. One of us must have the wrong tastes. The same goes for any value judgment you can think of. The Beatles or The Rolling Stones; chocolate or vanilla; cats or dogs; Halo or Call of Duty. If you can't live subjectivism, then there's an objective fact of the matter in every case. Does anyone actually believe that?

Of course my incredulity doesn't prove anything. Kreeft could always just bite the bullet and say that, yes, absurd as it may seem, there really is a right answer to every question of value, no matter how trivial. But again, consider the consequences of that. Not only would there be a fact about whether warm beer is better than cold beer; there would be a fact about whether a 51.02309340923475 degrees Fahrenheit glass of beer is better than a 51.02309340923474 degrees Fahrenheit glass of beer. Don't even think about saying that there could be a range of right answers dependent upon the ability of beer tasters to actually detect differences. That would be living subjectivism.

That we value some things more than others proves only that we do, in fact, value some things more than others. It doesn't prove that some of our values are right and others wrong, and it certainly doesn't prove that a perfect being exists somewhere out there to validate some of our values and invalidate others.

Thursday, February 10, 2011

An Important Difference

Last night, Sam Harris tweeted a link to a debate he had participated in a few years ago on the subject of religion's role in the end of the world. The debate itself is mostly awful, as these things tend to be, with the moderator helping Harris's opponent pile on him to the point that a frustrated audience member finally cries out "You're the moderator!" I'll link to the first clip, though, for the sake of a reference:



Something interesting did emerge in the course of the debate, though, namely the extent to which Harris's religious opponent seems to rely on his own personal experience as his most important source of evidence, and the extent to which his compatriot the moderator is instantly willing to accept that experience as better evidence than any of the statistics Harris offers in return.

Of course we already knew that this was true of religious people to some extent. Powerful personal experience of the numinous does, in my mind, serve as some evidence in favor of religious belief, even if it is vastly outweighed by counter evidence from philosophy and science. But if one accepts what one takes to be one's most important convictions entirely on the basis of such evidence, we shouldn't be surprised to find him privileging that same sort of evidence in all cases.

Of course no human enterprise can ever get off the ground without input garnered from personal experience. But the value of science is that it gives us a set of tools for refining that kind of experience. This is why repeatability is vital to the scientific enterprise. If only one person is ever able to perform an experiment, we shouldn't put as much stock in it as if every competent individual who performed it had the same results.

This is why when Harris says that a single well-designed opinion poll of people in Gaza would be of more value than the personal experiences that his opponent continually cites, he's absolutely correct. It can be hard to accept that one's experiences might not be a perfect mirror of reality, but that's exactly why we shouldn't stick our fingers in our ears and pretend that they are, oh yes they are! At least if what we really care about is getting at the truth and not just maintaining our sense of smug self-satisfaction.

Wednesday, February 9, 2011

All Things Considered, I Feel Fine

You've probably already heard about Acitivision's announcement from earlier today--you know, the one that finally killed off the Guitar Hero franchise, at least for the time being. As someone with some pretty fond memories of the series, I feel like I should be more upset than I am.

My first experience with music games was on a trip to Japan in 1999. Beatmania was all the rage there at the time, but in one arcade, I also found what looked like a rarely used Guitar Freaks cabinet. A friend and I, both of us guitarists, tried it out, and found it utterly inscrutable.

Because of that, I had to be pressured into trying Guitar Hero, and in fact missed out on the series until its second installment. When I finally gave in and borrowed a copy of Guitar Hero 2, I was instantly sucked in. It helped that the songs I was playing were (mostly) the real deal, or at least pretty good covers. But beyond that, everything about the game just clicked for me. The best songs in that game even sort of approached something not entirely unlike the feeling of playing a real guitar.

Most of them were just damned fun, though. They continued to be damned fun when Guitar Hero 3 came out the next year. I didn't think much about Guitar Hero one day not being fun until the first time I encountered Rock Band. I remember there being a fair bit of skepticism in the game journalism community, about whether people really wanted a full band's worth of plastic instruments in their living rooms, but the first time I played "Wave of Mutilation" in Rock Band in a crowded Best Buy, I knew my relationship with Guitar Hero was over.

The same held for the vast majority of my friends, and Rock Band parties became our preferred way of interacting for at least a couple of years. Meanwhile, Activision ran Guitar Hero into the ground with way too many iterations, each one more mediocre than the last. As a result, I can't say I feel much of anything at today's announcement. Honestly, it was past due. Even if Activision brings back Guitar Hero at some point (they certainly left the possibility open), I doubt I'll be interested. Sometimes, it's best to let the past be the past.

Tuesday, February 8, 2011

Shedding Its Light Silently

A few days ago, I wrote about Square Enix's long fall from grace. Tonight, I was reliving happier times for the company, or at least for Square, by finishing the DS remake of Final Fantasy IV.

This is my favorite game in the Final Fantasy series, and is still my sentimental favorite game of all time (though Persona 4 edges it out just slightly as my absolute favorite these days). I first played FFIV when it was released on the SNES as Final Fantasy II, and the impact it made on me as a gamer was immeasurable. For years, I gave up playing anything but RPGs because of it. I had long conversations with friends about the future of gaming, in which I expected games to continue to look more or less like FFIV, but devote the ever-increasing storage capacity of cartridges and, later, CDs, to creating increasingly realistic worlds.

Of course things didn't turn out that way. The vision I had of a single player MMORPG that sacrificed graphical advances for story and player freedom is just now starting to be realized in games like Fallout 3, but it's still not much like I imagined it. What I had imagined was a kind of graphical Turing test, in which players could communicate with AI controlled NPCs and have more or less realistic conversations (with more or less realistic consequences). There would be an over-arching story, but the player would have as close to ultimate freedom in exploring it as possible.

Looking back on FFIV now, it's kind of amazing that it inspired that vision in me. The game is, by modern standards, aggressively linear. While it does at least give you access to an airship relatively early on, even that freedom is kind of illusory. Sure, you can fly anywhere in the overworld, but unless you've hit the right story triggers, you won't find much to do.

I suppose my desire for a maximally interactive FFIV came from the fact that (again, at the time) its characters were the most engaging I had encountered in a game, and I wanted more of that. Looking back now, it handles its more dramatic moments pretty ridiculously, but it still has some scenes that have scarcely been touched by subsequent games.

For example, I still love the scene in which Cecil, after having (inadvertently) destroyed Rydia's village and more or less kidnapped her in the aftermath, starts to win her over by turning on his own army to protect her. Rydia doesn't come around immediately, and Cecil doesn't pout when she fails to. He understands her anger and resentment, and gives her room to deal with it.

It's moments like that that have kept FFIV high on my list of favorite games, especially with American games becoming increasingly violent and misogynistic and Japanese games getting so lost in their own tropes that many of them have become self-parody. I think we'll see games that improve on those moments of realistic human interaction in the future, and I look forward to it. But in the meantime, I'd still be willing to play that JRPG Turing test that my friends and I dreamed up all those years ago.

Monday, February 7, 2011

Humorless

Inception is full of brontosaurean effects, like the city that folds over on top of itself, but the tone is so solemn I felt out of line even cracking a smile.”
That quote from David Edelstein’s review of Inception has been with me since I first read it, kicking around in my mind as something I knew I'd have something to say about at some point.

Edelstein's review, despite his protests to the contrary, reads as if he went into the movie looking for reasons not to crack a smile. That aside, though, the above complaint stood out to me as particularly unreasonable. Edelstein makes no effort to support it, which leads me to believe he takes it to be self-evident that artists owe their audiences a smile here and there.

Or rather it would lead me to believe that, if I didn't find it unbelievable. Surely nobody actually believes that all movies ought to have comedic elements. Even those who leap to decry any work which treats its subject matter seriously as "self-serious" (there seems to be no worse sin in contemporary art) are probably inconsistent. Would Schindler's List have benefited from more ironic winks at the audience? If not, why? Surely it's not only that it's a movie about the Holocaust. Surely artists can treat other subjects seriously without being mocked for taking themselves too seriously.

A reasonable argument would be that Inception's subject matter is too fantastical to be treated as seriously as Nolan treats it. But that opens up its own can of worms. Is it ever acceptable to treat fantastical subject matter completely seriously? I see no reason to believe that it's not, though I would agree that it's incredibly difficult. The Twilight movies are an extreme example of why. Their dour-faced teenage vampires and werewolves mope about the perpetually rainy Pacific Northwest as if immortal creatures have nothing more important on their minds than high school romances. Even if you can get caught up in such a story while it plays out, spell out the premise objectively, and it sounds ridiculous.

The reason Edelstein's criticism stuck with me all these months is that it could be directly applicable to a lot of video games. So many games these days take place in worlds so full of grizzled faces and grim architecture that it's nearly impossible not to laugh at them. But that doesn't mean that games should always break up the angst and oppression with some laughs. It just means that they should be more self-aware. In turn, self-awareness in art doesn't entail ironic detachment. It just means having an understanding of where your story fits in the big picture. The story of Killzone is closer to Inglourious Basterds than Saving Private Ryan, but it's up for debate whether its developers understand that. On the other hand, the subject matter of Metal Gear Solid could be handled seriously, even solemnly, but Kojima constantly breaks the mood with stupid and inappropriate humor.

So maybe the problem isn't that artists won't "allow" audiences to smile. Maybe it's that artists and critics alike need to think more about when smiles are really needed.

Sunday, February 6, 2011

The New Retro

Working in a game store gives you a different perspective than you're likely to get just reading about games online (or even getting involved in the discussions with the other people who talk about games online). For example, it's amazing how many people don't realize that the PSP Go can't play UMDs, or who have never connected a console to the internet. The majority of gamers are nothing like those of us who frequent trendy gaming websites and listen to their podcasts.

One trend that has become almost ubiquitous in the store at which I work is frat boys buying N64s and trying to recreate their childhood game collections. To some extent, this was inevitable; back when I first started searching for games on the internet, I was solely concerned with tracking down all the Atari 2600 games I recalled as my first gaming experiences. Lots of people who have fond gaming memories end up trying to recreate them at some point, whether that means digging their old consoles out of their parents' attic, downloading an emulator and scads of ROMs, or buying back as much as possible.

What makes this apparent trend of N64 nostalgia interesting to me is that it looks like the first steps in finally moving beyond the threadbare trend of 8- and 16-bit nostalgia that is especially problematic in indie game circles, where it has been holding developers back from exploring original ideas for at least a decade. Don't get me wrong, I don't really want to see 8-bit nostalgia replaced by 64-bit nostalgia (new ideas are almost always preferable), but it would be a refreshing change of pace. How would this kind of nostalgia look? Would artists try to recreate the N64's muddy, low-rez textures and blocky polygons? Will graphics that were ugly even in their time suddenly become as chic as squat little 8-bit sprites have become?

Also interesting is the fact that this wave of nostalgia seems to be sweeping over the terminally unhip first. The self-aware hipsters who shop at my store still flock to the NES and SNES (and sometimes PS1, but only for games that look 16-bit anyway), leading me to wonder how bad the revisionist history will be in a few years when every indie game looks like Ocarina Of Time.

My prediction: it'll be totally sick, bro.

Friday, February 4, 2011

Is Square Enix Done?

Yesterday Square Enix reported a 76.6% drop in profits from this time last year. In the current fiscal year, the company has had only two million-selling titles, Dragon Quest Monsters: Joker 2, which was released only in Japan, and Kane & Lynch 2, which of course is an Eidos property. It's almost certain that the disastrous and ill-conceived launch of Final Fantasy XIV has a lot to do with the company's steep financial decline. All this should lead longtime JRPG fans to wonder whether the one-time leaders of the genre have lost the magic.

There are some changes that, at least from the outside, appear both essential and easy. Kill Final Fantasy XIV, admit it was a mistake, and move on. Stop milking Kingdom Hearts before it becomes as meaningless as Final Fantasy has become. Re-evaluate whether re-releasing your back catalog on every imaginable platform is actually profitable.

But those are all just stabilizing measures. What Square Enix really needs is something fresh and new that will capture imaginations in the same way that Kingdom Hearts did. But given its current financial peril, it can't just try to make another Kingdom Hearts, i.e. a big-budget epic bolstered by one of the most expensive licenses on the market. Instead, it needs to take a small project and turn it into a hit.

Of course that's not much different from saying they need to make lightning strike with pinpoint accuracy. But there are at least some guidelines they could keep in mind. Look to emerging platforms like iOS and Android. Take a chance on young, untested talent rather than giving a stalwart like Nomura final say in creative matters. Don't put so much faith in well-worn genres, or at least the purest examples of those genres. There are lots of clones out there these days, and people aren't going to buy one more because it has Square Enix's name on it.

As a long-time Final Fantasy fan myself, I want to see Square Enix succeed. But I want to see them do it like the revolutionary company they once were--not the benighted old guys they seem to have become.

Thursday, February 3, 2011

You're In Business, Jonathan

Jason Schreier has written an interesting little piece for Wired about auteurs in video games. Jonathan Blow provided some quotes, and while I would really like to spend some time discussing whether someone who "thinks almost all games are pretty bad" can qualify as an auteur, something else Blow said dovetails nicely with what I was talking about yesterday.

First of all, here's the relevant excerpt:
“For someone like me, who thinks almost all games are pretty bad, and who has very specific ideas about what he wants to make … I can very definitely say that the single-leader model is good,” he said in an e-mail to Wired.com, although he noted that he and THQ are not in the same business.
It's that last bit that interests me. Blow "noted" that he, as someone who makes video games and sells them for a profit, is not in the same business as THQ, who make video games and sell them for a profit. Of course "noted" is Schreier's word choice, but it's a strange one. That's a statement that cries out for justification, and Schreier takes it for granted.

Given Blow's past comments about not making games due to "crass profit motives," I think we can guess what he means. THQ is a business, and recently an increasingly nasty one, what with CEO Brian Farrell essentially saying he wants people to pay $100 for complete games. It's understandable that Blow would want to distance himself from that. It would be even if he didn't appear to buy into the notion that making money is antithetical to making art.

Blow can try to convince us that his desire to sell games (which I can only assume he has, since he sells games) is different from THQ's, but nobody should believe him. Both want you to buy their games so they can make money. That Blow's pricing model is more consumer friendly doesn't mean he's in a different business. He isn't, and he won't be until he starts giving his games away, or only selling enough copies to cover his development costs.

You're in the video game business, Jonathan, even if you hate it.

Wednesday, February 2, 2011

Support and Business

I've strongly disliked everything Twisted Pixel has ever done, but their newest title, Gunstringer, actually sounds somewhat interesting. Far less interesting was IGN's exclusive reveal of the game, which couldn't have been lazier unless it had just been a cut and paste of a press release.

Ars Technica's Ben Kuchera was on Twitter decrying the IGN story earlier this afternoon. Kuchera seemed to be insulted that "sites that helped support" Twisted Pixel's previous games were made to wait on IGN's exclusive content to go live before they could publish their own stories.

Now I don't want to sound like Ben Paddon, but this business of game sites "supporting" indie developers worries me. I'm assuming Twisted Pixel thought the IGN exclusivity deal was in their own best interests as a business, and if they did, then I applaud them for going ahead with it rather than trying to perpetuate the myth that indie developers make games primarily to collect goodwill from press and fans. If you think it's a bad business decision, fine. If you think they should put the feelings of game journalists above promoting their games, you're delusional.

Twisted Pixel, no matter what else they might get out of making games, do it to make money. That's not meant to be a disparaging remark. They don't exist to support gaming blogs. And gaming blogs don't--or at least shouldn't--exist to support developers (even fashionable indie developers), but to report on them. That's the only relationship that's fair to readers.

Sad that it seems to be disappearing.

Tuesday, February 1, 2011

The Most Beautiful Song in the World



"Pink Orange Red" is the first song on the Cocteau Twins' EP "Tiny Dynamine." I've never heard anything more beautiful, and don't expect to.

Monday, January 31, 2011

Twisting

These days, I go into almost every movie I watch expecting a twist ending. They seem to be more or less a requirement, and are at least so common that I'm more surprised when the film I'm watching doesn't try to pull the rug out from under me than when it does.

In the past week, I've watched two twisty movies, The Last Exorcism and Shutter Island, and while watching both I thought a lot about Inception, my favorite movie of the past year, as well as Departures, a 2008 Japanese drama that I watched recently. Neither of the latter movies involve twists, but I found both more surprising than the former.

The Last Exorcism's twist ruined the movie for me. Everything preceding it would have made sense in the real world. Cotton Marcus was a huckster, an unbelieving charismatic preacher clearly based on Marjoe Gortner (and the documentary made about him, which everyone should see). Nell, the ostensibly possessed girl, didn't really do anything that a real girl who was experiencing psychological trauma couldn't have done. It left the reality of Nell's condition to the viewer, who would undoubtedly project his or her own worldview onto the character. That's nice--it always is when directors don't over-explain their films. But the last five minutes of The Last Exorcism spend all that goodwill on a cartoonish twist that suddenly made me not care about any of the characters I'd spent the past hour getting to know (and in some cases, like).

Shutter Island, on the other hand, twists from the opening scene to the final shot. It throws so many "is this really happening?" moments at the viewer that I quickly stopped caring. I couldn't identify with any of the characters, because the movie wanted me to constantly question whether they were who they claimed to be, or whether they existed at all. It was so obvious that a big reveal was going to turn everything on its ear that it seemed like a waste of energy to get involved. Storytelling is, at its heart, the art of getting people to respond emotionally to characters they know don't actually exist. When your entire premise is that the characters in your story probably aren't real, you're not telling a story anymore. You're just trying to show how clever you are.

Really, Inception and Shutter Island are very similar movies. They both feature Leonardo DiCaprio as a man whose obsession with a lost loved one drives him into an unreal world. The difference, and the reason it's possible to care about his character in Inception but not Shutter Island, is that Inception, despite taking place largely in a world the movie tells you isn't real, goes out of its way to explain to you how things work in that unreal world, and unwaveringly abides by its own rules. Shutter Island has no rules, and as a result can't really surprise. Both movies mean to leave you wondering about what their final moments mean, but only Inception makes you feel like you could construct a reasonable answer if you retraced its steps.

Departures surprised me as well, but not with any twists. It surprised me because it played with my own expectations of movies, especially romantic dramas. Relationships develop in a way that real world relationships often do, and I suspected that they would continue to do so. But main character Daigo Kobayashi proves himself to be stronger, perhaps better all around, than most people, and things resolve in a way that is in retrospect predictable, but didn't feel that way as it played out. As with Inception, I could get involved because I knew the rules. The surprise wasn't in finding out they weren't the rules after all, but in finding out that they could be subverted with enough hard work.

When I think back to some of my favorite twist endings, like those of Psycho and The Ring, it makes me sad that they've become such a cliche. I'm sure they can be made meaningful again, but not until they stop being taken for granted.

Sunday, January 30, 2011

Justifying 8-bit Love

I'm really sick of new video games that try to look like NES games, but here's one reason to still love 8-bit games:



Try doing that with a Blu-Ray.

Saturday, January 29, 2011

Dickwolves

If you don't immediately get the significance of that title, you can get caught up here and here, and finally here (scroll down to the last post). Got all that? Good.

My first reaction to this controversy was pretty much in line with that second link above. Everyone gets offended by things, and everyone has the right to voice his or her offense. I've done it myself, and I'm sure I'll do it again. It sucks when someone you like offends you, and it's even worse when you go to that person hoping to make them see your side of things and they make a t-shirt mocking you for being offended in the first place.

That doesn't mean you weren't wrong to be offended in the first place; it just sucks. Life is complicated.

I have to admit, though, reading that last post made me finally see the other side of the issue in a way I had previously failed to. I hadn't thought about what it would be like to be a woman (or maybe a man) whose life has been directly affected by rape, walking through a crowd of hundreds of sweaty adolescents (or adults for whom aging provided no escape from that desperate situation), trying to avoid making eye contact with the ones wearing that t-shirt--the one the people you thought were your friends made to mock you for being hurt by something they said, oh and also for having been hurt by sexual abuse.

That, purely and simply, is bullying, and Penny Arcade was enabling it. I don't believe for a second that that was their intent, but they're in a unique position to both make the t-shirt and provide the forum for victims to be mocked and vilified. Again, life is complicated.

The point of this post is not for me to get on my high horse and point out my own moral superiority. If anything, it's to come to grips with my own wrongness. While I support freedom of speech even when I find that speech abhorrent, I also believe in holding people responsible for what they say. I'll still defend the two comics I linked to above. It's unreasonable and harmful to demand that nobody ever talk or even joke about potentially offensive subjects.

But in retrospect, it's pretty easy to see that the t-shirt took things too far. Like Sarah Palin's gun sights ad, it encouraged solidarity between the sane (those who disagreed with Gabrielle Giffords's politics or found the world "dickwolves" funny) and the insane (those who actually wanted to kill Giffords or those who would actually commit rape). The sane always lose with that arrangement. We should avoid it.

Friday, January 28, 2011

Is It Wrong To Be Almost Perfect?

Earlier this week, PZ Myers took on a different sort of criticism of the New Atheists from what they normally receive. Writing for the Chronicle of Higher Education, Stephen Asma complained that the usual suspects' critique of religion fails because it focuses so much on the big three monotheisms, and ignores other religions, like Buddhism and animism.

That critique isn't as interesting to me as part of Myers' response, in which he essentially says he would reject a world that was perfect in (almost) every way if that perfection was brought about by acceptance of religious faith:
He really doesn't get it. He could show me a religion that is nothing but sweetness and light, happiness and good thoughts and equality for all, and it wouldn't matter: the one question I would ask is, "Is it true?" It wouldn't matter if he could show empirically that adopting this hypothetical faith leads to world peace, the voluntary abolishment of crime, the disappearance of dental caries, and that every child on the planet would get their very own pony — I'd still battle it with every fierce and angry word I could speak and type if it wasn't also shown to be a true and accurate description of the world. Some of us, at least, will refuse to drink the Kool-Aid, no matter how much sugar they put in it.
This reminds me of a question I used to pose to my intro to philosophy students when we were discussing free will. If someone told them that they could go to live in a perfect world (however they defined 'perfect'), but the condition of doing so was that they had to give up their free will, would they accept the offer? In the whole time I taught, only about three students ever said they would. Those who refused it almost uniformly said that they would rather live in a world full of pain and hatred than exist in one that was perfect, but in which they could never choose to do anything that would ruin that perfection.

For my part, I think I would accept the offer with very little further deliberation. I think that, in a perfect world (as I envision it), I would be too busy being wildly happy to worry that I couldn't wake up one morning and choose to rape or murder my neighbor. I wondered if my students meant what they said, or if most of them were just failing to really imagine what a perfect world would be like.

I don't wonder that about PZ--he's incredibly intelligent, and unlike my undergraduate students, has more than enough life experience to understand what he's rejecting. The fact that he says he would reject an offer very similar to the one I was proposing to my students makes me wonder about the morality of my willingness to accept the perfect world.

Just as I would probably sacrifice my free will to have my ideal world, I'm pretty sure I'd be willing to support a religion I knew was bogus so long as it was otherwise entirely benign. That would mean that it made minimal false claims. It wouldn't prohibit the teaching of scientific truths like evolution and the big bang; discriminate against anyone because of gender, race, sexual orientation, etc.; or censor its critics. It would probably, like animism or panentheism, make minimal claims about the nature of divinity. But I'm pretty sure my skepticism about its metaphysical ideas wouldn't trouble me much if it meant that I, and the people I love, would live happy, healthy lives with no worries about money, inequality, or nuclear annihilation.

I might not drink the Kool-Aid, but I also wouldn't run around knocking the cups out of other people's hands.

So does this make me cowardly or immoral? Is there a difference between those and pragmatism? Or is there anything I've missed that should change my mind? Feel free to enlighten me.

Thursday, January 27, 2011

Horror Bored

I don't remember the last time I watched a horror movie that I really loved. That's weird, because for several years, I hardly watched anything but horror movies, and had a long list of more that I needed to see. Most of that list feels long forgotten now, as do the times when I felt certain that almost any horror movie you put in front of me would offer at least a little excitement.

The first horror movie I ever saw was the Japanese version of Ring 2, which I saw in a theater in Osaka on my 21st birthday. In other words, I was a late bloomer to the genre, largely because of a squeamishness about gore. But I loved Ring 2, perhaps all the more because I couldn't really understand much of it through the language barrier and having not seen the first movie. When I got back to America, I started devouring horror movies, though I still shied away from the more disgusting ones.

In 2005, as I mentioned in an earlier post, I ended a horrible relationship, and spent a lot of time feeling miserable and nihilistic. Desperate for any kind of catharsis, I started watching the gore films that I had previously avoided, and eventually found that it gave me a sense of pride, having overcome a fear that I had been carrying since childhood. I came to consider myself something of a gore movie connoisseur, and remember with a sick fondness the time circumstances conspired in such a way that I ended up watching the fake Japanese snuff film Flowers Of Flesh and Blood four times in one week.

As the remake trend of the mid 2000s got into full swing, I retreated deeper into low budget, foreign and retro horror, and that's where my interest started to wane. At the risk of blaspheming, most of that stuff is a lot more interesting to read about than it is to actually watch. I can't count the number of times I would read about a movie in one of the Psychotronic guides, excitedly track it down, then spend most of the run time bored out of my mind. As I got more interested in video games, I drifted farther and farther from horror (and therefore from movies in general). And it should go without saying that Hollywood wasn't doing anything to bring me back.

My growing interest in skepticism didn't help matters either. Even early in my life when I was a Christian, I was pretty apathetic about the existence of things like ghosts, demons, and anything you could call paranormal. But reading books by the likes of James Randi and Richard Dawkins had brought me to the realization that such beliefs are completely untenable. This really hit home for me when I saw The Exorcism Of Emily Rose. A girl in the row behind me spent most of the movie crying and having to be comforted by her friends, while I was fighting to stay awake. If you don't believe in the devil, possession's just not very scary.

That's not to say a well-made horror movie can't still creep me out a bit, even if I don't buy its premise, but well-made horror movies are getting harder and harder to find. Even those with a couple of interesting ideas or good performances always seem to wreck things in the end by explaining too much. If you understand a problem, you can at least try to solve it. Real fear is not knowing what you're up against. Does anyone think that Paranormal Activity wouldn't have been scarier if it had ended with the camera on an empty bed while the young couple screamed their guts out downstairs? Did the girl coming back to (apparently) swallow the camera add anything?


Hopefully one day horror movies will interest me again--I've had some great times with them, and would like to again. But, ironically, they've gotten too cowardly to do anything that's really frightening.

Wednesday, January 26, 2011

The Apophatic Theology of Indie Games

According to apophatic (or negative) theology, God is ineffable, beyond the boundaries of human language. As such, the only meaningful way in which we can speak of God is to say what God is not. While this sort of theology has become a refuge for some modern day Christian thinkers (most notably Karen Armstrong), its roots go at least as far back as the Neoplatonic philosopher Plotinus. And now Michael Thomsen, in a piece for IGN, has applied it to indie games.
Like hipsterism, 'indie' is a state of mind better defined in terms of what it isn't. 'Indie' isn't Bobby Kotick, Wii Fit, Gears of War, or Nathan Drake. 
I like the idea of describing indie games apophatically. It gets right to the heart of how vapid and puffed-up most of the indie scene is.

Here's the problem with apophatic theology: it only works as long as everyone brings the right presuppositions to the table. In his article "God is the Question," apophatic apologist Mark Vernon writes
Whatever God might be, God is not visible: God's invisible. Whatever God might be, God cannot be defined: God's ineffable. Nothing positive is said. But nonetheless something is said of God.
Well, something is said if you've already accepted a certain fundamentally mystical idea of divinity. If you haven't, you might wonder whether speaking of God in this way actually draws a distinction between the divine and the non-existent.

There's a bigger problem, though. The apophatic view of God falls apart if you start not-saying the wrong things. We can all rub our chins and ruminate on the mystery of the ineffable, but we'd look rather silly smoking our pipes and holding forth on God's fundamental ungerbilness or unforkness.

Take another look at that list of things Thomsen says indie games aren't. Does that actually tell us anything meaningful about them? Of course it doesn't. What could it actually mean to say that a video game isn't Nathan Drake? It's a category error, like asking what purple sounds like. I realize that Thomsen was being tongue in cheek, but that doesn't mean we can't glean some insight from his comments. The indie scene, as celebrated in forums like IGF, is just like the God of apophatic theology: an artifical construct, meaningful only to those who have the right set of presuppositions.

Having said as much, though, we indie game skeptics (and there don't seem to be many of us) open ourselves up to the same claim that snarky Christian apologists often make against atheists: "Why spend so much time arguing against something you don't believe exists?" The answer is simple, though. Believing in something that doesn't exist can have negative effects.

Thomsen is right when he says that "in film and music ['indie' is] a wheezing stereotype long since discredited." By aping the same posture as indie rock and indie movies, indie games have inherited the same disease. Most of them pander every bit as blatantly as their mainstream counterparts, just to a different audience.

It's that pandering that's the real problem, and the reason that I'll keep being skeptical about indie games (and movies, and music). Take away the enablers who demand more pretension and forced quirkiness, who desperately want to define themselves negatively against some (equally imaginary) mainstream, and games will get better. What things aren't doesn't matter--what they are is everything.

Tuesday, January 25, 2011

Does Not Liking Hip-Hop Make Me a Racist?

A few months ago, I replied to a friend on Facebook who was asking people to list their 10 favorite albums. I didn't think my list was particularly controversial, especially to people who know me. Most of my favorite bands were operating in the late '80s to mid '90 in the UK. Most were part of the shoegaze movement that took off around that time. Like others who responded, most of the albums on my list landed in one or two closely related genres. It never entered my mind that my love of fey British alternative bands would surprise anyone.

It did, though, and soon the "whiteness" of my list was being mocked--exclusively by other white people. I wasn't offended, since the mockery was good natured and pretty funny. But after yesterday's post, in which I might have implied that mainstream music critics spend so much time talking about hip-hop because they don't want to be perceived as racists, I thought about the Facebook incident again. Specifically, I thought (as I did at the time) about why it was acceptable for me to be mocked for listening to so much "white" music when I would never have considered firing back that others' lists were too "black."

These are hardly profound thoughts. We're all aware of concepts like "reverse" racism, white guilt and tokenism. While I don't think that white people who love hip-hop are necessarily guilty of engaging in any of the above, I have to admit that my comments yesterday were intended to make readers question whether any are present when white hipsters heap praise on "Kanye." Of course it's completely possible for someone to genuinely  love both hip-hop and indie rock, but one wonders whether it's really possible to connect with both in the way some critics want claim to.

While I don't think there's a genre of music out there that I inherently dislike, I have to admit that very little hip-hop appeals to me. When it does, it's generally in the vein of early '90s Public Enemy and Ice Cube records, and the appeal is almost entirely technical. Those were some of the best-produced records of their time, but that's as far as my interest goes. I don't--and, thanks to my socio-economic background, can't--identify with the sentiments being expressed in the lyrics. As much as I appreciate the craftsmanship of the beats and the significance of the chaotic soundscapes, it's an intellectual appreciation. My favorite Ice Cube track doesn't move me the way even my least favorite Cocteau Twins track does.

I've always felt like this is a pretty honest assessment of the situation, and not one that leaves me open to charges of racism. I've always rejected the claim, often made by hip-hop's detractors, that it's an inherently inferior form of music since it is often based on samples and allusions rather than completely original musicianship. The same complaint has been lodged against musical styles that I love, like electronic and industrial, for years, and it has always rung just as hollow. I don't have a grudge against hip-hop, but that doesn't change the fact that none of it has ever spoken to me at a level that made me cherish it the same way I cherish records by Slowdive, The Cure, The Pet Shop Boys, and so on.

I don't think my taste in music is "too white," nor do I think that others' taste is "too black." You should listen to the music that moves you, makes you feel sorry for all those poor bastards who didn't live long enough to hear it. Otherwise, you're just wasting your time.

Monday, January 24, 2011

Music Journalists Are Incom--Oh, Never Mind

'I give props to Bruno Mars' Doo-Wops & Hooligans, but I'd rather get lost in Ariel Pink's trippy Before Today, which sounds like an album made by an alien who visited Earth in 1976, listened to a ton of AM gold, then tried to replicate the sounds he heard, from very imperfect memory, some 30 years later—check out "Can't Hear My Eyes" and "Menopause Man."'


This snippet of an article from Slate's Jonah Weiner encapsulates why I don't read much music journalism anymore. I like the idea that the purpose of criticism is ultimately to tell the audience about one's own subjective experience of a work, but there's no lazier way to do that than with tortured similes like the one quoted here. The only thing that can push me away from an article faster is when some cheeky writer decides to invent a new genre to describe a not-particularly-original artist, as when Pitchfork described Melissa Nadler's sound as "narco-folk."

Even when I do manage to get all the way to the end of a piece of music writing these days, I usually find that
every critic has pretty much the same tastes: mostly indie darlings like Vampire Weekend, as well as a few hip-hop superstars (usually Kanye West, or just Kanye as he's invariably called) thrown in to prove that they're not snobbish racists.

Weiner goes on in the same article to praise records that "[burst] with ideas and references and signifiers that can be like oxygen to people whose jobs necessitate that they find interesting, involved things to say about music all day." But is it really interesting and involved to play a public game of spot the allusion with every album you listen to? I submit that Weiner's need to point out that he got it when Vampire Weekend referenced The Source and Wire demonstrates that it is not.

I used to devour music magazines in order to discover new artists, but these days I'd vastly prefer to let Last FM or Pandora serve that purpose. I still love music, but I've found that I don't much care what artists have to say about their own work, much less what most music critics have to say about it. I imagine this has something to do with my taste for ethereal and shoegaze bands, who put sound above message. I've never wanted to hear Kevin Sheilds or Liz Fraser say what their songs are really about, because I suspect the truth couldn't possibly live up to my experiences.

Maybe if I listened to more music in which lyrics are of central importance, I'd feel differently, but then again maybe not. A few years ago when I was obsessed with Joanna Newsom's Ys, I intentionally avoided any discussion of the songs' meanings. I knew what they meant to me, and that was good enough.

I'm sure a lot of artists and critics would be appalled by this, but at least I know I'm not a hypocrite. I've written music for most of my life, and one of the most thrilling moments I ever had as a songwriter was when a friend told me what she thought one of my songs meant. She was completely wrong, but I didn't care. I was happier that she had imposed her own subjective meaning on my lyrics than I would have been if she had known exactly what I was singing about.

By now I've completely lost the plot of what I was even writing about at the beginning of this post, so I'm not going to sum up. I'm just going to implore music journalists to be more concerned with passion than the need to make sure everyone knows that they get it.

Sunday, January 23, 2011

Smugness

In my last post, I wrote about a recent trend in liberal Christianity, namely lashing out at the so-called New Atheists for failure to approach disbelief in God with a certain degree of sadness and gravitas. New Atheist writers are often deemed insufficiently intellectual on the basis of tone and unwillingness to be miserable in a godless world--but never, as far as I've seen, on the basis of specific arguments that they've made.

Despite the fact that I took Scott Stevens to task for this in my last post, I want to point out (in the interest of not being unphilosophical myself) that I don't think it's necessarily impossible to make the case that he was trying to make--namely that the only proper response to disbelief is a kind of monkish solemnity. It's just that Stevens, like the leading names in liberal Christianity, haven't even tried to make that case yet. They've asserted the conclusion, then scoffed at people like me who ask for an argument to support it as if it's self-evident even to children of below-average intellectual capacity.

In short, they do just what they accuse the New Atheists of doing: substituting smug self-confidence for rigorous argument.

This is irritating not just because hypocrisy is always irritating, but because it so willfully ignores the actual state of affairs. I'm sure there are smug atheists--there are smug people in all walks of life, and all belief systems. While I don't deny that the New Atheists have all openly mocked religion, I do reject the implication that mockery and smugness are always the same thing. This leads me to believe that the New Atheists' critics are either reading smugness into attacks on their position, or they aren't reading the New Atheists' books.

One of the most famous bits of supposed smugness in the New Atheist canon is Richard Dawkins's screed against the God of the Old Testament:
The God of the Old Testament is arguably the most unpleasant character in all fiction: jealous and proud of it; a petty, unjust, unforgiving control-freak; a vindictive, bloodthirsty ethnic cleanser; a misogynistic, homophobic, racist, infanticidal, genocidal, filicidal, pestilential, megalomaniacal, sadomasochistic, capriciously malevolent bully.

That's harsh, but it's not smug. If it were smug, it would be focused on deriding the stupid faithful who believe in the existence of a divine super-being that any idiot can see doesn't exist. But Dawkins isn't doing that. His ire is focused on what he believes is a fictional character, not the people who think the character isn't fictional.

Moreover, while I'm sure Dawkins believes wholeheartedly in this assault, I'm equally sure that he recognizes the humor inherent in it. And a good part of that humor plays on the thought of how much trouble Dawkins is in if it turns out that he's wrong. That's not smugness--it's practically self-effacing.

So why shouldn't Dawkins and the other New Atheists have a sense of humor about disbelief? How is this any different from saying that Christians should go shuffling around, staring down at their navels in abject misery because they've realized they don't believe in Allah. Of course some people do, in fact, feel sad when they cease to believe in God. But it doesn't follow that all of us must feel sad about our lack of belief.

To say that we must is...well, smug.

Saturday, January 22, 2011

"Unphilosophical"

Scott Stevens, Religion and Ethics Editor of ABC Online (that's the Australian Broadcasting Corporation, not the American Broadcasting Company) has written a piece called "The Poverty Of the New Atheism," the content of which it should be easy to discern from the title. As with most such pieces, Stevens is unable to discuss the ideas of the New Atheists without spending most of his time criticizing their delivery. In fact, he's so focused on the tone of Dawkins et. al. that he forgets to even make an argument.

Therefore, as much as I'd like to, I can't say that Stevens commits an ad hominem fallacy when he calls the New Atheists "unphilosophical," because to call it a fallacy would imply that Stevens has made an error in his reasoning. He hasn't, because he's not reasoning so much as lashing out.

There seems to have been an innate sense among atheists that the Promethean quest to topple the gods demands a certain seriousness and humility of any who would undertake it. Hence those atheists worthy of the name often adopted austere, chastened, almost ascetic forms of life - one thinks especially of Nietzsche or Beckett, or even the iconic Lord Asriel of Phillip Pullman's His Dark Materials trilogy - precisely because our disavowed idolatrous attachment manifest in practices and habits and cloying indulgences, and not simply in beliefs (this was Karl Marx's great observation about the "theological" dimension of Capital).
By comparison, the "New Atheists" look like sensationalist media-pimps: smugly self-assured, profligate, unphilosophical and brazenly ahistorical, whose immense popularity says rather more about the illiteracy and moral impoverishment of Western audiences than it does about the relative merits of their arguments.
In short, Stevens is all for criticism of religion so long as one does it with a proper attitude of reverence for what one is criticizing. This is nothing new--it's become quite the popular sentiment in liberal Christian circles since the New Atheists' rise to fame. But it's not an argument. It has nothing to do with what Dawkins, Harris, Hitchens and Dennet have said, and everything to do with how they have said it. Stevens does go a bit farther than previous critics, though, in directing some scorn at the New Atheists' readers, as well. (So is Stevens also illiterate, or has he not read any of the books he's criticizing?)

But for all this abuse to constitute a fallacy, Stevens would have to be using it as a counter argument, and as nearly as I can tell, he's not. He yammers on about Marx for a bit, but it's just paragraphs of throat clearing.

But Marx's critique of religion has an unexpected twist, a barb in the tail that implicates [the New Atheists] by exposing the deeper complicity concealed by their cynicism. For, to be "dis-illusioned" in Marx's sense is not heroically to free oneself from the shackles and blinders of religious ideology and thus to gaze freely upon the world as it truly is, as Dawkins and Harris and even Hitchens would suppose.
Rather, to be "dis-illusioned" is to expose oneself to the anxiety of the bare, unadorned fact of one's existence, to live unaided beneath what Baudelaire called "the horrible burden of Time, which racks your shoulders and bows you downwards to the earth".
Being one of the New Atheists' illiterate readers, I barely feel qualified to comment, but it seems to me that Marx's "dis-illusionment" is precisely to "gaze freely upon the world as it truly is." Unless Stevens is suggesting that exposing "oneself to the anxiety of the bare, unadorned fact of one's existence" is somehow to construct another illusion to replace the religious illusion one has cast aside.

This still isn't an argument, though. It's just a contrast, and a confused one at that. Stevens accuses the New Atheists of cynicism, but praises Marx for recognizing that the world as it truly is, is a pretty terrible place. Illiterate though I may be, I have at least read Dawkins, and his message is that life gets better when you cast off religious illusions, not worse. What's really bowing us downwards to the earth is not reality, but the delusion that an omnipotent, omniscient being is judging us every moment of every day. The good news is that we all have the innate capacity to see through the delusion, if we want to use it. That's not cynicism--if anything, it's too optimistic.

Stevens, on the other hand, thinks that getting rid of illusions should leave us with nothing but misery. Why? His concern with being philosophical might have prompted him to offer an argument, rather than insults and appeals to Marx. Stevens, it seems, is not even unphilosophical--he's just angry and, yes, cynical. That doesn't make him wrong, of course--it just makes him uninteresting.

Friday, January 21, 2011

Good Games Will Save Us

A Twitter friend of mine (and a damn fine writer), Mark Whitney, has written a new piece entitled "Indies Save the Industry." If you know anything about my feelings toward indie games, you know that title makes it impossible for me not to respond.

The thrust of Mark's article is actually pretty inoffensive, even to me: independently developed games have a lot to offer the video game industry. As I said when discussing my favorite games of 2010 a while back, I agree wholeheartedly with this, and the reason I generally have such scorn for indie games is that they so often squander their independence by rehashing ideas that were threadbare twenty years ago. 

Mark also states upfront that he's not actually sure that the game industry needs saving, so I won't devote too much time to taking the title apart. It's hyperbole, but I have no problem with using eye-catching headlines to get people to read articles that are far more nuanced than those headlines suggest. I agree with Mark's basic premise, that the video game industry could use a big infusion of creativity.

What I disagree with is Mark uncritically repeating the meme that indie games are, by their very nature, more innovative than anything put out by major publishers. The only example he cites is Narbacular Drop, the game that would become Portal after Valve hired the students who made it. It's a good example of what a small team with a great idea can do when they don't have a marketing department demanding more blood and bigger boobs.

But Narbacular Drop undermines Mark's premise as much as it supports it. Most people who know about the game know about it precisely because the team that made it was absorbed into a corporate entity that gave them the resources to perfect the concepts with which they were experimenting. If anything, Narbacular Drop is an argument that indies should sell out to the company they think is most likely to put the most faith in their best ideas. Yes, the idea was conceived while the developers were independent, but it wasn't fully realized until they got their hands on some dirty corporate money.

The reasoning behind the "indies as creative saviors" meme (when there is any reasoning at all) comes from the notion that creativity is always best when it's unconstrained. But that's a huge oversimplification. Look through any artist's sketchbook, listen to a band's demo tapes, and it quickly becomes clear that refinement and editing are essential to the creative process. Knowing when and how to edit one's own work--which good ideas are good for the project at hand and which are good in a vacuum--is essential. As much as indie fans are loath to admit it, there are people at major publishers who have great insights into this subject. 

Of course I would be wrong to pretend that nobody in the indie scene realizes this basic fact. We have to keep in mind that video games cost a lot of money to produce, and even indie developers who really want to polish their ideas often don't have the resources to do so. As Mark points out, there's a reason Blizzard's games are as good as they are: they can take as long as they want to release them. Indies don't have that luxury, but they could get closer to it by partnering with publishers or producers who believe in their ideas--and have access to the coffers of an EA, a THQ, or even an Activision.

Of course I realize that my vision of a world in which well-to-do publishers sink money into worthy small-budget projects for the betterment of everyone is utopian. But it's no less misguided than the assertions of people who have looked into the dregs of Xbox Indie Games or the iTunes and Android marketplaces and still claim that indies are, on the whole, more innovative than anything in the mainstream. Good ideas can come from any size team with any size budget, and indies who partner with major publishers aren't necessarily going to be drained of all creativity. We only hurt ourselves as gamers when we pretend otherwise.

Thursday, January 20, 2011

Snow!

Here's the weather I woke up to this morning:



And here's Lovesliescrushing, my favorite band to listen to when it's snowing:

Wednesday, January 19, 2011

How To Launch a Console

Today, Nintendo announced the launch line-up for its next handheld console, the 3DS. Like many launch line-ups, its completely underwhelming. I've already pre-ordered a 3DS, and still plan on buying it (I mostly pre-ordered because that's the only way to get a new console in the first year of its existence these days), but it did convince me that Kid Icarus: Uprising is likely the first game I'll buy for the system.

That got me thinking, though--have any consoles had less interesting launch line-up? Have any been wildly better? Before I answer that, go take a look at the 3DS launch line-up. Done? Do any of those titles drive you wild with desire? Didn't think so.

The truth is, though, very few of the most influential consoles in history had great launches. In fact, the most successful launches tend to be the lest impressive quantitatively. Systems that launch with over 10 games tend to offer very little of note (e.g. nearly every post-Dreamcast console), though there are some exceptions (the U.S. launch of the NES, which included Excitebike and Super Mario Bros., and the Colecovision, whose 12 launch games were all perfectly geared to showing off the machine's technical superiority).

Launches with few games can be great successes, if enough care is taken. This is something Nintendo used to recognize, as the SNES and N64 both had strong, if vanishingly slim, debut line-ups. Even the Turbo-Grafx 16's launch was half-great, featuring Monster Lair (yes!) and Fighting Street (no!). But of course low quantity doesn't guarantee success, as the Atari Lynx, Sega Master System and Virtual Boy (among others) readily attest.

All we can glean from this is that there's no perfect formula for a launch. Ideally, the games will play to a system's strengths, as the SNES and, to a far lesser extent, Sega CD launches did. Having good games isn't necessarily enough--the games need to show why the new console is superior to anything else on the market (which is why Sony purportedly tried to minimize the number of 2D games released early in the Playstation's life). A long list of titles may look good in press releases, but odds are most of them will be forgettable, so it's probably better to focus on putting out a few highly polished games rather than loads of half-assery that will be populating bargain bins within six months.

Unfortunately, the latter is what I see when I look at the 3DS line-up. Nintendo's first-party games tend to hold their value, though I have my doubts about Steeldiver, given Nintendo's history with submarine games. Super Street Fighter IV will probably be the biggest hit, but personally I hate fighting games on handhelds, so I have no interest. And as for Resident Evil: The Mercenaries--when has plucking a bonus mode out of a full game and selling it as a standalone product ever worked out well?

Time may prove me wrong, but I don't expect to get much use out of my 3DS in the first few months. Still, when Christmas rolls around and good games finally start trickling out, I'll be glad not to fall victim to another of Nintendo's notorious, demand-increasing hardware shortages.

Tuesday, January 18, 2011

Oversimplifying Abortion

Apparently a creationist website called Uncommon Descent is attempting to get 25 influential atheists (some may say the 25 most influential) to answer a list of grossly oversimplified questions about what rights, if any, fetuses and newborn babies have. Seriously, go read that list of "five simple questions," for which UD is only willing to accept yes or no answers. The fact that anyone thinks such questions are simple goes a long way in explaining the vitriol fundamentalists spew at anyone who sees abortion as a viable, morally acceptable action.

Well, I may not be one of the 25 most influential living atheists, and I'm definitely not going to play by UD's rules, but I am going to answer the questions. Let's just get this out of the way upfront, though:


I realize these questions are designed to make abortion rights supporters look like monsters, but I'm also hoping that giving reasonable answers to unreasonable questions will help expose the intellectual deficiency and/or dishonesty behind the way a segment of the pro-life side tries to frame the debate.

(a) Do you believe that a newborn baby is fully human? This one's actually easy. Of course a human baby is fully human. But wait..."fully" human? Why "fully"? Given the context of these questions, the inclusion of that word seems to suggest that there could be such a thing as a half-human, or maybe a quarter-human, and further that such an entity might have a different moral status than a full human.

The only way I would consider giving a different response to this question is if the context were evolutionary, and we were speaking hypothetically about a transitional form between humans and a new species that evolved from humans. But then the question would be irrelevant. Species divisions are arbitrary and man-made, and there's often quite a bit of argument about which species a given specimen belongs to. So maybe a newborn human baby could be closer to a non-human species than either of its parents, but it's hardly a yes or no question.

(b) Do you believe that a newborn baby is a person? I loathe this question. The argument over what constitutes personhood is a huge quagmire that is, in my opinion, best avoided even when doing ethics. I vastly prefer to leave everyone to her own definition and look for solutions that work no matter what that definition is. This is why I love Judith Jarvis Thompson's defense of abortion. Rather than arguing over whether fetuses are persons, she simply concedes that they are (on whatever definition of "person" you prefer) then argues for why abortion is still morally permissible. 

That said, I tend to view personhood as falling on a continuum between the ability to experience flourishing and suffering, and the lack of said ability. As with all these "simple" questions, there's no easy answer, so I won't give one. Instead, I'll just say that I think it's possible for babies to be more or less persons.

(c) Do you believe that a newborn baby has a right to life? My position on rights is that talk about them is meaningful only within a legal context. You have rights only to the extent that your society grants them. Of course we can argue about whether granting more or fewer rights is in a society's best interests, but convincing me that rights are inherent to human beings would require a truly brilliant bit of metaphysical reasoning.

So technically my answer to this question is "If we're talking about America, then yes, I believe that a newborn baby has a right to life, because it is granted under American law." But let's remember that there's a huge ceteris paribus there, and if a baby is born with no chance to lead a productive life (that is, with no chance to experience anything approaching a reasonable degree of flourishing), then it has less of a right to life than a baby born with a normal capacity for flourishing. Still, I want to make it clear that I see this as an entirely legal question, and not a moral one.

(d) Do you believe that every human person has a duty towards newborn babies, to refrain from killing them? This one actually is simple: absolutely not. Having given that answer, though, let's quibble a bit about semantics. How does the questioner define "killing"? Is he referring to murder, or any action that would directly cause the end of the baby's life? If it's the latter, then it's easy to think of examples in which "killing" a baby is morally permissible, e.g. taking a child with no chance to ever breathe on its own off of life support. If it's murder, then I would still say "no," but with the caveat that it's much harder to think of actual world examples in which I would say such an act is morally permissible. In terms of possible worlds, though, it's much easier. Imagine a possible world in which you could travel back in time and murder Hitler in the crib. Would you have a moral duty to do so? I say you would.

(e) Do you believe that killing a newborn baby is just as wrong as killing an adult? This question comes with the same semantic quibble as (d). If we're talking about causing an end of life, then there are medical cases in which "killing" is not morally wrong, and may even be morally required. If we're talking about murder, then yes, I think a newborn baby has the same rights (again, using my legalistic definition of "rights") as an adult. It's not inconceivable to me that there could be a case in which murdering a baby is the moral thing to do, but I don't know of any such cases in the actual world as it exists at this moment.

My answers to these "simple" questions aren't as important as pointing out how messy and difficult the questions actually are. Only someone firmly in the grip of dogma (religious or otherwise) could think otherwise.


Monday, January 17, 2011

Momus - Hypnoprism

The fact that I only discovered yesterday that Momus, once one of the top figures in my list of musical heroes, had put out a new album last year. Hypnoprism isn't a bad record--it's certainly better than his last, JoeMus--but it has me wondering again why he won't just write the songs he seems to want to write.

For the last few years, I've felt like Momus was losing his way. JoeMus was the culmination of that, unbearably self-indulgent and self-sabotaging. After three increasingly absurd and experimental albums (Oskar Tennis Champion, Otto Spooky, and Ocky Milk), JoeMus felt like it wanted to be a catchy, frivolous pop record, but couldn't escape its curator's desire to remain a part of the experimental scene in which he had worked so hard to be accepted. Almost every song found a way to destroy itself just when it was getting good.

Hypnoprism may be a light at the end of a long tunnel, or it may just be a sign that Momus has run out of ideas. On one hand, it eschews the excessive use of pitch shifting and time-stretching that made JoeMus so tedious. On the other, it sounds a little like a greatest hits collection populated with songs that were never actually released before. There are distinct echoes of Momus's heyday, with several songs recalling the 1996 album Ping-Pong, and others going even farther back than that. It's nice to hear that Currie can still write a (relatively) straightforward pop tune, but it's also a little worrying to hear him going back to the same old themes yet again. "Evil Genius" and "Death Ruins Everything" are songs he's already written several times over, and "Datapanik" (a eulogy for a crashed hard drive) might have been witty in 1999, but now, it's a little too universal.

At least Hypnoprism ends on a high note, back-loaded as it is with the two best songs on the album. The first, a cover of Josef K's unrecorded "Adoration," is the most successful realization of Afropop yet in Momus's catalog (and surely inspired by his recent collaboration with Vampire Weekend). The second, "Strawberry Hill" is a brilliant pastiche of Ryuichi Sakamoto and Herbie Hancock. It's both the freshest and most refreshing track on the album--if only more of it could have sounded like that!

Two things still appear certain: Momus will continue to make records, possibly until the day he dies; and he will continue to be equal parts fascinating and frustrating.