Sunday, February 20, 2011

Increasing Boredom

I've finally started reading Sam Harris's most recent book, The Moral Landscape. While I'm not Harris's biggest fan (I've always felt that he completely lacks the optimism and wit that make his peer Richard Dawkins so worth reading), I was still interested to see how one of the big New Atheist authors would handle my own area of specialization, ethics.

It's still too soon for me to make any pronouncements about that, having just scratched the surface of the book, but this endnote from chapter 1 leaves me skeptical about Harris's entire project:
...I am convinced that every appearance of terms like "metaethics," "deontology," "noncognitivism," "antirealism," "emotivism," etc., directly increases the amount of boredom in the universe. My goal...is to start a conversation that a wider audience can engage with and find helpful. Few things would make this goal harder to achieve that for me to speak and write like an academic philosopher.
As someone who spent two years working on a thesis that dealt with almost all of the concepts that Harris thinks so increase boredom in the universe, I obviously disagree with his assessment; but that's beside the point. There are two things that really bother me about this, and make me suspect that The Moral Landscape is going to be shallow and, itself, quite boring.

The first is that Harris seems to be completely missing the point of popular scientific and philosophical literature, which is to present complex ideas in such a way that audiences without an academic background in those areas can begin to understand them. One does not do that by avoiding the subject, but by writing about it clearly and as simply as possible (and, yes, with minimal jargon). Here's another reason I prefer Richard Dawkins. His recent book The Greatest Show On Earth, in which he presents the evidence that supports the theory of evolution, doesn't shy away from dense subjects like genetics, geology, etc. It tackles them head on, and Dawkins proves that he has a gift for writing about them in such a way that lay-audiences can grasp their relevance to evolutionary theory, even if we couldn't afterwards teach a class on the subjects. Harris is essentially saying that his goal is simply to avoid subjects that might bore the reader (or him). That's terrible popularizing.

It's also a double standard. When I took a course on the philosophy of mind in my first semester as a grad student, we read a lot of pure neuroscience (Harris's own area of research, and one that is of central importance in The Moral Landscape) early on. I found most of it deadly dull, but trudged ahead anyway because I needed to know it to understand the more philosophical (and to me, more interesting) papers we would be reading later on. Harris simply assumes that what is interesting to him is interesting to everyone, and that what isn't interesting to him isn't worth talking about, even if it directly impacts the thesis of his book. That's not true, and I'm willing to bet that it will leave his argument feeling insubstantial in the end.

The second problem with this is that it makes Harris sound like one of the religious people who have come in for such scathing criticism in his other work. Yesterday I linked to Jerry Coyne's thorough rebuttal to Mark Vernon's incredibly ill-informed article on evolution. I suspect that Vernon feels similarly about evolution as Harris does about moral philosophy. He recognizes that it's an important topic (if only because it's often held up as a problem for a position he's committed to defending) and feels compelled to pontificate on it, but all that research he would have to do to actually understand the subject is just so boring. There's nothing wrong with being dilettantish, if all you're concerned about is cocktail party conversation. But if you're going to present yourself as some sort of authority, you really ought to take the time to learn about your subject in detail, even the parts that don't interest you much.

I'm going to continue to read The Moral Landscape, hopefully with an open mind. It is, of course, possible that Harris is some kind of philosophical savant, and that he'll be able to present answers to questions that actual philosophers have been debating for centuries, despite finding the subjects too boring to meaningfully engage. However, I have a feeling that, in the end, Harris will be revealed to have taken a vital metaethical question for granted, and we'll see that he had another reason for refusing to engage with the subject: he has no good argument for his underlying assumptions.

Saturday, February 19, 2011

Read Someone Else

You should be reading Jerry Coyne's great blog Why Evolution Is True anyway, but if you aren't, a post from earlier today might persuade you. Since I likely won't have time to write anything substantive today, it seems only fair to link to something far more substantive than what I usually write.

I have to admit, though, there's something about arguments like the one being critiqued there that I find fascinating. It's not that they're persuasive, because they aren't. But like I said in my previous post about quantum nonsense, I really wonder what it must be like to believe what some people believe. I want to know what they experience when they completely misrepresent an entire field of study. It seems too simple to say that they're being intentionally deceptive.

Friday, February 18, 2011

The 1,000 Console Future

Already having announced a new gaming handheld and a smart phone, it now looks like Sony is preparing to release an Android tablet, as well. These three new devices, combined with Nintendo's 3DS, Apple's three mobile game-playing devices, the wide array of Android devices, Windows Phone 7, OnLive, the three current-gen home consoles, and Steam serving both PC and Mac, make the "one console future" that Denis Dyack was evangelizing four years ago look pretty silly.

Of course the current proliferation of platforms won't be able to continue indefinitely, but its existence is still a good thing for now. If any one of Sony's experiments pays off, it will be because stiff competition from Apple, Nintendo and Microsoft drove them to make a device that did things that others didn't. The same can be said of Apple, Nintendo, HTC, and so on. Things may be starting to stagnate a little, with everyone focusing a little too much on keeping up with Apple rather than surpassing them, but for now it's good enough that a thousand flowers are blooming.

That's why I disagree, at least in part, with Chris Kohler's opinion piece on the subject at Wired. He compares the current situation to the mid '90s, when Sega, Philips and 3DO were flooding the market with hardware, creating confusion among consumers that would eventually lead to all three getting out of hardware development entirely. That analogy is good, but not perfect.

For one thing, devices like the 3DO, CD-i, 32X and SegaCD were, themselves, confusing. The former two wanted to be more than just video game consoles, but they failed completely at creating an identity beyond the nebulous concept of "multimedia" devices. Nobody, even the people creating content, knew what the hell multimedia was, which meant they couldn't make a compelling case for why the general public should care about it. The SegaCD and 32X might have made sense on their own, but they didn't as add-ons for the Genesis released in such close proximity to each other.

Sony doesn't have the same problem. People will know at a glance what the NGP, the Xperia Play, and the tablet (if it exists) are, because they've seen them before. Consumers have the concept of "handheld game system," "smartphone" and "tablet". There's no confusion with the hardware. There's a temptation to say that the Android marketplace is where the real confusion will come in, but let's not forget that Android phones are outselling all other smartphones as of 2010.

All this is not to say that Sony is in the clear--R&D costs on these devices must be astronomical, and none of them are going to be cheap (except maybe the Play). But I'm not ready to say just yet that they're a bad thing, or that the general proliferation of devices is a bad thing. I like choice, and like the innovation that comes from competition. There are smart and dumb ways to compete, but it's way too soon to make judgments about which strategy is which at this point.

Thursday, February 17, 2011

On Offense

Despite reading a fair number of atheist blogs, I entirely failed to miss this controversy (and some responses to it) from a little over a week ago. Which is too bad, because it covers a couple of topics that have been a flies in my personal ointment for a while now.

If you don't want to read all of those links, here's the long and short of it. At an atheist conference in Alabama, a panel discussion was held on the problem of attracting more women to the movement. The panel consisted of five men and one woman, and one of the men repeatedly used the word "female" in a way that at least one audience member found troubling. When she commented, she was cut off by the panelist with a rude joke, and angrily left the room. Cue explosive bickering on several blogs.

Not to sound completely milquetoast, but I can identify with both sides here. During my brief time as the music director at my university's radio station, one of our DJs was an outspoken feminist who was so offended by the word "female" that she would spend her shifts going through our back-catalog and crossing out any use of the word (usually in a context like "this band has a female vocalist") and writing "woman" above it.

This struck me as a case of doing surgery with a chainsaw rather than a scalpel. She had complained to management before about comments from some male co-workers making her uncomfortable, and her concerns had been taken seriously, so I was never sure why she didn't talk to anyone about the "female" issue. As at least one of her "corrections" was on an entry I wrote, I would have liked to have had the chance to explain my word choice, which had more to do with thinking "female" reads better than "woman" in some contexts than any desire (conscious or otherwise) to dehumanize women.

On the other hand, though, there are words that are similarly grating on me. For example, even though I know plenty of women who refer to other women as "chicks," I find that particular term cringe-inducing. I can't explain it; it has always bothered me, and likely always will. I personally think that my annoyance with the word "chick" is more justifiable than others' annoyance with the word "female," but really the whole issue is so subjective that finding common ground is likely to be no small task.

And that's the real problem here: determining whether particular offenses are worth speaking out against, or if they're minor annoyances that we just need to swallow. For example, I've decided that "chick" isn't worth fighting against, but "fag" is; but again, I know LGBT people who laugh off "fag" in cases that send me into a rage.

Am I irrational for being disgusted enough with the word "fag" to criticize those who use it? Was the woman at the conference who was offended by "female" irrational? That's a difficult question. I think that, to some extent, I am irrational. "Fag" offends me regardless of its context, but "retard" doesn't. That's at least inconsistent. But what are the options? Are we left with a dichotomy which says that either everyone should be offended by everything, or no one should be offended by anything? Neither option seems particularly appealing.

The closest thing that I can propose to an answer is that we all need more of that consciousness raising that Richard Dawkins likes to talk about, on a wide range of issues. That doesn't mean that we all become hypersensitive and humorless. But it does mean that we start paying more attention to how we use potentially loaded words. Words aren't inherently offensive; they become offensive because of the history of their usage. But just as understanding that history is essential to understanding whatever offense they may or may not cause, it's also essential in moving beyond mere offense.

I think there's a reason that very few people get up in arms about the prolific use of racial slurs in Quentin Tarantino's scripts, but an entire campaign was launched to try to stop kids from using "gay" as a pejorative. We assume that Tarantino knows the history of the words he uses, and uses them to the end of crafting a suitably sleazy world for his characters. To the extent that they elicit laughter, it's because audience and director are both aware of just how inappropriate they are. That is, we've had our consciousness raised.

Compare that to the kid on Xbox Live who uses "fag" like most people use commas. He doesn't know the history of that word, and isn't using it to any end other than being abrasive (or worse, he uses it simply because he hasn't thought about it). That kid is in desperate need of consciousness raising.

So what we need isn't more offense, more righteous anger, or more calls for those who disagree with us to have a sense of humor. It's more understanding, not just of those who have been offended, but those who do the offending. Nobody needs to be singled out and scorned--we all need a chance to explain ourselves.

Wednesday, February 16, 2011

Nostalgia For Nostalgia For a Time That Never Was

Occasionally I get incredibly nostalgic for the early '90s J-Pop subgenre called Shibuya-kei. It was some of the prettiest, most playful music ever created, and we'll probably never hear anything like it again.















Tuesday, February 15, 2011

They Want To Misunderstand

This article from the New York Times, about a group of biologists who set out to educate kids in the rural U.S. about evolution in honor of Darwin Day, has some good news and some bad news. It suggests that people in general aren't as aghast at the idea of their kids being taught about evolution as we might have suspected, but it also suggests that those who want the subject taught more widely are still making one small but important mistake.

First the good news:
The group’s small-town hosts took their own precautions. A high school principal in Ringgold, Va., sent out permission slips so parents could opt out of sending their children to the event (two did).
Only two sets of parents opting out of letting their kids learn basic science is, all things considered, pretty good. Yes, it could have been better (it could have been zero), but it still has to have been fewer than what the event's organizers, and maybe even the principal, were expecting. Maybe I'm too hopeful, but what this suggests to me is that the teaching of evolution is not actually as controversial as its shrillest opponents would have us believe.

Speaking of those opponents, though, some educators may still be making it too easy for them to go on willfully misrepresenting the theory of evolution. I think the most important step educators can take in making evolution clear to younger students is that there is no intentionality behind it. Of course phrasing it that way would open up a philosophical can of worms that teachers, understandably, would not want to deal with. Still, I think explanations like this don't go far enough in making the point:


Dr. [Craig] McClain, who wrapped up his Nebraska-Montana tour at a middle school on Monday, found himself explaining how giant squid evolved. 
“Smaller squids get eaten by everything,” he said. “It’s not a very good lifestyle to have.”
Hopefully McClain went on to make it clear that the change in squid size was driven by the fact that smaller squids died off, while their larger neighbors survived to pass on their genes to future generations, resulting in a larger population overall, and that this process played out over millions of years. Given that evolution's theistic critics love to claim that evolution happens by "random chance," and that change in species would require a driving intentional force, the above response doesn't go far enough. It leaves it open for some shifty apologist to say "What makes more sense--that the squids got bigger because they wanted to, or because God wanted them to?"

I don't know what McClain's full answer was, or what came before, so I don't want to suggest that he wasn't doing his job properly. But I have heard evolution's staunchest defenders talk about this subject in ways that are too ambiguous. When a large portion of your audience is primed to misunderstand you, you have to work harder.

Saturday, February 12, 2011

Observe!

In a new post at his blog Why Evolution Is True, Jerry Coyne takes apart the work of yet another scientist with New Age leanings writing for the Huffington Post. Since I've spent the last two days going after the religious tendency to use subjective experience in dubious ways, it's only fair that I point to a more secular ideology based on the same thing.

The post Coyne is critiquing is nothing new if you've read even a paragraph of someone like Deepak Chopra. It's a mangling of quantum mechanics which takes the technical notion of "observers" to mean "humans looking at things" (see Sokal and Bricomont's Fashionable Nonsense for a more thorough discussion of this common mistake), then assumes that anything true of subatomic particles must be true of all sorts of macroscopic objects. It then concludes that since reality is a construct of (human) observers, there's no reason it can't go on indefinitely.

I once applied for a job at a "natural foods" grocery store, and while waiting on an interview there, I overheard a hilarious conversation. Two young hippies were sitting at a table drinking Naked fruit juice and loudly talking about how all disease is imaginary, and if you can stop believing doctors and "the government" when they tell you you're sick, you'll become immortal. The passion with which they were discussing this nonsense was funny, but I also have to admit that I was a bit envious. What must it be like to really and truly believe that we're in complete control of reality? Surely it's more than a little intoxicating.

I feel the same way when I read peddlers of quantum nonsense. I think that some of them are cynical, and just pushing this stuff on readers who don't know any better (and don't want to), but still, for those who do believe, what a different life it must be. Of course it will end the same way as mine--the inevitabilities of disease or old age will put a permanent stop to our continued observations, and no amount of positive thinking will be able to stop it.

But I confess that I sometimes wonder, in the meantime, which of us is having a better time.

Friday, February 11, 2011

Living Subjectivism

Yesterday, I wrote about the importance to religious believers of arguing from personal experience. Peter Kreeft's "Twenty Arguments For the Existence of God" has several arguments which hinge on appealing to our perceptions of the world in order to prove that God exists. One, the Argument From Degrees Of Perfection, goes so far as to conclude that there must be an objective facts about value judgments, and one of those objective facts is God's existence.

The argument goes something like this: We notice that things vary in their characteristics, and those variations can be thought of as falling along points on a continuous spectrum. We often make value judgments about a thing based on its position on the spectrum for one of its characteristics. And we can extrapolate from those judgment-making behaviors that there is a fact of the matter at which we're trying to arrive.

This isn't entirely unreasonable. There often is a fact of the matter about whether our perceptions reflect reality, so the assumption that there could be a fact of the matter about our value judgments isn't totally unjustified. Anyone who has ever had a passionate argument about music, art, food, etc. should be able to grasp the reasoning here. If we don't, at some level, think that our judgments are right, what's the point of such arguments?

But that we behave as if there's a fact of the matter doesn't guarantee that there is, and the rest of the argument should make that clear. It continues, and here one can't help but think of the ontological argument, that we also make value judgments about beings. And if there is a fact of the matter about those value judgments, that means that there must be an objective standard against which they are being made--in other words, there must be at least one perfect being. That perfect being is God.

You may want to counter that the argument goes off the rails here--that nothing as grandiose as a perfect being is needed to explain why we value people who treat us well over people who treat us poorly. Kreeft responds that this is further proof of his point, for if there were no fact of the matter, if all value judgments were subjective, we would feel no compulsion to argue. "You can speak subjectivism," he says, "but you cannot live it."

That reply presents a false dichotomy, though. We "live subjectivism" all the time. We might argue about our tastes, and really want to convince others that they should share our tastes. But that doesn't require an objective truth, or at least not an objective truth about value. I might argue that you should appreciate my favorite band because I want you to support them by buying their albums. I also might just like a good argument.

This argument has the same problem that I pointed out about apophatic theology in  an earlier post; to the extent that it sounds good at all, it only sounds that way as long as you're talking about the right things. Point to an area of easy consensus, like whether it's better to be loved or not, and nobody's going to complain too much if you conclude that, yes, it's really better to be loved than not.

The problem comes when you start trying to "live objectivism" about less high-minded topics. For example, if I like cold beer and you like warm beer, we couldn't put it down to different tastes. One of us must have the wrong tastes. The same goes for any value judgment you can think of. The Beatles or The Rolling Stones; chocolate or vanilla; cats or dogs; Halo or Call of Duty. If you can't live subjectivism, then there's an objective fact of the matter in every case. Does anyone actually believe that?

Of course my incredulity doesn't prove anything. Kreeft could always just bite the bullet and say that, yes, absurd as it may seem, there really is a right answer to every question of value, no matter how trivial. But again, consider the consequences of that. Not only would there be a fact about whether warm beer is better than cold beer; there would be a fact about whether a 51.02309340923475 degrees Fahrenheit glass of beer is better than a 51.02309340923474 degrees Fahrenheit glass of beer. Don't even think about saying that there could be a range of right answers dependent upon the ability of beer tasters to actually detect differences. That would be living subjectivism.

That we value some things more than others proves only that we do, in fact, value some things more than others. It doesn't prove that some of our values are right and others wrong, and it certainly doesn't prove that a perfect being exists somewhere out there to validate some of our values and invalidate others.

Thursday, February 10, 2011

An Important Difference

Last night, Sam Harris tweeted a link to a debate he had participated in a few years ago on the subject of religion's role in the end of the world. The debate itself is mostly awful, as these things tend to be, with the moderator helping Harris's opponent pile on him to the point that a frustrated audience member finally cries out "You're the moderator!" I'll link to the first clip, though, for the sake of a reference:



Something interesting did emerge in the course of the debate, though, namely the extent to which Harris's religious opponent seems to rely on his own personal experience as his most important source of evidence, and the extent to which his compatriot the moderator is instantly willing to accept that experience as better evidence than any of the statistics Harris offers in return.

Of course we already knew that this was true of religious people to some extent. Powerful personal experience of the numinous does, in my mind, serve as some evidence in favor of religious belief, even if it is vastly outweighed by counter evidence from philosophy and science. But if one accepts what one takes to be one's most important convictions entirely on the basis of such evidence, we shouldn't be surprised to find him privileging that same sort of evidence in all cases.

Of course no human enterprise can ever get off the ground without input garnered from personal experience. But the value of science is that it gives us a set of tools for refining that kind of experience. This is why repeatability is vital to the scientific enterprise. If only one person is ever able to perform an experiment, we shouldn't put as much stock in it as if every competent individual who performed it had the same results.

This is why when Harris says that a single well-designed opinion poll of people in Gaza would be of more value than the personal experiences that his opponent continually cites, he's absolutely correct. It can be hard to accept that one's experiences might not be a perfect mirror of reality, but that's exactly why we shouldn't stick our fingers in our ears and pretend that they are, oh yes they are! At least if what we really care about is getting at the truth and not just maintaining our sense of smug self-satisfaction.

Wednesday, February 9, 2011

All Things Considered, I Feel Fine

You've probably already heard about Acitivision's announcement from earlier today--you know, the one that finally killed off the Guitar Hero franchise, at least for the time being. As someone with some pretty fond memories of the series, I feel like I should be more upset than I am.

My first experience with music games was on a trip to Japan in 1999. Beatmania was all the rage there at the time, but in one arcade, I also found what looked like a rarely used Guitar Freaks cabinet. A friend and I, both of us guitarists, tried it out, and found it utterly inscrutable.

Because of that, I had to be pressured into trying Guitar Hero, and in fact missed out on the series until its second installment. When I finally gave in and borrowed a copy of Guitar Hero 2, I was instantly sucked in. It helped that the songs I was playing were (mostly) the real deal, or at least pretty good covers. But beyond that, everything about the game just clicked for me. The best songs in that game even sort of approached something not entirely unlike the feeling of playing a real guitar.

Most of them were just damned fun, though. They continued to be damned fun when Guitar Hero 3 came out the next year. I didn't think much about Guitar Hero one day not being fun until the first time I encountered Rock Band. I remember there being a fair bit of skepticism in the game journalism community, about whether people really wanted a full band's worth of plastic instruments in their living rooms, but the first time I played "Wave of Mutilation" in Rock Band in a crowded Best Buy, I knew my relationship with Guitar Hero was over.

The same held for the vast majority of my friends, and Rock Band parties became our preferred way of interacting for at least a couple of years. Meanwhile, Activision ran Guitar Hero into the ground with way too many iterations, each one more mediocre than the last. As a result, I can't say I feel much of anything at today's announcement. Honestly, it was past due. Even if Activision brings back Guitar Hero at some point (they certainly left the possibility open), I doubt I'll be interested. Sometimes, it's best to let the past be the past.

Tuesday, February 8, 2011

Shedding Its Light Silently

A few days ago, I wrote about Square Enix's long fall from grace. Tonight, I was reliving happier times for the company, or at least for Square, by finishing the DS remake of Final Fantasy IV.

This is my favorite game in the Final Fantasy series, and is still my sentimental favorite game of all time (though Persona 4 edges it out just slightly as my absolute favorite these days). I first played FFIV when it was released on the SNES as Final Fantasy II, and the impact it made on me as a gamer was immeasurable. For years, I gave up playing anything but RPGs because of it. I had long conversations with friends about the future of gaming, in which I expected games to continue to look more or less like FFIV, but devote the ever-increasing storage capacity of cartridges and, later, CDs, to creating increasingly realistic worlds.

Of course things didn't turn out that way. The vision I had of a single player MMORPG that sacrificed graphical advances for story and player freedom is just now starting to be realized in games like Fallout 3, but it's still not much like I imagined it. What I had imagined was a kind of graphical Turing test, in which players could communicate with AI controlled NPCs and have more or less realistic conversations (with more or less realistic consequences). There would be an over-arching story, but the player would have as close to ultimate freedom in exploring it as possible.

Looking back on FFIV now, it's kind of amazing that it inspired that vision in me. The game is, by modern standards, aggressively linear. While it does at least give you access to an airship relatively early on, even that freedom is kind of illusory. Sure, you can fly anywhere in the overworld, but unless you've hit the right story triggers, you won't find much to do.

I suppose my desire for a maximally interactive FFIV came from the fact that (again, at the time) its characters were the most engaging I had encountered in a game, and I wanted more of that. Looking back now, it handles its more dramatic moments pretty ridiculously, but it still has some scenes that have scarcely been touched by subsequent games.

For example, I still love the scene in which Cecil, after having (inadvertently) destroyed Rydia's village and more or less kidnapped her in the aftermath, starts to win her over by turning on his own army to protect her. Rydia doesn't come around immediately, and Cecil doesn't pout when she fails to. He understands her anger and resentment, and gives her room to deal with it.

It's moments like that that have kept FFIV high on my list of favorite games, especially with American games becoming increasingly violent and misogynistic and Japanese games getting so lost in their own tropes that many of them have become self-parody. I think we'll see games that improve on those moments of realistic human interaction in the future, and I look forward to it. But in the meantime, I'd still be willing to play that JRPG Turing test that my friends and I dreamed up all those years ago.

Monday, February 7, 2011

Humorless

Inception is full of brontosaurean effects, like the city that folds over on top of itself, but the tone is so solemn I felt out of line even cracking a smile.”
That quote from David Edelstein’s review of Inception has been with me since I first read it, kicking around in my mind as something I knew I'd have something to say about at some point.

Edelstein's review, despite his protests to the contrary, reads as if he went into the movie looking for reasons not to crack a smile. That aside, though, the above complaint stood out to me as particularly unreasonable. Edelstein makes no effort to support it, which leads me to believe he takes it to be self-evident that artists owe their audiences a smile here and there.

Or rather it would lead me to believe that, if I didn't find it unbelievable. Surely nobody actually believes that all movies ought to have comedic elements. Even those who leap to decry any work which treats its subject matter seriously as "self-serious" (there seems to be no worse sin in contemporary art) are probably inconsistent. Would Schindler's List have benefited from more ironic winks at the audience? If not, why? Surely it's not only that it's a movie about the Holocaust. Surely artists can treat other subjects seriously without being mocked for taking themselves too seriously.

A reasonable argument would be that Inception's subject matter is too fantastical to be treated as seriously as Nolan treats it. But that opens up its own can of worms. Is it ever acceptable to treat fantastical subject matter completely seriously? I see no reason to believe that it's not, though I would agree that it's incredibly difficult. The Twilight movies are an extreme example of why. Their dour-faced teenage vampires and werewolves mope about the perpetually rainy Pacific Northwest as if immortal creatures have nothing more important on their minds than high school romances. Even if you can get caught up in such a story while it plays out, spell out the premise objectively, and it sounds ridiculous.

The reason Edelstein's criticism stuck with me all these months is that it could be directly applicable to a lot of video games. So many games these days take place in worlds so full of grizzled faces and grim architecture that it's nearly impossible not to laugh at them. But that doesn't mean that games should always break up the angst and oppression with some laughs. It just means that they should be more self-aware. In turn, self-awareness in art doesn't entail ironic detachment. It just means having an understanding of where your story fits in the big picture. The story of Killzone is closer to Inglourious Basterds than Saving Private Ryan, but it's up for debate whether its developers understand that. On the other hand, the subject matter of Metal Gear Solid could be handled seriously, even solemnly, but Kojima constantly breaks the mood with stupid and inappropriate humor.

So maybe the problem isn't that artists won't "allow" audiences to smile. Maybe it's that artists and critics alike need to think more about when smiles are really needed.

Sunday, February 6, 2011

The New Retro

Working in a game store gives you a different perspective than you're likely to get just reading about games online (or even getting involved in the discussions with the other people who talk about games online). For example, it's amazing how many people don't realize that the PSP Go can't play UMDs, or who have never connected a console to the internet. The majority of gamers are nothing like those of us who frequent trendy gaming websites and listen to their podcasts.

One trend that has become almost ubiquitous in the store at which I work is frat boys buying N64s and trying to recreate their childhood game collections. To some extent, this was inevitable; back when I first started searching for games on the internet, I was solely concerned with tracking down all the Atari 2600 games I recalled as my first gaming experiences. Lots of people who have fond gaming memories end up trying to recreate them at some point, whether that means digging their old consoles out of their parents' attic, downloading an emulator and scads of ROMs, or buying back as much as possible.

What makes this apparent trend of N64 nostalgia interesting to me is that it looks like the first steps in finally moving beyond the threadbare trend of 8- and 16-bit nostalgia that is especially problematic in indie game circles, where it has been holding developers back from exploring original ideas for at least a decade. Don't get me wrong, I don't really want to see 8-bit nostalgia replaced by 64-bit nostalgia (new ideas are almost always preferable), but it would be a refreshing change of pace. How would this kind of nostalgia look? Would artists try to recreate the N64's muddy, low-rez textures and blocky polygons? Will graphics that were ugly even in their time suddenly become as chic as squat little 8-bit sprites have become?

Also interesting is the fact that this wave of nostalgia seems to be sweeping over the terminally unhip first. The self-aware hipsters who shop at my store still flock to the NES and SNES (and sometimes PS1, but only for games that look 16-bit anyway), leading me to wonder how bad the revisionist history will be in a few years when every indie game looks like Ocarina Of Time.

My prediction: it'll be totally sick, bro.

Friday, February 4, 2011

Is Square Enix Done?

Yesterday Square Enix reported a 76.6% drop in profits from this time last year. In the current fiscal year, the company has had only two million-selling titles, Dragon Quest Monsters: Joker 2, which was released only in Japan, and Kane & Lynch 2, which of course is an Eidos property. It's almost certain that the disastrous and ill-conceived launch of Final Fantasy XIV has a lot to do with the company's steep financial decline. All this should lead longtime JRPG fans to wonder whether the one-time leaders of the genre have lost the magic.

There are some changes that, at least from the outside, appear both essential and easy. Kill Final Fantasy XIV, admit it was a mistake, and move on. Stop milking Kingdom Hearts before it becomes as meaningless as Final Fantasy has become. Re-evaluate whether re-releasing your back catalog on every imaginable platform is actually profitable.

But those are all just stabilizing measures. What Square Enix really needs is something fresh and new that will capture imaginations in the same way that Kingdom Hearts did. But given its current financial peril, it can't just try to make another Kingdom Hearts, i.e. a big-budget epic bolstered by one of the most expensive licenses on the market. Instead, it needs to take a small project and turn it into a hit.

Of course that's not much different from saying they need to make lightning strike with pinpoint accuracy. But there are at least some guidelines they could keep in mind. Look to emerging platforms like iOS and Android. Take a chance on young, untested talent rather than giving a stalwart like Nomura final say in creative matters. Don't put so much faith in well-worn genres, or at least the purest examples of those genres. There are lots of clones out there these days, and people aren't going to buy one more because it has Square Enix's name on it.

As a long-time Final Fantasy fan myself, I want to see Square Enix succeed. But I want to see them do it like the revolutionary company they once were--not the benighted old guys they seem to have become.

Thursday, February 3, 2011

You're In Business, Jonathan

Jason Schreier has written an interesting little piece for Wired about auteurs in video games. Jonathan Blow provided some quotes, and while I would really like to spend some time discussing whether someone who "thinks almost all games are pretty bad" can qualify as an auteur, something else Blow said dovetails nicely with what I was talking about yesterday.

First of all, here's the relevant excerpt:
“For someone like me, who thinks almost all games are pretty bad, and who has very specific ideas about what he wants to make … I can very definitely say that the single-leader model is good,” he said in an e-mail to Wired.com, although he noted that he and THQ are not in the same business.
It's that last bit that interests me. Blow "noted" that he, as someone who makes video games and sells them for a profit, is not in the same business as THQ, who make video games and sell them for a profit. Of course "noted" is Schreier's word choice, but it's a strange one. That's a statement that cries out for justification, and Schreier takes it for granted.

Given Blow's past comments about not making games due to "crass profit motives," I think we can guess what he means. THQ is a business, and recently an increasingly nasty one, what with CEO Brian Farrell essentially saying he wants people to pay $100 for complete games. It's understandable that Blow would want to distance himself from that. It would be even if he didn't appear to buy into the notion that making money is antithetical to making art.

Blow can try to convince us that his desire to sell games (which I can only assume he has, since he sells games) is different from THQ's, but nobody should believe him. Both want you to buy their games so they can make money. That Blow's pricing model is more consumer friendly doesn't mean he's in a different business. He isn't, and he won't be until he starts giving his games away, or only selling enough copies to cover his development costs.

You're in the video game business, Jonathan, even if you hate it.

Wednesday, February 2, 2011

Support and Business

I've strongly disliked everything Twisted Pixel has ever done, but their newest title, Gunstringer, actually sounds somewhat interesting. Far less interesting was IGN's exclusive reveal of the game, which couldn't have been lazier unless it had just been a cut and paste of a press release.

Ars Technica's Ben Kuchera was on Twitter decrying the IGN story earlier this afternoon. Kuchera seemed to be insulted that "sites that helped support" Twisted Pixel's previous games were made to wait on IGN's exclusive content to go live before they could publish their own stories.

Now I don't want to sound like Ben Paddon, but this business of game sites "supporting" indie developers worries me. I'm assuming Twisted Pixel thought the IGN exclusivity deal was in their own best interests as a business, and if they did, then I applaud them for going ahead with it rather than trying to perpetuate the myth that indie developers make games primarily to collect goodwill from press and fans. If you think it's a bad business decision, fine. If you think they should put the feelings of game journalists above promoting their games, you're delusional.

Twisted Pixel, no matter what else they might get out of making games, do it to make money. That's not meant to be a disparaging remark. They don't exist to support gaming blogs. And gaming blogs don't--or at least shouldn't--exist to support developers (even fashionable indie developers), but to report on them. That's the only relationship that's fair to readers.

Sad that it seems to be disappearing.

Tuesday, February 1, 2011

The Most Beautiful Song in the World



"Pink Orange Red" is the first song on the Cocteau Twins' EP "Tiny Dynamine." I've never heard anything more beautiful, and don't expect to.