About this Author
Andrew Phelps is an assistant professor at the Rochester Institute of Technology, in Rochester, NY. He is the founding faculty member of the Game Programming Concentration within the Department of Information Technology and his work in games programming education has been featured in The New York Times, CNN.com, USA Today, National Public Radio, and other publications. Email: amp-at-it.rit.edu
April 26, 2003
I got up and spoke as one of Tim O'Reilly's "alpha geeks" at ETCON about massively-multiplayer games. And immediately afterwards someone rushed up and asked me in the hallway 'Do you think there will ever be another successful MMORPG after Everquest?" Well, yes I do. But there is a reason most of the second generation (2G) MMOGs are failing - namely that they deny the existence of the games that came before them.
In the "original" crowd of MMOGs like Ultima Online and Everquest, players went there having never seen one before. It was brand new. And while you might have played with a small group of friends from work or home, the games threw you together into larger groups (ie "guilds") to take on end-game content. This was 5-7 years ago.
The problem with the 2Gs is that they mistakenly assume that players are once again going to trickle in by ones and twos and want new groups. That is not my experience. Instead, as guilds get bored or outstrip the content of their favorite games, they look for places to move en masse. The colony is looking for a new hive. This means that when I enter a game, I *have* a social structure already - I *know* who the leaders are, who I turn two for organization of raids and distribution of loot. I *want* a game in which, regardless of game power, my position in the guild community is maintained. But none of the 2Gs are offering that - they only offer the ability to "start fresh" - we want new games but we also want to continue playing that game with the hundred other people that we know and love - and that's what is causing the next generation to fail, a callous insensitivity to that simple fact.
An exampe: I was playing Shadowbane with a few players from my EQ days. My friends and I formed a "group" which is a little construct that is well known to us all. But we were also in a "guild" by virtue of which city we started in. I didn't want to be in some strange guild, I wanted to be in *mine*. So then we level up some characters by exploring the world. And one of them levels up faster than the rest of us by virtue of the fact that he doesn't have to teach a night class . And he hits the magic 'player-killer' level and *woosh* - he is no longer in our guild and can't be in the same area as the rest of us. Hey Mr. Game Designer - you just separated me from my friends that I want to play the game with. Mistake.
Now I'm sure that there is a world-balancing reason for this, that it makes sense within the game. But the gaming community is looking to get people onto their second-generation games (with the exception of Sony, who likes to release expansions for EQ).
The next successful MMOG will be one that realizes 3 things:
1. There were two generations and a series of MUDs/MOOs before those. The game should make it easy to transfer my existing social fabric from whatever game I am playing now to the new world. It doesn't matter if I have a level 1 character than can be killed by a wet mop, they should still be able to be an officer in a guild structure that can all be created on day 1.
2. We want to communicate with people in the game, even when we're not. This is the reason that player groups have message boards and phone trees. People want to be able to find out what is happening on their lunch break, that even though they weren't there the guild beat the big red dragon. That's still a victory, however vicarious. The player community has hacked tools to do this - putting IRC and AIM inside of EQ, for example. Most game companies try to stop this - why?
3. The experience of some players matters a lot more than others. Sorry, but its true. If I buy a game, try it, and don't like it - then the game developer is out one customer. But if I log in, try a game, and decide it isn't a good fit for my guild - then all of a sudden I've turned off a whole lot of potential customers. Most guild websites these days for Asheron's Call, EQ, etc. have running polls on other games, about whether or not they are any good. When one of those games is percieved as "better" than what the guild is doing now, they "hop". I've seen this when an a very large guild left my server on EQ for Asheron's Call. It means that the group is making the decision, not the individual player - but game companies are pitching to players, not to groups.
Stop trying to design my social experience - instead pay attention to the fact I already have a very complex one. I want better tools to communicate and facilitate the actions of that group, not a change in the group or the rules by how it is formed.
ETCON was a lot of fun. I'll have more stuff up in the next few days.
April 25, 2003
Liz [and friends] have started a blog on Social Software at Corante. Knowing Liz, and having met Clay, I'm sure that it will be a very interesting place. It is my belief that games and entertainment spaces will define a lot of the future of social software - I know Clay and I disagree on this. But already we are talking about some play between the blogs and getting some things going. Have fun over there, sounds like a great crew, and we'll all be reading!
April 23, 2003
I am here. I haven't run into tons of people yet, I'm sitting here eating some breakfast, learning the ropes and blogging in real-time. Real... time.... technology still amazes me sometimes. First session in a half hour is Howard Rheingold, talking about his book Smart Mobs but everyone reading this blog should know him best from The Virtual Community. Lots and lots of fun.
UPDATE: Ran into Jane Pickard and Justin Hall from GameGirlAdvance! Hooray for having friends (although we'd never met). had an interesting discussion with Jane I want to write up later. Basically I can't keep up with the conference - too fast to digest, will take me a bit to get things out into words. But there will be coverage, oh yes, there will be coverage. A billion ideas crammed into your head in a compressed time. Its a good kind of 'ouch'.
I went to the Wednesday panel on DRM, but my machine died for a bit. Here is a better real-time blog then I could ever hope to make.
Many of the folks from the Social Software Alliance (SSA) are here - you should really check it out if you aren't here or even if you are and missed them somehow. I spoke at a session Tim O'Reilly did about stuff on his 'radar', of which what is happening in online games is an example. I'll flesh all this out into a full post. I used PHANK as an example of a player community. I wonder if anyone in Phank was there...
April 21, 2003
April 20, 2003
Several of you sent me love/hate mail after the Guilty As Charged article - which is great. Some readers apparently missed the sarcasm running through the article - I do not think games are a waste of time, but I do think that they are perceived as such. Anyway, several discussions going on about that thread at various sites. I'll link up the discussions, some of them were good to read. Also - we *are* working on comments on this blog, and an RSS feed. Really. I don't run the tech for the site, but it is important, and it will happen (not just by popular demand, but also by author request ). I recently became involved in the Social Software Alliance - which seems interesting. I will be at ETCON next week meeting lots of people and talking about multi-user games. I hear there is also this big glowing thing in the sky called the 'Sun' - I am on a fact-finding mission from Rochester, NY to see if this is true. Depending on the network situation there will either be (a) lots of small Got Game? updates during the conference as things happen or (b) one huge update sometime early next week. Either way, I'm excited, and I will record my thoughts. Also looking forward to the plane ride to better acquaint my thumb with my GameBoy. More soon.
April 15, 2003
So I was having this talk with some friends last week. Surprisingly the same topic came up more than once, which doesn't happen all that often when it's outside our normal conversation paths. Liz was one of these people, and she was remarking about the fact that she spent the night before playing Pokemon on her Gameboy Advance SP (which is a really nice piece of hardware). And she was sitting there in her office saying (to paraphrase) 'Yea, I spent all last night playing game. Sometimes I wonder about playing games with my time, you never really have anything to show for it'. And I thought this was pretty interesting, because you don't really have anything to show for it. I mean, I have a level 9 million wizard in Neverwinter Nights - but so what? That doesn't really mean anything to anyone, and it shouldn't. As Liz and I were talking I came to realize something - she felt guilty. Guilty at having wasted the time playing a game, when other past times would offer a more tangible result. But then she said a very interesting thing. She said (again to paraphrase) "I know myself. I wouldn't have spent that time working. Don't get me wrong I'm not lazy, I work a lot, but sometimes you need 'non-work' time. I would have spent it reading a book or watching a movie or something, but instead I wasted it."
Now that's pretty interesting, because it starts to point at a kind of societal acceptability scale for our free time in which games rank last. Why is that? Doesn't playing games teach things like good logic skills and hand-eye coordination? Granted not many of us make a living off of hand-eye coordination any more, but still. Some folks have been using games very much like Sim City to teach economics and city planning in junior high. The military uses games to teach planning and strategy, as well as real-time reaction drills. But no, playing games is wasted time. And to be perfectly honest, I've sat up and blinked after a few good hours of Everquest and said 'Boy, that was a complete waste of a day'. And I felt guilty. And in many cases it is a waste of a day: I didn't do anything around the house, or wash the car or anything. But as Liz pointed out, I wouldn't really have done that anyway.
What I might have done is read. Or watched a movie. Or TV. If I was watching Discovery channel, that would be better right? More productive than Animal Crossing? But if I were watching Dexter's Laboratory, would that still be better? If I were reading that would clearly be better than all of them - does that include Star Trek novels? Ok, I don't read Star Trek novels, but you get the idea. We, as a society, are quick to dismiss any relative merits of game playing as compared to almost any other media: but are we willing to apply the same focus to what we actually do with other media? Part of this is that we really don't know a lot about what games teach us, or our youth. Don't worry though, Joe Lieberman is calling for national funding to study the effect of games. I smell a non-biased study if there ever was one [insert dripping sarcasm]. But this is not just a Congress out of touch with mainstream culture. I offer you this challenge: think of something to do in your free time that is 'less meaningful' or 'worse' than playing the new Zelda that doesn't involve breaking the law. I bet the list is short.
The final issue is one that a student at another university and I were discussing the other day. One of the groups on campus wants to have a very large LAN party on campus. (A LAN party is one in which the party members each bring a computer, network them all together, and then play a multi-user game with/against each other). And the university in question wasn't sure if that was really 'a social event' that they could fund, because the students would just be 'staring at the computers instead of meeting and greeting one another'. Now, is that just ignorance on the part of the school officials because they have never seen a LAN party, or is that how we as a society think of gaming culture and collaboration? Is the stereotype still a teenage male in the basement all by themselves? Haven't we moved beyond that?
April 7, 2003
I was having a conversation with a colleague the other day (before the power went out all over Rochester due to our Ice Storm 2003) about the development of the MS in Game Design & Development. And we were talking about the degree, and its overlap with our social software initiative, and how all of this ties together and what we were going to do academically, and what courses they would take and which path and this and that and blah blah blah. And something stuck me in the middle of the conversation which was: The study of games (and more generally any other social software architecture), will depend largely in part upon the social community of the degree itself - possibly moreso than other disciplines.
First off, understand that college is in itself an incredibly social thing. We (faculty) argue about courses and content, but there are a thousand other factors as to whether students get the experience they need to have. Who they date. Where (or if) they do their laundry. What clubs are on campus, and if they identify with them. Who they live with. Who they have classes with. Whether or not they like their classes. Whether or not they like their campus. (Both liking and not liking something have their plusses and minuses). Millions of things that somehow form the collective event of 'being a student'. There are thousands of ways to do it, and some of them work for some people. It is my belief that studying social software will be impossible in a vacuum.
I can almost believe that people can study 'how to build X' without a lot of the above - provided there is a set of predefined goals and step-by-step instructions. Thus, 'how to build a castle that looks exactly like this with this set of building blocks' can be done either alone or one on one. No interaction required. But once we get to 'why' we would build such a castle, the whole thing changes - the faculty member is only a facilitator to the correct experience or set of experiences (who in fact has only so much control on the process because all we are responsible for are courses), because there is no single and correct reponse to 'why'. (Although on is tempted with the childhood response of 'Becuase').
Life, in its absolute, becomes the vehicle of education, because to truly understand a culture, you have to be a part of it. I am not convinced that you can effectively understand what it is to be a gamer, for example, without being one (and if you try to, you will likely become a gamer before you can claim any reasonable measure of success). And this makes them very hard to study and it also makes building a degree in gaming not only an academic exercise but also one of careful social engineering. This runs contrary to the very notion of impartial observer and experimental theory.
I can see this with games, because games effect their makers so strongly. When students come to me their questions are not 'will I study technique XYZ?' or 'will I get a job?' but 'how can I learn to make [game X]?' ( where [game X] is a game that they played in childhood ). Bard's Tale, Asteroids, Pitfall, Zelda, these are all popular choices. Yes, they all want to do it with new fully immersive ultra high-def 3D and whatnot, but generally people getting their feet wet are not interested in studying new forms of play - they are interested in lavishly recreating the old ones with better technology. And there is nothing wrong with studying the old forms first, indeed it is difficult to explore new alternatives without first understanding the major genres and niches - but at some point originality is key... regardless of what we try to study anyone in this culture invariable relates any idea back to a basis in another game, and anyone who isn't in this culture has long since left the room out of disinterest.
As I sit here (and I've had two prospective student emails thus far even while I write this in spite of the fact we only have a concentration at the moment and not a degree) I am floored by the number of people that wish only to recreate, albeit with better tech, that which has been seen and described before. And it is in part because they are a part of the gaming culture, and that is what drew them here. The culture is so iconic, worships its past with such fervor, that it is nearly impossible to break the mold. You can see this in the chicken-and-egg problem described in the GDC coverage about sequels and licensed property, in the venture capital models that are only willing to fund games that are almost exactly like other successful games. I've seen game proposals that literally say 'We want to build a Quake-like thing but that takes place in a kind of giant bee-hive with insectoid enemies'. ("Not that there's anything wrong with that!"). This attitude is now beginning to permeate the study of games in academia, or at least in the way we as faculty and administration are thinking about it.
Example question: What will these students be able to do when they graduate?
Sample response: Build games. You know like [insert example of modern successful game here]. They can be successful like company XYZ, who grew by four thousand percent last year.
So where do they learn to think outside of that box? If all you can point to are successful past instances and what made them successful, how do you learn the art of innovation? When I went to art school no one (ok a few people but not a lot) said 'paint like this person does'. So why do we force developers (through a variety of market forces and cultural norms) to 'build games like that person does'?
This happens in part because we all still lack a frame of reference for the study of this. We all still lack a vocabulary (as was pointed out at the Academic Summit at GDC 2002, see a synopsis here). Academics are great at coming up with mumbo jumbo like 'A successful student will be able to effectively evaluate the competing forces of designs that produce a successful title in the electronic entertainment marketplace". What did I just say? A student will be able to look at a game and determine which features are most like other successful games and thus how likely it is that the game in question is going to be successful. Except - all of the huge successes in the games industry have been the odd-balls, the things that have flown in from 'off the radar' to take the world by storm. Can you teach that ?
I see the same problems in social software on the whole. We study technology, but we do not study with the same effectiveness the culture that makes the technology useful - we look instead only at the past: How this is like / not like the telephone. How this is like / not like the television. And now we're starting in on how is this like / not like email - like / not like the MUD, or MOO. And this is not meant as derogatory towards those many studies, its a great way to begin to understand - to fit current phenomena in a frame of reference that can be understood. But it is not the only way to place this in a context, and it may be that placing it in several contexts at once could be more academically interesting. We try to separate content from presentation from delivery, when it is only by merging these things that anything usable, anything desirable, exists. Perhaps the age old addage of 'break it apart into component pieces until it can be understood' fails in the study of modern computing/communication.
Are we already so far past the birth of the Internet that we can only stare now at the original birth of the browser, forever focused on our collective navel? Or is it deeper than that. It is easier, and one can claim more success, if we focus solely on what we already almost know. It is easy to claim advances in delivery mechanisms, leaving for someone else how that will eventually be used. And that is good and important work, basic research at its core. And we can study how something is already used, what made it successful, digital archeology, even if it happened last month. But how do you get at the in-between, the thing that springs with/from a culture, and produces technology to mold so form-fitting around it, to nurture it and grow it and produce not only a product but a set of people with it, in unison, the way online games have done since their beginning? How do you study that? How can you begin to come to terms with that?
Today is a day my job feels hard. I took a duck in the face (an homage to Pattern Recognition).
UPDATE: An interesting counterpoint to this was presented here by Clive Thompson. Worth the read, he thinks I'm nuts (which may be true).
April 3, 2003
Greetings Professor Falken... All references to bad movies starring Ferris Beuler aside, there seems to be some recent impetus with games and the military (ok, not so recent, but recently publicized given world events). The New York Times reports on soldiers training through the use of games, including a virtual Iraq. Will Wright has been having all kinds of Sim fun (possibly with the CIA), reported in a recent USA Today article and mentioned at Ludology and GameGirlAdvance. I can almost picture a little Sim Hussein running around with player-community packs for army outfits and super special desert fatigues. Spy glasses to fool the American intelligence community are a special download for $3.99.
The military training with games technology is nothing new. They were doing this with Flight Simulator a long time ago. What's interesting is that they are starting to try and capitalize on "gaming culture" (a la America's Army), and in part on the idea that players are desensitized to killing after doing it over and over in a game (which I argue is a false assertion, that no amount of virtual carnage can compare to the known act of taking human life). Is it true that somehow "gaming culture" may be more suited to acts of war? Indeed the very imagery seen in the last Gulf War and in this one has made war a sort of spectator sport, all you need is a joystick to take part ( a la Ender's Game by Orson Scott Card ). It's interesting to see the military reference EG in the NY Times article above, and yet they missed the subtle point at the end of the story that Ender Wiggen himself is destroyed by what he has (unknowingly) done. A careful read of EG is really about psychological conditioning and the warping of a young mind to the point where Bean does not immediately recognize that killing is even wrong... I am reminded of Real Genius when Lazlo is told that what he was doing was killing people and he 'cracked', living in the basement of the University for years on end.
I buy all the arguments that simulation is cheaper than the real thing, that virtual places are great for wargames. I buy the argument that training in virtual space saves lives because of less accidents. I buy the arguments about reaction time and mindset, about drilling for combat scenarios in differing terrains and environments. I can take almost at face value all the arguments presented General Paul Gorman at PoP! Tech (Windows Media Video Stream here: http://stream.knowtechnology.net/poptech/ ), which argue at the marvelous effectiveness at using games as training tools for the military. But I am personally horrified at the thought of actually implementing a system like that described in Ender's Game, or any one of a thousand other Sci-Fi scenarios.
My reasoning goes something like this: The point of going to war is to win - and if you don't want to win, don't go to war. More formally, the point of going to war is to reach one's political objectives through the use of military force. Thus, there is (or at least should be) some end condition in which one side will perceive itself a "winner" and cease hostilities. Or, if you choose the inverted view, one side will perceive itself the "loser" and surrender by capitulating to the political demands of the "winner". (I am using "winner" and "loser" in the common game vernacular, note that in real warfare these are not nearly so absolute).
The point of a game is not to win. The point of a game is to have fun. Most people associate winning with having fun. But I could build a game that just starts up and immediately gives you 8 billion points and says 'you win'. Fun game? Lots of replay value? And yet, such a scenario would be preferable in war, if you 'won' without the battle, perhaps by amassing all your troops along the border and intimidating the enemy into surrender. If I played the game described earlier, I would want my money back. If I were a soldier, I'd be high-fiving my platoon.
Games can offer a very real sense of what I will label 'pseudo-warlike-mentality' (there are many formal studies that cover this in greater depth). Players are under pressure to stay alive, advance towards strategic goals, and slay enemies. There is a great deal of camaraderie and gallows humor in multi-user games that allow players to serve on a pseudo military unit (either with or against each other). This even works in games in which the backdrop to combat is fantastical - Everquest or Star Command. But the analogy doesn't really hold.
One of the traditions in my EQ guild is that we sacrifice a gnome whenever we fight a dragon. We send someone in to die. There are some strategic reasons for this, and often in real combat soldiers need to be sent into impossible situations (just like a single gnome against a dragon). But the gnome, unlike the soldier, isn't really going to die. My guild laughs when the gnome gets eaten, with a lot of good-natured (and sometimes profane) ribbing. Death, in a game-world has very little meaning. When you attach real-world death to a game, it fails to be a game, because the very goal of a game (to have fun) is lost. Instead, you have a system that uses some display and AI routines to simulate a very serious exercise.
Isn't the whole problem with a lot of the despots we've seen over history is that they regard war as a game? They look at the cost sheets and balances, think of the little men on the board and where they can move to, how many are projected to live and die, what kinds of equipment and terrain they have to deal with - and miss the whole damn point.