Wednesday, April 30, 2008
A: "So, from your definition of the word 'game,' real life would be a game."
B: "Yes, I think life is a game."
A: "Then, is death Game Over or just a checkpoint?"
Friday, April 25, 2008
One of the challenges for a games-based classroom is transitioning learners from their onscreen experiences to real world applications. A game that teaches algebra should keep that fact well-hidden. Kids immediately get suspicious when threatened with something that seems too much like a learning tool. Instead, conceal the algebra training inside an economic or management sim along the lines of Zoo Tycoon (which conveniently would also teach about animals, basic geometry, problem solving, etc.), and ramp it up gently. But at some point you have to help the learner make the mental connection, the “oh wow” moment… to realize, essentially, that skills learned in interactive zoo management work in life as well.
That "oh wow" moment is key for learning, but not just for game-based learning as Matt suggests. It's critical to draw the parallels between what you're doing in a classroom and how it's actually used in the Real World, whether you use games or not. Without that connection, you run into all sorts of problems:
- Students learn rote facts and methods without understanding them in a broader context. When it comes time to apply their classroom knowledge, they'll have to go back and learn it again, because they never thought before of how to actually use it.
- Humans are inherently good at understanding and remembering stories, moreso than random factoids. Course content is the latter; showing how it's used is the former. Without the context, it's harder and more inefficient for students to learn the material.
- More to the point, a lot of students won't even pay attention if they don't see the value. If your class is perceived as just being an arbitrary hoop to jump through so your students can get a piece of paper, let's just say that you're not going to have your students passionately learning your subject.
And honestly, students are hungry for this understanding. If you teach a class, try this, if you don't already. One day, just take two minutes at the start of class to tell a story about how the stuff you're learning in class today was actually used to do, well, anything useful or cool. See if your students don't pay a whole lot more attention for the entire day.
And this is a problem with a lot of college classes. Many professors have no idea how their course material is used in practice (career academics are especially vulnerable to this), or they know but they aren't telling. When I first took Linear Algebra, we learned everything except practical application, so I did the familiar cram-for-the-exam-then-forget-everything method of study. Then I took Computer Graphics, which was really cool, and I realized that maybe I should have paid more attention since we were using scaling, rotation and translation matrices on a regular basis. And then I took another neat course where we learned about the math behind sending a space shuttle into orbit, which required a whole lot of dealing with vectors and matrices. And then I worked in the industry as a game designer of all things, and found that you could use matrices to solve certain types of game balance problems. This would have been nice to know when I was taking the course!
It's like a lot of professors out there imagine themselves as Mr. Miyagi from The Karate Kid. Wax on, wax off. Do that a few thousand times. After you're done, then I'll tell you how to kick the other guy's face in. That makes for great storytelling, but lousy pedagogy.
So, I see this as a huge advantage for professors who have actual, honest-to-goodness industry experience: we can share that experience with our classes. Why am I spending perfectly good class time talking about something abstract and obscure like positive feedback loops? Glad you asked, let me tell you about a game I worked on that had game balance issues because of a feedback loop that was unintentionally embedded in the core mechanics, and here's what we did to fix it. I'm not teaching you this stuff because the IGDA Curriculum Framework says I should, I'm teaching you the stuff that I've actually used myself on the job. So pay attention. (And they do, most of the time.)
Now, there is a danger here: you have to have the context but also the content. There is a perception in academic circles that the only thing an industry person does is come into the classroom and tell a bunch of entertaining war stories. You've gotta deliver the goods, too, so your students actually have the knowledge and skills that they're supposed to apply. But in my own experience as a student and as a teacher, there's more danger of too little context than too much.
Tuesday, April 22, 2008
- Encourage students to fail early and often. Being in school is the one time where you can do this without losing millions of publisher dollars in the process. BUT,
- Punish students harshly for failure. It's a tough industry, and classes should reflect that.
These aren't necessarily mutually exclusive, although it seems like it at first glance. The former is primarily concerned with taking creative risks: trying forms of gameplay that have never been done before. The latter mostly involves setting and achieving reasonable goals: controlling the scope of a project, keeping to a schedule and meeting deadlines.
However, the two viewpoints collide when you're teaching a studio class where the output is a complete game -- if the students try hard, but end up making a game that is just not fun or interesting (in spite of their efforts). As a teacher, do you grade them harshly, because a comparable professional project would mean that their studio would be out of business and they'd all be looking for new work? Or do you grade them generously for their ability to try hard, stick with a process and complete the project? Either way would seem to send the wrong message.
Saturday, April 19, 2008
It occurred to me the other day that this might not be the case for teachers. I've never heard of an instructor putting together a portfolio of their own students' work to show how much their students have improved under their tutelage, but I don't see why something like that wouldn't be valuable if you're marketing yourself as a first-rate teacher.
Likewise, a university might consider this for its promotional materials, the same way that the beauty industry likes to show lots of before/after photos so you can see how much of a change their products can make. Again, I've never seen this before, but at the moment I'm having a hard time thinking why not.
Saturday, April 12, 2008
The conference itself was great; it was small (about 700 people, compared to the ~30,000 at GDC) which meant that you actually have the time to have real conversations with people, without having to leave to say a quick 'hi' to twenty more people. You have time to actually play games with other game developers. You get to meet the people for the first time who you've previously passed a dozen times in the hallways of GDC, like ships in the night.
Some quick thoughts that I wrote down from all of my various side discussions with people, in no particular order:
- There's a common pattern in teaching game design: many students start out wanting to make a copy of their favorite big-budget game; as students, they have this huge gift that is the academic freedom to innovate, and they just want to make Something of War-something. After they get in the industry and the novelty wears off, then they want the freedom to create and innovate that they no longer have. The professors from industry are already at the point where they value creative freedom and we're setting up our classes to provide what we wish we had when we were students, but we forget for a moment that we didn't appreciate what we had at the time. I'm not sure if there's a way to fix this, other than to treasure the few students who are exceptions and set them up as examples for everyone else.
- The term "independent" (or "indie") as applied to game development is vague, because it can mean any combination of three different things: business model (not owned by a publisher), money (low-budget, not AAA), or experimental gameplay (not just a derivative clone). It might be better to abolish the term "indie" from our vocabulary, and be more specific about what we're talking about.
- Women and minorities are still being horribly marginalized in the mainstream game industry (okay, no news there). But, most of the efforts to fix this so far have focused on attracting more of them to the industry. I'm thinking that an equally important piece of the puzzle is raising awareness within the existing industry that this is a major problem. In the past, I've suggested that every game designer should take Women's Studies as a class; I should add Minority Studies to the list. And I should specify that these would not be electives or suggestions, but required coursework for anyone seeking a game degree.
- Since the beginning of time, some games have been designed with technical constraints first. Today, it's something like "point-and-click is easy to implement in Flash, so what games can we make where the only player action is point-and-click?" A couple hundred years ago, it was "we have all these maps, what games can we play that use maps?" Three thousand years ago, it was "we have all this wood and rocks and pebbles, so what games can we play with wood and rocks and pebbles?"
Tuesday, April 08, 2008
However, the methods of cheating differ between community college and a more typical four-year university. Honestly, policing a class at community college is much easier.
At a four-year school, there are student dorms, and fraternities and sororities and student clubs, all places where students can save old assignments and tests to form a study bank (which forces professors to vary their test questions, or else have some students who suspiciously seem to know every answer as if it were memorized...). Most students have a social network of friends and they typically study together, which opens the door to having them do each other's assignments.
At a community college, ironically, there is no community; it's a day campus only. Students may have friends, but a lot of those friends aren't fellow students, so there's less group study. Most students don't stay around longer than two years, either, which limits the amount of old tests they can pass on to the next "generation" of students (since this generation is only one year behind them).
The net result is that it's easier to repeat test questions at a community college, without fear that my students are going to walk in with a study sheet cribbed from last year's exam. It's also easier to request that homework assignments be done on an individual basis, because a lot of students don't have the means to work in groups anyway.
Of course, the down side to this is that assignments that require work in a team outside of class are much harder. As with everything in life, there are tradeoffs.
Friday, April 04, 2008
It occurs to me that my students get course credit for taking my classes, but I don't get course credit for teaching it! Some day it might be nice to, you know, have a Bachelor's degree in game design from all the courses I've taught. But it's never going to happen, because I don't actually get course credit for teaching my courses.
In some ways it shouldn't matter. In theory, if I'm qualified to teach a course, it may as well be on my transcript anyway. In other ways it most certainly matters; if I teach a class at the graduate level when I don't have a graduate degree (yes, this can happen), it would be nice to get some credit hours towards my own graduate degree!
Tuesday, April 01, 2008
Whenever a student of mine gets a job in the industry, it reflects on me, personally (because most of the time, I'm the only person from industry they've had direct contact with in a classroom). In other words, my students may affect my ability to get industry contract work at some point.
The assumption is that if I taught them everything they know, then their skills and abilities are a reflection of my own. This isn't entirely true, of course, but it doesn't matter. A lot of people believe it's true, therefore it influences their perception, and perception is everything when it comes to reputation. I say "my" here, but this really applies to any industry-based teacher, especially at a school where they're the only one of their kind.
Sometimes this works in my favor. Last year I had two absolutely brilliant students who made it into the industry, and they're making me look good, through no fault of my own.
Sometimes this works against me. Maybe some day I'll have an absolutely horrible student, who somehow blunders into an industry job and screws things up horribly. If I'm asked to provide a recommendation I can be reserved about it, but beyond that, I have no defense against this. But it's still a potential black mark on my record.
I suppose if one is really paranoid, the best defense is to work for a university that has overly selective admission requirements, and get oneself installed on the admissions board. For the rest of us... I suppose we just have to cross our fingers and hope that we get more good than bad (and that maybe we can be enough of an influence on enough of our students to make the difference).