Showing posts with label Culture Shock. Show all posts
Showing posts with label Culture Shock. Show all posts

Thursday, March 03, 2011

Culture Shock: Broken Terminology and How to Fix It

Going through the Expo floor at GDC is a rather unique experience. Like a highly expressive game (Minecraft or Sim City, for example), the experience is different for each person depending on their personal actions and goals. Students and unemployed developers looking to get jobs will be networking in the aisles, hanging out at the company booths, and maybe walking randomly through other areas to pick up swag or play with cool-looking toys. Professors network with industry on behalf of their students, and also visit the other school booths so they can all compare notes on who is best representing themselves. Exhibitors do... um... whatever they do. And so on.

Personally, I walk around looking for themes (this year I saw heavy representation of IT companies, cloud computing services, social media support services, middleware, geographic regions trying to attract companies, and schools trying to attract students - note the irony here, since most students at GDC are already at a school so it's a bit late to recruit).

School booths are interesting. Some show off student projects and allow you to play them. Others just show video. One showed a bunch of design docs and art bibles and other written works for a current student project in progress. Most have at least some printed documents talking about their academic programs, classes and curriculum. I always look at these to see what they're teaching kids these days.

One particular school I encountered, I won't name names, had a degree in "game design" (those were the exact words used). When I look at the core classes, I see: a token programming class, a level design class, and a dozen art/animation classes. Hopefully anyone reading this immediately sees the problem here. Obviously, the school did not.

I brought this to the attention of someone at the booth, who pointed me to a director-level person in the same booth. This director looked at me like I was from Mars, as if she couldn't understand why I was concerned or what the problem was. It concerns me because this isn't just a problem with a single school, it becomes a problem for every school. Imagine if a biology degree at School A was equivalent to a chemistry degree at School B, a physics degree at School C, and a science degree at School D. And biotech labs have to put this all together to figure out who actually learned the skills they need to hire. Even if just one school screws this up, the message to industry is "ignore the name of the degree on every resume you receive because it might be lying to you about what it means." But I'm an outsider to the school, and I have no influence to fix this myself, even though it totally screws me over, even if I'm teaching somewhere else.

I think the solution here has to come from industry. I would love to see more industry professionals stopping by the school booths, taking an honest look at what they're presenting, and calling them out on it -- in public, right there on the show floor, if need be -- if they are peddling the academic equivalent of snake oil. If every developer that passes through the expo takes five minutes to do this with just one school, any school that is just blatantly lying to its students about the value of its curriculum will hear that message over and over, loud and clear. So... any takers?

Sunday, February 20, 2011

Games are Publications

I was just asked an interesting question by a Ph.D. candidate: "how would I, as an academic researcher, contribute to the field of video game design?"

This is interesting because it seems so straightforward and obvious, but really thinking about it led me to a series of conclusions that really show both the similarities and differences between academia and industry.

Here's the "obvious" answer (well, obvious to anyone in industry): you don't. Industry largely ignores academic research. This isn't because game designers are a bunch of haters, it's for purely pragmatic reasons:
  • Academic papers tend to be "rigorous" which is a nice way of saying that they take forever to read before you get to the useful parts;
  • Even if we did have time, there's a dearth of peer-reviewed academic journals that specifically address game design, so we would have to hunt through all kinds of unrelated journals just to find something that's even relevant to the field;
  • A lot of academics have no understanding or experience of industry, and therefore publish research that is useless to practitioners, so you have to read through multiple game design papers just to get one that you can use at all.
All of these things mean that finding useful academic research takes an awful lot of time, and time is the one thing game developers never have. We're working on a game, for Pete's sake, and it has to ship yesterday. Who has time to read through journals? We'll read Game Developer Magazine and maybe some articles on Gamasutra, but that's the most we can hope for. And publishing there won't get an academic researcher promotions or tenure, so forget it. Hence, researchers shouldn't bother, the majority of the time.

But wait -- does that mean that the field of game design is stagnating, if there is no way to push cutting-edge research to the field? Quite the contrary; we see innovative and iterative designs all the time. So how does the field build on itself, if there's no research? Thinking about this uncovers a flaw in the original question: it is built on some assumptions about the function and form of academic research.

In the sciences, at least, here's how it works. Suppose you're a research faculty. You do some research. You publish your results in a peer-reviewed refereed journal. Professional R&D folks in industry follow at least the top-tier journals to stay current on cutting-edge techniques and technology. The academic journal represents a primary source of knowledge that builds on itself over time. The original question assumes that game design works like this too. It doesn't.

Here is how professional game designers do research: they play games. Unlike other parts of a video game, the design is laid bare whenever you play. By playing you can explore the mechanics that were designed. Any mechanics that are hidden from you, such as combat formulas or enemy stats, can be found in a published hint guide (which is the closest thing we have to a public design document, most of the time). This allows us to analyze and study games directly. We ask questions like "how do these mechanics create a positive or negative player experience?" and "why did the designer choose to implement that feature in this particular way?" and "what are the weak points of this game, and how would we fix it?"

It is really a wonderful form of "publication"; imagine if scientists did not merely publish the results of their experiments, but also made their petri dishes and cell lines and laboratory equipment and whatnot available, and these were included in each journal so that the reader could precisely replicate their experiments at home! This is what published games are for game designers. Play is the game design equivalent of reading an academic journal. (Oh, how I love this field.)

So, this brings us to the non-obvious answer to the original question: to contribute to the field of knowledge that is game design, make a game and release it. If your game does something interesting, game designers will play it, analyze it, pick it apart, learn from it, and incorporate its lessons into their future designs.

It also means that all game designers, whether in academia or industry, are doing cutting-edge research, and every published game is peer-reviewed.

Wednesday, January 19, 2011

Placement of Students in Industry

Wow, it's been awhile since I wrote anything here. The busiest schedule ever will do that to some people, so for those of you patiently waiting here, I apologize.

I just finished having an epic Twitter discussion with @bbrathwaite and others today, and it made me want to write out in long form something that's been bugging me a bit lately.

Among entry-level jobs in the game industry, it is definitely not one-size-fits-all. The best entry-level jobs offer outstanding work environments, working under amazingly talented senior staff; students who land these kinds of jobs are likely to learn a lot, and go on to positions of prominence in their own right years later. The worst entry-level jobs are little more than meat grinders, throwing inexperienced students in a bullpen and working them to a soul-crushing death on largely uninteresting and unrewarding projects, without providing much in the way of learning opportunities (let alone decent pay or benefits). The majority of jobs are somewhere in between the extremes.

Likewise, students themselves fall along a bell curve. Some are superstars, some are abysmal, and the majority are somewhere in between. Now, the really terrible students probably won't even graduate, let alone make it into the hyper-competitive game industry, so that problem solves itself. The mediocre students, they can get mediocre jobs, and hopefully the reality of the industry will give them enough of a new perspective to bring out the best in them (or conversely, they'll decide that the industry isn't all it's cracked up to be, and they'll gracefully exit) -- again, problem solved. But what to do with the really amazing students?

My personal feeling is that for the really amazing students, they deserve better than the worst the industry has to offer. I'm talking about the students who have already distinguished themselves before they graduate -- the ones I would hire myself, in a second, if I owned a game company. I do not want the best and brightest our schools have to offer, getting thrown into a meat grinder. There are better jobs out there, and I would like to see the most deserving students get the best opportunities. Ideally, their school (or at least one of their instructors with industry connections) helps place them in a good studio. At minimum, they should be taught how to sniff out and avoid the really bad studios, how to detect the warning signs of a toxic work environment.

Mine is not the only school of thought on this matter. Maybe you'll recognize some of these attitudes:
  • Industry experience matters a lot. Even the best student can't hold a candle to an average person with even 1 to 2 years of experience on a real development team. Don't hold students in such high esteem. (To which I would respond: as a teacher, I'm supposed to disrespect my students?)
  • The first job always sucks. That's typical for the industry. Newly-minted graduates need to "pay their dues" just like the rest of us. (I would say: just because something is commonplace doesn't make it right.)
  • Don't forget how competitive the game industry is, especially these days with so many layoffs, and industry-experienced people applying for entry-level positions. Any job is better than no job, and even the best students should be thankful for even the worst opportunity. At worst, they can still add "industry experience" to their resume. (I think this sets up a false choice, as if a student's only options are "bad job" or "no job." As I mentioned, there are hugely positive entry-level experiences out there, even if they are rare. Maybe I'm too much of an idealist, but I think that a few rare people are good enough that they deserve better.)
I wonder, though, if this comes down to a difference between the viewpoint of an educator and that of a hiring manager. I'm thinking primarily about what is best for my students; they are thinking about what is best for their company; and the two are not always the same.

Friday, June 11, 2010

Takeaways from GECS

I'm just getting caught up from attending GECS last week and meeting a bunch of other really awesome people. The focus of this workshop was using games to teach STEM courses; usually the crowd I hang around with is game developers who get into teaching, but here I saw more educators who were taking steps into games, so it was a bit of a different perspective. Here are the lessons I learned:

There is interest in games beyond "game development" schools and departments. Some traditional educators see games as a means to an end, a way to make their content more accessible. From their perspective, they couldn't care less whether it's games, or inquiry-based learning, or circus clowns, as long as it gets their critical course content to stick in their students' brains. This is certainly not always the case -- there are plenty of professors who delve into games because they are gamers -- but there are others who are unfamiliar with games but are still trying to use them because they want to be effective teachers. The game industry (especially those of us who teach) need to reach out more to other departments, rather than staying in our own comfort zones.

Games are not the only way to teach. While some "serious games" people like to tout games as some kind of panacea that makes all learning activities more fun and engaging, the best examples of so-called "games" that I saw were not taking advantage of the interactivity so much as non-game elements that are engaging. One example, by engineering professor Brianno Coller, illustrates this. He opens a course in Control Systems by presenting this racing-car game, where the car is controlled by some very simple source code. It starts out not doing anything; he tells it to move forward, and the car drives straight into the first wall. He then tries to get it to take a corner, by steering towards the center of the road (with the tightness of the turning proportional to how off-center the car is -- if you're at the side of the road, you swerve wildly, while a slight displacement only requires a slight correction). This seems intuitively like it would work... but when you run it in the simulation, something strange happens. The car takes the first turn, but then starts veering wildly out of control, vastly overcorrecting for its position, until it eventually gets so far out of line with the road that it crashes into a side. This leads into a discussion and exploration of why that happened, how to correct it through a phase shift, and all of the calculus and other heavy math that you need to derive it. He has found that this method of teaching is far superior to simply diving into the equations with no context.

Is Brianno's course superior because it uses games to teach? I don't think so. Instead, what he's doing is opening his lecture with a real-world mystery, something the students can see that is interesting and counterintuitive, and then he goes through the course material to solve it. Once he's got that "hook" the students are much more interested in learning the material, because it's not just a bunch of random facts and equations anymore... the learning has a purpose. And while that mystery may be presented within a game world, I don't think it's the game that gets student interest as much as the mystery itself.

A storm is coming, and it is going to suck. One concern I'm seeing from a number of people is that game industry growth is not keeping pace with the number of graduating students from game-related programs, and yet the number of academic programs is still increasing. As a result, I think the industry is going to get more and more competitive over time, and things are going to be pretty rough for students for awhile (until we find some kind of equilibrium). Corollary: it's likely that we will see more industry "abuse" of fresh students, in terms of expecting long hours and lower pay, since there is more labor supply than demand. Reputable schools should warn their current and prospective students about this trend. (Don't worry about dropping your enrollment numbers; in practice, you're not going to be able to talk most students out of choosing a game development major, anyway.)

Another storm is coming, and it is also going to suck. One by-product of the many industry layoffs this last year, is that a lot of ex-developers are considering teaching as a career, which is a great thing. However, to save costs, a lot of schools have been taking advantage of this by hiring more adjuncts and reducing their full-time staff. This is exceedingly dangerous on the part of the schools that do this, and here's why: the game industry is cyclical in nature. When the next upswing hits and the industry goes on a hiring binge again, schools can expect at least half of their adjuncts to leave. If a department that used to be 50/50 between full-timers and adjuncts goes down to 20/80, and then half of the adjuncts leave, the result would be devastating.

We think there are more academic standards than there actually are. How many schools has the average faculty taught at? I don't know, but the answer seems to be pretty low. And yet, a lot of people I talked to just assumed that their experience would extrapolate to every school in the country. One example is the assumption that adjuncts always get paid less than full-time faculty; I've run into some schools that pay them about the same per course (it's the same course, after all), and other schools that actually pay adjuncts more, on the theory that (a) they need to partly make up in cash what they don't pay in full-timer benefits, and (b) a lot of adjuncts have day jobs, so teaching is effectively "overtime" work for them, and they need the extra pay as incentive to put in the extra hours. Another assumption is that full-time faculty always teach a certain number of courses each term; I've seen requirements of anywhere from 5 courses per term down to one course per year, depending on the school, the department, and how much research the faculty is doing outside of their classes. Another assumption: everyone complains about how hard it is to work across departments because they are "silos" and yet I've seen some rare schools where inter-departmental collaboration is the norm. It seems to me that each school is different, and there are few if any standard practices that really apply everywhere. I was just a bit surprised at how many career faculty seemed unaware of this.

Thursday, December 17, 2009

Escapist article

My debut as an Escapist columnist was just posted. It amazes me that I'm at a point where I can just send emails to some well-known game developers asking interview questions, and they actually respond. I'm still not sure how this happened.

For students, I'd also recommend reading the editor's note to this issue. It describes a little of the culture of game development, and is a reminder that this is not just an industry of gamers, but of human beings. Sometimes as a rabid game fan it is easy to lose sight of that.

One interesting thing I just realized is that you might or might not be able to ask the same question ("what games do you keep playing obsessively?") of teachers. Yes, many game development teachers are rabid gamers... but I've run into more than a few that have no personal interest in games. But I don't think I've ever met a game designer who didn't love games. It's strange, the differences between the two worlds.

Now, I just have to decide whether The Escapist counts as a peer-reviewed publication for purposes of my CV... probably not.

Wednesday, November 18, 2009

Culture Shock: the role of policy

In academia, there is less of a team spirit than there is in the game industry. Probably this is because there is less of a threat of, say, an entire department getting laid off because their collective product didn't sell enough units at retail. This difference has many manifestations.

One difference is in how closely people follow written policies.

In industry, while most workplaces do have some kind of Employee Manual with a list of policies, these are usually seen more as guidelines (except in the obvious cases where there would be legal repercussions if the policies are ignored). Getting an exception to, say, a sick leave policy is a matter of talking to your boss about it and having a good reason, especially if that reason ultimately benefits the team and the game.

In the academic world, policy seems a lot stricter. Asking for an exception is essentially asking your boss to go through some kind of appeals or justification process on your behalf. It is asking for more work, in a world where everyone already has quite enough work on their plate, thank you. So it is far more likely to meet a stone wall in this case. You are more likely to see bosses and administrators hiding behind official policy rather than explaining it or working around it, because following the rules is the path of least resistance, and there isn't much personal reward to putting in the extra effort.

For my peers in industry considering academia, this is one of those annoying things you can expect to run into. Ultimately, it means you have to choose your battles carefully, because you won't have enough time to fight over every silly little thing (at least, not if you expect to get all your work done).

Obviously, not all schools are this bureaucratic, and not all game companies are this relaxed. I'm talking general trends here, based on my experience. As usual.

Sunday, May 03, 2009

Culture Shock: Academic Freedom vs. Industry Constraints

When reading about Brenda Brathwaite's series of non-digital games (this includes games about such heavy topics as the Middle Passage, the Trail of Tears, and the Holocaust), it struck me that this kind of project would never happen in the game industry.

I don't mean that it would never get publisher funding. I mean, it wouldn't, but that's not my point. My point is, even if it were on her own time with her own money outside of work, this would never be allowed to happen.

Think about it. Suppose you were a working game developer and you casually mentioned to some co-workers that you were thinking of making an art piece and showing it at galleries, and that the topic was highly controversial and this was sure to have a lot of people cheering, and a lot of other people up in arms. How many nanoseconds would it take before your producer found you at your desk and asked you very nicely not to do this, out of fear that the Company would receive negative media backlash, and this is the last thing we need when we're courting three publishers for our next contract, so if you're interested on working on non-digital games maybe you could make something about fluffy bunnies instead? (I suppose some companies make controversy part of their business plan, but I'm talking about everyone else.)

This is a completely different paradigm than academia, where the whole concept of tenure is (at least in theory) supposed to be about the freedom to do anything, no matter how controversial. As an academic, you actually get support for things like this. You can sometimes get funding for things like this. Not everywhere, I'm sure, but it seems more likely that a random school will at least not get in your way if you want to take on a controversial product, compared to a random game company. One more point to consider if you're considering a career in either and you prefer to have total creative freedom.

Sunday, March 15, 2009

Culture Shock: Learning Disabilities

Autism. Aspberger. OCD. ADD. ADHD. Tourette's. Bipolar. You name it, someone in the game industry has it. Probably several someones, and probably at least one someone who is incredibly successful.

For this reason, it's hard for me to even call these "disabilities" -- given that the word "disabled" literally means that the person is not able to do something, and clearly it is possible to make games regardless of what psychological label might be applied to someone. But then, I'm not a psychologist.

For the most part, people in the game industry don't care if you've been diagnosed with anything, as long as you can help them make great games. You could be criminally psychotic for all we care, as long as it doesn't impact the development schedule. (Okay, I exaggerate. But only slightly.)

So, it took me by surprise the first time a student gave me this little slip of paper from the campus office of disabilities, several years ago (I've since gotten used to this ritual; it seems there's always at least one per class, and usually more).

For those of you who have not taught before, here's how it works: the student brings you this paper that gives you (as the teacher) no practical information, except to tell you that the student requires some special privilege (commonly, extra time and privacy when taking exams). You have to sign it -- in all the places I've taught, I've never been allowed to keep a copy -- and then the student takes it back. Presumably it gets filed somewhere, I don't know.

And then, naturally, you forget about it, because you're not allowed to keep a copy. Until exam time comes, and you remember that two of your students have special requirements, but you can't remember which students (many students with so-called "disabilities" are quite high-functioning), and one of them might have dropped your class a few weeks back anyway. Oops. I've been doing this for a few years and I still manage to screw this up most of the time.

The most frustrating thing, though, is that you're given no information about how to teach more effectively. I understand and accept that we're dealing with confidential information on a need-to-know basis, and I will often be getting the bare minimum of relevant information. But this conflicts with a desire to teach properly, and if I know that (for example) talking more slowly or repeating myself will help or hurt the situation, or if making my lecture notes available is useful, or if I should avoid calling on a student in class because it would embarass them... well, it'd be good to know, but there's no way for me to find out without a confidentiality breach.

The obvious thing to do in these situations is to talk to the student directly, and simply ask if there's anything you can do... but often the student doesn't know, because they aren't a professional educator.

Best solution, I suppose, is to take matters into my own hands. Read books on as many of these disabilities as I can find, particularly any that might give clues on how to teach better, and hope for the best.

Tuesday, January 20, 2009

Awkward Moments

A short collection of social awkwardness as experienced by a game-designer-turned-educator, in no particular order:
  • Having several students admit that they played a game you worked on, when you know the game in question wasn't particularly good. (Additional awkwardness: when the game in question is M-rated, and you know that the students were underage when they played it.)

  • Giving a game design constraint for an in-class exercise, and repeatedly being asked questions about the exact boundaries of the constraint... and realizing simultaneously that my students are trying to weasel out of the constraint (and that I should be annoyed), and also that my students are trying to precisely define the constraint (which is an important skill for game designers, and something I should be proud of).

  • Witnessing a student fall asleep in class, and hoping that it's because the student got no sleep and not because I've really become that boring. (Additional awkwardness: waking the student up, and hoping that I done it in a way that I haven't cruelly humiliated them.)

  • Assigning a homework that's not only easy but actually fun, and seeing that half the class didn't bother to complete it. And then wondering if my definition of "fun" has changed.

  • Writing something out (an assignment, a syllabus, an email, etc.) that I thought was clear as could be, and having students not understand it. This either means I'm not as good a writer as I thought, or that my students aren't functionally literate, or that my students are lazy... and no matter which it is, there's nothing I can be happy about.

Tuesday, December 30, 2008

One Myth About Teaching

I was confronted with the opinion the other day that teachers are overpaid because they only have to work 9 months out of the year but get paid for a full year. I'm pretty sure the person who expressed this opinion has never actually met a teacher before, but it seemed like a thought that a lot of people might have. So, I thought I'd set the record straight.

First, a teacher in any field typically gets paid a bit less than a working professional in that field, even though they have to know just as much (if not more). This is why a lot of teachers feel underpaid for their work -- because with their qualifications, they could make more if they weren't teaching.

As for being underworked, I know of very few teachers who sit idle on summer/winter break. In my own experience, the time fills up fast:
  • There's a lot of prep work to do for classes before they start: revising syllabi and course content, evaluating new textbooks, and keeping current with industry trends all take time.
  • If you're teaching any brand new courses, you have to develop everything from scratch, which typically takes about as much time as teaching the course itself (i.e. one new course = two old courses, in terms of time commitment).
  • Keeping professional skills sharp is important. Over breaks I usually end up doing some kind of freelance contract work.
  • Ever heard of summer and winter classes? A lot of teachers hold classes over these supposed "break" periods.
  • And of course, during the academic year teaching is a lot more than just a 9-to-5 job. In theory you're supposed to have a 40-hour work week, which is 4 or 5 classes if you're full time (that includes face time in lecture or lab, and also out-of-class time spent grading). But in addition to that, you have other duties: academic advising, office hours, faculty meetings, and (if you're really unlucky) being on a committee.

In reality, teaching is more than a full-time job.

Does that mean that these thoughts of "lazy" teachers who only work "30 weeks out of the year" are completely inaccurate? Unfortunately, no. It is possible to reduce the workload. You can hold office hours for your classes simultaneously, and then use the time to get other work done if no students show up (although this means you'll end up treating students like they're interrupting you when they show up for scheduled office hours). You can just copy your course notes from earlier classes without updating them, which reduces prep time to almost zero (but then you cheat your students out of a modern education). You can set up your assignments so that they're easy to grade (but anything easy to grade is usually not that meaningful -- for example, you can tell a lot more about a student's understanding by reading an essay than you can get from a multiple-choice question, but multiple-choice is easier to grade).

So, it is possible to have lots of time off, work 40 (or fewer) hours per week for 30 weeks a year, and have the rest of the time free to... um... do whatever teachers do when they're not working. But so far, the only way I've found to do that is to cheat your students. If you want to be a good teacher, forget any thoughts you had of annual three-month vacations...

Thursday, August 21, 2008

Culture Shock: Deans and Heads and Chairs, Oh My!

When I was a student, my only real faculty contact came from my professors. Sure, there were all these other people out there with titles like "dean" and "department chair" and "provost" but I never had any dealings with them, nor did I have any idea what they did. I had no concept of departmental politics or inter-departmental territorial disputes; I couldn't see beyond the exam next week.

Now, as a faculty, I see all this stuff (even though I sometimes wish I didn't... sort of like if you enjoy eating sausage and then see how it gets made). But it occurs to me that it's still off the radar of most students (and, indeed, most industry professionals who have some dealings with educators).

I'm not sure what, if anything, to do about it. Part of me feels like students should probably at least know who the dean of their department is and why that matters. Maybe the non-teaching faculty should do more to have contact with students in informal settings (not that they would necessarily have the time, with their overloaded schedules)? Or maybe the teachers who have a lot of student contact should speak a little bit about departmental issues in their classes so that the whole thing is a little more transparent?

Sunday, August 17, 2008

Culture Shock: Student Passion (or lack thereof)

I've recently mentioned the lack of passion I've seen in teachers, compared to that of game developers. It occurs to me that the same complaint can be made of students.

Admittedly, this is largely the teachers' fault. How hard is it to get excited about something when you're learning from someone in the field who just isn't excited about their own work? Still, it's a bit of a surprise for me, coming from a job where everyone is working together as a team to make games... and seeing students working in a totally different way.

In the game industry, at least on the projects I've worked on, most people care about the project. Sure, if you work really hard to finish the work on your plate, your "reward" is to get even more work piled on you. So if you're cynical, you could say that the best "strategy" is to just do the bare minimum you need to not get fired. After all, you're salaried, so it's not like working harder actually means more money or rewards or anything. And yet... that almost never happens in practice, because the real reward is that your game is better. And if you care about the game, and you want it to be a good game, then you'll do whatever you can to make it the best game you possibly can. If you don't care about the game... well, there's a whole big software development industry out there that has nothing to do with games, which will pay you more money for less work. So people don't tend to become game developers unless they have this drive to make great games.

You'd think that the same would be true of game dev students, wouldn't you? Put a group of students together to make a game, and you'd expect them to all work insane hours and do everything they can to make it the best student project ever. After all, it's not like students can't do amazing work.

But in practice, you don't always see this. Sometimes you get an outstanding student team (usually the result of a single outstanding student leader who pulls the team together, and if you removed that one student the whole thing would collapse). But I'm seeing a lot of cases where this isn't happening at all. Some students don't show up for meetings and don't do any work at all -- as if they wanted a free ride, just a grade, and they don't care that this project is something that could go in their portfolio and get them a job (among other things). Students make excuses about why their work is late, when I know full well it's because they were just goofing off and procrastinating, a sign that they don't really care much about their project (they just see it as classwork, not an original project).

I'm still trying to find ways to make sure students get it, that game projects are an opportunity to create something experimental and new and different and original and really really cool (possibly the last opportunity they'll have for the next ten years of their career), and that they should really care about it. But I feel like it's an uphill battle sometimes, like I'm fighting against a dozen years of "education" that teaches students to jump through hoops for a piece of paper with the attitude that the real stuff comes later after graduation.

And it's a bit of a shock for me, even now, because I don't have to deal with this in the industry. I don't have to ask the programmers on a big-budget game to show up to work and give their best effort, because they already do.

Wednesday, June 04, 2008

Culture Shock: Retention and Turnover

In the game industry (and in fact, in any professional industry), employee turnover is expensive. If someone leaves the company and you have to replace them, there's the expense of interviews (which take a lot of time away from senior people) and then the extra time it takes the new hire to get productive. Companies that realize this do what they can to retain their employees. Indefinitely.

Being a professor is different. In my case, "turnover" means that a student has graduated. It means I'm doing my job correctly. It also means fighting against the instinct of "gotta keep our best people around" that I'm used to from being in the industry.

Tuesday, April 01, 2008

Students Modify the Teacher's Reputation

I was talking with Brenda recently (we do that a lot) and she gave me something new to worry about.

Whenever a student of mine gets a job in the industry, it reflects on me, personally (because most of the time, I'm the only person from industry they've had direct contact with in a classroom). In other words, my students may affect my ability to get industry contract work at some point.

The assumption is that if I taught them everything they know, then their skills and abilities are a reflection of my own. This isn't entirely true, of course, but it doesn't matter. A lot of people believe it's true, therefore it influences their perception, and perception is everything when it comes to reputation. I say "my" here, but this really applies to any industry-based teacher, especially at a school where they're the only one of their kind.

Sometimes this works in my favor. Last year I had two absolutely brilliant students who made it into the industry, and they're making me look good, through no fault of my own.

Sometimes this works against me. Maybe some day I'll have an absolutely horrible student, who somehow blunders into an industry job and screws things up horribly. If I'm asked to provide a recommendation I can be reserved about it, but beyond that, I have no defense against this. But it's still a potential black mark on my record.

I suppose if one is really paranoid, the best defense is to work for a university that has overly selective admission requirements, and get oneself installed on the admissions board. For the rest of us... I suppose we just have to cross our fingers and hope that we get more good than bad (and that maybe we can be enough of an influence on enough of our students to make the difference).

Thursday, March 27, 2008

Terminal Degrees

Every field has its jargon. Game designers will happily talk about HUDs and avatars and positive feedback loops, oblivious to the fact that most people who haven't been doing this for the last few years of their career might have no idea what they're talking about. This is a particular danger when teaching, by the way, that you lapse into "designer-speak" without defining your terms, only to be met with blank stares.

People in academia do this too, and it can sometimes be confusing for the new designer-turned-teacher to keep up. A recent discussion on the IGDA game educators mailing list reminded me that one of the new terms an industry person is likely to run into is the terminal degree.

(Disclaimer: since I've only been doing this the last couple of years, I might get some details wrong. If you see any errors, feel free to post in the comments and I'll fix the post. Thank you.)

What is a terminal degree? The best description I can think of is a degree higher than Bachelor's, which is the highest degree offered, at the institution you received it, at the time you received it. Normally this means a Ph.D., but some fields don't offer one (the best-known are probably MFA and MBA) so those are referred to as terminal Master's degrees. Typically, a non-terminal Master's takes less time than a terminal Master's, which takes less time than a Ph.D. (in case you're trying to get an advanced degree as fast as possible).
Edit: Looks like I was wrong about this, it's just the highest degree offered in a field -- still, usually a Ph.D. but in some fields an MFA, or other degree. I'm not sure what happens at boundary conditions, such as if you get an MFA in Game Design (the highest degree offered today) and then one school decides to offer a Ph.D. in Game Design. Does that invalidate the 'terminality' of these other degrees?

Note that this means that if a new Ph.D. is offered at a university that previously only offered a Master's, whether the Master's is terminal or not is based on when the student enrolled; if you started before the Ph.D. was available, it's terminal. Timing matters.

Why should you care? Terminal degrees are important if you're planning to make a career of teaching. Having one means that you get paid more; at some places it's even a requirement for certain positions. If you don't have one already, think about getting one. Unfortunately, leaving a full-time career in industry to go back to graduate school is difficult for most people. Fortunately, once you do have a full-time job at a university, one of the more common benefits is being able to take classes for free or almost-free; if it's not practical to get your terminal degree first, it's quite possible to get it second.

From seeing a number of people going through graduate school, I also secretly suspect that it's called a "terminal degree" because it has a good chance of killing you.

Saturday, March 15, 2008

Love is in the air...

So, two of my former students got married today. This was not a surprise; I don't think I ever saw one of them without the other for as long as I'd known them. (You can tell they were my students, because the bride walked down the aisle to the theme of Aerith. And the two figures on the top of the wedding cake were Tidus and Yuna.)

It was a strange feeling, being part of that. I certainly never would have dreamed of inviting any of my professors to meet my family when I was an undergrad. (My wife, who tried to get to know some of her professors outside of class, was repeatedly told that it was somehow "wrong" or "inappropriate" or "unprofessional" for reasons that I've yet to understand.) Yet, the whole thing doesn't make me feel weird or freaky. It makes me feel pretty special, actually.

I think this kind of thing might be specific to game professors, and maybe a few other professions. I'm teaching people how to go after their dream jobs, and part of that is learning about what their dreams actually are. Students don't typically seek game jobs for the fame or prestige or high pay; goals and hopes and dreams are always on the surface, and these things are very personal. So, I suppose it's much easier to know students on a personal level if you're teaching in this field, as opposed to teaching calculus, or quantum physics, or signals and systems.

At the same time, there was another strange thing I didn't expect: I didn't actually know anyone in the room. I saw these two students outside of class a lot of the time, so I figured I'd spot at least a few of the people I saw them hang out with. Instead I was in a room with a hundred strangers, and it made me realize just how much I didn't know about them. And I realized I'd felt this way before when I was a student, when I'd see one of my professors in the bathroom or the grocery store or something, and it was like "whoa, they're an actual human being with a life outside of the lecture hall?" And now I see the same thing in reverse -- whoa, my students have actual lives outside of my classes that I don't actually know about! (I used to think this was because there were all these formal barriers between students and professors that the professors put in the way intentionally; now I think it's just a by-product of seeing a person on a regular basis for only a few hours a week, so that you "know" them but only in a narrow context.)

So, for those of you in the industry who are considering teaching, this is the kind of stuff that I hope you have to look forward to. And yes, since I know you'll ask, the cake was delicious and moist.

Good luck, you crazy kids.

Friday, February 01, 2008

Culture Shock: Professional Humility

If you're a game designer for long enough, eventually you'll work on a game that ends up just not being all that fun. (No matter how great you are, everyone has their mediocre projects.)

As a teacher, there are analogous cases. Eventually you'll meet a student who you know is capable of grasping the course material, and they genuinely try hard, but for whatever reason they just don't seem to get it. Or, eventually you'll create an exam or assign a homework and half the class completely bombs it.

In either case, there are two types of people. There are the ones that look at the tiny amount of positive feedback (the one glowing game review, the one student who gets everything perfect) and decides that if this tiny minority understands what they were trying to do, then everyone could, and it's just that the others aren't trying hard enough. It's certainly not their fault if these people just don't put in the effort to see their obvious brilliance. They feel better, but they don't actually get any better.

Then there are the ones who learn humility. Yes, others may have made mistakes along the way, but something in your own work went wrong as well -- enough that it was unable to save everything else. By figuring out what you did wrong, you become stronger.

In games, this is actually pretty easy. Reviewers will quite happily lay your game on the table and dissect each of its flaws, for all to see. Post-mortems, likewise, show us that most development teams are painfully aware of their issues; if you as a designer have made mistakes, even if you don't know what they are, someone else on your project team certainly does.

In teaching, it's much harder. There is no "project team" -- in fact, it's rare to have even a single other education professional observe your work and offer any constructive feedback at all. Students can help you identify weak points, but they can't tell you what to do about them. Teaching, then, forces one to go beyond humility into the realm of self-reflection in a way that game development does not.

Saturday, January 12, 2008

Culture Shock: Parking Policies

It didn't occur to me until recently that different institutions have wildly different ways to treat faculty when it comes to parking lots, of all things.

In my experience, parking for a normal job is pretty uneventful. Some buildings have private lots and you get a sticker or an assigned space or something, while smaller companies might need you to park on the street nearby. Either way, the company is paying you to come to work, so it's in their best interests to make it easy for you to actually arrive. Also, most companies aren't so huge and bureaucratic that Parking Enforcement is its own private department... especially in game development.

For colleges and universities, by contrast, parking is a huge problem. When you've got thousands of people commuting every day, you need some way to ensure that parking spaces are allotted fairly. Unfortunately, the parking office is usually not part of Human Resources, which means that a lot of places inadvertently send some pretty harsh messages. Here's three policies at three different institutions that I've observed (feel free to add your own in the comments):
  • Separate parking lots. If you're a student and pay a fee, you get a sticker for the student lots. If you're faculty, you get to park in the faculty lots for free, but you're not allowed in the student ones. There are also some "seniority lots" at prime locations on campus; when someone with a space dies or retires, whoever's most senior on faculty gets the space. What this says: Students and faculty are different people, so we're putting up artificial walls to make sure they don't accidentally get to know each other. Also, no matter how great a teacher or researcher you are, what really matters is that you stay here for a long time.
  • Separate parking spaces within many lots. Each lot has some student and some faculty spaces, and if you pay a fee you get a sticker that gets you in to the right kinds of spaces in each lot. You have to pay even if you're faculty. What this says: This institution doesn't care who you are. You are nothing to us, a mere insect. If you don't like it here, there's a line of people out the door waiting to replace you, so put up and shut up.
  • Common lots. There are a variety of parking lots and every space is exactly the same. Students pay for a parking pass, faculty get one for free, but everyone is fighting for the same spaces (and they're a bit scarce at certain times during the day). I haven't had a space stolen in front of me by one of my own students before, but I'm sure it happens occasionally. What this says: We're all in this together... or, every man for himself... depending on how you look at it.

Friday, November 30, 2007

Culture Shock: Assumed knowledge and skills

Everyone thinks they're a great game designer. Even if they have no experience. Even if their idea of a great game is "the game I want to play, even though no one else will want to." If you tell people you're a game designer, one of the two typical reactions is "hey, I have this great idea for a game..." (the other reaction is "what's a game designer?"). Basically, it doesn't matter if you're a Junior Designer on your first gig or if you've got twenty years experience; everyone you meet thinks they're a better designer than you. This is something you get used to pretty quickly.


Strangely, I didn't see this much when I became a teacher. I don't think I ever had a single student who felt that they knew more about game design than I did. The student/teacher social dynamic is apparently stronger than the "everyone's a game designer" thought process. I have no explanation for this. (It's not just that I have the power of assigning grades; students who didn't even take my classes treated me with a respect that I'm totally not used to.)

Tuesday, May 15, 2007

Culture Shock: Doing the Work

As my class ran under time today and students left about half an hour early, it occurs to me that the decision to cut things short was met with enthusiasm -- not because my students hate taking my classes (I hope!), but they've just been conditioned through years of education to get out of class as soon as possible.

This sort of thing doesn't happen in industry. If I'm working as a game designer and spontaneously decide to take half of the afternoon off, the rest of the team isn't going to heave a sigh of relief that I'm leaving (nor is the publisher, or the client). But here, my "customers" are perfectly happy if I'm not doing the job I get paid to do, at least to a certain point. It's rather unsettling, really.