A "news" article today (I hesitate to call this journalism without adding quotation marks) suggests that few people understand how universities change their content to keep up with industry. In particular:
And that's what makes Qantm unique - we're not a university, because what people don't realise is that with a university it takes at least three years to change a course. If a lecturer now sees that companies are using a new language to program in, it'll take him three years to implement it in the course.
This is simply not true. There is a subtle but important difference between the content of a course, the course listings in the catalog, and the course curriculum. I will explain.
Course content is the stuff that actually gets taught within a single course. In my Game Industry Survey course, I have modified the content in minor ways on a day-to-day basis; for example, if the Activision/Vivendi merger happened the day before my talk about game publishers and how EA is by far the largest, you can bet I'd be modifying that content the same day. I certainly wouldn't be waiting three years before I started talking about it in class! Minor stuff like this gets incorporated all the time.
Granted, minor content changes like that aren't quite as drastic as, say, switching from Java to C# in an intro programming class. In that case, the instructor might have to wait until the next quarter/semester before changing the language around, but it can certainly be done. And even if the game industry suddenly decides tomorrow that every programmer absolutely must know C# from now on and it's the last week before finals, an instructor could still modify the last lecture of the quarter by talking a little about this newfangled C#, and how it's similar to the language we learned during this course and how it's different, and that all of the students should learn it on their own over winter break if they want to get jobs, or whatever. There's still no three-year delay.
Course listings, i.e. what courses are available for students to take, can obviously not be modified in the middle of a semester. However, they can (and frequently are) modified on a semesterly basis in the form of "Special Topics" courses. Special Topics is this wonderful catch-all that lets professors offer whatever the heck they want. Sometimes it involves the professor talking about their (very narrow) area of research; sometimes it's a brand-new course that should be added to the curriculum, but the professor wants to try it out just to make sure; sometimes it's a course that's important but offered so infrequently that it never got its own dedicated course number; and sometimes it's just an off-the-wall experimental course idea that a professor has just been dying to try out one of these days.
At any rate, there's always a selection of Special Topics in each department, and they change on a regular basis. Again, no three-year lag time here.
The course curriculum is what changes every 3 years (an approximation -- I'd assume that at a four-year institution it would change every 4+ years, while a two-year community college could modify their curriculum every 2+ years). These numbers are not set in stone, by the way: they are a practical consideration. How can an established university build a reputation for producing quality graduates if the core curriculum of this year's graduating class is different from last year's? This is not (entirely) about universities being slow, bloated bureaucracies... I mean, they are, but that's beside the point... this is about universities not kowtowing to every little whim and fad of the industry, and waiting for trends to become established before they force them on the unsuspecting student population.
So, let's suppose a new programming language becomes popular. C# is so 2005, today it's all about Ruby (not really, this is just a contrived example). Starting next quarter, a Programing in Ruby special topics course magically appears. Academic advisors let their students know that Ruby is the next big thing in the game industry, and they'd better learn it before they graduate if they want even so much as a job in QA or the mail room or something, and they're highly encouraged to take the special topics course if they don't just learn it themselves over winter break. Unfortunately, it only counts as an elective, but creative advisors might be able to substitute the special topics for the Programming in C# course that used to be a core requirement; for students who took that already, at least it's a programming elective. A few years later when it's time to revamp the curriculum, the Ruby course displaces the C# course, and all is well.
Until two years later when the industry suddenly decides that Ruby is so 2008, and the thing that everyone really needs to know is this newfangled C+@#$ (where you program by shouting profanities at the computer, with voice-to-text support in the IDE), and the cycle begins again.
Oh, and one other quote from that stupid article that I feel the need to correct:
Initially there was no master plan for SAE, for colleges and things, but it turned out that I invented practical education, because nobody before me was doing that. In a way I formalised education.
I'm supposed to believe that it never occurred to anyone in the thousands of years that humans have inhabited the planet, to have education that's actually practical? If David Braben invented practical education, then I invented the internet.