Lessons from Hollywood?
August 19, 2007
Jane at the always fun to read Game Girl Advance raises several interesting points regarding "game literacy" in response to an essay on authorship by game designer Clint Hocking. From her post:
Just as children are taught to read - narrative is built in blocks of meanings; visual arts, too, are not necessarily intuitively appreciated. The *reason* that so many of what I would have once called low-brow mid-culture consumers appreciate Monet's waterlilies and Handel's Water Music is because those cultural forms, the language of classicism, have been so embedded in the west that we can understand them intuitively. When the Impressionists were first painting plenty of people thought them crazy, or degenerate, or both. Now imitations of their work hang in second-rate hotel rooms everywhere. We've tamed the language of what was once radicalism so that now it seems safe and normal.
She goes on to ask an important question:
...ok, we're raising a generation of gamers who understand interactivity and systems; how come we are running out of good programmers then? Shouldn't this generation be ideally suited to become natural programmers? And yet every industry that requires engineers, programmers, or other system-builders is having the toughest time finding talent. Is it just a matter of boom time economy and too much work to go around?
Or is something broken?
I think the critical issue here is one that separates video games from other forms of art. The creatively inspired artists and designers who conceive games require an increasingly skilled team of specialized programmers to bring their vision to life (so to speak). Chris Crawford, Steve Meretzky, and Shigeru Miyamoto understand the language of video games in the same way that Edwin Porter, D.W. Griffith, and Georges Melies understood the language of film. Their pioneering creative works were achieved only through the raw tactile process of using primitive tools (often of their own invention) to render their individual visions. This kind of personal auteurist approach, from the inside out, is typical of the early years in other media as well (radio, television, newspapers).
The problem, now that the games industry has reached its second phase of development (often characterized in other media by a "golden age" - Hollywood in the '30s, Television in the '50s ) is that a skilled body of artisans and trade professionals has yet to arise sufficient to serve the growing needs of the industry. The reality is that a game developer can't simply hire the bright-eyed kid just off the bus from Kansas and assign him/her as an apprentice to the 2nd unit gaffer to learn the ropes. The amount of specialized training required today is too great. Consequently, as Jane observes, we have a shortage.
One more analogy to the cinema might be useful. The question for all the Hollywood studios at the dawn of the sound era was how to find enough people to make the transition quickly and capitalize on the new technology of sound. History suggests that two solutions were identified. If you were a big studio with lots of money (MGM, Paramount....Blizzard?, Valve?) you hired the best people from all over the world and developed your own self-sufficient in-house production systems. If you were a smaller studio like RKO or Warner Bros. you did one of two things (or sometimes both). You hired a genius and gave him a small amount of money and total control (Orson Welles, Michael Curtiz), or you developed a less costly, more efficient way to make good movies (David Selznick's unit production system). Looking back on the first decade of the sound era, I think you could make a decent case that RKO and Warner Bros. released just as many truly great films as MGM or Paramount.
Could there be a helpful lesson in this for today's game developers and designers?