Andy Serkis and Unreal teamed up at GDC to showcase what may just be the future of video game motion capture technology.
It began with a video of Andy Serkis performing a monologue from Macbeth. The original performance is certainly captivating – it is Serkis, after all – but didn’t seem to have anything to do with video game technology. However, Unreal then showed how a new technological concept known as “Project Spotlight” is able to project Serkis’ facial expressions onto the face of a digital alien creature:
While the visual of that technology is impressive, the truly amazing aspect of the demonstration is the implication that the rendering was done in real-time. See, you could theoretically achieve similar effects using modern technology. The problem is the number of resources that such an effect demands and how you kind of have to shape your entire game around it. That’s the issue the revolutionary L.A. Noire encountered. With Spotlight, though, developers are potentially able to render such effects without having to compromise the performance of the game itself. That’s a literal game changer.
Before you get too excited, though, this technology is still in what should probably be considered a “developmental” stage. That means that it’s not quite ready for primetime and may not become commonplace for a generation or so (if it becomes commonplace at all). The same is true of some of the other technology showcased during GDC, including this amazing Star Wars demo that looks too smooth to be true (it is):
Of course, that’s just how technology works. Nobody makes progress without stretching the current conceivable limits. As such, we do believe that this demo represents a fairly accurate representation of how video games will look sometime in the next decade or so.
On a related note, we’d be perfectly happy to hear that someone has cast Andy Serkis as an evil alien overlord in some kind of Mass Effect-like sci-fi epic.