Are you ready for virtual actors in movies?

[kyte.tv appKey=MarbachViewerEmbedded&uri=channels/6118/311444&tbid=k_1716&premium=true&height=500&width=425]

One of the most interesting conversations I had at the Consumer Electronics Show last week was with Charlie Boswell of AMD. He works with movie companies to implement the data centers that they need to build movies of the future and he told me about this new technology, called “Light Stage” which lets movie companies capture human actors and then change their images into software-controlled “virtual actors.”

Until now this technology looked cheesy. But no longer. You probably have already seen virtual actors in movies and haven’t realized it (all done with Light Stage).

Here’s the two videos so you can see how movies are changing.

1. Charlie Boswell, who has the coolest job at AMD, working with movie studios to make special effects where he talks to me about what he’s working on and tells me about Light Stage. If you are into movies, he talks to me about a bunch of movie houses and how they are using technology.
2. Jules Orbach, CEO of Light Stage/OTOY, showing me some clips of what these virtual actors can do. He was also up on stage during the AMD keynote and Barron’s Online has a live blog of that. On stage AMD and OTOY announced they were working on the fastest supercomputer ever.

Anyway, it’s interesting to see how technology continues to change our movies. Boswell blows my mind when he says this technology will soon be affordable for everyone (soon being years, not decades).

Are you ready?

22 thoughts on “Are you ready for virtual actors in movies?

  1. How is this new? We’ve had hand-drawn animation in live-action films, pure CGI actors in live-action films, and CGI compositing of archival footage and new footage. This is, if anything, just a small step in the direction of pure CGI ‘live-action’ movies.

    Like

  2. How is this new? We’ve had hand-drawn animation in live-action films, pure CGI actors in live-action films, and CGI compositing of archival footage and new footage. This is, if anything, just a small step in the direction of pure CGI ‘live-action’ movies.

    Like

  3. Chris Barts,

    It’s different from all those things you mention because it’s an animation-performance hybrid. Which is to say, a real live actor provides the animating “vita” for a digitally rendered image, so it’s neither pure animation, nor pure live-action cinema. (and some hard-core animation purists will fight you to the death whether it’s ‘animation’ at all). Which in itself is not new – “Gollum” in LOTR for example (Andy Sirkus should have been a supporting actor oscar nominee, but the Academy still hasn’t really worked out how to handle these cases yet), or even, rotoscoping (last seen in Scanner Darkly). Or, Beowulf, even.

    The difference is the level of *performance* detail that these guys claim to be able to capture – and off cameras too (rather than mo-cap suits).

    We’ve been able to *render* 3D environments (including humans) in quite some detail for a number of years. Making them move convincingly (and not have the audience cry out “zombie!” and laugh) is proving a more difficult trick that technology such as this is only just starting, in the past few years, to overcome.

    Like

  4. Chris Barts,

    It’s different from all those things you mention because it’s an animation-performance hybrid. Which is to say, a real live actor provides the animating “vita” for a digitally rendered image, so it’s neither pure animation, nor pure live-action cinema. (and some hard-core animation purists will fight you to the death whether it’s ‘animation’ at all). Which in itself is not new – “Gollum” in LOTR for example (Andy Sirkus should have been a supporting actor oscar nominee, but the Academy still hasn’t really worked out how to handle these cases yet), or even, rotoscoping (last seen in Scanner Darkly). Or, Beowulf, even.

    The difference is the level of *performance* detail that these guys claim to be able to capture – and off cameras too (rather than mo-cap suits).

    We’ve been able to *render* 3D environments (including humans) in quite some detail for a number of years. Making them move convincingly (and not have the audience cry out “zombie!” and laugh) is proving a more difficult trick that technology such as this is only just starting, in the past few years, to overcome.

    Like

  5. Aaah, when the clip started I thought “This is just plain ol’ CGI, what’s the big deal?”. Then it got scary – the level of details captured there is incredible and when they add the backgrounds it seems to add an extra level of realism to the face. Any idea how much it’s costing to use this tech? Meaning is there potential, at some point, to see actors getting priced out of the movie biz?

    Like

  6. Aaah, when the clip started I thought “This is just plain ol’ CGI, what’s the big deal?”. Then it got scary – the level of details captured there is incredible and when they add the backgrounds it seems to add an extra level of realism to the face. Any idea how much it’s costing to use this tech? Meaning is there potential, at some point, to see actors getting priced out of the movie biz?

    Like

  7. I call humbug.

    The whole DeBevec-Lightstage-virtual-actors thing is a red herring, much like the tired ‘will robots rule the world? (plus old, first announced at Siggraph 2000), the real goal is “cloud rendering” to replace the RedHat Render farms, or try to at least, server-side graphics processing. But the energy required and cost of scale could kill it, why pay someone else tons to render for you, at a seriously tremendous price, when you can go distributed computing, centralizing content instead of horsepower? The only thing you are gaining from moving from client to the server, is latency, and even if that changes, still no real market. Another doomed venture, as the whole thing seems like just one big AMD commercial. Do something real-world impressive THEN go Marketing-Blogger-Dumb Press heavy, and not prior to while throwing tons of “possibilities” out of the plane.

    Boswell blows my mind

    Easy enough to do, just add snake and oil.

    Like

  8. I call humbug.

    The whole DeBevec-Lightstage-virtual-actors thing is a red herring, much like the tired ‘will robots rule the world? (plus old, first announced at Siggraph 2000), the real goal is “cloud rendering” to replace the RedHat Render farms, or try to at least, server-side graphics processing. But the energy required and cost of scale could kill it, why pay someone else tons to render for you, at a seriously tremendous price, when you can go distributed computing, centralizing content instead of horsepower? The only thing you are gaining from moving from client to the server, is latency, and even if that changes, still no real market. Another doomed venture, as the whole thing seems like just one big AMD commercial. Do something real-world impressive THEN go Marketing-Blogger-Dumb Press heavy, and not prior to while throwing tons of “possibilities” out of the plane.

    Boswell blows my mind

    Easy enough to do, just add snake and oil.

    Like

Comments are closed.