newtboysays...

When will this tech progress enough that they can render the effects in real time, using something like Xbox motion tracking to keep track of the dancers/props and have them actually effect the projections in real time, rather than projecting a pre-rendered 'movie' that they try to keep pace with and their place in? It would erase all the lag created when the dancers are 1/4 second off, or 3 inches out of place. That would go a long way towards creating suspension of disbelief for many, and sharpen up the performance immensely. Then we can have things like the bioluminescent forest done on stage with moving objects...like Avatar in theater form.
I love what they do with it, I just want to see it progress...and fast!

ChaosEnginesays...

I see no reason it couldn't be done right now.

Once you have the dancers position the rendering is pretty trivial.

Getting the dancers position with a Kinect is certainly doable... I'm not sure how well the Kinect would cope with multiple people in that lighting environment, but you could certainly rig the dancers with invisible IR emitters and track that.

It's simply a matter of cost.

edit: technical info on how it was actually done
http://vezerapp.hu/blog/project-showcase-pixel/

newtboysaid:

When will this tech progress enough that they can render the effects in real time, using something like Xbox motion tracking to keep track of the dancers/props and have them actually effect the projections in real time, rather than projecting a pre-rendered 'movie' that they try to keep pace with and their place in? It would erase all the lag created when the dancers are 1/4 second off, or 3 inches out of place. That would go a long way towards creating suspension of disbelief for many, and sharpen up the performance immensely. Then we can have things like the bioluminescent forest done on stage with moving objects...like Avatar in theater form.
I love what they do with it, I just want to see it progress...and fast!

billpayersays...

@newtboy There are shots that are definitely real time and certainly in sync (20-25sec & 30-40sec are good examples), at least as far as 24fps video goes

Any console and most phones or laptops bought in the past couple of years are powerful enough to generate this stuff in real time.

newtboysays...

@billpayer...Not to my eye.
All this is pre-rendered, and 20-25 sec and 30-40 sec show me that clearly, just like the rest. The video is not in sync with the dancers in either portion. With the guy 'hand dancing', the blocks fall out in many places his hand never goes (but obviously was supposed to go), the floor projection at 30-40 was closer, but not perfect by any means. None of this is 'real time' that I can see, and is also not in perfect sync in many places.
I think you don't understand what I mean by 'real time', because this is obviously not done in 'real time', it's all 100% pre-rendered.

billpayersaid:

@newtboy There are shots that are definitely real time and certainly in sync (20-25sec & 30-40sec are good examples), at least as far as 24fps video goes

Any console and most phones or laptops bought in the past couple of years are powerful enough to generate this stuff in real time.

ChaosEnginesays...

The link isn't actually that clear. It states that "Each scene is a Composition. They are trigged via OSC from eMotion or from QLab. This allow us to be perfectly in sync with sound."

So yes, the graphics are rendered in realtime, but I think the actual control is done by hand rather than by automatically tracking the dancers movements.

I could be wrong, but that's the way I read it.

billpayersaid:

It clearly states in the link that @ChaosEngine provided that they are triggering events live, altering camera angles, frictions, gravity, viscosity, some triggered by sound. ie. Real-time not pre-rendered.

newtboysays...

Then why is it SO far off in time and space? if they tracked the dancers, it would effect the projection, but it obviously does not. Maybe they are triggered by sound, as in by the music...sound is not the dancer.

EDIT: OK, I read the link, and it's not rendered in real time at all. It's pre-rendered effects/movies triggered by sound (cued by music) or by "eMotion" which is NOT motion tracking, but is apparently only motion SENSING (it only notices there IS motion, not what and where that motion is) and more often by people using Ipads to 'draw' in 'real time'. That is NOT at all what I was talking about, which is tracking the dancers and rendering based on that tracking, in real time and in 3D. That did not happen here.

billpayersaid:

they are triggering events live and altering camera angles, frictions, gravity, viscosity. that is r e a l t i m e

eric3579says...

See pictures 4 and 5 here http://www.am-cb.net/projets/pixel-cie-kafig/

Just a normal projection that maybe is being manipulated a bit in the way of graphics or when something starts. No dancers are being tracked and projected onto or around due to their changing positions. Its imperative the dancers are in the right place at the right time to make it seem as if they are being tracked.

newtboysays...

No sir.
It is pre-rendered video (or really series of pre-rendered videos) some of them slightly 'tweaked' in semi-real time, most of them not, but not ever by tracking the dancers in real time, which was the whole point of my original post and the follow ups.
Synching the pre-rendered videos with the pre-recorded music is not 'real time rendering' any effects.
An image you 'tweak' on an ipad with your finger(s) to (poorly) SIMULATE real time effects BASED ON MOTION TRACKING is not at all the same thing as real time rendered projections ON DANCERS BASED ON MOTION TRACKING THE DANCERS AND MORPHING THE IMAGE TO THEIR POSITIONS. (apologies for the 'shouting', but you keep missing the point) I'm sorry that's hard for you to understand or admit.

EDIT: For it to be "real time renderings" it would HAVE to track the dancers...which you admit it does not....so I don't get how/why you want to argue a point you already conceded.

billpayersaid:

@ChaosEngine the realitime comment was for @newtboy

@newtboy I never said they were tracking dancers. Only that it is a r-e-a-l-t-i-m-e animation not a pre-rendered video, which is 100% confirmed by the f-a-c-t that they are tweaking cameras, friction, gravity and viscosity in r-e-a-l-t-i-m-e

jmdsays...

Why would it have to track the dancers to be considered realtime? All my games are real time 3d, they certainly don't track me dancing, or doing much of anything really.

I think the term we want to use is pre-calculated. The particle movements were chosen before hand much like a motion capture, but the rendering is still realtime and thus thing like camera angles can be changed.

I was disappointed because I noticed the lack of true interaction, and when somethings the performers did would effect the particle effects while others wouldn't. That annoyed me even. We have years of tech to monitor actors in real time space, it may have taken a bit more work but this could all have been done with realtime interactions.

newtboysays...

Please, if you will read my original comment, it's about tracking the dancers in real time and conforming the effects to their actual movement, also in real time. Somehow that was lost in the replies. I get that the image is being rendered in real time but that's not what I was commenting on. (even though it's pre-rendered video background, the effects and 'camera' and lighting are changed on the fly, but not by the dancers positions/movements)
Your term 'real time interactions' describe what I mean better than I did.

jmdsaid:

Why would it have to track the dancers to be considered realtime? All my games are real time 3d, they certainly don't track me dancing, or doing much of anything really.

I think the term we want to use is pre-calculated. The particle movements were chosen before hand much like a motion capture, but the rendering is still realtime and thus thing like camera angles can be changed.

I was disappointed because I noticed the lack of true interaction, and when somethings the performers did would effect the particle effects while others wouldn't. That annoyed me even. We have years of tech to monitor actors in real time space, it may have taken a bit more work but this could all have been done with realtime interactions.

Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists




notify when someone comments
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
  
Learn More