Photo-Realistic Virtual World Rendered LIVE server-side

The game is running on OTOY, the 3D engine that renders graphics in the cloud. The technology allows relatively weak computers (or even mobile phones) to display incredibly detailed graphics comparable to those seen in Hollywood movies. -TechCrunch

*edit: It appears that at least one part of this video was not rendered in real time. Sounds fishy, maybe it's just a marketing video. We'll have to wait for more updates on this.
10175says...

I'm an environment artist and designer for games, so I can say with a good level of experience that this whole thing is most likely faked.
What is probably going on here is the content is being PRE-rendered server-side, and they just spit movies of it out. Most likely their content would be something showcasing that medium, rather than a realtime environment ala-GTA or <pick 3D game of your choice>. I could be wrong, but it's highly doubtful. Even if it were procedurally generated, you don't even have enough bandwidth to deal with the amount of data in that environment over your network connection.

I don't know what to say about them using things from peoples demo reels. Yeesh.

cybrbeastsays...

Good research BicycleRepairMan, sounds plausible.

And about the bandwidth, I can already stream HD video from sites like Gametrailers. Granted it's not as high resolution as some people play their games, but still quite impressive.
I'm more worried that they won't have enough computing power any time soon to render all these images to users.

AeroMechanicalsays...

I suspect that someday video games will work this way using thin clients (and pretty much all software). They'll need very low lag I/O and HD streaming, but that's not so hard to imagine in the future.

That would solve all of the hassle of video-game hardware--games could become increasingly advanced without the need for the consumer keep buying new hardware all the time. It would also eliminate piracy, and for various other reasons, I'm sure the game developers would love it. Except, of course, for the sort of hardware they're going to need.

If you figure a successful console game sells 10 million copies and you could probably imagine a few hundred-thousand people are playing at any given time after release, they're going to need some pretty heavy duty iron on their end and a ton of bandwidth.

I doubt this video is fake, but let's see it scale to thousands of simultaneous users spread hell-and-gone all over the internet. There was another video not so long ago demonstrating software you could use to run a game on your desktop but stream it to a laptop to actually play it.

I dunno, very cool possibilities anyways. Maybe 10 years from now.

Lolthiensays...

I'll believe this when I see it. "Server side rendering"... so if you have 5000 concurrent users, the server re-renders the scene 5000 times from 5000 different viewpoints simultaneously? 10,000? 100,000??

Like was said above, I'll believe in pre-rendered movies, and those changing according to 'up' down 'left' right commands and what not... moving from scene to scene...

rychansays...

It's a neat idea but it doesn't quite seem feasible to me. There are some big issues-
It's computationally and thus $$$ expensive to render all of that stuff.
It's expensive to pay for all the bandwidth, especially if you want it to be high-definition. At best a player will have to live with low-res graphics and compression artifacts.
And the biggest show-stopper is the latency. You can't have any form of client side prediction which hides latency in online games. Your latency is increased by having to go through a video compression step. Best case I imagine 1 second of latency and that just isn't fun to play with.

Online games just don't seem like a good application for thin-clients. It's so much faster and cheaper to send a compact description of the world state instead of sending an entire visualization. Mobile phones could be an exception, since they are low-res and have limited graphics acceleration, but they also have limited and expensive network bandwidth.

reedsays...

Yeah, "cinematic" effects make it look "realistic". "Live" means here "prerendered video streamed live" (or, prerendered textures are baked onto simple geometry and that's rendered live) apparently, and "the cloud" likely means "some servers".

But it does look pretty good and the movement looks really smooth.

cybrbeastsays...

>> ^rychan:
And the biggest show-stopper is the latency. You can't have any form of client side prediction which hides latency in online games. Your latency is increased by having to go through a video compression step. Best case I imagine 1 second of latency and that just isn't fun to play with.


The article about the server-side render engine claims: "As for lag, he experiences 12-17 milliseconds on the west coast (where his current test server is located) and 100 ms in Japan."

rychansays...

Well those numbers are surprisingly good. I wonder if that's the one-way trip they're timing. The user experiences latency as both the outbound and return trip. Anyway 12-17ms you'd hardly feel, 100ms would be annoying but maybe survivable, 200ms I couldn't tolerate at all but maybe some people could.

Remember this isn't like latency in World of Warcraft. If you have a 500ms ping in WoW (which I frequently do), your client still responds instantly to character movement, inventory browsing and many other things. If you're playing two characters side by side it's surprising how far the clients can be out of sync with the server, but it works for the most part.

I don't really believe those numbers, I must say. The internet alone, in the best case scenario, will cause that much latency. Streaming video would have to have more latency.

I mean the most important element of video compression is motion estimation - taking advantage of temporally redundant data. That's not possible here. You don't know how quickly the client will be rotating their camera for the next frame. It seems like the compression would be pretty poor because of this (nearly as inefficient as single image compression, with some gained efficiency for re-using codebooks/palettes from previous frames where possible), and the bandwidth required very high, and that's just more opportunity for stuttering because of dropped or delayed frames. And each frame has to travel independently. You can't group them and compress them together or you're adding huge amounts of latency.

I'll believe it when I see it working with low latency and high resolution in a large scale deployment.

mechadeathsays...

I also work in the game industry and am very familiar with current and near-future rendering technology and am certain that this video a hoax, pitch or that the gentleman is misinformed about what can be done serverside.

Whenever you hear a designer/producer saying "you can do anything you want in this game blah blah blah..." they are lying and quite possibly incompetent - i've seen a lot of these promises before, every feature of the game is utterly infinite with no real basis in what's possible or how to do it.

Assuming this is actually an interactive environment in the video, the biggest reason this can't happen is that the rendering strategy is dead wrong. This is like the opposite of bit torrent; instead of the clients spreading the rendering load, the server's processing and bandwidth are slammed. The farms that run WOW servers can barely keep track of player positions and data validation, let alone rendering "hundreds of animated characters on screen at once with full raytraced reflections in realtime." This would be a neat feat for a server to do in realtime for ONE offsite client, let alone thousands.

I long for the day when this is possible, but what we have here isn't even smoke and mirrors, its just bullshot.

xxovercastxxsays...

Definitely some bullshit here. Even a gigabit connection doesn't compare to bus speeds on basic video cards, let alone top of the line 3D cards, and I don't know anyone with a gigabit internet connection. There's no way someone is rendering photo-realistic real time interactive 3D over the wire.

Mind you most of this isn't even close to photo-realistic, but I still don't buy it.

spoco2says...

Absolute shit.

I'm not even going to bother looking into this.

* The visual style changes multiple times through the 'demo'
* The aspect ratio of the image changes multiple times
* There's no coherent runthrough of anything, because they've just grabbed a host of clips and bunged them together.
* The logic of having 1000 people connected and so you have to do 3D rendering of those 1000 people's view is utter, utter shite. That it's hard to do this sort of rendering on a PC for ONE person using a friggen top end graphics card burning away under the hood, how the HELL are you going to do it for all 1000 clients.

I so hate people who invent shit like this and present it as fact.

F*ck them with a drainpipe.

jmdsays...

Meh, I had an explanation of everything but it was way to long. In short, this isn't fake, but it IS most likely a demo produced with the intent of selling his idea to a bigger company.

Aside from the massive amounts of memory needed to keep most of the world "on demand" for thousands of users, rendering 320x240 windows (think youtube people. Before youtube, people thought free movie streaming sites people could upload movies to would be impossible to support. It is, you just gotta customize your method of streaming to suit low bitrates) isn't to hard. I mean take a look at your video card, the only reason we want monster cards is for higher resolutions, higher AA (Which is achieved by rendering at larger resolutions then what your monitor displays), and more memory for the textures. You reduce your window to 320x240 and your only issue left is memory.

The camera's movements are exatly that of a mouse, flying through a virtual world, and the world produced many of the common flaws of LOD rendering that we encounter today in virtual worlds,

a) when zooming in on sky scrapers quickly, you can see a reflection texture that isn't rendered at a distance gets applied, this shouldn't happen in pre-renders were LOD is not applied nor needed,

b) many out door scenes get caught with low resolution shadow maps, especially the mall floor. This is an optimization for real time rendering and would not show up in pre rendering.

c) When high up in the air looking down, roadways appear sparse and buildings featureless, much like that of GTA4 when you do the same. Realtime LOD rendering removes alot of it to keep frame rates sane, pre-rendering would not need this.

spoco2says...

>> ^jmd:
Meh, I had an explanation of everything but it was way to long. In short, this isn't fake, but it IS most likely a demo produced with the intent of selling his idea to a bigger company.


Oh you're so going to have to eat all those words... this is faker than a fake thing covered in fake stuff and served in a fake store.

Look at the first part of this video. WHY would you have depth of field changes (focus pulling) other than it being a pre-rendered demo from a cg artist as it is.

Then go to OTOYs website the 'tech' behind this and look at their screenshots of environments and notice how much lower quality they are than those you see in many shots of this patched together collection of video from all over the place.

Then read that it requires a plugin whereas this video states 'no plugin required', which begs the question, how do you get any sort of interactive environment on a phone, server generated or not, without some sort of plugin? Going to render it in HTML?

Sure there are those looking at serverside rendering, but the graphics shown in this demo are NOT server side rendered real time images being able to be created for x thousand people all using the system at once

Saying that they hired him because they use more of his video in this than just the opening is madness, it just states they stole MORE than just that opening render of his, they ALSO stole the renders he did of the mustang and truck.

Also, cityspace exists as another company, but then it's tagged as Liveplace. Liveplace exists as a website with... shock, NOTHING on it other than an email harvester

Such a steaming load of crappy fake crap.

Just ask yourself, if it wasn't fake at all, WHY have the clips from the guy's renderings in there at all? WHY?

And some of the stuff in there is just previous OTOY material..

And then there are a collection of still renders thrown in there for no good reason.

Argh... Stop believing everything you see people

jmdsays...

Spoco, uhmm... does flash count as a plugin? Cause yea flash could do it. Although maybe you wouldnt need flash.. web cams have supported HTML controls for panning/tilting for many years now.. althought its not nearly as graceful as you would like.

Why is there point of view changes? uhmm..why not? MOST rendering engines of this calibure support such features, its actually practically done in hardware. Would you normaly in this type of application? No not really, but its pretty clear the controls were not "final" in anyway.. infact there was no interface which probably means hes using a bunch of developer functions to move around. This also gave him easy access to adjusting light sources.

I'm not saying this is going to be a product, This vid could easily be a couple years old. It obviously has no user interface yet, and the avatrs themselves have little to no work on them. Clearly an early alpha stage if not still working concept.

However what I see in the video looks alot like what we see in todays large scale virtual worlds, complete with the rendering nuances we encounter still. Also the movie DOES show both still shots and motion shots, and the still shots DO look like they are far more detailed and those most likely ARE a high quality pass pre-rendered shot. However the fact that the still shots and motion video have such a stark change in quality should be further proof this is realtime rendered.

spoco2says...

>> ^jmd:
Spoco, uhmm... does flash count as a plugin? Cause yea flash could do it. Although maybe you wouldnt need flash.. web cams have supported HTML controls for panning/tilting for many years now.. althought its not nearly as graceful as you would like.

I definitely count flash as a plugin if they're talking doing this on phones. Yes you can get flash for phones but it can't be assumed to be installed. Also, you can't say that it doesn't need a plugin if you need flash, that is a plugin, just one with huge penetration (tee hee).


Why is there point of view changes? uhmm..why not? MOST rendering engines of this calibure support such features, its actually practically done in hardware. Would you normaly in this type of application? No not really, but its pretty clear the controls were not "final" in anyway.. infact there was no interface which probably means hes using a bunch of developer functions to move around. This also gave him easy access to adjusting light sources.

But you're missing the point that there's NO consistency between the various bits of video, some have full on HDR, bloom, focus effects, high detail, the works... these are the pre rendered stuff that took artists a seriously long time to create and then a seriously long time to render. Then there are some still shots for no good reason other than to try to show some more detailed environments that a system like this could never hope to do.

Check out OTOYs website itself... it shows the sort of quality they currently claim to have with their tech, which is pretty nice unto itself, but NOTHING like the pre rendered portions of this vid.

To say that current online worlds already look somewhat like this ignores the point that all of those worlds are rendered by EACH person's computer... the rendering is handled 1 to 1, 1 pc to 1 user. The servers just need to keep track of where everyone is and what they're doing, and they need HUGE numbers of servers to handle just that.

Now, put on top of all of that the need for the servers to ALSO render scenes 'server side', and it really doesn't sound very plausible.

Now, I just tried to run the OTOY demo 'city' (which I'm guessing most of this Liveplace thing is taken from), and in Firefox the page did nothing, and in IE it tried to do an ActiveX install, and then failed (running vista here).

And on top of that, the info pdf states

OTOY 2.0 is currently in beta testing and is scheduled for deployment
in Q2 2006.


So... yeah... kinda a long way out of date there.

Right at the end of the pdf it states:
virtual runtime tools - enables OTOY content to stream onto thin
clients using either AJAX, Flash or Java - no binary plug-in required.


So, Flash and Java are plugins, and there's no way you're getting scenes anywhere near the first part of this video via AJAX...

However the mustard is cut, it's going to require horsepower on the user's machine.

In that same pdf:
cinematic rendering using 3D hardware on systems with Shader Model 2
cards (i.e. DX9 cards introduced in 2002 or later), including Intel
integrated graphics chipsets which account for nearly half of 3D
hardware enabled systems. (All screen shots in the gallery are direct
captures of the engine rendering in real time with no post processing, using shader model


ie. the machine itself needs the processing power, not the server.

dagsays...

Comment hidden because you are ignoring dag.(show it anyway)

It may be fake, but I think we're looking at the future here. To me it makes a lot of sense to render server side and just pump video screens to the end user.

Yes, yes the efficiencies of rendering real-time to 5,000 users at once seem prohbitive, but I'm sure there are a lot of savings to be made depending on how the software is designed eg, if you are rendering objects in a room that are being viewed by two people with almost the same perspective- then just render the objects once and pump it to everyone viewing from approximately the same perspective.

Please note, IANASE and I'm completely talking out of my ass. But this stuff still gives me hot pants.

cybrbeastsays...

spoco, here you can see a live demo of the software running in a browser. It's still not completely smooth, but definitely working. BUT apparently it uses 3 GPU's to make these images. So a Second Life type application would be very expensive with today's technology. But playing super beautiful games with a low end computer might be possible. Consider this, if you have an expensive GPU you only use it when you play games, the rest of the time it's pretty much idle. Now three GPU's can then provide content for a lot of users if they don't play at the same time. With a moderate subscription fee, or a pay by the hour fee, I think this would be possible. With further scaling of technology, maybe cell like super multi-core processing the costs can come down significantly.

If I had a low end machine I wouldn't mind paying to play a beautiful version of say GTA without having to upgrade my computer for $1500. And having to upgrade again a year later, etc.

I wouldn't exactly classify this video as Lies, but very optimistic and some years into the future. It needs a lot of development, but I long for the day that I don't have to upgrade to play the newest games.

jmdsays...

spoco, does the demo go online? only reason I can think of requiring any 3d hardware is if its rendering localy in which case its demoing content, not the actual application.

And again, server processing multiple users; the tricky part is what will render the graphics.. I could possibly see an engine optimised for rendering 320x240 windows.. or maybe use actual 3d hardware and copy the frame buffer out (The video card wold ultimatly need to performe fast enough to handle multiple users.. one user for one video card is not even close to economical), how ever other then that you just need alot of memory because you will have multiple users calling high resolution textures from all over. CPU wise aside from video rendering, it will not be stressed by the number of users in anyway. Aside from tracking the user, no computations really need to be made for things like AI (there is no AI), and server sided collision detection (walking through walls ftw), any non video use of the server cpu is very minimal.

ponceleonsays...

Yeah, I smell dead fish. The guy is narrating through as if it is one presentation, but it is really disjointed and looks like it came from multiple sources... also, WTF is up with the constantly changing focus thing. yeah, I get it, this "out of focus" look is all popular and shit in recent games, but this used WAY too much of that effect.

Oh, and shakey-cam = shite. I don't know who the first director it was that decided that shakey-cam is a good method to film something, but I hope you got herpes.

NordlichReitersays...

Ive done some work in Direct X and rendering to screen. Every time something happens the buffers have to update. That's a huge world ...that needs to update for every user.

That's why world of war craft has that cartoon texture. And Even then it may or may not work well. In all that I have played or researched, for college papers on the industry, rendering is done on the client sides with the information for the world stored in database files. To do this on the server side is going to require a massive connection and a huge bank of hardware. That is why it is so easy to hack client to server applications including games, Gary Hoglund goes into detail about this his books exploiting online gaming.

I find it hard to believe that they can do this with no latency on a horrible internet commercial. This seems alot like the lochness monster, or that Iraq had WMDs.

Even in Unreal all the meshes and textures were stored as ASCII files that held location points about the objects, and the rest was rendered on the users computer, and the goal was to make all objects as low poly as possible. Because once the lighting and geometry is added it will start to slow things down. That's why they came up with new ways to map textures so that they wrap around the geometry like bubble wrap seen in Gears of War.

In short(not short): it is not best practices to render from the server, the server is just there to facilitate communications. That is what a client server system does. The clients talk to each other through the server, with the simplest of messages.

Article about Otoy:
http://www.ubergizmo.com/15/archives/2008/07/otoy_serverside_3d_rendering_is_taking_the_wrong_path.html

mechadeathsays...

the one thing that i find amazing is that there are multiple references to co-development with AMD in the text and in his other videos... i wonder if anyone can verify this, as it sounds like the only plausible aspect of the story, which begs the question -why would AMD associate with this type of company?

12694says...

A lot of the assumptions being made here seem to miss the point that this rendering is based on ray tracing, that it is server side, that all that is being streamed is video (thus negating needs for a hefty client, if you can watch youtube on it you can "play" this virtual world on it), and that there are definitely some tricks you can pull based on all of this to make it less problematic than many of you think.

They're raytracing, which means to render a given scene (say, the bar) for one user costs a certain amount, but to add other viewpoints doesn't cost much more, as all you're doing is adding points of view. The scene itself is already being drawn from multiple angles because you have to to bounce the rays properly, so if you're rendering a chair, for example, it doesn't cost much more to output two POV than it does one.

In this way, what you have is a situation where you're rendering the world for 1+n users, where n is the number of extra people who are interacting in a given frame of reference with each other.

Add on top, this is obviously a prototype, probably intended to help generate buzz and/or investment capital, and you have what we see here. Its an interesting idea and while I don't think we've seen the end of client based rendering, nor will we for a very long time, this could end up being very popular for a certain niche of virtual world or augmented reality applications.

It's also possible I'm wrong.

spoco2says...

I do just LOVE all these people saying how server side rendering sounds really good, and wow what a great idea... and then go on to say 'I have no experience at all in this, I don't understand any of the issues, I don't know anyting about renderpaths'.

The best anyone can come up with is that if you render a tiny 320x240 window it doesn't take much processing... this has 2 problems:
1) rendering the actual number of pixels out onto the screen is but one part of the issue, all the rest still has to be done, (object placement, bulding, physics, collision detection, shadows etc. etc. etc.) rendering x number of pixels out is but one small part of the equation.
2) Who the crap wants to actually play a game at that resolution in this day and age? Are we not in the HD era now?

Really, when they are still having enough trouble having a quad core PC with SLI graphics cards with x cores on each of them running a game in the quality we want these days, FOR ONE PERSON how the hell do you expect servers to do it for thousands of people? And if you try on the 'well, you only have to do the physics etc. once for everyone...' think about that in a one person, one pc world, you only have to do the physics etc. within the view of that player, are you suggesting that going to be easy for a server/servers to do the physics etc. for the ENTIRE world at any given time?

Yeah...

Dreamin' for any sort of scale of world with any large scale of people connecting to it in any sort of resolution you'll actually want to play in.

spoco2says...

And just because I like having people in the industry back up what I've been saying we have here. Then if you watch Jules (the guy behind OTOY), doing a demo of OTOY, LISTEN to the bandwidth requirements "If you have a 6Mb connection we can go to full HD and render that"... OK, so... um... ok, so they're going to stream 6Mb of data to every client connected? So just 1000 people connected... JUST 1,000 clients and you're talking 6Gb of bandwidth required out of that host... 6 GIGAbits for god's sake. This DOES NOT SCALE.

And this is more logical than the tiny amounts of bandwidth required for users today to play their online games in full HD using local rendering.

OOOOKKKKK.

That demo shows more of where they currently are, but what does it show? A SMALL little area, with a few buildings and two 3D models running in a SMALL window on the monitor using a HUGE bandwidth internet connection. That's running using 3 7700 video cards on their servers... HOW IS THIS GOING TO SCALE?

Yes this has some uses for specific uses in movie production or the like where you want to show ONE or TWO people some 3D models on the computers where you have bandwidth dribbling out of your bum. But in the REAL world, with REAL numbers of people connecting to a 3D world. YEAH. RIGHT. I mean, just watch that and see how little he can do with it, he has a TINY world where he can change the lighting (and how many times does he do that), move around in the tiny world and make the optimus do its animation routine. So, OTOY exists indeed, and it can do basically what they say it can do, but on a MUCH, MUCH, MUCH smaller scale than is implied in the video

(It does show though that they really do have that guy on staff, or bought those 3D models/animations from him (seeing as they are in his 'personal' section of his website he did them for himself).

Also, I got the OTOY demo running on IE 7 on XP, and indeed, it's exactly as expected, in that it's NOT doing the rendering server side at all, but rendering in browser CLIENT SIDE (You can watch your bandwidth usage doing NOTHING while you're using it). And it's a demo with quite limited area, low resolution window, low resolution textures, pre rendered shadows etc. (NOT realtime)... this is basically just an in browser 3D plugin, of which there are already plenty around, and ones that do a damn site better job than this. "cinematic quality 3D rendering" my arse.


Am I a little obsessed by this? A bit... I hate stuff like this being thrown out there as truth and people eating it up as fact and getting all pant wetty about their puny Pentium 4 machines being able to play games that look like they were rendered on the top of the line machine.

Well, yes, you could, if you have 6Mb of bandwidth to spare, and are the only person connected to the servers, and you're happy playing in a world the size of a football field... then yeah... go for it, enjoy your game.

This is aimed at the movie industry (which seeing as they seem to have ties with ILM and other such things, and have a thing called Lightstage for capturing objects in 3D), and it surely has applications there, but for it to scale to the number of people who play online games, well... it's madness.

zeoverlordsays...

So i was playing saints row the third earlier today and save for some reflections and the quality of some of the textures and lighting it's not far off, in fact i am pretty sure next generation consoles will be able to reproduce most of this and more without modification (reflections are still tricky, but you can fake those)

Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists




notify when someone comments
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
  
Learn More