Something went wrong. Try again later

AlexGlass

This user has not updated recently.

704 5 8 4
Forum Posts Wiki Points Following Followers

AlexGlass's forum posts

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

Fuck resolution. Noise is the topic of the future! Although I have a feeling some people will not understand what's going on here.

In short, this is a real time path tracing engine running at 30fps and 720p in real time and has been in development for roughly 5 years. To me and others who understand what they have achieved, this is absolutely astonishing. You will see noise as the ray tracing engine converges. It also runs at 720p. Despite that, this engine crushes Titans.

And yes true next-gen will actually start at 720p. And despite that fact, once we have GPUs that can clear up this noise, games will look more clear and pristine, more technically impressive and more realistic than ever and compared to what we're currently accustomed to even at much higher resolutions. You will get a higher geometry count than you have ever seen in anything other than CG. You will get lighting effects only previously possible in CG. And you will get all the materials that devs currently shy away from showing in video games or that generally look bad...mirrors, glass, water and other fluids, transparency, shiny surfaces, fully dynamic lighting engines, indirect lighting, etc. Standard.

Brigade 3

Time for an update on Brigade 3 and what we've been working on: until now, we have mostly shown scenes with limited materials, i.e. either perfectly diffuse or perfectly specular surfaces. The reason we didn't show any glossy (blurry) reflections so far, is because these generate a lot of extra noise and fireflies (overbright pixels) and because the glossy material from Brigade 2 was far from perfect. Over the past months, we have completely replaced the material system from Brigade and replaced it with the one from OctaneRender, which contains an extraordinary fast converging and good looking glossy material. The sky system was also replaced with a custom physical sky where sky and sun color vary with the sun position. We've had a lot of trouble finding a good way to present the face melting awesomeness that is Brigade 3 in video form and we've tried both youtube and Vimeo at different upload resolutions and samplecounts (samples per pixel). Suffice to say that both sites have ultra shitty video compression, turning all our videos in a blocky mess (although Vimeo is still much better than YT). We also decided to go nuts on glossy materials and fresnel on every surface in this scene, which makes everything look much a lot more realistic (in particular fresnel, which causes surfaces to look more or less reflective depending on the viewing angle), but the downside of this extra realism is a lot of extra noise.

Loading Video...

The scene in the video is the very reason why I started this blog and is depicted in one of my very first blog posts from 2008 (see http://raytracey.blogspot.co.nz/2008/08/ruby-demo.html). The scene was created by big Lazy Robot to be used in a real-time tech demo for ATI's Radeon HD 4870 GPU. Back then, the scene used baked lightmaps rendered with V-Ray for the diffuse lighting and an approximate real-time ray tracing technique for all reflective surfaces like cars and building windows. Today, more than five years later, we can render the same scene noise free using brute force path tracing on the GPU in less than half a second and we can navigate through the entire scene at 30 fps with a bit of noise (mostly apparent in shadowy areas). When I started this blog my dream was to be able to render that specific scene fully in real-time in photoreal quality and I'm really glad I've come very close to that goal. We plan to show more videos of Brigade 3 soon, so stay tuned...

Now we just need to demand that AMD and Nvidia stop putting out rasterized based GPUs at $700+ and actually make a GPU that cleans up this noise. One more GPU generation, two tops, should do it, so late 2014 - early 2015. I have a feeling Maxwell will go this route. If they target compute rather than rasterized graphics, it could be just one and the Maxwell cards might just be the first to do the trick.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#3  Edited By AlexGlass

@tourgen said:

well resolution is a quick metric of the processing power of a box these days. programmable shader pipelines & GPGPU and all that. So if it doesn't have the power to do 1080p reliably it won't have the power to do any of those other things you seem to think are more important than resolution.

It depends on which area of your GPUs are used for which effects. Resolution targets a specific area.

It's not at all unusual for the same power to be used to develop a lower polygon game running a weaker physics engine at 30fps but with a dynamic global illumination engine at 1080p, as you would need to develop a game running at 60fps, but without a dynamic global illumination engine and far more polygons at 720p. In this case you just can't draw a direct comparison at all. Resolution means absolutely nothing. Even if it was the same, the frame rate, polygon count and physics are not. So you still can not draw a direct comparison. Means absolutely nothing.

There is really no way to tell if for example Drive Club with a GI engine for lighting at 30fps is more technically demanding than Forza 5 at 60fps, with shadow maps. You can't draw a direct comparison from that. And you're missing a whole bunch of other areas. Polygon count. Physics. Particle effects.

Or you can have a game running at 720p at the same frame rate as a 1080p game but the 720p game will allow for improvements in other areas of your graphics.

If I double my polygon count, or objects on screen, then your dynamic global illumination engine is going to put a massive hit on your frame rate. Because now those lights and shadows have to be calculate for twice the geometry.

It's ALL relative. So to take one aspect and use it as that metric, it means you better be sure the game is exactly the same in every OTHER area. And resolution is just a clarity aspect of graphics.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#4  Edited By AlexGlass

@trafalgarlaw said:

@alexglass said:

@trafalgarlaw said:

Damage controlling the fact that xbox one games are going to be 720p or 900p already? Melting down are we?

No damage controlling. Just frustrated. No doubt this prompted me to write this, but it's been festering for a long time and it's a general disease, not just in relation to next-gen consoles.

The fact that a topic such as that can send the internet in flames, and everyone losing their shit and giving resolution so much importance, not just now, but for years.

Then here's me posting about something like Brigade, which is doing real time path tracing....which will actually have A MAJOR impart on graphics and games and the general response is people scratching their head...going...."duh", I don't get it.

It's just plain ignorant. It's absolutely mind boggling to me.

Calm down. Forza 5, albeit in 1080p, has prebaked light sources and no real-time day-night cycles. That's when you know Xbox One is in trouble, not even impressing in ways other than resolution. I'm always down for tech talk but resolution always plays an important role.

Oh it does does it? Fantastic.

Now answer me this question. Is a game running at 30fps with a global dynamic illumination engine at 1080p more impressive than a game running a shadow map driven light engine running at 60fps? Is that game more power hungry? More technically demanding? What if it's pushing less triangles? How about if it's weak on physics?

Care to tell me how many triangles per frame it's pushing? How about Call of Duty? How many objects? How about BF4? How about Assasin's Creed?

You just proved the ignorance I'm talking about. You took ONE aspect and came to that EXTREMELY FLAWED conclusion. This is exactly the kind of stupid, ignorant comparisons that are taking place everywhere. And they truly are ignorant, meaningless and wrong in every possible way.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#5  Edited By AlexGlass

@amonkey said:

I'm not sure I follow you logic. 1080p should really be industry standard by now, since it is somewhat of a bottle neck for a nice looking games. On PC I feel that 1920 is as high as it needs to go though.

My logic is nobody talks about anything else anymore. It's as if gamers are completely freaking ignorant to all the things what makes a game look good.

It's ONE aspect. Geometry count, lighting, animation, physics, particle effects, are no less important than final resolution. But nobody talks about it anymore. Nobody.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#6  Edited By AlexGlass

@trafalgarlaw said:

Damage controlling the fact that xbox one games are going to be 720p or 900p already? Melting down are we?

No damage controlling. Just frustrated. No doubt this prompted me to write this, but it's been festering for a long time and it's a general disease, not just in relation to next-gen consoles.

The fact that a topic such as that can send the internet in flames, and everyone losing their shit and giving resolution so much importance, not just now, but for years.

Then here's me posting about something like Brigade, which is doing real time path tracing....which will actually have A MAJOR impact on graphics and games and the general response is people scratching their head...going...."duh, I don't get it."

It's just plain ignorant. It's absolutely mind boggling to me.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

First of all I want to say that I am a HUGE fan of graphics techniques, envelope pushing engines, power, and everything that makes a game look gorgeous. I research it on a regular basis. I look at the latest techniques, the latest advancements in GPU tech, and the latest up and coming engines. From polygon to point cloud data to voxels to ray tracing engines. I LOVE all of this stuff. Absolutely love it.

Having said that, there is something that has been driving me crazy for years and that is this obsession with resolution, partly driven by the PC crowd and manufacturers desires to sell high end GPUs without major upgrades. I really feel this has made the gaming world a stupid or ignorant place or BOTH when judging graphics and power.

It's beyond ignorant, so ignorant that gamers have forgot what actually make up your graphics. I don't even know if you could place it in the top 5 categories. Nobody ever freaking talks about anything else anymore.

How about polygon count or geometry in general? Is a 1080p game running 10 million triangles per scene more technically impressive than a 720p game running 2 million triangles per scene(with everything else being the same)? What do you get if you look at that 2 million triangle game when you see it at 1080p? A more CLEAR view of a lower polygon game. Characters will look less rounded. Less objects. Less detail in every other aspect of your game. The only thing you get is a more clear picture. Does that aspect alone, represent a better looking game? You have a more clear view of a shittier looking game.

Have gamers forgot about this?

How about lighting engine? Is a 1080p game running a shadow map driven lighting engine a better looking or more impressive game than a 720p game running a fully dynamic global illumination engine? I would argue a real time path traced game at 720p which will soon start popping up is going to wipe the floor with a rasterized game even at 4k resolution! I would take a game running an engine like Brigade at 720p without noise any day of the week before I would take a rasterized game at 4k. No doubt that Brigade game is likely going to be more impressive in every single aspect than that 4k game and will look better overall.

How about physics?

How about simultaneous enemies on screen?

How about draw distance?

What about animations?

Oh and here's a thought. How about actual texture asset resolution? Do you guys ever pay attention to this? You know, the actual textures that gets UV mapped and pasted on your character or object in the game. What is the resolution of THAT texture? Are all at the minimum necessary res in order to even support the game's main resolution? That matters just as much when it comes to resolution. If your game is running at native 4k but your assets are 1080p textures and you are looking at it through a 4k resolution do you think you truly actually have a "native 4k game?"

What about things that actually -you know - make up your graphics? Not just the screen filter through which you see your graphics, because that's all resolution is. A clarity measurement. Not the actual objects, or 3D geometry, or effects or any of the graphical components that are far more important. It's driving me nuts to see the dominating topic of graphics, or power, be resolution.

And here's a interesting idea for those that are actually interested in the future of graphics. The best looking games in the future, will be those that don't actually rely on traditional image textures at all. At any resolution. Textures are static. They will never scale properly to higher resolutions. They don't actually display true 3D data that can be manipulated or played with in a video game. The clear and definitive way, when you have actually reached the holy grail in graphics, is when you can actually dump textures as you know them out the window and begin adding that detail in through actual 3D geometry or 100% procedural textures. Then it can actually scale properly. If sculpted out of geometry, that geometry can actually cast shadows properly within a dynamic global illumination engine, or reflect things, or be taken into consideration by a physics engine. If you plan on buying that 4k TV and buying a stout GPU that can actually pull it off, but the game is covered in textures meant to be displayed at 1080p, then when you scale it up, all you will most likely see is just the additional flaws you couldn't really notice before.

For the love of god, stop this shit internet! Just stop it! Resolution is not the be all end all of graphics or what makes graphics look good or technically impressive. It really isn't and you are missing the forest from the damn trees if you begin using it as the measuring stick for graphics or power. There's so many more exciting and more important aspects of graphics that should be discussed and are not. There are devs pushing gaming tech and graphics in really important areas, that actually matter much more and everyone is just busy counting freaking pixels.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#8  Edited By AlexGlass

Considering the ENTIRE internet had no clue K.I. ran at 720p until the devs stated it, or Ryse ran at 900p upscaled until the devs stated it, not only do I find this recent obsession with resolution and the correlation with the be all, end all of what makes graphics or power ridiculous, but you would think you wouldn't trust some guy who thinks he saw it running at 720p.

There were millions of people online analyzing screens for months, and I don't remember ONE guy that came out and said...wait a minute guys...it's NOT 1080p BEFORE someone asked the devs and they confirmed it.

Does anyone else remember? Because I sure as hell don't. I remember Twitter confirmations. That's all.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@jgf said:

@giganteus said:

The only thing that sets off an alarm in my head to this being possibly true is Albert Panello giving a non-answer on twitter. That's not a good sign when PR does that. I'm reminded of Driveclub's director (or whomever it was) doing the same thing just last week before it was delayed. So I'm not going to call bullshit, yet.

Either way, Ghosts won't look too stellar in either resolution.

Yeah, I also don't get this marketing speech babble. If the rumor is not true why not just confirm that its native 1080p/60fps and call it a day. Instead we get a wall of text basically saying nothing. Or if you don't want to respond to any rumor and speculation, do not respond to it at all. Its weird.

Guess we'll just have to wait and see who was right when nextgen hits the stores. It'll be fun watching all the rumors getting debunked or confirmed. At least we don't have to argue about this stuff anymore and can finally start gaming :)

Yeah if we are to assume Albert has access or prior knowledge to all of this type of information. But if he doesn't, then wouldn't he need to find out?

If they do know and it's true, and MS is just hiding all of this from its fan base in order to collect pre-orders, fans are not going to be very happy. They better have a good reason.

Another update:

Ok.

Well, fuck being coy, obviously this is the rumor I heard. It's really thin, guys. Really thin. I want two sources on anything I post here or talk about on my podcast but this I really, really wanted two sources on. And there's a few wrinkles to it as well which are interesting (if they are true). I just cannot say whether it's true or not and this is obviously a huge fucking deal. Sales are at stake. At this point I wouldn't be surprised if MS put a goddamn hit out on me.

Anyway, this is the rumor. And it's thin. It's fucking thin. To be completely honest I don't believe it. I mean... it's just weird. Anyway - supposedly Activision is doing a controlled environment review process that only huge ass games can get away with. What I heard was you got to chose a version.... either xbox one or PS4 and you didn't get to see the other one. You then go in and play it at their setup for a certain number of hours and that's it. Someone who saw the Xbox one version claims it's 720p.

Did this person count pixels? Is this known info they are telling people? Is it upscaled? I don't know the answers to any of these questions. It doesn't even make sense to me... CoD is a pretty shitty looking game, why in the hell could it not run at 1080p? I feel like if they took the time they could get the 360 version at 1080p if they wanted to... the engine is older than me. But, that's what was said.

It's impossible to get a second source because everyone is, obviously, under NDA. No one wants to get blacklisted by Activision.

And now for my opinion....

Driveclub being delayed I was pretty damn certain on. It wasn't a flimsy rumor like this. But the one thing that I'm noticing here is that the non-responses are the same. I was nervous about the driveclub thing because I was out here by myself with some random ass site I never heard of (kudos to them for being right though) and if the info ended up being wrong I was going to look like a huge asshole. Each non-answer given made me feel better about being right.

If the game is running at 1080p there's no reason to give non-answers. That's not info you want to hide. So looking at it from this angle - the rumor exists and non answers are given - I feel like that gives credence to the rumor.

But... but... but...! I've been right about a lot of shit. I should have never teased this because this rumor is fucking flimsy. More flimsy than PS4 in October which the entire internet believed. I'm not saying it's true. I expect 900p to be honest. That's a guess. I don't know. I do know there is a lot of chatter about the xbox struggling to hit 1080p.

Also, why the fuck not, Titanfall has been rumored to be targeting 720p on the xbox one as well. I'm sure that probably came up in this thread already by someone... but, yeah, that rumor seems more solid than this CoD one. http://www.neogaf.com/forum/showpost.php?p=87095707&postcount=2179

So a 1,000 word essay on explaining that it's a weak rumor, doesn't know jack shit, and starting another one.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@alexglass said:

@the_laughing_man said:

So is he more or less going like "trolololol its not true"

Mumbo jumbo?

I don't know but it's pretty ironic to talk about holding himself to higher journalism standards. Personally I applaud the IGN mods for just locking threads with baseless NeoGAF rumors. I don't trust their anonymous insider sources are any better than any other websites anonymous insider sources that they used to ban people for posting. One thing I actually used to respect about GAF because you knew the news was legit. Now it's like they live to start them and spread them.

Supposedly this is what started all this.

http://i.imgur.com/eBn6ODI.png

Yeah see I don't get that. Put your name on it. If you have an inside source, you don't have to reveal it, but put your username on it, be clear, and stand by it. I think if you know you have this type of popularity and power to spread information, then it requires more responsibility and standards than what they have shown this year.