Ray tracing is a nice step, but looking at SOTTR and it's 1080p 30fps demo. Yeah, I won't be using it until the tech and hardware matures at any rate.
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
Performance does look like it will be a problem to start. In principle though, I think you could halve the rays being calculated in 3D, then when doubled, one for each eye, you'd be back to 2D performance with it still showing the effect. With the greater than 2x bump you get from different parallax, this would probably still look terrific.
I'm not an expert, but it seems to me that there is no fundamental reason that ray tracing using the RTX here would not work with 3D Vision. The only real question for us is whether 3D Vision Automatic will pick it up and automatically force it to draw again for the 2nd eye.
From a technical perspective, the problems that we run into getting games to work in 3D is because the developers are using hacks/workarounds to get the effects that they want, and the effects only work in 2D.
Typically this is for performance reasons. As you know shadows are very often a problem in games, and that's because the developers almost always use deferred shading nowadays. They use deferred shading for performance reasons. If you add 'too many' lights to a scene then deferred shading has been the only way to get enough performance. The same is true of tile-lighting. The idea behind that is that lights far away from the tile you are working on won't have any impact on those pixels. Not strictly true, but as an approximation it's pretty good.
Deferred shading breaks transparency effects in general. So there's another hack to workaround that. And that's why doing things in 3D doesn't work well out of the box, because all these hacks are in place with the sole goal of making a 2D image work.
Ray tracing flips that on its head. With rays, you calculate the exact path of lighting, and at least with this first gen, there won't be weird hacks like deferred shading. Quality will be more important than performance, at least to start.
So in principle at least, I would expect ray tracing to work in 3D out of the box, with no changes necessary, and no ShaderHacking required. That would be super nice for transparency effects and specular highlights, a couple of the hardest for us to solve.
But really, this will depend upon the 3D Vision team, and whether it is easy or hard for them to add it to 3D Vision Automatic. In 3D Vision Direct, the developer could easily enable this.
Just a guess, but it seems to me that adding ray tracking SDK/API to 3D Vision Automatic would not be very hard. Not like adding g-sync to 3D Vision for example. It would just double the calculations for the extra eye, just like it already does for all the geometry in a game. We can always hope.
Performance does look like it will be a problem to start. In principle though, I think you could halve the rays being calculated in 3D, then when doubled, one for each eye, you'd be back to 2D performance with it still showing the effect. With the greater than 2x bump you get from different parallax, this would probably still look terrific.
I'm not an expert, but it seems to me that there is no fundamental reason that ray tracing using the RTX here would not work with 3D Vision. The only real question for us is whether 3D Vision Automatic will pick it up and automatically force it to draw again for the 2nd eye.
From a technical perspective, the problems that we run into getting games to work in 3D is because the developers are using hacks/workarounds to get the effects that they want, and the effects only work in 2D.
Typically this is for performance reasons. As you know shadows are very often a problem in games, and that's because the developers almost always use deferred shading nowadays. They use deferred shading for performance reasons. If you add 'too many' lights to a scene then deferred shading has been the only way to get enough performance. The same is true of tile-lighting. The idea behind that is that lights far away from the tile you are working on won't have any impact on those pixels. Not strictly true, but as an approximation it's pretty good.
Deferred shading breaks transparency effects in general. So there's another hack to workaround that. And that's why doing things in 3D doesn't work well out of the box, because all these hacks are in place with the sole goal of making a 2D image work.
Ray tracing flips that on its head. With rays, you calculate the exact path of lighting, and at least with this first gen, there won't be weird hacks like deferred shading. Quality will be more important than performance, at least to start.
So in principle at least, I would expect ray tracing to work in 3D out of the box, with no changes necessary, and no ShaderHacking required. That would be super nice for transparency effects and specular highlights, a couple of the hardest for us to solve.
But really, this will depend upon the 3D Vision team, and whether it is easy or hard for them to add it to 3D Vision Automatic. In 3D Vision Direct, the developer could easily enable this.
Just a guess, but it seems to me that adding ray tracking SDK/API to 3D Vision Automatic would not be very hard. Not like adding g-sync to 3D Vision for example. It would just double the calculations for the extra eye, just like it already does for all the geometry in a game. We can always hope.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
bo3b, do you have any insight at all as to whether the basic 3d vision drivers (and automatic mode) will be supported by the 2xxx series? That's my only real concern at this point and it's what's stopped me from preordering.
bo3b, do you have any insight at all as to whether the basic 3d vision drivers (and automatic mode) will be supported by the 2xxx series? That's my only real concern at this point and it's what's stopped me from preordering.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote="bo3b"]So in principle at least, I would expect ray tracing to work in 3D out of the box, with no changes necessary, and no ShaderHacking required. That would be super nice for transparency effects and specular highlights, a couple of the hardest for us to solve. [/quote] Really glad to hear your optimistic predictions Bob! I got very excited after watching some YouTube videos last night and came to realize this is something on its own. Something nVidia developed. So my first thoughts, predictions and speculation might be way off and I hope so. It sounds too good to be true. The team at nVidia working on this really knows their shit. Its way bigger than it seems when Jensen talks about engineering and architectural development. Seems like a totally new audience for nVidia. It made me realise where 3d Vision is on nVidia's priorities. The worlds most powerful solid modelers were also there. Catia and Solid Works. Software I dream to master one day.
Watched this one like 5 times already xD
https://www.youtube.com/watch?v=KJRZTkttgLw
Im just curious whether this would be more prominent among cut scenes or during real time game play? Because it seems like these beautiful scenes are rendered on Quadros in Real Time.
Also very curious how outside scenery would be rendered (global illumination). Because I think this is where the safety wheels come off. So far there aren't much content in this area. Its probably too soon to expect too much. But I saw Unity is on board and photo realism seems to be the next big thing. Is unity using their own Raytracing algorythms? This video got me a bit confused...
https://www.youtube.com/watch?v=Ke4aqxhSeDQ
[quote="DJ-RK"]I've seen discussion elsewhere talking about how much better fidelity ray tracing can potentially create for VR[/quote] Can you remember where you saw this DJ-RK?
@RAGEdemon, Where did you download that 3d model? It looks pretty cool. I've never used the stereoscopic settings in Keyshot because its grayed out. We never got it for that type of work. (Things like animations..)
I usually use grabCAD to download 3d models to save me some time redrawing stuff when I need it fast.
For those who haven't seen it yet its definitely worth the watch!!!
https://www.youtube.com/watch?v=jY28N0kv7Pk
bo3b said:So in principle at least, I would expect ray tracing to work in 3D out of the box, with no changes necessary, and no ShaderHacking required. That would be super nice for transparency effects and specular highlights, a couple of the hardest for us to solve.
Really glad to hear your optimistic predictions Bob! I got very excited after watching some YouTube videos last night and came to realize this is something on its own. Something nVidia developed. So my first thoughts, predictions and speculation might be way off and I hope so. It sounds too good to be true. The team at nVidia working on this really knows their shit. Its way bigger than it seems when Jensen talks about engineering and architectural development. Seems like a totally new audience for nVidia. It made me realise where 3d Vision is on nVidia's priorities. The worlds most powerful solid modelers were also there. Catia and Solid Works. Software I dream to master one day.
Watched this one like 5 times already xD
Im just curious whether this would be more prominent among cut scenes or during real time game play? Because it seems like these beautiful scenes are rendered on Quadros in Real Time.
Also very curious how outside scenery would be rendered (global illumination). Because I think this is where the safety wheels come off. So far there aren't much content in this area. Its probably too soon to expect too much. But I saw Unity is on board and photo realism seems to be the next big thing. Is unity using their own Raytracing algorythms? This video got me a bit confused...
DJ-RK said:I've seen discussion elsewhere talking about how much better fidelity ray tracing can potentially create for VR
Can you remember where you saw this DJ-RK?
@RAGEdemon, Where did you download that 3d model? It looks pretty cool. I've never used the stereoscopic settings in Keyshot because its grayed out. We never got it for that type of work. (Things like animations..)
I usually use grabCAD to download 3d models to save me some time redrawing stuff when I need it fast.
For those who haven't seen it yet its definitely worth the watch!!!
[quote="KoelerMeester"]
Watched this one like 5 times already xD
https://www.youtube.com/watch?v=KJRZTkttgLw
Im just curious whether this would be more prominent among cut scenes or during real time game play? Because it seems like these beautiful scenes are rendered on Quadros in Real Time.
[/quote]
This is not 100% raytracing. Only reflections, lighting effects and some minor FX are raytracing. Everything else is a normal 3D rendering. RTX 20xx GPU won't bring real time full raytracing, neither RTX Quadros.
Nvidia marketing is misleading people so they can have tons of preorders.
Im just curious whether this would be more prominent among cut scenes or during real time game play? Because it seems like these beautiful scenes are rendered on Quadros in Real Time.
This is not 100% raytracing. Only reflections, lighting effects and some minor FX are raytracing. Everything else is a normal 3D rendering. RTX 20xx GPU won't bring real time full raytracing, neither RTX Quadros.
Nvidia marketing is misleading people so they can have tons of preorders.
I don't doubt they want to maximise preorders, but I'm not convinced that they're misleading anyone.
To be fair I haven't studied all the presentations in detail, but my impression is that they are selling the extra tensor/etc cores as enabling the additional ray tracing (RTX) effects as they will be hardware accelerated.
You're right, the idea that the whole scene is ray traced would be ridiculous, but I don't think that's what they're saying.
I think someone else said that the RTX effects are analagous to Physx, and I agree with that. I would expect that the ray tracing still adds an overhead to the 'standard' rendering cores/pipeline.
I don't doubt they want to maximise preorders, but I'm not convinced that they're misleading anyone.
To be fair I haven't studied all the presentations in detail, but my impression is that they are selling the extra tensor/etc cores as enabling the additional ray tracing (RTX) effects as they will be hardware accelerated.
You're right, the idea that the whole scene is ray traced would be ridiculous, but I don't think that's what they're saying.
I think someone else said that the RTX effects are analagous to Physx, and I agree with that. I would expect that the ray tracing still adds an overhead to the 'standard' rendering cores/pipeline.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote="rustyk21"]
I think someone else said that the RTX effects are analagous to Physx, and I agree with that. [/quote]
It was me :D
I said they're misleading people because talking about "real time raytracing" when not 100% of the scene is actually raytracing (like in 3D renderer such like Maya, 3Dsmax,...) is not totally fair.
I think someone else said that the RTX effects are analagous to Physx, and I agree with that.
It was me :D
I said they're misleading people because talking about "real time raytracing" when not 100% of the scene is actually raytracing (like in 3D renderer such like Maya, 3Dsmax,...) is not totally fair.
Yes, not the same thing at all... I remember years ago setting up POV to raytrace a globe on a 286 and it took all night to render one from.
Ok, my memory is hazy, it might have been a 386 sx, point being it was a huge thing and took so long. Full frame ray tracing is massive and has been talked about for a while, I see this as being a half way house house that just leverages the hardware to get the best results that they can.
Yes, not the same thing at all... I remember years ago setting up POV to raytrace a globe on a 286 and it took all night to render one from.
Ok, my memory is hazy, it might have been a 386 sx, point being it was a huge thing and took so long. Full frame ray tracing is massive and has been talked about for a while, I see this as being a half way house house that just leverages the hardware to get the best results that they can.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
I'm not an expert, but it seems to me that there is no fundamental reason that ray tracing using the RTX here would not work with 3D Vision. The only real question for us is whether 3D Vision Automatic will pick it up and automatically force it to draw again for the 2nd eye.
From a technical perspective, the problems that we run into getting games to work in 3D is because the developers are using hacks/workarounds to get the effects that they want, and the effects only work in 2D.
Typically this is for performance reasons. As you know shadows are very often a problem in games, and that's because the developers almost always use deferred shading nowadays. They use deferred shading for performance reasons. If you add 'too many' lights to a scene then deferred shading has been the only way to get enough performance. The same is true of tile-lighting. The idea behind that is that lights far away from the tile you are working on won't have any impact on those pixels. Not strictly true, but as an approximation it's pretty good.
Deferred shading breaks transparency effects in general. So there's another hack to workaround that. And that's why doing things in 3D doesn't work well out of the box, because all these hacks are in place with the sole goal of making a 2D image work.
Ray tracing flips that on its head. With rays, you calculate the exact path of lighting, and at least with this first gen, there won't be weird hacks like deferred shading. Quality will be more important than performance, at least to start.
So in principle at least, I would expect ray tracing to work in 3D out of the box, with no changes necessary, and no ShaderHacking required. That would be super nice for transparency effects and specular highlights, a couple of the hardest for us to solve.
But really, this will depend upon the 3D Vision team, and whether it is easy or hard for them to add it to 3D Vision Automatic. In 3D Vision Direct, the developer could easily enable this.
Just a guess, but it seems to me that adding ray tracking SDK/API to 3D Vision Automatic would not be very hard. Not like adding g-sync to 3D Vision for example. It would just double the calculations for the extra eye, just like it already does for all the geometry in a game. We can always hope.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Watched this one like 5 times already xD
Im just curious whether this would be more prominent among cut scenes or during real time game play? Because it seems like these beautiful scenes are rendered on Quadros in Real Time.
Also very curious how outside scenery would be rendered (global illumination). Because I think this is where the safety wheels come off. So far there aren't much content in this area. Its probably too soon to expect too much. But I saw Unity is on board and photo realism seems to be the next big thing. Is unity using their own Raytracing algorythms? This video got me a bit confused...
Can you remember where you saw this DJ-RK?
@RAGEdemon, Where did you download that 3d model? It looks pretty cool. I've never used the stereoscopic settings in Keyshot because its grayed out. We never got it for that type of work. (Things like animations..)
I usually use grabCAD to download 3d models to save me some time redrawing stuff when I need it fast.
For those who haven't seen it yet its definitely worth the watch!!!
This is not 100% raytracing. Only reflections, lighting effects and some minor FX are raytracing. Everything else is a normal 3D rendering. RTX 20xx GPU won't bring real time full raytracing, neither RTX Quadros.
Nvidia marketing is misleading people so they can have tons of preorders.
To be fair I haven't studied all the presentations in detail, but my impression is that they are selling the extra tensor/etc cores as enabling the additional ray tracing (RTX) effects as they will be hardware accelerated.
You're right, the idea that the whole scene is ray traced would be ridiculous, but I don't think that's what they're saying.
I think someone else said that the RTX effects are analagous to Physx, and I agree with that. I would expect that the ray tracing still adds an overhead to the 'standard' rendering cores/pipeline.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
It was me :D
I said they're misleading people because talking about "real time raytracing" when not 100% of the scene is actually raytracing (like in 3D renderer such like Maya, 3Dsmax,...) is not totally fair.
Ok, my memory is hazy, it might have been a 386 sx, point being it was a huge thing and took so long. Full frame ray tracing is massive and has been talked about for a while, I see this as being a half way house house that just leverages the hardware to get the best results that they can.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310