Comment by pixelpoet
Teenage me from the 90s telling everyone that ray tracing will eventually take over all rendering and getting laughed at would be happy :)
Teenage me from the 90s telling everyone that ray tracing will eventually take over all rendering and getting laughed at would be happy :)
As you can tell, I'm patient :) A very important quality for any ray tracing enthusiast lol
The ability to do irregular sampling, efficient shadow computation (every flavour of shadow mapping is terrible!) and global illumination is already making its way into games, and path tracing has been the algorithm of choice in offline rendering (my profession since 2010) for quite a while already.
Making a flexible rasterisation-based renderer is a huge engineering undertaking, see e.g. Unreal Engine. With the relentless march of processing power, and finally having hardware acceleration as rasterisation has enjoyed for decades, it's going to be possible for much smaller teams to deliver realistic and creative (see e.g. Dreams[0]) visuals with far less engineering effort. Some nice recent examples of this are Teardown[1] and Tiny Glade[2].
It's even more inevitable from today's point of view than it was back in the 90s :)
[0] Dreams: https://www.youtube.com/watch?v=u9KNtnCZDMI
[1] Teardown: https://teardowngame.com/
[2] Tiny Glade: https://www.youtube.com/watch?v=jusWW2pPnA0
AFAICT it's not really different, they're just calling it something else for marketing reasons. The system described in the Sony patent (having a fixed-function unit traverse the BVH asynchronously from the shader cores) is more or less how Nvidia's RT cores worked from the beginning, as opposed to AMDs early attempts which accelerated certain intersection tests but still required the shader cores to drive the traversal loop.
My old ray tracer could do arbitrary quadric surfaces, toroids with 2 minor radii, and CSG of all those. Triangles too (no CSG). It was getting kind of fast 20 years ago - 10fps at 1024x768. Never had good shading though.
I should dig that up and add NURBS and see how it performs today.
It's not, though. The use of RT in games is generally limited to secondary rays; the primaries are still rasterized. (Though the rasterization is increasingly done in “software rendering”, aka compute shaders.)