Nvidia is developing a new approach to increase future GPUs’ ray-tracing capabilities. Ray tracing is a technique for creating detailed light and shadows in video games by calculating lighting and reflections. It’s fairly hard, but it looks fantastic in games like Control, and it can even breathe fresh life into older games like Half-Life. Subwarp Interleaving is the latest feature, and while it’s not yet ready for implementation, it sounds like a great potential step forward for GPUs.
Future Nvidia GPUs could be 20% better at Ray Tracing
Nvidia has published a research document that outlines this new discovery, as discovered by Ox22h on Twitter (via Tom’s Hardware). Subwarp Interleaving works by being better optimized to deal with the significant thread divergence and low warp occupancy that ray tracing has in applications. It’s already generating benefits of 6.3 percent on average, with gains of up to 20% possible in the best-case scenario.
As part of the challenge, the researchers outlined how contemporary GPUs work. “To begin, GPUs organize threads into units called warps that fetch data from a single program counter (PC) and execute SIMT (single instruction, multiple threads) instructions. Second, GPUs mask stalls by scheduling multiple active warps at the same time.”
This architecture obstructs ray tracing, resulting in warp-depleted conditions. Subwarp Interleaving is supposed to make better use of these features, such as enhancing hardware utilization by using diverging pathways and lowering warp latency.
“When a long-latency operation stalls a warp and the GPU’s warp scheduler is unable to find an active warp to switch to, a sub warp scheduler can switch execution to another divergent sub warp of the current warp,” the researchers say.
Subwarp Interleaving requires architectural changes to current GPUs to be implemented, thus we won’t see it in Nvidia’s new RTX 3090 Ti monster GPU. Still, it’s always encouraging to witness technological progress, and a 20% increase in ray tracing efficiency is nothing to scoff at. No doubt, by the time this technology is ready for use in graphics cards, we’ll have seen a slew of other cool developments to go along with it.
Leave a Reply