Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Real-time is unfortunately a sort of vague term.

If you mean raster rendering pipelines then I don’t believe it’s possible because the nature of the GPU pipelines precludes it. You’d likely need to make use of compute shaders to create it at which point you’ve just written a patthtracer anyway.

If you mean a pathtracer , then real-time becomes wholly dependent on what your parameters are. With a small enough resolution, Mitsuba with Dr.JIT could theoretically start rendering frames after the first one in a reasonable time to be considered realtime.

However the reality is just that even in film, with offline rendering, very few studios find the gains of spectral rendering to be worth the effort. Outside of Wētā with Manuka , nobody else really uses spectral rendering. Animal Logic did for LEGO movie but solely for lens flares.

The workflow change to make things work with a spectral renderer and the very subtle differences are just not worth the high increase in render time



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: