Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ada Lovelace GPUs Shows How Desperate Nvidia Is (semianalysis.substack.com)
12 points by darkteflon on Sept 23, 2022 | hide | past | favorite | 2 comments


Thought this a good technical write-up from a dependably great source.

While I agree that the value of hardware ray-tracing performance remains questionable both to gamers and to Nvidia’s market position, I would be less enthusiastic about lumping hardware “AI-based rendering techniques” in with that assessment.

DLSS is still the gold standard for AI-upscaling, in no small part due to the hardware approach. It’s very popular with gamers because it has such a significant impact on performance without negatively affecting image quality; in some cases and for certain scenarios, it even improves on native. Yes, it must be implemented by developers, but that’s a limitation that all credible upscalers currently face on account of needing access to, e.g., underlying motion vector information.

On the other hand, it’s also fair to say that AMD and - more recently - Intel are doing some very interesting work in this space, and the outcome of this race is not a foregone conclusion. We also still don’t know, I think, what sort of hardware innovations AMD might bring to the space with RDNA3.

Certainly we’re at an interesting inflection point.


I agree with you. Raytracing really isn't useful for most gamers, and I don't think it's realistic for the video game industry to move towards raytracing when rasterized rendering has delivered such good results thus far. ML acceleration is much more useful; upscaling alone can justify the extra cost, but there are so many other potential use cases, especially now that more and more consumers are running fancy ML models like Stable Diffusion which require large amounts of VRAM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: