Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why does this GPU cost so much? It seems to be like an RTX 3060 Ti/3070 with more VRAM and less power draw. Does that really justify the price tag?


NVIDIA's licensing. Note that the 3060/3070 is a "GeForce" whereas a 4000 is a "Quadro". Then look at the GeForce driver EULA: https://www.nvidia.com/content/DriverDownloads/licence.php?l...

> No Datacenter Deployment. The SOFTWARE is not licensed for datacenter deployment, except that blockchain processing in a datacenter is permitted.

I could never figure out why they allowed blockchain processing, given how wasteful much of it was and how it impacted card supply for years, but there you go.


Because they made a lot of money selling to miners at prices that maybe they couldn't ask to gamers?


Wouldn't the GeForce license then say "no block chain" and a different NVIDIA brand (BlockForce?) would allow it, letting NVDA charge more for large scale mining? Otherwise it looks like gamers and miners are paying the same prices.


They do. But since most miners would just ignore the license, they implemented LHR (low hashrate) beginning on GeForce 3000-series that detects the types of ops used for mining and artificially slows the GPU to about 25% of its actual speed.


They did that but with mixed success. One random link from a Google search: https://old.reddit.com/r/EtherMining/comments/t4mwxj/workaro...


Their source code got leaked and it was reverse engineered to defeat it. That said, it was all kind of late in the game, GPU mining died not too long afterwards.


I bet that now the AI demand spike will be the new crypto demand spike that NVDA latches onto.


This seems insane to me. It makes me even more confused that AMD/Intel aren't trying their hardest to go after AI compute. Nvidia is basically leaving the door open.

But I suppose that the idea could be to protect their gaming market in a way. If datacenters were buying up gaming GPUs then that could cause shortages for regular customers.


AMD is trying their hardest.


Because it's "enterprise".


Datacenter customers don't have options, gamers do.


RTX 4000 isn't even data-center grade - it's workstation/pro segment. This is an important difference, as some server vendors (such as Dell) will not validate the workstation segment to run in their servers.

The segmentation is a huge mess from NVIDIA. Consumer grade GPUs aren't supposed to be run in data-center / compute (forbidden by EULA) and the workstation GPUs are in a weird middleground limbo, being many times more expensive, but not exactly attractive for server use for the above reasons.


Like Apple, they use RAM for market segmentation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: