The original post: /r/hardware by /u/Imnotabot4reelz on 2024-12-21 14:08:03.
Does anyone know the reason? If they’re trying to break into the market, wouldn’t a 64GB GPU strapped onto one of their battlemage GPUs probably sell really well due to AI?
Sure we get Nvidia doesn’t want to do that because it is using VRAM as a gatekeeper to force people to buy more expensive cards.
But for AI workloads, which are hard limited by VRAM capacity, it would seem there is a pretty massive demand for GPUs that have high VRAM. And you really don’t need all that much computing power… that matters much less to AI creators than it simply being possible to do something or not due to hard VRAM limits. The difference between “I couldn’t create this image if I had 100 years because I don’t have the VRAM” versus “I had to wait an extra few seconds because Intel is slower” is massive.
I feel like Intel could literally create a whole new market segment pretty easily. If relatively cheap 32/48/64GB cards come out, models/AI programs using that much would certainly become more common. And even today a lot of them can make use of high levels of VRAM.
Even a 24GB model to start would be something. But I don’t get why they aren’t doing something like this, when they’re supposed all about “edge computing”, and finding niches. Seems like there’s a massive niche that will only grow with time. Plus they could tell their investors all about the “AI”.
Nvidia is using VRAM as a gatekeeper. It’s such a vulnerability to be attacked, but Intel won’t for some reason.