• @[email protected]
    link
    fedilink
    13 months ago

    But consider that if you get a more powerful card at the same price you don’t need as much upscaling or frame generation. FSR being sightly worse is irrelevant if you can run the game at native.

    I’m on a 3080, and if I’m getting 40fps in a title at settings I’m happy with (which is ending up more common than I’d like), not even a 7900xtx is going to give me the 90fps I’d much prefer. And, lest you think I’m being vastly unfair, I’ll also say there are no nVidia cards that will do so either. And yes, this is entirely dependent on your resolution, but the ultrawide I’m quite fond of is essentially the same pixel count as 4k144, which is a lot of pixels to attempt to draw at once.

    The only way to get there (at least until the 5090 shows up, I guess?) is to do some sort of upscaling. And, frankly, FSR is - subjectively - not ‘slightly worse’ but rather such a artifact-y mess (at least in games I’m playing) that I’d rather have 40fps than deal with how ugly it makes everything.

    XESS is a lot better, and works fine on AMD cards, but until FSR gets a lot cleaner, or everything starts supporting XESS, DLSS is still the best game in town.

    As for NVENC, you’re absolutely right, unless you’re using it for streaming, and have a hard cap on upper bitrates because you’re not Twitch royalty. I’ll admit that’s an edge case that most people don’t have, or even need to consider, but if you do need low-bitrate streaming, and don’t want to deal with x264 doing it in software, well, it’s NVENC or sub-par quality from AMF. I’m honestly surprised they haven’t invested time in fixing the one real use case that hardware encoding still has (real-time encoding of low bitrates), but I suppose someone somewhere has an excel sheet that shows that the market that cares about it is so small as to be of no value to spend time on.

    • @[email protected]
      link
      fedilink
      13 months ago

      If there are no nvidia cards that can run your game at 90fps, not even the 4090, then you’re using ray tracing I assume? In which case I’ve already agreed. The gap is too large, and a product tier offset in AMD pricing isn’t going to make up for that gap. My comments about FSR vs DLSS in this scenario assume a superior performance baseline for AMD, where you’re comparing no FSR to DLSS “quality”, or maybe FSR “quality” to DLSS “performance”. AMD would need to tank their prices to an absurd degree to close that gap when ray tracing is involved.

      As for why AMD haven’t put more time into their encoder, I have a suspicion they were banking on people moving away from AVC to HEVC or, more recently, AV1. Their HEVC and AV1 encoders are much closer in quality to nvidia than their AVC encoder, and clearly have more attention paid to them. Hell, even as far back as Polaris cards AMD’s HEVC encoder was even faster than their AVC, while also looking better.