It’s the year when consoles change and that bumps requirements. Also PC GPU also are getting DXR as a more mainstream feature plus as they are hitting performance limits so new tech like DLSS from Nvidia or VRS must be added to bypass those limitations.
And I’m curious what will happen to laptops. Turing mobile variants were already much weaker than their desktop counterparts. At some point it will be game streaming where you won’t have to have the hardware.
Honestly the only reason I have a laptop is because I didn’t have room for a desktop in my old appartment. But now that I have moved that’s probably what I will buy next…
Well… 3070 you get away with a 650watt as no power difference from a 2080ti. Anything less and you’re pushing it as it depends on your CPU power usage as well.
i7 9600k + 2080ti / 3070 would need 650watt.
3080 or higher and you best get 750watt or better.
So something to consider if you’re still using a 550watt, it’s going to have to be replaced for a higher rated PSU.
CUDA helps in gaming plus streaming in one PC box anyways. Not had much of a performance hit streaming on Twitch while using the hardware x264/265 enconding while playing a game.
So hardware is getting better to replace dedicated hardware or having another PC just for the streaming part.
But I would wait until October/November on upgrading anyways, see what the reviews say on each card. And any potential driver bugs for the new graphics cards as well.
Wasn’t there a problem with the 2080 Founders Edition on it’s release two years back with some cards dying within a week or so?
Best to see what the situation is there. Let early adopters find that stuff out.
This is not really true.
The RTX on the series was a forced gimmick which makes you lose a ton of FPS by turning it on. It was a marketing ploy by NVIDIA. Even with a top card you would lose a hilarious amount of FPS but the card just brute forced it.
The new cards that were announced recently can render “real” ray tracing which will not make you lose many or any FPS by turning it on. Same can be said for the AMD cards that will be announced in a month. If anyone wants to have ray tracing in SL I recommend them to get one of the new cards. Even if you buy the cheap ones at £300 you will be able to run ray tracing with no problems.
If you already have a 20 series card then good luck but it is doubtful it will work out since WoW is popular for FPS dips. Running a feature that will make you lose 20-30 FPS on a game that can drop to 60 FPS on the best systems is a bit stupid.
That’s why it needs DLSS or similar systems no matter wherever its Turing or Ampere. Ray tracing is more expensive to compute than rasterization. And it’s not really plot by Nvidia, it’s just the advancement of the industry, especially animation and major game developers. At some point you have to release the first generation which will be inferior to second or third…
For AMD we know there will be DXR support but we don’t know if they have any resolution cheating system like DLSS 2.0 as without that DXR performance is bad.
It’s just shadows, no reflections, no other effects. There is zero reason for getting DXR for WoW until they revamp their assets and implement DXR in full. RTX 3070 or 3080 is worth it for Cyberpunk, but not solely for WoW.
You are confusing few technologies constantly. DXR or Turing and Ampere will decrease the performance by a lot, the more raytracing the bigger the drop. RTX 3080 can do it better than 2080 Ti but on their own both cards won’t give anything playable on 1440p or 4K. But if you run DLSS at 1080p and upscale it to 1440p or 4K with ML you get higher performance and raytraced effects. Both generation of cards NEED DLSS to have DXR perform efficiently.
And WoW FPS limitations are often CPU based, not GPU so DXR impact won’t be that additive on the FPS drop.