You’re just a clown trying to justify the fact you’re a technological illiterate. Can’t wait the day Blizzard will drop RT on Diablo IV with DLSS3 just to see you crying. In the meantime, I play Diablo IV maxed out on a 4K monitor at 133Hz while encoding an h.265 movie to an AV1 file without even causing a minimal stutter to the game.
Stay with your GTX 1080. That is the amount of power I can only accept on a mobile device. Even my cell phone can do better.
Who stepped on your toes? Can you please explain why you react to a person like this? Are you hurt or do you simply want to make fun of people who have a lower GPU as you?
There are still decent enough of people who own a GTX 1080, including me. If you have a problem with this, then I would suggest you to look for a doctor, because this cleary indicates that something is wrong with you. Do you also see me brag about that I have access to a L40G Nvidia GPU?
I can play Diablo on high settings in 4k, just fine. Afcourse the GTX series doesn’t support ray tracing or path tracing. But is this the reason why to respond like this to other people?
Do you feel “proud” that you have something better as someone else? Is that it?
And about DLSS3, please do research before you scream something. This technology is designed to get a stable framerate in games. Its a AI that predict frames and generate them, to fill up the missing frame to get a stable FPS. Simply by the fact that even the best hardware can’t produce a stable framerate in games with ray tracing and especially path tracing. Cyberpunk 2077 overdrive (path tracing)18fps with a RTX 4090 in 4k. You need to enable DLSS3 to get a decent FPS to play the game in path tracing. See my video above.
And another thing, if Blizzard decide to add Ray tracing in Diablo 4, it will still be a option. So people with a GTX 1080 can still perfectly play.
I love my gtx 1080, it has let me play every game out flawlessly at 1440p.
I have the EVGA FTW2, I also overclocked it 140MHz on the core and 500Mhz on the VRAM.
I have a 4k setup in my home gym, the ps5 is on that and in all fairness
4k Vs 1440p in this generation of games is “meh”.
I mostly play battlefield, diablo, RTS games and Sea of Thieves. So anything more powerful would be ridiculous.
I’m not interested in a bell or a whistle when I get the other 90% fidelity with my 8 year old card. I enjoyed cyberpunk, but the raytracing implemented didn’t “feel” right.
I am not alone in this thinking, an 85% drop in performance to get RT over pre-baked shadows is not good value. It definitely does not significantly improve fidelity.
I was over the moon with the performance of my RIG in the D4 betas, max settings, 1440p, smooth 70fps or higher. Looking at my FPS logs it averaged 90. For an isometric arpg that’s perfect.
So I promptly cancelled my 4080 order.
Glad you enjoy your scenario of rendering and gaming, really cool. Glad that 1200 quid is getting it’s ROI.
Turning off post processing to gain a slight performance gain isn’t worth it. Better for the map to generally load faster than getting some slowdown when a mob appears due to lack of cpu power.
Ray Tracing is a gimmick. We agree on that.
I use a 3600x Amd cpu, Rx580, 32 gb ram (and?) m2 ram drive. The game runs perfectly fine. Fight scenes look just a blurry on my screen as I bet they do on yours.
As with any game so long as fps doesn’t drop below 20 it doesn’t actually matter. I usually got in the beta and slam around 45-65 depending on area and that’s a non issue for me. I am not a fighter pilot… so I don’t need to see a tank moving across a 4" screen at 144 fps… 1080p is plenty on a 32" 1ms screen.
The problem with Ray Tracing is, and why it feels like a “gimmick” is because Ray Tracing still uses rasterization lighting. One of the reason why some people might not see any difference.
It feels like a gimmick in most games. But I have to disagree when it comes to Cyberpunk 2077. Now that they updated the game with Path Tracing, it feels like a complete different expierence. They removed rasterization completly and only make use of path lighting. And you see clearly the difference.
Every object cast now a shadow, no more rasterization lighting, the light can bounce multiple times, it can even change colour from objects, and no more light bleeding, thank god. Because that was one of the most annoying thing from rasterization lighting.
If we talk games like Elden Ring, then yeah ray tracing is just a gimmick.
It contributes in the atmosphere and it can also look way more realistic. I can agree when we talk about Diablo 4.
But for example in Cyberpunk 2077, it’s a whole different story. The characters feel more alive, it looks more realistic, the whole feeling is more impressive, more immersion. Also the muzzle flashes from guns look way more impressive, area’s that are suppose to be dark, are dark. No more annoying light bleeding, everything has shadows which creates more depth.
Like I said, it contributes more immersion. But I can agree if we talk about Diablo 4. I woulnd’t see much use of Ray Tracing or Path Tracing in Diablo 4.
1080ti will definitly outperform a 2060 any day of the week. The only thing 2060 has going for it is raytracing capability. 1080ti is a much better card overall
my specs ssd , rtx 3050, i5 11400h, 32g ram 3200mhz kingston fury ddr4 under 1080p laptop 15.6 inch screen. was running on max during the test, most of the time can hit 60-90 fps, some times it could be a bit lagging/freezing (occasionally). thats it
The point is unless you’re really into “realistic or fancy graphics and textures…” most people turn RT off in options. Just like they turn off bloom and other annoying options.
Nah, I am a serious cinemaphile and audiophile. I have my displays calibrated to THX standards, I have 4k and 1440p displays, none of the games I have played or do play, including cyberpunk, benefitted the gameplay experience with the fidelity only detracted by creating a massive loss of smoothness.
Also, “Ray tracing generates computer graphics images by tracing the path of light from the view camera (which determines your view into the scene), through the 2D viewing plane (pixel plane), out into the 3D scene, and back to the light sources.”
The claim you made which I objected to wasn’t about (your perceived) “benefit” but about “realism”.
What you subjectively prefer to look at is completely unrelated to that.
The bottomline is still that you got it completely backwards.