Advice on upgrading to 144hz

Hi guys and girls,

I’m looking for some advice with regards to upgrading my current PC. Ideally, I’d love to upgrade to a 144hz monitor and wondered whether I’d need to upgrade my GPU to do this. Here’s a list of my current components:

GPU - Radeon R9 280x
CPU - AMD FX 8-Core Black Edition FX-8350
MB - 990FXA-GD65
PSU - CX600M
CPU Cooler - H100i V2

According to Bottlenecker, there is a 16% bottleneck percentage, suggesting that my GPU is too WEAK for my CPU. Therefore, I’m presuming the GPU should be the first thing that I upgrade?

Ideally, I’d be quite happy with playing OW in low settings as long as I’m able to play it at 144hz; however, if I’m able to push these settings without dipping below 144hz then that’s great! The only other game that I play is WoW, so I’d also like something that will make BFA look great and run smoothly.

Any recommendations as to what I should be upgrading would be massively appreciated whether it’s a specific GPU or monitor model!

Apologies if these posts aren’t allowed here, and thank you in advance for your help!

Ryder

Upgrade to 240 Hz gaming :smiley:, it makes a world of difference.
Those Asus 240 Hz monitors being used in OWL are sweet. I own one myself.
Asus Rog PG258Q for the monitor.
You’ll need an Nvidia 1080 Ti , and a good intel processor, to ensure your PC can dish out a minimum of 240 FPS.

You would need to use the Display Port cable to hook your PC to the monitor.
Then within Overwatch go to your video settings and pick 1980 x 1080 (240).
Note OW won’t show you the (240) option hz refresh rate until you have a monitor that supports it, and a cable from ur GPU to the monitor, that can support that bandwidth. Hence why using Display Port cables are important.

If your upgrading from 60hz 1080p to 144hz 1080p its quite easy to test it out. You go ingame and lift the FPS limit. I have 144hz monitor and I have capped my FPS to 150 so I get stable FPS and going higher helps only slightly. So Id recommend having stable 150 in order to buy 144hz but the higher you can go the better ofc, but anything after 150 is really really minor.

Your pretty much pulling numbers out of your ass.

  1. you dont need nvidia
  2. you dont need 1080 ti
  3. you dont need intel processor

I have 1070 and amd processor. I have 144hz monitor and run it in 1440p. I could almost run it 240hz stable (I get something like 230-250fps so my stable fps is almost enough). That means if you go 240hz 1080p you need bit over half of the power that I have. That means a 980 should be enough (I dont have 980 that I could test it with, but comparing 1070 vs 980 with max settings shows that if both performs scale up when going lower settings that would mean even a 980 would easily be enough for a 240hz 1080p with low settings, while 1080 ti would be super duper overkill)

1 Like

Don’t follow the advice above (star) if you don’t want to build a new pc.

Anyway, the first thing that you’d most likely have to do is change your motherboard as you’re already running one of the best cpu’s for that model of mb. Therefore if you’re current setup is bottle necking you’re gpu, you’ll have to switch them out

You want as high framerate as possible, as it will reduce input lag and ensure you have fresher information on screen.

If you can push 200fps on your card on low settings, no need to upgrade. If you can push 144fps on low, then get the display first. Because 144hz is a much bigger difference from 60hz, than 200fps is from 144fps.

I absolutely don’t agree with the 240hz comment, it’s a gimmick. As an objective fact, the difference between 144hz to 240hz is so low due to diminishing returns, frame pacing etc. A very good, and simple graph

Edit - I see you’re not allowed to post links :frowning: So, search for Scott Wasson (Radeon Technologies Group. Co-founder Ars Technica & The Tech Report.) on twitter with the text “FPS is non-linear” and you should get the image.

You can also check out Linus as he has a video on it. Basically, difference is negligible. What is NOT negligible is the power needed to drive it.

Put it this way, I have an 8700k, Asus RoG Strix 1080 ti (6% faster than nvidia’s blower style) and 16gb 3200 RAM. The display is a 165hz (overlocked with a push of a button, but so pointless I keep it at 144hz as the downsides outweigh the benefits, ghosting, longevity etc) RoG Swift PG287QR 1440p Display.

It’s a good computer, one of the best gaming PC’s currently without going to a ridiculous enthusiast build with Titan V’s etc.

There is a very, very little amount of newer AAA titles that will run 120+ maxed out. Overwatch is, of course one of those. The good thing with my screen is I have a choice between G-sync, ULMB or Fixed refresh. I had never bothered with ULMB until overwatch and wow, I am actually impressed. Admittedly, I still keep it on G-sync because I dislike tearing but if you don’t care about that the ULMB is pretty amazing, even at the same FPS it makes things feel even more crystal.

As I’ve said though, the diminishing returns is a big part of things. You won’t notice much difference, if any between 120 to 144hz. Even 90 is pretty solid. Once you have a 144hz though, going to 60 feels like what those who scream “60 or go home!” do when they see 30, it’s pretty jarring, so unless you plan to upgrade in the future when you’re done with OW, it will be very hard to go back to a normal display. With that said, going from 60 to 72 is actually a huge leap, 72 is when it feels more than adequate, because 60 won’t.

I don’t know too much about the latest AMD cards though, but looking at it I feel it would be best to upgrade that before a monitor, but again, no experience with it.

First you have to disable Vsync and see if your current set up can maintain constant 120+ ingame. If thats the case, you can perfectly buy that monitor.
The change from 60 to 120+ is insane and in terms of aim and delay/responsiveness its literally a game changer. You will be able to hit flick shots and your mouse wont be “slow” due to no imput lag.

Even if you have or dont have to change GPU, the monitor is the best purchase, along with an SSD, that you can make to play Overwatch, guaranteed.

No1 who takes this game “seriously” plays with maxed graphics. Its an disadvantage. Sure if you want nice graphics etc why not go for it. But if one is going to get a 144hz monitor just for OW I would guess its because of the competitive nature, not because it looks slightly better. And OP even says hes happy with low setting. Low settings with 144+ fps is doable even with a budged setup. gtx 1060 3gb is enough for the 144hz and is ~250€ price range. Even a 1050 ti that is sub 200€card should be enough for 144hz gaming with low settings.

´https://www.youtube.com/watch?v=KjbbS_sRcEo´

The guy is recording and the FPS drops to ~130 at lowest so Id assume if you dont do anything extra while playing you should be able to get a stable 150+. At least for me recording/streaming eats my FPS.

They are hardware that I own, that I could recommend. I get a stable above 250fps with that hardware, so yeah sure you can get close to those specs with other hardware too. But yeah from my experiences with those kinda hardware they work as a charm. So like you put it, I wasn’t pulling numbers off my ass.

I recommended what I know works.

And to someone that says they couldn’t tell the difference between 144 and 240 it’s not super huge, but there is a level of fluidity that you do witness in 240 compared to 144. But again it’s true they are diminishing returns for the money you spend on it. So yeah that’s a build if your not on a budget :wink:

However with higher end hardware besides the fact that some of them might be an overkill now your future proofing yourself for the years ahead, so it evens out. I mean when you go with lower level GPU’s now they’ll indeed get the job done, but in a few years from now they are going to show.

I can hit 200fps with the game maxed out, so why would I need to lower that? If I lock it to 141fps, it stays at that without wavering at all.

I also stated OW is one of the few that can, so again.

Unless the OP plans on buying a display just for Overwatch for the rest of his life, then sure, but if he plans on ever playing other games, which unlike OW more often that not look horrific on low settings, then I don’t see your point.

I didn’t say you couldn’t see the difference, I said the difference is negligible, and it is.

However, I also agreed with you to what you are replying to the other person who said you’re pulling the numbers out your ass. I have a very good computer, and I know if you want to play other games at any respectable frame rate you’re going to need to future proof your PC. A 240hz monitor just means that even more so. Lowering your graphics might work for this game, but as I said, unless they plan on sticking to OW for the rest of their life, I don’t think it’s a sensible option. Lowering the graphics won’t work for every game if you want to get a frame rate to that degree.

1 Like

I actually used a 1050Ti until a month ago, coupled with a Sandy Bridge i7, so an OOOLD one. I never dipped under 150fps, and usually higher.

Keep in mind that Overwatch is quite heavy on both CPU and memory, and specifically that with fewer than 6 cores (or HTT) you most likely won’t get above 200fps in a 12-player match no matter what video card you have.

I am so out of touch with AMD models that I don’t know how many threads OPs can run at the same time, or what DDR generation it means he’s on, but if it’s less than 6 threads and 3GHz, a top-end video card will not add anything other than higher settings at 200fps.

And higher settings make the game harder. Bloom and extra speculars, and particles and junk, are distracting and obscures stuff.

Ofc it works. My point was that your setup is “overkill” for 240hz.

You “need” to ensure. Yeah well you dont need really. You can go way lower. And that was my point.

Like said go a head if you want to. But if you go low some of the objects disappear that might block you line of sight giving you an disadvantage having higher settings.

Well I didnt doubt 1050ti wouldnt make it. I havent tested it my self so I cant say for sure. And yeah OW is heavy on CPU but the OP said hes GPU is a bottleneck. Also Id recommend 8gb ram for any games these days and going over is not needed unless you need to stream or something along those lines, but just for gaming 8gb is easily enough and 4gb is really low on todays standards.

ddr 3 and 4 difference in games is next to nothing according to some tech pros. Im not a tech pro but I think linus tech tips has video on that or someone else did who I trust enough to think its true (also linus tech tips had video on 8gb ram is easily enough for gaming today and ive noticed it my self as well evne tho my new rig has 16gb for reasons).

The difference is frequency. DDR4 can run at much higher than DDR3, and that extra GHz makes a big difference when you have more threads hitting the memory at the same time. And when I say memory is a bottleneck for many in OW i don’t mean the amount, but the speed.

Finally it comes down to your budget. Something the OP, missed to mention.
If your on a string tied budget, you could go with lower configs to hit the 144hz. However if you can spend, I would recommend future proofing your buy, so that atleast for the next 5 years you can run the same rig.

I build my first serious rig 5 years ago, and until recently being able to pull out high FPS. However there is only so much you can do with future proofing as core technologies itself start changing. So don’t go over the board with future proofing, but keep it reasonable.

Give us your budget, and I’m sure the community here can help with either upgrade builds or completely new hardware :smiley:

Well not according to linus tech tips unless your rendering videos etc. Idk maybe you are more tech savvy.

You don’t even remember what you wrote nor getting the point as usual. You said “You’ll need a 1080ti” , which was wrong and the other guy proved it to you. That was the issue not the numbers out of your ass.

DDR4 has a slight benefit over DDR3, even in gaming but if you were to compare a high end DDR3 module to a mid range DDR4 module the difference would be really minimal.

High end DDR3 module to a high end DDR4 you’ll start seeing a bit more of a gain, but not really that significant.

In general, games tend to use one or maybe two threads. Memory speed isn’t an issue in those cases, and an i5 is as good as an i7. That’s in general, though. Overwatch is different.

And yes, I am likely to be more “savy” in this case, as I’ve been coding for 30 years, a lot of it games and low-level code.

A high-end DDR3 is similar to a low-end DDR4.