I’ll list my specs which seem to be relatively close to yours and a few thoughts regarding my graphics settings and some principles I followed during setup.
My specs:
- i5 8600k (I always use it without overclock, it’s 6 core compared to 4 core i5 6600)
- MSI Gaming X GTX 1060 6GB (no overclock)
- 16GB RAM (2666Mhz)
- Dell S2417DG (144Hz 1440p G-Sync)
I’m always running AfterBurner while playing overwatch mainly to setup an aggressive fan curve (for very low GPU temps) but I also have the AfterBurner overlay OSD on all the time without problems.
When I tweaked my graphics settings I had a few goals in mind:
I wanted 60-80% GPU usage on average. This has two reasons. The less important reason is lower temperatures. The more important reason is that it’s good to have some headroom to deal with peaks. Peaks are caused mainly by looking at more complex parts of a map and by having more heros and effects during intensive fights on the screen. For this reason creating custom games (on many different parts of many maps) with easy bots can be a very good idea when testing settings. Always benchmark during gameplay, not in some kind of menus. For example on the hero selection screen I usually get 90-100% GPU usage for some reason but jumps back to lower healthier levels after selecting the hero and running around on the map.
When it comes to CPU I have 6 cores instead of your 4 cores but the core speeds are very close (3.3Ghz vs 3.6Ghz). Overwatch seems to be using 50% CPU capacity for me at 120 FPS and only about 15% in the menus. I don’t know how this usage is distributed across my 6 cores and I don’t know if there is a maximum number of cores OW can utilise. Some games are written to use only a specific number of cores and those games are pretty much capped by core speed so a lot of cores don’t help. In some cases my CPU usage goes a bit higher but that might be due to other background processes of Windows. The CPU usage depends mainly on FPS (twice the FPS, twice the usage), in some games the quality of special effects might also affect it especially when the effects require physics simulation. For the above reasons FPS capping is the best way to deal with CPU usage.
In my opinion it’s a good idea to cap the FPS to a value that your CPU can handle in a very stable manner. Inconsistent FPS can degrade your aiming performance and perhaps some other things. While FPS scales the GPU usage too, GPU usage can be controlled by a lot of other ways unlike the CPU usage. For this reason I usually set FPS based on CPU, unless the FPS is so high that graphics should be dumbed down too much. In that case I lower the FPS further for the sake of GPU too. It is a good idea to dumb down graphics to low or medium while you are tweaking the FPS based on your CPU usage - this way you can experiment with CPU limits without interference from GPU limits.
I’m using capped 120FPS which is very stable for me in every situation. The fact that it’s less than 144Hz doesn’t cause stutter. This might be helped by the fact that I have G-Sync but some games can cap the FPS fairly well from software without hardware sync without causing much input delay while some other games get stutter and/or serious input delay.
As you see I have an 1440p monitor and GTX 1060 which can’t run recent games at 1440p at 100+FPS with max settings. Despite this it can do a pretty good job at relatively high settings. Here is how:
After settling with 120FPS I tweaked my graphics settings to get 60-80% average GPU usage during fight-free map navigation. Obviously it is better to dumb down settings that have higher impact on GPU usage but relatively little on perceived render quality. The things I’m usually turning off immediately when I can’t use max settings are very high quality shadows (ambient occlusion off, lower shadow detail) and reflections. Lower quality antialiasing. Medium effect quality if needed. If some settings have “High”, “Ultra”, “Epic” options then “High” usually gives excellent quality with sometimes much less GPU usage.
If lowering the above things don’t help reaching the desired GPU usage then instead of lowering graphics settings further (and compromising a lot of render quality) I recommend lowering the “render scale” setting which has huge impact on GPU usage. I lowered it to 75% in my case. This means that instead of the native resolution (which is 1440p in my case) the complex 3D scene rendering will be done in 75% size and will be upscaled to native resolution. Overwatch is very smart and draws the HUD (HP, menus, crosshair, etc…) after upscaling the rendered 3D scene so those make full use of my crisp 1440p resolution.
Both 144Hz and 1440p were risks in my case with a GTX 1060 but in my opinion buying the monitor was a very good choice. The reason is that dealing with my issue was fairly simple with the previously described methods. Another reason is that browsing and reading text on a higher resolution higher DPI is much better - this monitor is more general purpose if someone has only 1 display for everything. Another one is that 1440p gaming is the next thing, this monitor is a bit future proof. After upgrading my GPU and perhaps my CPU I’ll still have a pretty decent monitor.
Regarding temperatures: I prefer going for lower temps and hardware health which means I don’t really overclock and go for noisier/higher fan speeds. My GPU fan curve is aggressive and keeps my GPU at very low temps at around 60C. My CPU temperature meter is fluctuating a lot between 35C and 55C depending on usage but most of the time it’s between 45-50C with 50% CPU usage. Obviously, your normal low GPU and CPU temperatures might be different but I recommend buying a very large CPU cooler. I’m using NH-D15 (about 1kg) and the smallest I’d go for with my CPU is perhaps NH-D9L (about half kg) but I’m not sure it could cool properly my i5 8600k at 100% usage (however overwatch uses only 50% with my settings).
My opinion about 60Hz vs 120Hz: The difference is definitely visible and the game feels smoother, I recommend it. However, it doesn’t necessarily improve aiming a lot. In my experience humans are pretty good at predicting moving objects at 60Hz too (I’m playing hitscan with decent accuracy). However with a 60Hz monitor I’d check out for some reviews when it comes to total input latency. Total input latency is usually good for most 144Hz gaming monitors but can be pretty bad in case of some average 60Hz desktop monitors. I’ve heard that some people are using 60Hz TVs which are even more prone to having very high input latency.