I realise that this will be a very simple question, but something I’m trying to get my head around.
With an ability such as Bloodthirst for example, it’s rated to hit at 45% of AP. The actual damage to mobs is different every time. I can hit the same mob with white hits or yellow hits and the damage is never the same.
I get that the damage is being reduced some how, but how is this calculated? Is there just a specific roll for damage to be reduced and a certain percentage is knocked off dependant upon that roll?
They made abilities have a random +/- 5% dmg for “flavor”.
Different mobs have different amounts of armor.
Add onto that random trinket/azerite/etc procs increasing dmg/strength etc. and it’s not hard to see why it varies.
I think that the damage range of the weapon(s) factor in too.
Say you have a sword that does 100-200 Damage, then you do between 100 to 200 each hit (plus talents and AP and stuff, minus armor) that’s why it’s different every hit.
What Valkia is talking about is called Spell Variance.
It’s how it worked from the start of WoW, it’s a staple design of RPGs and have been for longer than WoW has been a thing.
They removed it in Legion, which is what they’re bringing back in Shadowlands with ±5%. (No clue what the variance was up to Legion though, but it seems it wasn’t just ±5%.)
It’s also a thing in Classic, because Classic doesn’t have Legion’s scaling which is what caused it to stop functioning in the first place.