This site may earn affiliate commissions from the links on this page. Terms of apply.

Ane question that's come upwards from time to fourth dimension in gaming is whether older GPUs get slower over fourth dimension. ExtremeTech has examined this question before with respect to software updates, meaning we checked whether older GPUs lost performance as a result of later commuter updates that were not as well optimized for older GPU architectures. While our driver tests found no evidence of software-driven slowdowns, we didn't check the impact of whether aging GPU hardware could bear upon performance.

A new investigation of an xviii month-old RTX 2080 Ti claims to have uncovered testify that an one-time GPU will run more slowly than a newer card, based on a comparison of 2 (we think) MSI GeForce RTX 2080 Ti Gaming X Trio GPUs. Unfortunately, based on bachelor information, at that place's no mode to confirm that conclusion. At all-time, what YouTube aqueduct Testing Games has established is that long-term mining might slow downwards a GPU.

At get-go glance, the findings seem equivocal. Testing Games runs a suite of games, including Assassin's Creed Valhalla, Battlefield V, Cyberpunk 2077, Forza Horizon 4, Horzion: Zero Dawn, Kingdom Come up Deliverance, Mafia Definitive Edition, and Scarlet Dead Redemption. The MSI Gaming X Trio GPU carte du jour used for crypto mining for the by 18 months is typically nearly x percent slower than the new MSI Gaming X Trio GPU that hasn't been used for mining. Spot checks of various games testify that the used RTX 2080 Ti runs 15-20 degrees hotter than the new carte, uses somewhat less power, and hits a lower maximum clock speed. This would seem to be an open up-and-close demonstration of the fact that mining tin can wear out a GPU, but in that location are some bug with this analysis.

First, the authors don't announced to take re-pasted or dusted the used GPU. Dust is an absolutely magnificent insulator and plenty of it will easily destabilize a gaming rig. This alone could account for the higher temperatures and lower clocks on the used carte du jour, no explanation needed.

Second, these are two dissimilar cards rather than the same GPU tested before and after being used extensively for mining.  The latter would be more useful. The wide use of Turbo clocks in GPUs and CPUs today permit for variations in binning that tin touch the final issue. It could be that the newer card fielded a better core, allowing for higher base performance, and effectively invalidating our ability to derive whatsoever useful information from this comparison. The official boost clock on the MSI Gaming Trio 10 is 1755MHz, which means bothGPUs are shown running above this specification. It is possible that some of the variance between the two GPUs reflects SoC quality.

If these are two different GPUs, we besides don't know if they use an identical VBIOS version or if they use exactly the same brand of RAM. Micro-timings and VBIOS updates can innovate their own performance changes. The newer GPU is also oft faster than the older GPU than the divergence in its clock speed would indicate. The clock gap as measured in-game is on the order of iii-5 per centum (it varies depending on where you are in the run), while the performance variation varies by 8-12 percent. The RAM clock is supposedly locked to an effective 7GHz (14Gbps) across both cards.

There'south another point I want to bring upwardly: These numbers are a little odd as far as the implied relationship between clock variation and actual observed performance.

From Testing Games. The clock speed gap, current FPS difference, and the average FPS score across the benchmark run practise not agree with each other. A most xx percentage momentary performance departure is shown to stand for to a six.6 percent clock variation, with the older bill of fare running virtually 10 percentage slower overall. These gaps are present in near every title, at any given moment of measurement.

GPU clocks and performance results do non typically motion in lockstep. Increase the GPU cadre and memory clocks past x percentage, and a game's functioning may only meliorate by six-viii percent. This is expected because there's always the chance that a benchmark is slightly express by some other aspect of the system. The expected consequence from a linear clock speed increase is a linear-to-sublinear improvement in performance. Information technology therefore follows that the expected affect of reducing clock is a linear-to-sublinear reduction in performance.

The results in this video bear witness the opposite, in almost every case. Apart from Kingdom Come up: Deliverance, the gap between the reported GPU clocks is near half the size of the performance improvement. iv-6 percent clock speed differences are associated with eight-fifteen pct performance shifts.

This could be a result of polling errors in the utilities being used to get together this information. Alternately, it suggests some other variable in play that hasn't been accounted for in the YouTube video above. The used GPU could exist hitting thermal limits and throttling itself back, but doing so more chop-chop than monitoring utility can find. Most polling utilities only poll once per second, while GPUs are capable of adjusting their clocks in a matter of milliseconds. It's possible that the used GPU'south clock looks more stable than it would if we had finer-grained reporting tools.

Testing Games has not released any follow-up information on their testing protocols or whether this comparison was performed on the aforementioned GPU at 2 unlike points in fourth dimension or on two different GPUs purchased at dissimilar times. It also hasn't released any discussion of why these results point to greater-than-linear performance improvements despite linear increases in GPU clock and no changes to memory clock.

Until these questions are answered, the thought that heavily-mined cards lose gaming operation tin but be considered a theory. We're not saying the theory is wrong, but information technology hasn't been properly tested yet. More information is needed, either from Testing Games or from other sources, to illustrate the accuracy of this claim.

At present Read:

  • Nvidia's RTX 3060 Picks up a ten Percent FPS Boost
  • Nvidia Hints at More GPU Mining Restrictions
  • Nvidia Says It Won't Nerf Crypto Mining on Existing GPUs