Frame rate caps in games or via driver implementations can be useful for a number of reasons. A locked frame rate (fps) can improve frame pacing if a CPU, GPU, or other hardware is a bottleneck, and also help conserve power when output frame rates exceed your monitor’s refresh rate. AMD is known for taking advantage with the Radeon Chill feature, which allows users to save power when away from the screens. And similar behavior was also expected from Nvidia GeForce cards when the fps are locked.
However, according to tests by performance monitoring tool maker CapFrameX, Nvidia’s Ampere-based RTX 3000 cards don’t save power. CapFrameX believes that Nvidia’s stock-boosting algorithm is unable to efficiently control the core clocks and voltages for the Nvidia cards tested here, as they were considered maxed out even with the framerate limiter in place.
In the image below right, an RTX 3090 was tested with a 60 fps limit in Shadow of the Tomb Raider (SotTR) and found that the GPU was consuming almost twice as much power (~195W) with the default settings. Meanwhile, a manually configured RTX 3090 (left) used much less power at just ~110W.
The behavior was the same with the RTX 3070 Ti. Locks with 60, 120 and 144 fps as well as a no-limit scenario were tested. These were done at two different resolutions, 1080p and 1440p. Apart from SotTR, Strange Brigade, eternal doom, and Battlefield V were also tested.
On the AMD side, the Radeon RX 6800 XT was used for this. We contacted CapFrameX regarding the driver versions used in the test. So far we haven’t heard anything back.
For more details see CapFrameX’s original article linked under the source below.
Source and images: CapFrameX