Nvidia's latest DLSS iteration has arrived with a focus on computational efficiency, but the reality is proving more demanding than anticipated. While the new version promises enhanced image stability and performance gains, benchmarks reveal a substantial increase in power consumption that could reshape how gamers approach high-end graphics.
The RTX 5090, Nvidia's flagship GPU, shows particularly dramatic shifts when running DLSS 4.5 in demanding titles like Cyberpunk 2077. In some cases, the card consumes up to 50 watts more than with DLSS 4, even though frame rates may not always reflect a proportional improvement. This discrepancy suggests that while the transformer-based model delivers noticeable visual enhancements, it does so at a significant computational cost.
Even older GPUs like the RTX 3060 Ti experience increased power draw, though the absolute figures are less extreme due to their lower baseline wattage. The trend extends across multiple game engines, including tests in Spider-Man Remastered and Stalker 2, where the power differential remains consistent despite differences in rendering approaches.
For users already running high-wattage GPUs like the RTX 5090, the impact may be less concerning. However, those on tighter power budgets or using mid-range hardware could face noticeable performance trade-offs when enabling DLSS 4.5's most aggressive presets. The question remains whether the visual improvements justify the increased power consumption in real-world scenarios.
- Performance Impact:
- Up to 50 watts additional power draw on RTX 5090
- Consistent increases across multiple GPU generations
- Frame rate gains vary by preset and game title
The new DLSS version was unveiled at CES 2025, building on its predecessor's transformer architecture while introducing five times more compute intensity than earlier CNN-based models. While the technology promises stability benefits—particularly in 4K gaming—users should weigh these against the real-world power implications before adopting it widely.
For those already equipped with high-end hardware, the trade-off may prove worthwhile. But for others, the increased demand could push GPUs closer to their thermal and electrical limits, potentially requiring careful monitoring or system adjustments to maintain stable operation under heavy loads.
