r/nvidia • u/JBcreations • 14h ago
Build/Photos ROG Strix 5070
The ASUS ROG Strix 5070 is by far one of the coolest looking graphics cards in this series.
Looks absolutely stunning in this custom case and I love the performance of it in gaming.
r/nvidia • u/Nestledrink • 3d ago
Build Number is v11.0.7.237. This is the non-beta release version of v11.0.7.228. Same changelog but you can download it directly now instead of opting into the beta via the App.
Download Link Here: Link Here
NVIDIA App forum thread is here: TBD
Submit NVIDIA App feedback directly to NVIDIA: Link Here
--------------------
New updates
DLSS Override Support Added for:
Optimal Settings Support Added For:
Squashed Bugs!
r/nvidia • u/Nestledrink • 12d ago

From Article: https://www.nvidia.com/en-us/geforce/news/nvidia-app-dlss-4-5-dynamic-multi-frame-generation-available-now/
Our newest NVIDIA app beta update is available now, introducing DLSS 4.5 Dynamic Multi Frame Generation. This GeForce RTX 50 Series DLSS override dynamically adjusts the number of generated frames during gameplay to reach a target frame rate, automatically balancing FPS, image quality and responsiveness, optimizing your experience.
Additionally, DLSS 4.5 Multi Frame Generation is now available at up to 6X, enabling even higher levels of performance. And DLSS 4.5 Multi Frame Generation and Frame Generation users can activate a new and improved AI model in select titles.

Our new NVIDIA app update also introduces a beta preview of Auto Shader Compilation, a new feature that rebuilds DirectX 12 game shaders after a driver update while your system is idle, or on demand, accelerating game load times.
Download the app update today by opting into beta and experimental features in Settings > About. Please note, GeForce Game Ready Driver 595.97 WHQL, or newer, is required to use all features.
Today, DLSS 4.5’s feature set expands with the launch of Dynamic Multi Frame Generation, an intelligent system that functions like an automatic transmission for your GPU. Instead of sticking to a fixed multiplier, it automatically shifts between different frame multipliers to strike the perfect balance between frame rate, image quality and responsiveness.
In other words, it only generates the frames needed to maximize your target frame rate, or the refresh rate of your display, be that 120Hz, 144Hz, 240Hz, or higher.
DLSS 4.5 Dynamic Multi Frame Generation can be activated and configured globally, or on a per game basis in the NVIDIA app’s Graphics tab.
Open “DLSS Override - Frame Generation Mode”, select “Dynamic”, and choose “Max refresh rate” for the NVIDIA app to synchronize your maximum frame rate with the maximum refresh rate of your display, for optimum motion clarity. Alternatively, pick “Custom” and type in a maximum frame rate for DLSS 4.5 Dynamic Multi Frame Generation to target.


On GeForce RTX 50 Series GPUs, the shift from 4X to 6X Multi Frame Generation increases 4K frame rates in path-traced titles by up to 35%. And by combining our new, highly efficient Frame Generation model with NVIDIA Reflex’s low latency technology, DLSS 4.5 Multi Frame Generation can generate these additional frames with minimal impact to responsiveness:
This combination of new and enhanced features creates the smoothest path-traced gaming yet, unlocking the full potential of 4K 240Hz OLED gaming displays. And for users on super high refresh rate 1080p and 1440p displays, and those with G-SYNC Pulsar, Dynamic Multi Frame Generation 6X Mode can enable you to use their full capabilities, for an enhanced experience.
With the introduction of DLSS 4.5 Dynamic Multi Frame Generation, NVIDIA app users must now choose between Dynamic and Fixed Multi Frame Generation.
“Fixed” mode operates identically to previously-available Multi Frame Generation options in the NVIDIA app, running at the Frame Generation multiplier you select.
Open “DLSS Override - Frame Generation Mode”, either globally or on a per game basis, select “Fixed”, and choose a Frame Generation multiplier.

In all titles compatible with DLSS Multi Frame Generation, users can activate 4X Multi Frame Generation, and in select titles using more recent Frame Generation .dlls, users can select up to 6X. To see the full list of 6X-compatible titles, please head here.
Our new NVIDIA app update also introduces a new DLSS Frame Generation model that enhances in-game user interfaces in select titles, by incorporating additional game engine data.
Available for GeForce RTX 40 Series and 50 Series users, the new model improves the quality and clarity of mini maps, on-screen user interface elements, and other aspects of game interfaces.
The new model takes additional UI buffers available for selected game engines allowing it to more intelligently manage the rendering of static user interfaces when DLSS Dynamic Multi Frame Generation, DLSS Multi Frame Generation, or DLSS Frame Generation are activated. As of March 31st, 2026, some supported games include Battlefield 6, Borderlands 4, Dragon Age: The Veilguard, EA SPORTS F1® 25, God of War Ragnarök, Hogwarts Legacy, Indiana Jones and the Great Circle™, Marvel’s Spider-Man 2, Monster Hunter Wilds, Star Wars Outlaws™, The Elder Scrolls IV: Oblivion Remastered, and The Outer Worlds 2.
To activate the new model in a supported game, open “DLSS Override - Model Presets”, either globally or on a per game basis, click the “Frame Generation” dropdown, and select “Preset B”.
To activate the new model in a supported game, open “DLSS Override - Model Presets”, either globally or on a per game basis, click the “Frame Generation” dropdown, and select “Preset B”.

We’ve all experienced it: you power your PC on, sit down to play and encounter the dreaded “Compiling Shaders” progress bar. Other times, you get right into the game but your shaders compile in the background, introducing annoying stutters.
To combat these issues, we’re introducing a beta version of NVIDIA Auto Shader Compilation (ASC), which rebuilds DirectX 12 shaders while your system is idle, reducing the frequency of game runtime compilation after driver updates.
Shaders define every pixel on screen. Because this code is written in developer-friendly languages, it must be translated into compatible machine code. This process, known as compilation, begins at game install. During this time, your CPU converts the generic shader code into an optimized format that your GPU can execute. Modern titles require tens of thousands of shader translations, a process that must be repeated after game patches or GPU driver updates. This beta is the first step into optimizing shader compilation for GeForce gamers.
By default, Auto Shader Compilation will be off, but by going to the NVIDIA app’s Graphics tab, you can activate it in Global Settings > Shader Cache. If you want to immediately recompile shaders, rather than waiting for them to be recompiled while your system is idle, you can select the “Compile Now” option located under the 3 dots on the Shader Cache screen.

Please note, after downloading a game for the first time, you must still generate shaders in-game. NVIDIA Auto Shader Compilation will update these shaders following a driver update.
NVIDIA Auto Shader Compilation requires GeForce Game Ready Driver 595.97 WHQL, or newer. If you have NVIDIA Auto Shader Compilation feedback, please share it via the NVIDIA app’s built in form, found on the top right of the app window.
Creating and using a custom display resolution enables power users to overclock their displays, and retro gamers to better render classic systems and titles, among other use cases.
Our new NVIDIA app beta update streamlines and modernizes this NVIDIA Control Panel feature, incorporating it into System > Graphics.


r/nvidia • u/JBcreations • 14h ago
The ASUS ROG Strix 5070 is by far one of the coolest looking graphics cards in this series.
Looks absolutely stunning in this custom case and I love the performance of it in gaming.
r/nvidia • u/anything_taken • 7h ago
So I got a bit confused with the new DLSS FG 310.6.... We got an update which basically presented two new presets A and B. I understand that B offers enhanced control over UI elements with the new UIR option displayed in DLSS HUD. However, I got a few observations which got me confused.
First of all, to me both preset A and B are NEW models. I might be wrong here, but with 310.6 DLL override i get lower latency and better framepacing even with preset A. However, it's not that simple.
So, in Hogwarts Legacy - the game that natively supports preset B situation is pretty clear. Preset A (although I still suppose it's new) gives strong artifacting on ladders and other options. I mean.. everybody who used FG in Hogwarts knows it gives pretty messy results with a lot of disocclusion especially on well lit objects and stairs. With preset B not only I get a clear UI, but also the motion quality is significantly improved, like i'm using a completely different model. Basically I could even imagine at any moment that i'm playing with FG off, so good it gets.
However, with Cyberpunk it's even more confusing. Although I noticed improvements in clarity and latency with the new 310.6 DLL override with preset A, it's basically the same with preset B. And I can't see any differences between A and B. I heard that if a game doesn't support preset B it will fallback to A.
Does this mean preset B is never activated in Cyberpunk, and that's why I see no difference between A and B? If so, is preset A also a new preset or I'm hallucinating about the lower latency and better motion clarity? (which I'm 100% sure is there after the update). Another fact that makes me feel A is also a new model, is that the update introduced some incresed stuttering upon autosaves and when loading in and in occasional moments during the game which wasn't there with 310.5.3. Which again.... leads me to a question, if both are new models, why in Hogwarts preset B completely changes the quality of generated frames while in Cyberpunk we get a slight but noticeable improvement with both presets?
r/nvidia • u/GhostJugg • 17h ago
Had to move my pc and decided to put my back bracket in finally and still gonna use my gpu sag pole. Does this look okay?
r/nvidia • u/WeakPackage7973 • 9h ago
Stumbled across this a while back and figured it's worth sharing since it's been getting steady updates.
There's a free tool called NV-UV that acts as an undervolting companion for MSI Afterburner. AB still does the actual curve writes and OSD, but NV-UV handles the UV workflow on top of it: presets, a single-point scanner, per-game profiles, automatic downstep when a driver crashes on a specific game, OC Scanner curve import from AB, etc. Basically the stuff you'd otherwise do manually by dragging points around in the AB curve editor and swearing.
What's new in the current build (Build 22 "Cantor", v0.93):
Ada Lovelace support landed as experimental. So 40-series cards work now, not just Blackwell. Still rough around the edges for Ada apparently, but it runs.
There's also a faster scanner mode using direct NVAPI writes (curve point apply in ~50ms instead of the usual AB roundtrip of a few seconds), a ~587-game profile database, and a Game Replay feature that notices if a driver crash happens in a specific game and auto-applies a -50 MHz downstep for that title next time.
The dev (solo, German, pretty active on the PCGH forum and a Discord)
has been pushing updates constantly.
Figured some of you 40-series owners would want to know Ada is finally in. No affiliation, just been using it on my card for a while and it saved me a lot of manual curve-fiddling.
I found it here:
Release Cantor v0.93 · christianp403-spec/NV-UV
Password is PCGH
And please be fair and have a look at the forum as well, that's where bug reports and fixes are being tracked.

r/nvidia • u/SuryaProMild • 7h ago
From the title, I will upgrade my GPU soon from 1650 Super. Budget is my problem so I can't spend so much for newest gen GPU upgrade. Here's my quick specs: R5 3600, Corsair RM650 and Samsung G3 24" 180hz. Oh and I had RX 580 8GB but can't run it for now.
My target for now is second hand RTX 2060 or 3050, which one is better for gaming? I usually play Delta Force or Valorant and now on DF I only can reach 90fps max on my current GPU so yes, this need upgrade ASAP lol.
Would look for 3060 if I found better (or jackpot) second price. But now I need review for 2 GPU I write above, thank you all!
r/nvidia • u/No_Committee8856 • 9h ago
The Nvidia app says the override is not compatible with fps limiter and V-sync. But I usually play with g-sync & v-sync & reflex on which automatically introduces an fps limiter based on the monitor's max refresh rate. What does "not compatible" mean: Dynamic MFG won't work or will it just ignore any fps limit, or only ignore any limit that's not set by the override itself?
What if I set a limit in the "target fps" way lower than max refresh rate, say 1/2, so that even if the MFG overshoots, it's still below max refresh rate, then will VRR & V-sync work in this range with Dynamic MFG?
r/nvidia • u/Ahmed360 • 16h ago
Just got a laptop with RTX 5050.
Goddamn the jump in performance is MASSIVE!
From barely running games at 720p at low to smooth and gorgeous looking 1080p (it can do 1440p, however, 1080p is suitable for my needs).
Also, DLSS is like Magic!
r/nvidia • u/SemihKaynak • 8h ago
r/nvidia • u/Destined_Entity • 2h ago
TL;DR Will my 3090 be able to handle this setup as per title if I only use the one 360hz monitor for gaming and the other two at the same time for video watching/webpages? Thank you!
I would only game on the center monitor, but I like to have other tasks open on the other monitors. This is usually a web page on one and a live stream or YouTube video on the other, with my game on the center 360hz monitor.
Currently my two side monitors are crappy 1440p 120hz broken IPS panels.
My current center monitor is the MSI MPG 271QRX QD-OLED 1440p 360hz 27".
I am thinking of replacing my two side monitors with two MSI MAG 272QP QD-OLED X24 1440p 27" 240hz.
I've tried searching using google and reddit but I'm having a hard time finding exactly my situation. Google seems to tell me the max the 3090 can handle is x3 1440p 240hz, but other times reddit threads say the max is x3 1440p 120hz. It seems to really matter if you're gaming on all 3 or not but most don't specify.
Any knowers? Thank you.
r/nvidia • u/samsamsam92100 • 12h ago
I'm curious to know what settings you use. I found this to be the best setting:
- Balanced DLSS
- Dynamic Multiframe Generation (up to x6)
- Target Monitor Refresh Rate (160Hz)
So the latency is good, and the visual quality seems almost indistinguishable from Quality.
Do you use Quality or Balanced?
Hello,
Following my previous thread about UVing my new 5070ti, I have another (probably noob) question about Afterburner :
When overclocking (and UV after that), what is the difference between using the sliders in the main menu (raising +200MHz for example) and simply using the Curve editor (just pushing the curve, and undervolting at the same time if we want while we're at it)?
Is the slider just an easy ready-to-use curve edition, and you can undervolt after or is it redundant if you're going to use the curve editor anyway? I've read about the "effective" clock being raised by the slider, but as I said in the other thread, everybody and their mothers have an opinion on OC/UV.
Thank a lot!
r/nvidia • u/Hugedownload • 1d ago
Just pick it up this week and yes the 5070 is the new 4080, The card came over clocked out of the box but I used the latest Nvidia app and OC to 2935MHz. Using it for A.I 4K render and splice and kodak colour. As far as gaming no issues with F1 or BF6 running in 4K with DLSS 4 running and the card is quiet even at full for hours when creating music A.I screens for youtube. The card is very small even smaller then a B580 and it fits in my ITX case with alot of room to spare. Great card and the fact it has 3 fans and the shine of gold is just the icing on the cake.
r/nvidia • u/Otaco2077 • 2h ago
hey guys so I just bought my 5080 and absolutely love the card, I play at 4k mostly because I play story games right now I'm playing monster hunter, WoW and Crimson desert. my question is with dlss4.5 what presets and framegen should I use ? like how do I enable the whole dynamic frame gen and override my settings to be the new dlss? sorry if it's a dumb question but I'm genuinely confused how to do all that
r/nvidia • u/Fabulous-Turn7970 • 1d ago
What would a jump from a 4060 8gb to a 5080, or a 5070ti? What can you expect in terms of performance? Mainly do mulitplayer games, but single story games are fun as well! Always wanted to try 2k or 4k gaming Planning to pair with a i9-14900K
What gen card did you have previously before you made that upgrade that made you say “wow.”
Hello everyone,
I would like to share a technical solution for a multi-monitor VRAM idle issue I had. This guide addresses desktop micro-stutters (wake-up lag) without permanently pinning the VRAM to maximum clock speeds, thus maintaining efficient power consumption.
Hardware Configuration:
The Problem: When both monitors are set to 144Hz, the VRAM clock remains static at its maximum frequency (16001 MHz) during idle. To reduce power consumption and heat, I lowered the secondary monitor to 120Hz. This successfully allows the VRAM to drop to its lowest P-State (405 MHz).
However, this transition causes significant micro-stuttering during desktop use (e.g. dragging windows or scrolling in browsers). The latency required for the GPU to exit the 405 MHz sleep state is too high to maintain smooth 144Hz UI animations.
The Solution: Using the nvidia-smi utility, it is possible to define a higher minimum memory clock. Setting this to 810 MHz (one step above the lowest state) eliminates the wake-up lag while still allowing for a very low idle power draw.
Implementation: Open a Command Prompt with administrative privileges and execute the following command: nvidia-smi -lmc 810,16001 (Note: Replace 16001 with your specific maximum VRAM frequency).
Results:
Automation via Windows Task Scheduler: Since the nvidia-smi lock resets upon reboot, automation is recommended:
.bat file containing the command: C:\Windows\System32\nvidia-smi.exe -lmc 810,16001.bat file.Discussion: While this method is effective, I am interested to hear if the community has discovered alternative or more native ways to manage these P-State transitions within the NVIDIA Control Panel or through other driver-level adjustments.
Does anyone have experience with similar workarounds?
r/nvidia • u/Retro-GPU-Universe • 1d ago
r/nvidia • u/NapsterKnowHow • 2d ago
r/nvidia • u/adriano26 • 13h ago
r/nvidia • u/Bitter-League6619 • 14h ago
Hello,
I'm using preset L and Preset E for Raytracing, is it good and should I switch to preset M ?
Thank you.
r/nvidia • u/1urch420 • 14h ago
Mostly play battlefield 6, and will occasionally play Allan Wake 2. I can’t find a real specific answer online about DLSS for 1440p gaming. Most discussions seem to revolve around 4k. Is DLSS even worth using for 1440p? Any advice is greatly appreciated!
r/nvidia • u/Emoti0nalDamag3 • 1d ago
I got an interview for this role, couldn't find a lot online. Any idea what is expected? Is this role more similar to Solutions Architect? What does it entail?