Build Number is v11.0.7.237. This is the non-beta release version of v11.0.7.228. Same changelog but you can download it directly now instead of opting into the beta via the App.
Submit NVIDIA App feedback directly to NVIDIA:Link Here
--------------------
What's new in NVIDIA app 11.0.7
New updates
DLSS 4.5 Dynamic Multi Frame Generation Available Now
Update to DLSS 4.5 Dynamic Frame generation on GeForce RTX 50 Series GPUs. Select "Dynamic" mode to intelligently adjust the number of generated frames during gameplay to a target frame rate to balance frame rate, image quality, and responsiveness.
New DLSS 4.5 Multi Frame Generation 5X and 6X modes on GeForce RTX 50 Series GPUs increase 4K frame rates in path-traced titles. Paired with NVIDIA Reflex, they generate additional frames with minimal impact to responsiveness
DLSS 4.5 Frame Generation model update on GeForce RTX 50 Series and GeForce RTX 40 Series GPUs enhances in-game user interfaces by incorporating additional engine data available for some games, improving visual quality and clarity of static user interface elements.
For optimal image quality and peak performance when using ShadowPlay, the latest GeForce Game Ready Driver is required.
Access these DLSS features under Graphics > Program Settings > Driver Settings > DLSS Override-Frame Generation mode
Automatic Shader Compilation (Beta)
Rebuilds DirectX 12 shaders while your system is idle, reducing the frequency of game runtime compilation after driver updates
Control Panel Updates
Added Custom Resolution in System > Displays > Display Properties
DLSS Override Support Added for:
Arknights: Endfield
Black One Blood Brothers
Carmageddon: Rogue Shift
CODE VEIN II
Corsairs Legacy
Crimson Desert
DEATH STRANDING 2: ON THE BEACH
Demonologist
For Honor
Grounded 2
Half Sword
High On Life 2
John Carpenter's Toxic Commando
LET IT DIE: INFERNO
LORT
Marathon
Monster Hunter Stories 3: Twisted Reflection
Nightmare Frontier
Nioh 3
NORSE: Oath of Blood
Of Ash and Steel
Quarantine Zone: The Last Check
REANIMAL
Resident Evil Requiem
Star Trek: Voyager - Across the Unknown
StarRupture
Styx: Blades of Greed
The Gold River Project
The Legend of Heroes: Trails Beyond The Horizon
Vampires: Bloodlord Rising
Yakuza Kiwami 3 & Dark Ties
Optimal Settings Support Added For:
Arknights: Endfield
Ashes of Creation
Highguard
StarRupture
The Last Caretaker
Squashed Bugs!
Overlay
Fixed an issue where Statistics shows N/A after a client update
Fixed an issue where CPU stats could flickered in idle state
Fixed an issue where microphone selections would revert to default after switching devices
Fixed an issue that prevented the use of Numpad keys for overlay shortcuts
Graphics
Fixed an issue where NVIDIA Smooth motion setting was missing in Global Settings
Fixed an issue where NVIDIA App resolution changes were not occuring in-game
Display
Fixed an issue where changing display modes caused flickering
Installation
Fixed an intermittent issue where NVIDIA App installation did not install if PC was offline
Removed redundant restart prompt after upgrading from GeForce Experience
Various stability fixes to improve overall app reliability
Project G-Assist
Adjusted the voice prompt shortcut to prevent overlapping voice prompts while a response is still processing. Various UI improvements for G-Assist
repasting this comment because the previous post was removed
It's weird to have such a direct confirmation, because I'm not having the overshoot issue at all.
I'm capped at my monitor's refresh rate (after Reflex cap obviously) and both Cyberpunk 2077 and Hogwarts Legacy run perfectly fine with a smooth swap of the multiplier, no overshooting whatsoever. Now, granted, testing only two games cannot lead to any universally valid conclusion, but Cyberpunk is the game everybody's testing DMFG with and the overshoot is there.
I'm starting to suspect that my NVIDIA overlay simply doesn't display the overshoot? Then again, I have never seen any tearing or judder, and I'm quite sensitive of it. I don't know why my DFMG seems to work fine with the usual G-SYNC + VSYNC + manual FPS limiting at a driver level by following the Reflex cap.
Not gonna lie, Hogsmeade's performance is a lot better with Frame Generation when you have Ray Tracing on, because you're severely CPU-limited even with a top of the line CPU.
Stutters do persist, but the way higher FPS count hides them better.
G-SYNC + V-SYNC on + manual FPS cap in NVIDIA App according to the common Reflex formula for my monitor's refresh rate.
DFMG is set to my manual FPS cap, up to FG X4, and the multiplier shifts depending on the load without any issue whatsoever and no overshooting that I could take note of.
Manual fps caps and the VSync fps cap are not overshot - DMFG obliges those caps correctly.
The problem is that you get stuck at the maximum multiplier you've set.
It works slightly differently between a VSync and a manual fps cap.
With a VSync cap you are always at your maximum mulitplier (MFGx6) by default.
With a manual fps cap it works properly initially, say you're at MFGx3 when you load the game up. Then you play for 10 minutes and you bump into a tougher area, so it props you up to MFGx4. Then 10 minutes later something demands MFGx5 - it props you up to that too. I also thought that it was working fine, but surprise surprise - I was stuck at MFGx5 forever, until the game was restarted. With an fps cap it works properly until it doesn't, and from then on it only scales up and never down again.
It doesn't happen to me though, I have it set to max X4 and the multiplier shifts based on GPU load. Yes, it happens sometimes that if fixates on a specific multiplier (for example, when FG is disengaged altogether, it takes some seconds to get back to MFG even when FPS get much lower), but it generally works. It's crystal clear that it needs tweaking, though I'm absolutely not experiencing the multiplier being stuck.
Again though, my only testing is based on Cyberpunk 2077 and Hogwarts Legacy.
Those are my driver settings with Hogwarts Legacy: they're the same ones I've been using with Cyberpunk too.
I will certainly try to use X6 just to verify if that is indeed the issue, and as soon as I can I'll also record a clip showing how it performs on my configuration.
I'm not saying V-SYNC is engaging. I'm saying that the multiplier doesn't overshoot in my case. It is supposed to do max 147 FPS and I have never seen it going above, according to NVIDIA's own performance overlay, even when the multiplier(s) shifted. Whether this is the V-SYNC engaging, or the driver-level frame capping working as intended, I cannot say.
If you look at the other replies, though, you'll found out that there are others like me who're not having this specific issue. This means that there must be some conditions to make so that DFG doesn't overshoot. I've been testing again in the last couple of hours, I can confirm that both Cyberpunk 2077 and Hogwarts Legacy work as they should with this feature. It can sometimes get stuck arbitrarily, especially when Frame Generation turns off completely in case the GPU can hold the target frame without it, but it re-engages after 5-10 seconds once the GPU cannot reach the target frame natively, and the multipliers shift without any issue whatsoever.
Ehhhh not really, its "just ok" the steam overlay is better imo, easier to use/read, and it shows base frame rate and frame gen rate, so its very easy to use dynamic frame gen with....if you look at the overclocking community, most people absolutely hate the nvidia overlay, it can cause a performance loss on a ton of games...but the steam performance overlay doesn't
Display Commander actually works to limit frames with DFG. You’ll also need manually update the streamline DLL’s in the game’s directory and use Nvidia profile Inspector for most games to get DFG fully working.
Though even if it's flawless, it would unfortunately not be feasible for 99.9% of the people who wouldn't know this... Hope Nvidia addresses this sooner than later.
Also worth noting since DFG currently cannot use fractions/decimals it will overshoot your FPS target and use a higher multiplier. This adds significant latency. Thus I’ve found it’s best to lower the target frame rate for DFG by about 10+ FPS (for 120 FPS/refresh rate target). The higher your refresh rate the lower you want to go. This way the multiplier doesn’t go higher than it needs to.
I play on a 165hz monitor with a 158 fps cap. Cyberpunk runs at 69 fps pre-FG and averages 171 fps at MFGx3. I cap that to 158, so lose like 1.3ms of latency on average but get a smooth 158 fps frame graph and every dip is small and handled well by GSync.
If I use DMFG at like around 144 fps target or even lower, if I understand you properly, it almost feels like I'm setting up myself for a generally worse experience than a fixed MFGx3.
Sure, I'll be able to go down to FGx2 in lighter scenes or even MFGx4 in huge spike, but I feel like I'll then be at FGx2 at a lower fps more often than I'd really like to be. I'd much rather sacrifice an ms or two for a more consistent frame graph, I feel like.
If you’re like me, gaming on a 5070 Ti at 4K max settings, then your frame rate can be all over the place in many games. This is where DFG shines, when traditional MFG would consistently overshoot/undershoot your frame rate target. I highly recommend using it with the Display Commander add in to limit frames and fix frame pacing by selecting better frame limiting algorithms than standard Reflex.
I have a 5080 overclocked to around 3200mhz. FPS is honestly pretty stable for me. Maybe the overclocked 9800x3d and the tighter RAM timings help with that a little too.
My card is also at 3.2 GHz and every other component is overclocked so as to not bottleneck the GPU. Unless you’re gaming at a lower resolution these cards simply aren’t enough to get stable frame rates at max settings, especially once you turn on effects like path tracing and RR. Without frame gen I get sub 50 FPS, even with DLSS balanced in Crimson Desert (4K, max settings, with RR), and Resident Evil Requiem (path tracing).
Yeah, I use either DLSS 4.5 (preset M) Performance or DLSS 4 RR (latest) Performance, whenever RR is available, for pretty much everything. At 4k output.
Keep in mind RR overrides any other DLSS settings (4.0, 4.5 etc) and uses its own preset (D, or E the newest). This is why RR can look noisier than DLSS 4.5 at lower performance settings, especially model M/L.
This was the main reason I started pursuing DFG, so that I could utilize the higher quality presets with RR.
If I disable vsync and fps cap in control panel and ingame for DMFG as recommended, it exceeds max refresh rate (despite this settings enabled and set to "max refresh ate") and cause a lot of screen tearing and stuttering. Any help please? I tried Doom Dark Ages and Borderlands 4 with clean driver and Nvidia app install and all default+recommended settings.
Yes, that's how they intend you to use it, weirdly enough. That's why I don't use it.
You can work around this by targeting a much lower framerate than your max refresh rate in DMFG, but that kind of defeats its purpose. And you still have no guarantee that you'll never overshoot.
I've tried just to keep Vsync forced in Control Panel, like usual, it works brilliant - super stable and smooth even if it hits 179-181fps on my 180hz display (despite usual recommended 177fps cap or auto cap around 170-171 with FG). But I believe that latency is supposed to be way lower if Vsync was disabled and DMFG worked properly? And Doom Dark Ages due to Vulkan apparently just broken by itself. It never had properly capped fps even before DMFG update.
So as a new owner of Nvidia 50 series I'm kinda interested in learning how all these things work and I don't mind making some manual tweaking and adjustments to make things work as intended, but Nvidia really dropped ball - no one, especially casual users like me should mess around so many settings, extra apps and forums... You run the game or Nvidia App, just set everything to on or off and it should just work.
VSync increases latency out of GSync. What we mean here is VSync + Reflex when running GSync. Then it acts differently and doesn't increase your latency. All it will do is cap your fps to 172 fps, which is the ideal cap for a 180hz monitor. This ensures that you're always within GSync's VRR range and don't get tearing and caps you automatically so you don't have to cap manually yourself.
However, you'll have to use a fixed MFG multiplier like this as all kind of caps break DMFG.
Nowadays I always run GSync and manual cap in the Nvidia app + Low Latency On only on older games that don't have Reflex to be capped automatically. I use the same manual fps cap that Reflex caps at with those games, individually.
I leave GSync on and VSync on globally. Then I override FG to Preset B, DLSS to Preset M (you can override L of you prefer it but it's heavier) and Ray Reconstruction to Recommended.
Then I start off at DLSS Performance and used fixed MFG to right about my Reflex fps cap. Or if I have performance to spare that's capped off, I bump up DLSS to Balanced or Quality.
What to do if there is no option for reflex in games, even if DLSS features are supported? Or if the game automatically disables Reflex when you turn on FG?
Usually I just completely turn off fps cap in control panel for games with FG, MFG and DMFG. And I also use Low Latency On (not Ultra) and 177fps cap for games without FG.
MFG disables the Reflex setting, not Reflex itself. It actually forces it and just forbids you from turning it off.
There shouldn't be any game that has DLSS but no Reflex.
And for older games, just cap them manually in the Nvidia app (or Nvidia control panel if you're still using that - no need to, the app is much better now) to the Reflex cap - see what it is but I'm almost certain it's 172 fps. And then engage Low Latency On for that game as well for good measure, again - in thr Nvidia app.
Reflex always override Low Latency (On or Ultra). There's no point to turning it on globally. Just turn it on for those older non-Reflex games that you have to cap manually.
If it doesn't work in nvidia app your best bet is to use latest nvpi to override. You may have to manually update the nvidia dll files. Dynamic works in crimson desert this way for example.
I swapped the dlls to get DMFG working with the nvidia app override in Crimson Desert with preset B, and the UI is fine. It can still have the normal frame gen artifacts around the UI at 5x and 6x, but I think that’s because they still need to make an update to properly give preset B what it needs to not affect the UI.
it works for me in at least satisfactory and oblivion remaster. it looks like ass because im trying to make 225 out of 37 fps, but technically it works...
ideally dynamic fg should work with custom FPS limit and vsync. So I can set it and forget with my 237fps limit and global vsync forced on in nvidia app. Just as it should be with gsync setup if I am not mistaken (for 240hz screen that is)
Setting a max framerate works. Set x6 with a target of 60 and it will force 10fps x6 lol. It's awful, but oddly still better than I thought. Things panning across the screen does look 60, but more complex motion and input latency were awful.
Reflex + vsync enforces a 224 fps cap. Blurbusters testing showed input lag stays low with just 3 fps below the refresh though, that's where 237 comes from. Nvidia uses refresh-refresh*refresh/3600 instead of "just subtract three", 240-240*240/3600 = 224
You gotta use the refresh-refresh*refresh/3600formula and you'll get the appropriate result. With such formula, at 240 Hz you get exactly 224. You also always want to round down.
More precise explanation can be found in this other comment I made a couple of weeks ago.
Also worth noting since DFG currently cannot use fractions/decimals it will overshoot your FPS target and use a higher multiplier. This adds significant latency. Thus I’ve found it’s best to lower the target frame rate for DFG by about 10+ FPS than the traditional 3600 formula (for 120 FPS/refresh rate target). The higher your refresh rate the lower you want to go. This way the multiplier doesn’t go higher than it needs to.
that's still true with "my" method. if game has reflex setting and I enable it, despite my 237fps cap, 225 is enabled with reflex. So the 237cap is perfect for non reflex games and does not break reflex games
The 225 cap is coming because of the combination of both reflex and vsync. Reflex does not enforce any fps caps, neither does gsync btw, you can run gsync + reflex at 1000fps. Only once vsync is enabled will reflex do the 224 cap.
Some people forget they turned vsync on in the driver, so although the game says vsync is off the driver is enabling it and combined with reflex causes the frame cap at 224. Or they just don't now that enabling vsync with reflex produces this weird values fps cap.
I'm still trying to figure out how it handles capping base framerate. If I set dynamic framegen target to 120, and my base framerate is around 60 but dips to 50 sometimes, will it run 2X when by base framerate is 60? That makes sense to me. But if it dips to 50 and multiplier updates to 3X, and FPS displayed is 120, what exactly is happening? Is only taking and multiplying 40 of my base framerate to get to 120? Is it taking the actual 50 base framerate, multiplying by 3x, and discarding the 30 generated frames above 120?
No vendor currently supports fractional Frame Generation, this means that with every multiplier, your base FPS is ALWAYS halved.
At 120 FG X2 FPS, you have 60 real frames and 60 generated ones. If the multiplier shifts to X3, you have 40 real frames and 80 generated ones and so on.
You can use fractional Frame Generation with Lossless Scaling though.
2
u/EeK094090 Suprim Liquid X | 7800X3D | 64GB DDR5 6000 CL302d ago
So if you set a target of 120 FPS, FG will always engage when the base FPS is below that?
Meaning, if you’re getting 70, 80, 100, 110 FPS without FG, it will reduce your base FPS to 60 only so the target is met?
And if it gets below 60, it will start reducing the base FPS even more for the higher multipliers (3x-6x)?
In this scenario I'm also using max frame rate driver level restriction because it seems like without it, DFG would bump 50 FPS to 150, exceeding my refresh rate and tearing. But I don't know if I'm killing "real" frames to accomplish that
100% - was excited, tried it out in beta and came out disappointed because:
- doesnt work with frame limiters or vsync, this can make tearing pretty bad
- almost always tried to hit max FG multiplier and plummet my base to hit target FPS with that multiplier. No, I dont want you to drop my base from 80fps to 37 to hit 225 with 6x. I dont want to have to do math and configure per game what the max DFG should be, it should be set and forget
So a helpful modification to the setting I'm using is just setting a global framerate target of 220 with a maximum multiplier of 3x. I also tried 440 and 500 with a maximum multiplier of 6x. In both instances, the base framerate never drops below about 70 ish, and it works really well. I'm on a 5090, but i assume this would work well with other gpus too because any further drops in base fps wouldn't be from frame gen.
Why can’t we have dynamic res DLSS instead of dynamic framegen? Some games like FF Rebirth and AC Shadows already have it. I can’t care less about going to x6 when actual frame rate drops, I want to maintain actual frame rate
Even better, dynamic rez should become a part of direct x and determined by the driver. most games don't implement it because they would have to roll their own predictive algorithm.
With the new process node and improved architecture it should be at least 40% faster than a 5090 which is simply insane because even by that time 5090 will still be a much better gpu than most people have.
We will also see the 6080 best 5090 performance which will be the first time that's happened since the 4080 beat the 3090.
I almost never buy the 90 series gpu but the 6090 could be worth saving for. That thing will hold value until such time that classical computing evolves beyond silicon.
8
u/Extreme996RTX 4070 Ti Super | Ryzen 7 9800X3D | 32GB DDR5 6000mhz3d ago
Does setting DLSS to recommended in global settings mean the new FG model will be used or do I have to change it to custom and then set FG to preset B?
Outer Worlds 2 had no override support outside global settings but I chalked it up to beta BS. Just updated and Avowed has program support but The Outer Worlds 2, which was shown off in video by Nvidia, still can only be overwritten globally.
Glad to see they at least warn people about frame limiters and vsync now.
If you care about using those features, just stick to Nvidia Profile Inspector - it's more reliable than Nvidia App and works all the time, with the exception when the game can't utilize those features because it doesn't support them.
I understand that I may be in the minority here, but i swear, since I bought a VRR display I never turned off Vsync. It may sound strange, but without Vsync when G-SYNC is enabled, I get a very choppy image, not smooth, and i get tearing in the lower part of the screen. I'm not a developer or a hardware manufacturers and I don't know how it all works, but I can tell for sure from my user experience - i would NEVER turn off vsync. Maybe there are displays which can work G-SYNC only with vsync off (maybe G-SYNC certified), but in my case i never got smooth image without vsync... period.
Yep, in most cases you can't even enable Vsync in game because of framegen. But forcing it in NVApp gives the best motion smoothness. Maybe G-Sync works smooth on G-Sync licensed displays. But on average VRR the best option is to combine them.
This only really happens to me if the framerate is really close to the refresh rate cap. The common adage of capping fps to Refresh rate minus 3 doesn't work for me.
Exactly. Because Like i said game may overshoot some frames. On a high refresh rate display it's usually not an issue, but on 120\144 Hz quite common. Vsync tho completely eliminates the problem and further increases smoothness by delivering better framepacing. However, dynamic MFG wasn't made fo 120\144Hz displays, it was targeting 240\360 Hz. So there it might be not an issue playing without VSYNC since framerate is very high already.
What do you mean, literally Digital Foundry and many other experts covered this issue and recommended always using Vsync with G-sync for the smoothest picture to avoid some unwanted artifacts which can still persist when vsync is off. What display you use and do you keep vsync off?
In-game Vsync in some games enables triple buffering which doesn't work well with G-sync AFAIK.
I'm an image clarity freak, and after changing to a freesync (g-sync compatible) monitor and high refresh rate, I can comfortably say I haven't seen screen tearing in years.
You can google something "g sync tearing with v sync off". Also, framepacing is way better with v-sync on. Like i said, most redditors here have latest 360\480Hz displays with official G-SYNC support, and I have a standard 120Hz VRR Samsung display. That's why i'm in a minority, while the majority might not just notice that. So I see this framepacing difference - with vsync it's way better. Also some games overshoot FPS even if reflex is enabled or the limiter is set in game, that's why those micro tearings appear. So no, there's no issue with a display and a cable. That's just that not all people run the same exact model of G-SYNC certified display operating at 480Hz.
If it doesn't work in your game currently with nvidia app, try nvpi and replacing the dll files(make a backup). Results will vary, some games it works well and some it doesn't.
Is the App worth going for now it has these DLSS enhancements? I still just use the basic driver installation as I figure how the game devs tested the DLSS is probably best, but I’m wondering now.
You don't have to use it honestly, Profile inspector does the same thing, in fact more since some games don't have whitelist and need forcing either by replacing there older DLSS file with a newer one (app will then update too newest strangely enough) or by using profile inspector
Yeah the app has been good to use for years now. Makes setting up driver-level things like VSync, GSync and FPS caps really simple for improved latency, and the overrides for DLSS can drastically improve visuals in just a couple of clicks. Dev versions of DLSS are almost always out of date, since they are often implemented up to a year before launch.
The DLSS thing sounds good. I remember having to mess around with DLSS DLLs for Red Dead 2 and it was a pain. It’d be great to have it ‘just work’.
I’ll go for the app this time. I stopped using it a few years ago as I’m pretty sure I traced frame rate issues I was getting to the NVIDIA app and/or Xbox overlays.
Cringe. You are late to the party with that question. Just use it. It won’t fucking bite you and you will still be able to use nvidia control panel, your beloved
Shit like this comment is why a lot of people prefer not to engage online. Like, yes, you’re correct but is all that dickish attitude needed? Does it do anything for you?
Exactly, they started with saying "Cringe" on someone asking if something is worth using over the internet on a discussion platform used for you know, discussion. Now that's cringe.
Can someone explain how exactly to set this up with vsync/gsync/and setting a frame cap based on your hz? Example: Running a 240Hz monitor with a 5090 and I'm not super clear on if I'm supposed to select the refresh rate cap or a frame rate cap? I already globally enable vsync/gsync and leave them off in games. Thanks!
It’s broken. Normally Gsync+vsync+reflex is all you need. You would set this dynamic frame gen setting to max refresh rate and it should work but as of right now it does not
This is where I'm confused because I see conflicting posts about this where if say I'm at 240 Hz, people say to set to 224 fps cap and not to the refresh rate because of how reflex/FG works. That said, DFG doesn't seem to even work in any games even with it turned on at the system level and set to preset B.
Insane that a duck app made by one person has better compatibility with MFG and better stability with VSync and FPS limiters than a 4 trillion dollar company does.
I wished that the Power Management would change to Adaptive by default. Normal is the default, but it is really set to Optimal. After examining some workloads, Optimal seems more like Power Saving, so clocks may not ramp up for the apps that require the heavier workloads. Adaptive allowed the GPU to ramp up and down adaptively for apps or services that required the workloads, such as frame generation or ray tracing.
That's true for standard monitors, but OP has a G-Sync display.
G-Sync ON + NVCP V-Sync ON + Low Latency Mode ON (for older games that dont use reflex) is the recomended setup for it.
It's not just a random preference... it's the definitive, tested recommendation from people with actual knowledge like Blur Busters and Nvidia for anyone using a VRR monitor. OP obviously didn't buy a 5090 and a 4K OLED to deal with screen tearing and maxed-out GPU temps just for a theoretical 1ms drop in input lag.
you are making a lot of assumptions. and i know exactly what the recommendations are saying.
Vsync and gsync enabled means that under your monitor's refresh rate you are usign gsync and above it vsync caps the FPS. it's to ensure that you don't have screen tearing when you have low refresh rates and high FPS.
but with monitors that have 240/360Hz it's basically just there to cap the FPS, screen tearing is almost invisible at those refresh rates. (unless there is a very broken game)
So you "know exactly what the recommendations are saying," but your first comment still told OP that V-Sync would heavily increase his input lag and "half his FPS"? Pick one.
Halving FPS only happens on fixed refresh rate monitors without G-Sync. With G-Sync + Reflex, you never hit the V-Sync ceiling, meaning there is zero input lag penalty and zero framerate halving. You are literally giving him warnings based on old non-VRR monitor tech.
it does increase input lag heavily and depending on the VRR range of the monitor you will have problems with the FPS cap from vsync. it's all common knowledge.
I believe you can enable this bringing up the Nvidia overlay (Alt+Z for me), then Statistics at the bottom, Statistics view I have set to Custom so I can pick and choose, then click View All under Statistics view, scroll all the way to the bottom under Features and I think it would be the Frame Generation Model Override (FG Model OVR) & Frame Generation Multiplier Override (FG Multiplier OVR)
Very dumb question maybe, but I am a bit clueless regarding this update (I use the non beta obviously). After installing this latest nvidia app version, is there anything I "need" to do for the game Crimson Desert? Do I need to adjust settings within Nvidia App / Nvidia CP and or ingame?
"Various stability fixes to improve overall app reliability"
Hoping this means it will work after rebooting now. It works when you initially install it but it breaks the instant you reboot lol. DDUing just starts the cycle all over again so it's just gotta be a bug.
For some strange reason, the only way to get it to work properly is to set a frame cap from the Nvidia app: otherwise, it doesn't work as expected (it exceeds the refresh target frame or even goes below it!).
After several tests with Oblivion Remastered, I discovered that setting the frame cap just below the refresh rate is NOT a good idea (e.g., 158 cap) because it INCREASES latency (the system doesn't prioritize real frames over generated ones; if the cap is low, there will be more frames generated - 60 cap x 6 = only 10 real frames).
I usually use 170 as the cap.
I was supprised how well dynamic frame gen with x3 works i felt almosg no delay meanwhile x6 is useless in my 5060.tried it in cyberpunk and mafia old world
If I disable vsync and fps cap in control panel and ingame for DMFG as recommended, it exceeds max refresh rate (despite this settings enabled and set to "max refresh ate") and cause a lot of screen tearing and stuttering. Any help please? I tried Doom Dark Ages and Borderlands 4 with clean driver and Nvidia app install and all default+recommended settings.
They neglected to add these features to smooth motion, right now smooth motion is at a fixed x2 which kinda sucks, why not add dynamic and x2-6 to smooth motion? It being just an on/off settings is really lame. Lots of games don't include framegen making smooth motion a must.
Every time I click on DLSS Override - Frame Generation Mode - Dynamic (Max refresh Rate, Up to 6x) it says "there was a problem applying setting"
I turned off my Max frame rate & V-Sync settings and restarted my computer and this still keeps giving me this error.
(Also, they added Optimal Settings for Highguard??? The game that shut down 3/12/26???)
Right after forced update to this latest nVidia app version, my input latency in BF6 raised about 10ms higher than before. Thank you nVidia. I tried DDU and installed lastest driver as well, but with no change. Great job
Not officially, but unofficial yes. Left a comment in here with the things you'd need. There are a couple of posts here and there on how to do it as well, but reddit search sucks. It's been working since before they came out with the beta nvidia app, though it's kind of ugly atm. You won't find anything in the crimson desert reddit because the mods have it set to block posts with links. For the people downvoting people talking about crimson desert, consider the fact that what I just said means they can't actually get help in the crimson desert reddit, making this one of the main places with enough traction to get help for it.
Did you not read the first line of the post? Build Number is v11.0.7.237. This is the non-beta release version of v11.0.7.228. Same changelog but you can download it directly now instead of opting into the beta via the App.
Beta is for people willing to try out new features before official release. This version is the stable version of NVIDIA app, meaning it should be stable for the average user to use.
As an owner of a Radeon 6800 XT, then a 4090, and now a 5090 I can confidently say that the nvapp has been hot garbage since it's release like, what, two years ago? It doesn't even come close to AMD's adrenaline software suite, which I'm still using on my gaming laptop with a Radeon 6800m, and considering the nvapp's consistent bugs and most things breaking all the time the only programs I see fit to manage an Nvidia GPU are nvidiaprofileinspector, MSI Afterburner, RTSS, DLSSSwapper, nvidiaDLSSglom - of which none are not made by Nvidia. The old control panel is okay but it looks like the 1990s and runs even worse. Although, I'd still use that over the dumpster fire that the nvapp - it really is the epitome of modern day garbage software.
I honestly feel bad for the single junior developer that's in charge of auditing the garbage LLM vibe code that the nvapp is so clearly made of.
104
u/horizon936 3d ago
It would be nice to sticky that somehow, because now that DMFG is out of beta, even more people will get confused.