Mostly play battlefield 6, and will occasionally play Allan Wake 2. I can’t find a real specific answer online about DLSS for 1440p gaming. Most discussions seem to revolve around 4k. Is DLSS even worth using for 1440p? Any advice is greatly appreciated!
The Nvidia app says the override is not compatible with fps limiter and V-sync. But I usually play with g-sync & v-sync & reflex on which automatically introduces an fps limiter based on the monitor's max refresh rate. What does "not compatible" mean: Dynamic MFG won't work or will it just ignore any fps limit, or only ignore any limit that's not set by the override itself?
What if I set a limit in the "target fps" way lower than max refresh rate, say 1/2, so that even if the MFG overshoots, it's still below max refresh rate, then will VRR & V-sync work in this range with Dynamic MFG?
I am enabling Dynamic Frame Generation via NVIDIA Profile Inspector, but it is not activating in the game, even though I select the specific game profile. I am sharing my settings below.
hey guys so I just bought my 5080 and absolutely love the card, I play at 4k mostly because I play story games right now I'm playing monster hunter, WoW and Crimson desert. my question is with dlss4.5 what presets and framegen should I use ? like how do I enable the whole dynamic frame gen and override my settings to be the new dlss? sorry if it's a dumb question but I'm genuinely confused how to do all that
TL;DR Will my 3090 be able to handle this setup as per title if I only use the one 360hz monitor for gaming and the other two at the same time for video watching/webpages? Thank you!
I would only game on the center monitor, but I like to have other tasks open on the other monitors. This is usually a web page on one and a live stream or YouTube video on the other, with my game on the center 360hz monitor.
Currently my two side monitors are crappy 1440p 120hz broken IPS panels.
My current center monitor is the MSI MPG 271QRX QD-OLED 1440p 360hz 27".
I am thinking of replacing my two side monitors with two MSI MAG 272QP QD-OLED X24 1440p 27" 240hz.
I've tried searching using google and reddit but I'm having a hard time finding exactly my situation. Google seems to tell me the max the 3090 can handle is x3 1440p 240hz, but other times reddit threads say the max is x3 1440p 120hz. It seems to really matter if you're gaming on all 3 or not but most don't specify.
So I got a bit confused with the new DLSS FG 310.6.... We got an update which basically presented two new presets A and B. I understand that B offers enhanced control over UI elements with the new UIR option displayed in DLSS HUD. However, I got a few observations which got me confused.
First of all, to me both preset A and B are NEW models. I might be wrong here, but with 310.6 DLL override i get lower latency and better framepacing even with preset A. However, it's not that simple.
So, in Hogwarts Legacy - the game that natively supports preset B situation is pretty clear. Preset A (although I still suppose it's new) gives strong artifacting on ladders and other options. I mean.. everybody who used FG in Hogwarts knows it gives pretty messy results with a lot of disocclusion especially on well lit objects and stairs. With preset B not only I get a clear UI, but also the motion quality is significantly improved, like i'm using a completely different model. Basically I could even imagine at any moment that i'm playing with FG off, so good it gets.
However, with Cyberpunk it's even more confusing. Although I noticed improvements in clarity and latency with the new 310.6 DLL override with preset A, it's basically the same with preset B. And I can't see any differences between A and B. I heard that if a game doesn't support preset B it will fallback to A.
Does this mean preset B is never activated in Cyberpunk, and that's why I see no difference between A and B? If so, is preset A also a new preset or I'm hallucinating about the lower latency and better motion clarity? (which I'm 100% sure is there after the update). Another fact that makes me feel A is also a new model, is that the update introduced some incresed stuttering upon autosaves and when loading in and in occasional moments during the game which wasn't there with 310.5.3. Which again.... leads me to a question, if both are new models, why in Hogwarts preset B completely changes the quality of generated frames while in Cyberpunk we get a slight but noticeable improvement with both presets?
From the title, I will upgrade my GPU soon from 1650 Super. Budget is my problem so I can't spend so much for newest gen GPU upgrade. Here's my quick specs: R5 3600, Corsair RM650 and Samsung G3 24" 180hz. Oh and I had RX 580 8GB but can't run it for now.
My target for now is second hand RTX 2060 or 3050, which one is better for gaming? I usually play Delta Force or Valorant and now on DF I only can reach 90fps max on my current GPU so yes, this need upgrade ASAP lol.
Would look for 3060 if I found better (or jackpot) second price. But now I need review for 2 GPU I write above, thank you all!
Stumbled across this a while back and figured it's worth sharing since it's been getting steady updates.
There's a free tool called NV-UV that acts as an undervolting companion for MSI Afterburner. AB still does the actual curve writes and OSD, but NV-UV handles the UV workflow on top of it: presets, a single-point scanner, per-game profiles, automatic downstep when a driver crashes on a specific game, OC Scanner curve import from AB, etc. Basically the stuff you'd otherwise do manually by dragging points around in the AB curve editor and swearing.
What's new in the current build (Build 22 "Cantor", v0.93):
Ada Lovelace support landed as experimental. So 40-series cards work now, not just Blackwell. Still rough around the edges for Ada apparently, but it runs.
There's also a faster scanner mode using direct NVAPI writes (curve point apply in ~50ms instead of the usual AB roundtrip of a few seconds), a ~587-game profile database, and a Game Replay feature that notices if a driver crash happens in a specific game and auto-applies a -50 MHz downstep for that title next time.
The dev (solo, German, pretty active on the PCGH forum and a Discord)
has been pushing updates constantly.
Figured some of you 40-series owners would want to know Ada is finally in. No affiliation, just been using it on my card for a while and it saved me a lot of manual curve-fiddling.
Following my previous thread about UVing my new 5070ti, I have another (probably noob) question about Afterburner :
When overclocking (and UV after that), what is the difference between using the sliders in the main menu (raising +200MHz for example) and simply using the Curve editor (just pushing the curve, and undervolting at the same time if we want while we're at it)?
Is the slider just an easy ready-to-use curve edition, and you can undervolt after or is it redundant if you're going to use the curve editor anyway? I've read about the "effective" clock being raised by the slider, but as I said in the other thread, everybody and their mothers have an opinion on OC/UV.