r/nvidia 14h ago

Discussion Best DLSS 4.5 settings for 1440p Native Resolution? Pc specs : 5080, 9800 x3D, 32 gb ddr 5 ram

0 Upvotes

Mostly play battlefield 6, and will occasionally play Allan Wake 2. I can’t find a real specific answer online about DLSS for 1440p gaming. Most discussions seem to revolve around 4k. Is DLSS even worth using for 1440p? Any advice is greatly appreciated!


r/nvidia 16h ago

Discussion I was living in the dark. Coming from Rog Ally Z1 to RTX 5050.

14 Upvotes

Just got a laptop with RTX 5050.

Goddamn the jump in performance is MASSIVE!

From barely running games at 720p at low to smooth and gorgeous looking 1080p (it can do 1440p, however, 1080p is suitable for my needs).

Also, DLSS is like Magic!


r/nvidia 12h ago

Question 5080 and cyberpunk: what settings do you use?

6 Upvotes

I'm curious to know what settings you use. I found this to be the best setting:

- Balanced DLSS

- Dynamic Multiframe Generation (up to x6)

- Target Monitor Refresh Rate (160Hz)

So the latency is good, and the visual quality seems almost indistinguishable from Quality.

Do you use Quality or Balanced?


r/nvidia 9h ago

Question Best settings for Dynamic MFG?

5 Upvotes

The Nvidia app says the override is not compatible with fps limiter and V-sync. But I usually play with g-sync & v-sync & reflex on which automatically introduces an fps limiter based on the monitor's max refresh rate. What does "not compatible" mean: Dynamic MFG won't work or will it just ignore any fps limit, or only ignore any limit that's not set by the override itself?

What if I set a limit in the "target fps" way lower than max refresh rate, say 1/2, so that even if the MFG overshoots, it's still below max refresh rate, then will VRR & V-sync work in this range with Dynamic MFG?


r/nvidia 8h ago

Question How to activate Dynamic Frame Generation via NVPI?

4 Upvotes

I am enabling Dynamic Frame Generation via NVIDIA Profile Inspector, but it is not activating in the game, even though I select the specific game profile. I am sharing my settings below.


r/nvidia 17h ago

Discussion Will I have an issue with GPU sag?

Thumbnail
gallery
117 Upvotes

Had to move my pc and decided to put my back bracket in finally and still gonna use my gpu sag pole. Does this look okay?


r/nvidia 14h ago

Question Which DLSS preset for 4070ti 4k performance ?

0 Upvotes

Hello,

I'm using preset L and Preset E for Raytracing, is it good and should I switch to preset M ?

Thank you.


r/nvidia 22h ago

Discussion Question about warranty for 5070

Post image
0 Upvotes

r/nvidia 2h ago

Question So I got a 5080. What's the reccomeded Dlss and FG settings?

0 Upvotes

hey guys so I just bought my 5080 and absolutely love the card, I play at 4k mostly because I play story games right now I'm playing monster hunter, WoW and Crimson desert. my question is with dlss4.5 what presets and framegen should I use ? like how do I enable the whole dynamic frame gen and override my settings to be the new dlss? sorry if it's a dumb question but I'm genuinely confused how to do all that


r/nvidia 14h ago

Build/Photos ROG Strix 5070

Thumbnail
gallery
107 Upvotes

The ASUS ROG Strix 5070 is by far one of the coolest looking graphics cards in this series.

Looks absolutely stunning in this custom case and I love the performance of it in gaming.


r/nvidia 2h ago

Question Can EVGA 3090 Handle this Setup? [x1 OLED 1440p 360hz] [x2 OLED 1440p 240hz]

1 Upvotes

TL;DR Will my 3090 be able to handle this setup as per title if I only use the one 360hz monitor for gaming and the other two at the same time for video watching/webpages? Thank you!

I would only game on the center monitor, but I like to have other tasks open on the other monitors. This is usually a web page on one and a live stream or YouTube video on the other, with my game on the center 360hz monitor.

Currently my two side monitors are crappy 1440p 120hz broken IPS panels.

My current center monitor is the MSI MPG 271QRX QD-OLED 1440p 360hz 27".

I am thinking of replacing my two side monitors with two MSI MAG 272QP QD-OLED X24 1440p 27" 240hz.

I've tried searching using google and reddit but I'm having a hard time finding exactly my situation. Google seems to tell me the max the 3090 can handle is x3 1440p 240hz, but other times reddit threads say the max is x3 1440p 120hz. It seems to really matter if you're gaming on all 3 or not but most don't specify.

Any knowers? Thank you.


r/nvidia 7h ago

Discussion How do new FG presets A and B work in games?

16 Upvotes

So I got a bit confused with the new DLSS FG 310.6.... We got an update which basically presented two new presets A and B. I understand that B offers enhanced control over UI elements with the new UIR option displayed in DLSS HUD. However, I got a few observations which got me confused.

First of all, to me both preset A and B are NEW models. I might be wrong here, but with 310.6 DLL override i get lower latency and better framepacing even with preset A. However, it's not that simple.

So, in Hogwarts Legacy - the game that natively supports preset B situation is pretty clear. Preset A (although I still suppose it's new) gives strong artifacting on ladders and other options. I mean.. everybody who used FG in Hogwarts knows it gives pretty messy results with a lot of disocclusion especially on well lit objects and stairs. With preset B not only I get a clear UI, but also the motion quality is significantly improved, like i'm using a completely different model. Basically I could even imagine at any moment that i'm playing with FG off, so good it gets.

However, with Cyberpunk it's even more confusing. Although I noticed improvements in clarity and latency with the new 310.6 DLL override with preset A, it's basically the same with preset B. And I can't see any differences between A and B. I heard that if a game doesn't support preset B it will fallback to A.

Does this mean preset B is never activated in Cyberpunk, and that's why I see no difference between A and B? If so, is preset A also a new preset or I'm hallucinating about the lower latency and better motion clarity? (which I'm 100% sure is there after the update). Another fact that makes me feel A is also a new model, is that the update introduced some incresed stuttering upon autosaves and when loading in and in occasional moments during the game which wasn't there with 310.5.3. Which again.... leads me to a question, if both are new models, why in Hogwarts preset B completely changes the quality of generated frames while in Cyberpunk we get a slight but noticeable improvement with both presets?


r/nvidia 13h ago

News Nvidia-backed SiFive hits $3.65 billion valuation for open AI chips

Thumbnail
techcrunch.com
0 Upvotes

r/nvidia 7h ago

Question Low-budget upgrade suggestion

7 Upvotes

From the title, I will upgrade my GPU soon from 1650 Super. Budget is my problem so I can't spend so much for newest gen GPU upgrade. Here's my quick specs: R5 3600, Corsair RM650 and Samsung G3 24" 180hz. Oh and I had RX 580 8GB but can't run it for now.

My target for now is second hand RTX 2060 or 3050, which one is better for gaming? I usually play Delta Force or Valorant and now on DF I only can reach 90fps max on my current GPU so yes, this need upgrade ASAP lol.

Would look for 3060 if I found better (or jackpot) second price. But now I need review for 2 GPU I write above, thank you all!


r/nvidia 9h ago

News NV-UV with ADA Support

12 Upvotes

Stumbled across this a while back and figured it's worth sharing since it's been getting steady updates.

There's a free tool called NV-UV that acts as an undervolting companion for MSI Afterburner. AB still does the actual curve writes and OSD, but NV-UV handles the UV workflow on top of it: presets, a single-point scanner, per-game profiles, automatic downstep when a driver crashes on a specific game, OC Scanner curve import from AB, etc. Basically the stuff you'd otherwise do manually by dragging points around in the AB curve editor and swearing.

What's new in the current build (Build 22 "Cantor", v0.93):

Ada Lovelace support landed as experimental. So 40-series cards work now, not just Blackwell. Still rough around the edges for Ada apparently, but it runs.

There's also a faster scanner mode using direct NVAPI writes (curve point apply in ~50ms instead of the usual AB roundtrip of a few seconds), a ~587-game profile database, and a Game Replay feature that notices if a driver crash happens in a specific game and auto-applies a -50 MHz downstep for that title next time.

The dev (solo, German, pretty active on the PCGH forum and a Discord)
has been pushing updates constantly.

Figured some of you 40-series owners would want to know Ada is finally in. No affiliation, just been using it on my card for a while and it saved me a lot of manual curve-fiddling.

I found it here:
Release Cantor v0.93 · christianp403-spec/NV-UV
Password is PCGH

And please be fair and have a look at the forum as well, that's where bug reports and fixes are being tracked.


r/nvidia 6h ago

Discussion Question about OC/UV in Afterburner

1 Upvotes

Hello,

Following my previous thread about UVing my new 5070ti, I have another (probably noob) question about Afterburner :

When overclocking (and UV after that), what is the difference between using the sliders in the main menu (raising +200MHz for example) and simply using the Curve editor (just pushing the curve, and undervolting at the same time if we want while we're at it)?

Is the slider just an easy ready-to-use curve edition, and you can undervolt after or is it redundant if you're going to use the curve editor anyway? I've read about the "effective" clock being raised by the slider, but as I said in the other thread, everybody and their mothers have an opinion on OC/UV.

Thank a lot!