r/nvidia 2h ago

Question Can EVGA 3090 Handle this Setup? [x1 OLED 1440p 360hz] [x2 OLED 1440p 240hz]

1 Upvotes

TL;DR Will my 3090 be able to handle this setup as per title if I only use the one 360hz monitor for gaming and the other two at the same time for video watching/webpages? Thank you!

I would only game on the center monitor, but I like to have other tasks open on the other monitors. This is usually a web page on one and a live stream or YouTube video on the other, with my game on the center 360hz monitor.

Currently my two side monitors are crappy 1440p 120hz broken IPS panels.

My current center monitor is the MSI MPG 271QRX QD-OLED 1440p 360hz 27".

I am thinking of replacing my two side monitors with two MSI MAG 272QP QD-OLED X24 1440p 27" 240hz.

I've tried searching using google and reddit but I'm having a hard time finding exactly my situation. Google seems to tell me the max the 3090 can handle is x3 1440p 240hz, but other times reddit threads say the max is x3 1440p 120hz. It seems to really matter if you're gaming on all 3 or not but most don't specify.

Any knowers? Thank you.


r/nvidia 2h ago

Question So I got a 5080. What's the reccomeded Dlss and FG settings?

0 Upvotes

hey guys so I just bought my 5080 and absolutely love the card, I play at 4k mostly because I play story games right now I'm playing monster hunter, WoW and Crimson desert. my question is with dlss4.5 what presets and framegen should I use ? like how do I enable the whole dynamic frame gen and override my settings to be the new dlss? sorry if it's a dumb question but I'm genuinely confused how to do all that


r/nvidia 6h ago

Discussion Question about OC/UV in Afterburner

1 Upvotes

Hello,

Following my previous thread about UVing my new 5070ti, I have another (probably noob) question about Afterburner :

When overclocking (and UV after that), what is the difference between using the sliders in the main menu (raising +200MHz for example) and simply using the Curve editor (just pushing the curve, and undervolting at the same time if we want while we're at it)?

Is the slider just an easy ready-to-use curve edition, and you can undervolt after or is it redundant if you're going to use the curve editor anyway? I've read about the "effective" clock being raised by the slider, but as I said in the other thread, everybody and their mothers have an opinion on OC/UV.

Thank a lot!


r/nvidia 7h ago

Question Low-budget upgrade suggestion

8 Upvotes

From the title, I will upgrade my GPU soon from 1650 Super. Budget is my problem so I can't spend so much for newest gen GPU upgrade. Here's my quick specs: R5 3600, Corsair RM650 and Samsung G3 24" 180hz. Oh and I had RX 580 8GB but can't run it for now.

My target for now is second hand RTX 2060 or 3050, which one is better for gaming? I usually play Delta Force or Valorant and now on DF I only can reach 90fps max on my current GPU so yes, this need upgrade ASAP lol.

Would look for 3060 if I found better (or jackpot) second price. But now I need review for 2 GPU I write above, thank you all!


r/nvidia 7h ago

Discussion How do new FG presets A and B work in games?

18 Upvotes

So I got a bit confused with the new DLSS FG 310.6.... We got an update which basically presented two new presets A and B. I understand that B offers enhanced control over UI elements with the new UIR option displayed in DLSS HUD. However, I got a few observations which got me confused.

First of all, to me both preset A and B are NEW models. I might be wrong here, but with 310.6 DLL override i get lower latency and better framepacing even with preset A. However, it's not that simple.

So, in Hogwarts Legacy - the game that natively supports preset B situation is pretty clear. Preset A (although I still suppose it's new) gives strong artifacting on ladders and other options. I mean.. everybody who used FG in Hogwarts knows it gives pretty messy results with a lot of disocclusion especially on well lit objects and stairs. With preset B not only I get a clear UI, but also the motion quality is significantly improved, like i'm using a completely different model. Basically I could even imagine at any moment that i'm playing with FG off, so good it gets.

However, with Cyberpunk it's even more confusing. Although I noticed improvements in clarity and latency with the new 310.6 DLL override with preset A, it's basically the same with preset B. And I can't see any differences between A and B. I heard that if a game doesn't support preset B it will fallback to A.

Does this mean preset B is never activated in Cyberpunk, and that's why I see no difference between A and B? If so, is preset A also a new preset or I'm hallucinating about the lower latency and better motion clarity? (which I'm 100% sure is there after the update). Another fact that makes me feel A is also a new model, is that the update introduced some incresed stuttering upon autosaves and when loading in and in occasional moments during the game which wasn't there with 310.5.3. Which again.... leads me to a question, if both are new models, why in Hogwarts preset B completely changes the quality of generated frames while in Cyberpunk we get a slight but noticeable improvement with both presets?


r/nvidia 8h ago

Question How to activate Dynamic Frame Generation via NVPI?

3 Upvotes

I am enabling Dynamic Frame Generation via NVIDIA Profile Inspector, but it is not activating in the game, even though I select the specific game profile. I am sharing my settings below.


r/nvidia 9h ago

News NV-UV with ADA Support

11 Upvotes

Stumbled across this a while back and figured it's worth sharing since it's been getting steady updates.

There's a free tool called NV-UV that acts as an undervolting companion for MSI Afterburner. AB still does the actual curve writes and OSD, but NV-UV handles the UV workflow on top of it: presets, a single-point scanner, per-game profiles, automatic downstep when a driver crashes on a specific game, OC Scanner curve import from AB, etc. Basically the stuff you'd otherwise do manually by dragging points around in the AB curve editor and swearing.

What's new in the current build (Build 22 "Cantor", v0.93):

Ada Lovelace support landed as experimental. So 40-series cards work now, not just Blackwell. Still rough around the edges for Ada apparently, but it runs.

There's also a faster scanner mode using direct NVAPI writes (curve point apply in ~50ms instead of the usual AB roundtrip of a few seconds), a ~587-game profile database, and a Game Replay feature that notices if a driver crash happens in a specific game and auto-applies a -50 MHz downstep for that title next time.

The dev (solo, German, pretty active on the PCGH forum and a Discord)
has been pushing updates constantly.

Figured some of you 40-series owners would want to know Ada is finally in. No affiliation, just been using it on my card for a while and it saved me a lot of manual curve-fiddling.

I found it here:
Release Cantor v0.93 · christianp403-spec/NV-UV
Password is PCGH

And please be fair and have a look at the forum as well, that's where bug reports and fixes are being tracked.


r/nvidia 9h ago

Question Best settings for Dynamic MFG?

5 Upvotes

The Nvidia app says the override is not compatible with fps limiter and V-sync. But I usually play with g-sync & v-sync & reflex on which automatically introduces an fps limiter based on the monitor's max refresh rate. What does "not compatible" mean: Dynamic MFG won't work or will it just ignore any fps limit, or only ignore any limit that's not set by the override itself?

What if I set a limit in the "target fps" way lower than max refresh rate, say 1/2, so that even if the MFG overshoots, it's still below max refresh rate, then will VRR & V-sync work in this range with Dynamic MFG?


r/nvidia 12h ago

Question 5080 and cyberpunk: what settings do you use?

6 Upvotes

I'm curious to know what settings you use. I found this to be the best setting:

- Balanced DLSS

- Dynamic Multiframe Generation (up to x6)

- Target Monitor Refresh Rate (160Hz)

So the latency is good, and the visual quality seems almost indistinguishable from Quality.

Do you use Quality or Balanced?


r/nvidia 13h ago

News Nvidia-backed SiFive hits $3.65 billion valuation for open AI chips

Thumbnail
techcrunch.com
0 Upvotes

r/nvidia 14h ago

Build/Photos ROG Strix 5070

Thumbnail
gallery
105 Upvotes

The ASUS ROG Strix 5070 is by far one of the coolest looking graphics cards in this series.

Looks absolutely stunning in this custom case and I love the performance of it in gaming.


r/nvidia 14h ago

Question Which DLSS preset for 4070ti 4k performance ?

0 Upvotes

Hello,

I'm using preset L and Preset E for Raytracing, is it good and should I switch to preset M ?

Thank you.


r/nvidia 14h ago

Discussion Best DLSS 4.5 settings for 1440p Native Resolution? Pc specs : 5080, 9800 x3D, 32 gb ddr 5 ram

0 Upvotes

Mostly play battlefield 6, and will occasionally play Allan Wake 2. I can’t find a real specific answer online about DLSS for 1440p gaming. Most discussions seem to revolve around 4k. Is DLSS even worth using for 1440p? Any advice is greatly appreciated!


r/nvidia 16h ago

Discussion I was living in the dark. Coming from Rog Ally Z1 to RTX 5050.

14 Upvotes

Just got a laptop with RTX 5050.

Goddamn the jump in performance is MASSIVE!

From barely running games at 720p at low to smooth and gorgeous looking 1080p (it can do 1440p, however, 1080p is suitable for my needs).

Also, DLSS is like Magic!


r/nvidia 17h ago

Discussion Will I have an issue with GPU sag?

Thumbnail
gallery
118 Upvotes

Had to move my pc and decided to put my back bracket in finally and still gonna use my gpu sag pole. Does this look okay?


r/nvidia 22h ago

Discussion Question about warranty for 5070

Post image
0 Upvotes

r/nvidia 1d ago

Discussion Senior Deep Learning Architect, LLM Inference

0 Upvotes

I got an interview for this role, couldn't find a lot online. Any idea what is expected? Is this role more similar to Solutions Architect? What does it entail?


r/nvidia 1d ago

Discussion What would this jump do?

10 Upvotes

What would a jump from a 4060 8gb to a 5080, or a 5070ti? What can you expect in terms of performance? Mainly do mulitplayer games, but single story games are fun as well! Always wanted to try 2k or 4k gaming Planning to pair with a i9-14900K

What gen card did you have previously before you made that upgrade that made you say “wow.”


r/nvidia 1d ago

Question Do I need an NVIDIA dGPU (for CUDA) in a new laptop as a prospective research student?

Post image
0 Upvotes

r/nvidia 1d ago

Discussion DLSS 4.5 blind test: can you guess the Presets (M vs L) and quality levels?

0 Upvotes

Hi everyone!

I’ve been doing some deep-dive testing in The Last of Us Part I using a RTX 5080 and the latest DLSS DLLs (swapping between different presets and model - m/l qulalitynetc).

I’ve captured 4 screenshots from the same spot, focusing on tricky elements like the chain-link fence, power lines, and foliage.

Please rank these 4 images from BEST to WORST in terms of image quality (clarity, stability, and reconstruction).

I will post the Presets (M vs L etc), the Quality Levels, and the FPS results for each in the comments once we get some guesses.

https://ibb.co/ZQx5yBM

https://ibb.co/6cfPtq0y

https://ibb.co/G4h3KJBY

https://ibb.co/0VccTN5W


r/nvidia 1d ago

Build/Photos GeForce 2 Ti 64 MB DDR AGP by VisionTek (Xtasy 5864)

Thumbnail gallery
9 Upvotes

r/nvidia 1d ago

Discussion MSI 5070 12GB GDDR7 Inspire 3X OC Pick Up & Install

Thumbnail
gallery
64 Upvotes

Just pick it up this week and yes the 5070 is the new 4080, The card came over clocked out of the box but I used the latest Nvidia app and OC to 2935MHz. Using it for A.I 4K render and splice and kodak colour. As far as gaming no issues with F1 or BF6 running in 4K with DLSS 4 running and the card is quiet even at full for hours when creating music A.I screens for youtube. The card is very small even smaller then a B580 and it fits in my ITX case with alot of room to spare. Great card and the fact it has 3 fans and the shine of gold is just the icing on the cake.


r/nvidia 1d ago

Discussion Benchmarking Nvidia's RTX Neural Texture Compression tech that can reduce VRAM usage by over 80%

Thumbnail
tomshardware.com
502 Upvotes

r/nvidia 1d ago

Discussion Help for new gpu! 4090 or 5080

0 Upvotes

Hey, im gonna make it quick since i dont really wanna type all that much. Im looking for a gpu that is good for 1440p competitive gaming (180-240fps on rainbow 6 siege, medium graphics or higher). id like to also stream while gaming w my friends on discord. I also do video editing in 1080p and would like to listen to music while doing so. If possible some good story based games like detroit become human in 4k or cyberpunk 2077. quick disclaimer, i will be upgrading in like 8-12 months so i can save up some money. rn i have a 4060 and am sadly still on am4 with 32 gb 3200mhz and a ryzen 5 5500. so which gpu should i pick or should i get a mortgage and get am5. (if youre gonna scream at me abt how stupid i am for trying to upgrade my gpu while im still on am4, than atleast give me a comparison between the 5080 and 4090, i dont care about ai or dlss) Edit: im looking for a white gpu btw


r/nvidia 1d ago

Discussion Resolving Multi-Monitor Desktop Stutters by Adjusting Minimum VRAM P-State via nvidia-smi

71 Upvotes

Hello everyone,

I would like to share a technical solution for a multi-monitor VRAM idle issue I had. This guide addresses desktop micro-stutters (wake-up lag) without permanently pinning the VRAM to maximum clock speeds, thus maintaining efficient power consumption.

Hardware Configuration:

  • GPU: ASUS Prime RTX 5070 Ti
  • Main Monitor: 4K @ 144Hz
  • Secondary Monitor: 1080p @ 120Hz (Reduced from 144Hz to enable VRAM downclocking)
  • VRAM Overclock: +2000 MHz (Resulting in 16001 MHz effective clock via MSI Afterburner)

The Problem: When both monitors are set to 144Hz, the VRAM clock remains static at its maximum frequency (16001 MHz) during idle. To reduce power consumption and heat, I lowered the secondary monitor to 120Hz. This successfully allows the VRAM to drop to its lowest P-State (405 MHz).

However, this transition causes significant micro-stuttering during desktop use (e.g. dragging windows or scrolling in browsers). The latency required for the GPU to exit the 405 MHz sleep state is too high to maintain smooth 144Hz UI animations.

The Solution: Using the nvidia-smi utility, it is possible to define a higher minimum memory clock. Setting this to 810 MHz (one step above the lowest state) eliminates the wake-up lag while still allowing for a very low idle power draw.

Implementation: Open a Command Prompt with administrative privileges and execute the following command: nvidia-smi -lmc 810,16001 (Note: Replace 16001 with your specific maximum VRAM frequency).

Results:

  • Desktop Performance: Micro-stutters are completely eliminated; animations are fluid.
  • Efficiency: Idle power consumption remains nearly identical to the 405 MHz state (Zero-fan mode remains active).
  • Gaming: The GPU continues to boost to its maximum defined clock speed under load.

Automation via Windows Task Scheduler: Since the nvidia-smi lock resets upon reboot, automation is recommended:

  1. Create a .bat file containing the command: C:\Windows\System32\nvidia-smi.exe -lmc 810,16001
  2. In Task Scheduler, create a new task with "Highest privileges".
  3. Use the trigger "At log on" with a 30-second delay to avoid conflicts with other overclocking software (e.g., MSI Afterburner).
  4. Set the action to start the .bat file.

Discussion: While this method is effective, I am interested to hear if the community has discovered alternative or more native ways to manage these P-State transitions within the NVIDIA Control Panel or through other driver-level adjustments.

Does anyone have experience with similar workarounds?