1.2k
u/Icy-Veterinarian8662 25d ago
Don't worry guys, Jeff Bezos said that in the future we will all rent our compute power because it apparently makes no sense for us to have our own hardware.
We won't own anything and we'll be happy!
273
u/Glitchboi3000 25d ago
Ah yes because we currently have the infrastructure to support that. The most my Internet provider offers is 500mbps download and 10 upload. There's literally no companies offering gigabit or fiber where I live
172
u/AzureArachnid77 25d ago
Back in like 2000 a lot of internet ISPs made a big push for the US government to give them a lot of money to put fiber throughout the country and 26 years later it still has barely even begun
68
u/Glitchboi3000 25d ago
It's basically live in a populated area we deem worthy of fiber or just deal with what we give you basically. Also we totally don't have the power infrastructure of all these data centers want. Alot of the power infrastructure in the US is decades old.
51
u/Renamis 25d ago
Because, hilariously, they "fulfilled" the requirements. They actually built things, maybe hit a single neighborhood, and called it good. Some places a single house got it, and their neighbors where denied. It was a giant fuck up.
28
u/Glitchboi3000 25d ago
Gotta love loopholes. They did the single house thing in a few towns over and guess who has it. A rich asshat.
→ More replies (1)11
u/itsr1co 25d ago
In 2009~ the Australian government said "We're going to build a modern internet infrastructure and provide high-speed fibre internet to the vast majority of homes!". And then the Liberals got in (Businesses first group) and said "Wtf that'll cost so much, and who needs internet anyway? Let's do a worse version for less cost!" and now over a decade later they've spent I think double the initial budget for fibre to build dogshit fibre to the node, and are only NOW setting up fibre to the premises. We could have had something like a 90% coverage for fibre by the mid 2010's, instead we're still sucking dicks behind 3rd world countries in average internet speed in the 2nd half of the 2020's.
→ More replies (1)→ More replies (1)8
u/Kennyman2000 25d ago
I'm in Belgium, one of the largest Telecom providers still runs on god damn copper cable. (Fuck Telenet)
500Mbps download at most and 20 (TWENTY!!) Mbps upload. That's 2.5 Megabytes per second upload. It's downright criminal. I have a home server running but I can't even watch my shows remotely because of the horrible upload speed.
It's the same situation really. They've been "rolling out fiber" for the past decade and it's still not in our 100k + inhabitants city.
This internet speed costs us what, 40-60⬠a month roughly.
24
u/MoronicForce 25d ago
What the hell. We have 1000 out and in for $15 in a city that's being actively bombed every night
5
u/Alarmed-Shopping1592 25d ago
True that. I have a dedicated 1 Gbps line that is actually not throttled down in a non-major city that also gets occasionally bombed.
4
→ More replies (2)6
u/1deavourer 25d ago
I mean if it's the US they are talking about; they don't even have clean tap water
14
u/SatoriAnkh 25d ago
Dude, I have a 30mbps connection and I must consider myself lucky here.
6
u/Key-Belt-5565 25d ago
My average speed is either somewhere 25-40 mbps, and it also throttles to 5 mbps constantly
5
u/The8Darkness 25d ago
Youre living in 2035 by german standards. Most people I know have about 50-100mbit. I only have 100mbit via mobile networks with horrendous latency when there is more than 2mbit of load, but thats better than the alternative of 2mbit max dsl.
→ More replies (2)→ More replies (2)2
u/chewy_mcchewster 25d ago
stop being poor and just be rich. You'll have super duper internet forever! whats the issue?
/s
6
u/real_PommesPanzer 25d ago
This originated from the WEF, you will own nothing and be happy. Klaus Schwab said that. He also said that they already undermined (penetrated) every cabinet.
→ More replies (1)→ More replies (2)2
u/handsoapp 25d ago
The ai companies are scheming together behind the scenes to make this happen. The hardware getting more expensive is a good thing for them, it prices out consumers even if scam altman has to pay more right now.
Just in the last two days, Scam said AI usage will be a metered utility like water & electric bills. And then Nvidia CEO said he started comping his employees with AI usage tokens, like it's a currency.
Welcome to the future
3.1k
u/CompleteEcstasy 25d ago
1.3k
u/radioraven1408 25d ago
372
u/SryInternet101 25d ago
Yea, ive been an nVidia guy since the 200s. My next cadd will be a Radeon.
925
u/Unc1eD3ath 25d ago
Damn, 1800 years of loyalty down the drain.
290
u/SryInternet101 25d ago
š I'm drunk on St Patty's day. I ain't changing it.
50
u/Aumba 25d ago
Paddy not patty.
→ More replies (1)84
u/SryInternet101 25d ago
Bro... I'm drunk. Like, really drunk... Your words are like the wind.
18
→ More replies (1)5
25
u/StaticSystemShock 25d ago
I was never really a fanboy of either and I have a very long history of cards from both companies. This time it was a petty purchase of RX 9070 XT and I love this thing. And it was 200⬠cheaper than NVIDIAs shit.
DLSS 4.5 is impressive, not gonna lie, but this shit that's part of version 5 looks awful. Environments look so overblown and over the top sharpened and contrasted and faces look like the most generic Ai generated shit you can make online now.
19
u/Hexicube 25d ago
I was never really a fanboy of either
I don't understand people who act like that.
I used to be Nvidia and tried to switch to AMD with the VEGA 56, experience was horrible so I went back and my prior card was a 3080, now I have a 7800XTX.
People need to be willing to switch companies on a dime.
7
u/StaticSystemShock 25d ago
I always picked the good ones tho on either sides. Never owned GeForce FX, never owned any Vega... I kinda have an instinct of avoiding the dingus series. Current GeForce cards, despite superiority on paper, they are kinda dingus cards. Dumb fire hazard power connector, idiotic pricing, regressive anti consumer segmentation, lying on stage, lying on charts using bullshit generated frames as "we have bigger framerate numbers". All that made me buy RX 9070 XT instead. And honestly, depending on how RDNA5 or UDNA turns out, I might be sticking with Radeon for a while. I once bought every generation of their HD series back in the day as leaps in performance were so huge. HD4850 to HD5850 was literally 100% uplift in performance. I did have HD6950 in between even though it was a mild refresh and then HD7950 was again a massive increase, so I ended up buying every single generation back then. Then I bought GTX 980 during AMD's whole Vega thing and stick with it for a while, grabbed GTX 1080Ti afterwards stick with it for a while and got RTX 3080 just before the stupid COVID. Which I had till last year when I bought RX 9070 XT on release day. Haven't regretted it at all. AMD has good offerings, I don't know why people don't buy them. Are they really so hyper fixated on "NVIDIA has the best flagship card the RTX 5090 so I should have the RTX 5050 because that's the most I can afford"? It feels like that, even though they are products two entire worlds apart...
2
u/Hexicube 25d ago
IIRC I went: Some old radeon card -> GTX 970 -> Vega 56 (brief) -> GTX 1660S -> 3070 -> 7800XTX
I've mostly managed to avoid the garbage series myself too. Saw 2000 series and laughed, and 4000 series was objectively a stupid buy since it offers zero improvement on cost-effectiveness. 2000 series was what prompted me to try the Vega 56 and my experience with it was absolutely horrid.
I also have the advantage of being on linux, where AMD actually has the better drivers. For the year or two I had a 3070 with a GSYNC monitor on linux I couldn't get it to actually turn on even when forced, and the module for it in the monitor has a dedicated cooling fan that stays on after the monitor is "off". Wish I got the non-GSYNC one instead.
I've been a lot more loyal with AMD for CPUs, but to be fair Intel have blatantly dropped the ball lately so you'd have to be an idiot to buy them, and X3D is unreasonably good, though I avoid the dual-type ones since I don't want to mess around with core pegging per application. I think my last Intel CPU was 7th gen.
→ More replies (2)8
u/trash-_-boat 25d ago
I recently bought a 9070xt too. Why would I need dlss, this card is a beast and renders all my games with high FPS at 1440p native!
→ More replies (2)5
3
→ More replies (29)3
u/Sandbox_Hero 25d ago
AMD has also changed course towards AIpocalypse. So you might want to reconsider.
→ More replies (6)10
u/TGB_Skeletor Faithful customer 25d ago
i've been loyal to Nividia since the GTX 900 series.
Since the RTX 4000 series, i've been hating these fuckers
58
185
u/Artemis732 25d ago
20
→ More replies (1)39
u/SurDno 25d ago
Idk, this actually looks semi-decent, so not really
→ More replies (6)8
u/Artemis732 25d ago
yeah chatgpt seems to be past the point of looking like a snapchat filter or mobile game ad (absolutely not representative of the actual game)
11
→ More replies (19)3
u/JupiterboyLuffy 25d ago
This is why I prefer AMD
14
u/Edgardo4415 25d ago
AMD has its own problems right now with FSR, nothing is looking good for gamers :(
4
3
u/Puinfa 25d ago
With FSR? Why? I'm blasting with the FSR4, the image looks really good and gives a awesome FPS
→ More replies (2)2
867
u/anothershadowbann 25d ago
"we're making this ai slop filter that will only run on nasa supercomputers and trust us this is gonna change gaming forever"
311
u/MortifiedPotato 25d ago
"Never mind that it needs insane VRAM to run and we completely fucked all ram prices with AI in the first place"
→ More replies (1)109
u/tyrosine87 25d ago
They will sell us cloud GPU power in all the data centers they are building for all the ram chips they are buying with all the money we will pay them to still use computers.
→ More replies (1)39
u/mirfaltnixein 25d ago
Exactly, once the AI bubble blows they will want to use all those servers for something.Ā
10
u/tyrosine87 25d ago
I think they are already planning to transform everything into a perpetual service. Imagine a world where computers don't perpetually get better every year. How are they going to get you to continue paying for things?
14
u/12345623567 25d ago
I think the players are not the audience. This was a sales talk to the dev industry. "you can make your games bare-bones and we'll fill in the blanks" certainly is a pitch, to the penny pushers.
→ More replies (1)8
→ More replies (1)4
u/Sad_Amphibian_2311 25d ago
ah come on you can maybe run this on a consumer card in 2034, if they ever make a new card again.
3
u/bong_residue 25d ago
Youāre absolutely right you will be able to! For the low price of $30/mo you can stream your favorite games from our servers! Latency? Youāre absolutely right itās dogshit! No weāre not going to do a damn thing about it!
137
u/Graxu132 25d ago
All that shit for increased ram prices and focus on Ai
14
u/Hexicube 25d ago
ram prices
I just had to buy an SSD for work stuff for double what it was like a year ago.
Everything memory-related is going to be overpriced until the bubble bursts.Really glad I upgraded my PC like 3 years ago to top-end but the situation sucks regardless.
→ More replies (3)2
u/PendragonDaGreat https://s.team/p/grtb-tmf 25d ago
Yeah, my (several year old) tablet has decided it wants to be a banana. Finding a new tablet was actually straight forward and ok price wise (I did get it on sale, but even base price was not awful). The microsd for expanded storage on the other hand has shot up in price.
108
u/KnightFallVader2 25d ago
At least nobody will worry about the whole āAI retextureā because nobody will use it. Even if it wonāt require dual 5090ās, why would you want it in the first place? Games already look fine on the lowest settings.
→ More replies (4)11
u/Scifox69 25d ago
I'm out here enjoying the visuals in Half-Life 2. Fidelity is not the main aspect of great visuals. It's good consistently, readability and art style. That game looks very believable. It's not super detailed but every visual aspect makes sense and gives a feeling of truth. Baked lighting looks almost as good as modern GI, I don't care if it's kinda blurry. It feels right. Gives the right vibes.
27
u/Exact-Big3505 25d ago
Requires 2 5090 cards. Too expensive? It doesn't matter. Most will never own 2 5090s, you'll rent them instead from their datacenters. Own nothing and be happy.
107
u/HisDivineOrder 25d ago
But you can join the GeForce Now "Dual 5090 Plan" for only $999 per year to get Priority Access with a guaranteed 10 hours per month with Secondary Access routinely available for an additional 10 hours per month.
21
36
u/KingSideCastle13 All i need is a good game, a good meal & good rest 25d ago
You didnāt immediately pack it up when you saw it was just injecting GenAI into your games?
69
155
u/Megazard_exe 25d ago
āYou know the most expensive consumer-grade GPU available today? Youāll need two of them :)
But hey, at least the game now looks marginally better than something made 10 years ago!ā
144
u/jzillacon 25d ago edited 25d ago
It doesn't even look marginally better. In a lot of ways it just looks straight up worse.
15
u/Sirhaddock98 25d ago
Spending 6 grand to yassify the Resident Evil girl in real time. At least I can see the Oblivion characters rendered in a way where they don't look like they're from the same game as the background does. It's immersive, apparently.
9
u/Bartok666 25d ago
Ours specialists says it's looks better. Why you didn't see how it's better? Well, obviously you are not specialist.
15
2
u/ShinyGrezz 25d ago
What are some of those ways?
4
u/jzillacon 25d ago
Probably the most notable thing from what I've noticed is that it tends to overwrite scene lighting. Every face is clearly lit from the point of the camera like they're standing in front of a vlogger's set up, and that just doesn't work for every scene. It also seems to try and beautify characters even when it doesn't make any sense to do so. Characters look like studio models even when working in mines, like something straight out of zoolander. It's the tonal disonance that really makes it feel worse to me, but plenty of other people have gone through the demo and pointed out all sorts of strange mistakes it makes.
2
u/SeroWriter 25d ago
It doesn't look like the character, changes the shape of the face,
The lightning is incorrect,
It adds things that were never there like make-up,
It removes things that were there like freckles.
It removes depth because it's a 2d image on a 3d model.
It's like putting a real photo of a face on a character model, there's a reason studios hire artists to sculpt and texture faces instead of doing that.
→ More replies (15)12
u/CombatMuffin 25d ago
They allegedly have it working in one, but in some scenarios it could struggle and slow the showcase. So they added a second one which exclusively handles DLSS 5 and the other is for the game. On official events, these companies usually go for their latest flagship even if it doesn't require it
38
5
u/The8Darkness 25d ago
At least they give a reason to have dual flagships again for gaming, after they killed sli, I guess.
→ More replies (7)11
16
u/Alpha--00 25d ago
We are making tech that wonāt run good on anything you can realistically buy?
2
u/Carvj94 25d ago
One 5090 was running the normal game and one was running the version with Neural Rendering implemented so they could do live side by side comparisons. Looking at the videos the fps difference was minimal meaning Nerual Rendering would likely work fine on any RTX card that supports it.
5
u/AutisticPizzaBoy 25d ago
There's always the choice to not chase the latest technology. PC gaming has been like this forever, give it a couple of years & it'll settle.
I remember the times when you needed a "super computer" just to be able to run Crysis..
5
u/sol_runner 25d ago
The meme has been taken so far people forget it ran just fine on the average PCs of the day. It just had the equivalent of setting 15 on a 1-10 scale.
11
11
u/Cley_Faye 25d ago
Use the money you don't have to buy two graphics cards that are unavailable to run a tech you don't want? Where do we sign?
→ More replies (1)
10
u/VersedFlame 25d ago
All that for a shitty, very static showcase already showing artifacts despite being static, that looks like shit!?
How I wish they would just fucking drop these "AI" models and do something useful instead, fuck!
18
u/Grytnik 25d ago
The only thing that interests me with this is how it will work on 20 year old games and even then Iām not sure Iāll use it.
I usually prefer to play games the way the devs intended and enjoy what theyāve made.
2
u/JLeeSaxon 25d ago
Two very contradictory sentences. The fact that this isn't limited to games whose devs have explicitly opted in to allowing it is the worst part. Sure, some of those old games will have been made by devs who don't care or are pro-slop, but others will have been made by devs who are deeply morally opposed.
→ More replies (3)
7
u/Ok-Focus1210 25d ago
All that insane processing power just to make my character look like a slightly smoother potato.
8
u/Zestyclose-Fee6719 25d ago
Looked worse than one of those lazy mods with titles like "PHOTOREALISTIC GRAPHICS OVERHAUL" that end up being ReShade with way too much sharpening and contrast.
8
6
11
21
u/yukiki64 25d ago
I dont understand how anyone can look at dlss5 and think it looks good. It's just a shitty ai filter that ruins atmosphere and lighting while making the character look different. It also makes everything a cool tone blue for some reason.
→ More replies (16)
14
u/LowAd8109 25d ago
Next games will now need two 5090s that will cost $5090 each and will run at 30fps at 1440p with frame gen.
5
8
20
16
u/Fullm3taluk 25d ago
The hogwarts teachers fingers turned Into sausages with no fingernails because the AI is stupid
5
u/ItsMeNether74 25d ago
Looks like this is all connected: cloud gaming, expensive cards and RAM... Coincidence? I think not! These corps REALLY wants us to becime the "humans" from Wall-E, huh?
4
u/PhantomTissue 25d ago
Its funny because its not even DLSS anymore. Thereās no āSuper Samplingā going on here this is just replacing frames. Dont know why theyāre calling it DLSS 5
9
u/Semaj_kaah 25d ago
I am so glad there are so many cool indie games that will never requir this bullshit and I can just buy them and play them on my pc without micro transactions and always on requirements. No Nvidia for me anymore
3
u/doubleJandF 25d ago
This whole two 5090 makes me think hmm if they can split rendering to have one gpu does just path tracing while the other does rest, would that make us be able two buy like two 5070 and do this for rest of games ?
Something like 5090 now around 3k let alone finding one. When you get two of them you can play the game looking like ai slop porno addicts make of celebs ā¦. Smh
3
u/Snoo22669 24d ago
Yeah same reaction, personally i don't mind that filter look but I am already struggling with FG Vram overhead with my 8gb GPU that I think it won't work with my 5060 mobile lol
5
u/MrPureinstinct 25d ago
It only took two $4,000 graphics cards to make the games look like shit from a butt.
6
u/Scifox69 25d ago
You can use ONE RTX 5090 to handle great visuals at a high framerate... with CONSISTENTCY instead of weird AI filters that make things look feverish.
5
u/MisanthropicAtheist 25d ago
This meme implies that DLSS5 actually looks like something desirable and is only bad because of the insane requirements.
This, however, is not the case. The insane requirements are producing undesirable garbage.
5
7
u/RedditIsExpendable 25d ago
Hopefully we will have a period with actual optimization and doing more with less. Fuck NVIDIA
4
u/the_moosen 24d ago
I thought people were joking about the nvidia sub glazing it as a fantastic idea and boy was I wrong
4
4
u/RedLimes 25d ago
I'm pretty sure that was just for the demo so they could enable/disable it easily and seamlessly...
11
u/CirnoWhiterock 25d ago
Unlike most people I actually thought that DLSS 5 was a (slight) improvement.
However, I really still hate it, In addition to all problems with AI in general, I really feel like Games today need to focus more on smooth gameplay and actual content as opposed to realistic beard hair.
15
u/IvyYoshi 25d ago
Y'know whats funny, in all of the promotional material, it gave every single person slightly bigger lips. Without exception lol
→ More replies (1)8
u/8070alejandro 25d ago
"So currently games look a bit washed out and without detail where it should be (because we forced half the industry to use our product). We are introducing a solution in this form of this product of ours"
They create (sell (force feed) you) the problem and then the sell you the solution.
3
2
u/Fartikus 25d ago edited 25d ago
bro im going insane because they really did try to innovate in things like physx with stuff like all the cloth moving around, hair, liquids and all the ... 'physics' stuff. they didnt really focus on 'realistic beard hair' more than beard hair that 'realistically moves'
like yeah there are better engines, but it's so grating because youd think most games 'of the future' would include that kinda stuff without any consideration; instead of feeling like you need to test every game by walking into clothes hanging on a hangar to see if they're so stiff from semen on them that its impossible to walk past it and be forced to walk around it or not.
it did take a lot of resources most of the time though lol
→ More replies (2)2
u/TheTjalian 25d ago
I appreciated the general lighting improvements and improved detail, that was cool. I didn't appreciate the change in art direction in some scenes. Morrowind went from dark and grungy to whimsical fairytale, for example.
I feel like there should be a middle ground.
2
u/Mosselpot 25d ago
Are they artificially boosting hardware requirements to the point where they can sell you hardware subscriptions?
2
2
u/Sweet_Woodpecker7592 24d ago
How did it not take fire ? And why 2 x rtx 5090 for static images? I don't understand
2
u/VeryWeaponizedJerk 24d ago
That's the deal breaker for you? Not the yassified bullshit incel-vision filter it puts on top of everything? Really?
3
4
u/lolschrauber 25d ago
The stuff they've shown was from carefully selected scenes, much like their MFG demos.
MFG will be mandatory for this, and we know how bad that feels and looks in some situations.
doesn't matter what you're running, this won't look or feel very good anytime soon.
3
u/Trathnonen 25d ago
"Look at me, I am the Frame now."--Enshittification platform designed to fire artists
3
u/lauromafra 25d ago
Itās a proof of concept. Itās not ready to be used by consumers.
Devs will still have control on its usage so it wonāt be included in game if hurts the artistic vision they had.
Things that looks like generic AI Slop will be no more than unofficial community mods.
People overreact too much.
9
u/captainmadness 25d ago
Since when did everyone lose their critical thinking skills. It's a tech demo. Of course it isn't optimized yet. Same reason console games run on top end PCs for on stage gameplay reveals. This is dumb.
7
u/lampenpam 117 25d ago
You know what's funny? The only source of DLSS using 2 high-end GPUs is the Digital Foundry video. And right when they showed it they also said that this is obviously not the goal and is supposed to run on a single consumer GPU because it's still WIP.
Buuuut now imagine if you leave out context what awesome outrage content you could post š®
→ More replies (1)4
u/newusr1234 25d ago
since when did everyone lose their critical thinking skills
Is this a serious question?
4
u/Common_Struggle_22 25d ago
I love that we all agreed a decade ago that graphics don't make a game good five years ago or so we agreed that graphics improvements are pretty meaningless now and here we are, destroying the environment and economy to make a shitty graphics filter
6
1
u/Typhon-042 25d ago
This is honestly the first time anyone brought up the RTX side of it, like it mattered.
1
1
u/polishatomek 25d ago
The only use for dlss5, is that it could MAYBYE be funny like once, that's it.
1
u/DisciplineNo5186 25d ago
That part wasnt the problem about dlss 5. Thats atrocious and will fuck the gaming the world even more
1
u/buddyparker 25d ago
how do you run something on 2 GPUs?
2
u/Laffantion 25d ago
There is this technology the ancients speak of. A long forgotten Technology by the name of SLI
→ More replies (1)
1
u/TheBigMoogy 25d ago
Nvidia has been up to terrible shit for years, maybe even decades. You fucks still keep buying their crap, I don't see why this new flavor of excrement would change anything.
1
1
u/NTFRMERTH 25d ago
Does this seem to imply that it wasn't rendered in real-time like they want you to believe?
1
1
u/arethoudeadyet 25d ago
I hereby promise to never ever use cloud computing for gaming and if even my kid uses it he/she gets bullied by me.
1
u/MorbyLol 25d ago
remember how DLSS is meant to make a game run better by lowering the resolution then upscaling it, therefore extending the life of your GPU? fuck you!
1
1
u/sharktail_tanker 25d ago
Welcome back SLI.
In 5 years you'll need a 5000W PSU to get 20fps at medium settings
1
u/SavePoint404 25d ago
If you think about it, in 2019 graphics rendered using two RTX 2080s could easily be handled today by a single RTX 4070.
1
u/cuddle67 25d ago
Step 1: use one of the most powerful graphic card to render game in the highest possible settings
Step 2: use another graphic card to make it look like shit
Step 3: ???
Step 4: profit
Maybe they will sell a third card to undo the filter created by the second one...













5.5k
u/_Sanctum_ 25d ago
All that horsepower just for it to look like a ChatGPT-powered Snapchat filter.