- cross-posted to:
- gpu@programming.dev
- cross-posted to:
- gpu@programming.dev
“Jensen sir, 50 series is too hot”
“Easy fix with my massive unparalleled intellect. Just turn off the sensor”
If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!
If you buy a Nvidia GPU, you are part of the problem here.
Yeah NVIDIA is a bullshit company and has been for a while. AMD and Intel need to get their raytracing game up so they become a real competitor for NVIDIA especially now when there are more games that require raytracing.
No games “require” raytracing and the ones that support it may look better when you turn it on but it’s not worth the performance cost.
This is incorrect. The new indiana Jones game requires raytracing as does the upcoming doom game. As much as you may or may not like it traditional rasterized graphics are (starting) to be phased out. At least across the AAA gaming space. The theoretical benefits to workload for developers make it pretty much an inevitability at this point once workflows and optimizations are figured out. Though I doubt rasterized graphics will completely go away. Much like how pixel art games are very much still a thing decades after becoming obsolete.
Sorry for not knowing that a game that released 1 month ago makes ray tracing mandatory.
Considering that most gamers don’t even have ray tracing capable hardware (at least with decent performance), this seems like a pretty shitty and lazy decision.
My point still stands though. In most games it’s not that much better and in the games where it is better, it has a very exaggerated performance impact. Personally, I prefer playing at 100+ fps.
To be clear. The only required rt Indiana Jones utilizes is raytraced global illumination which does require tensor cores to work. But as long as you have a 20 series card or later you should be able to get playable performance if you manage your settings correctly. It only becomes super heavy when you enable rt reflections, rt sunshadows, or full path tracing. The latter of which being VERY expensive and what I’d assume most people think when they think of ray tracing. It does look really really good though and personally myself I’d rather play that game at 60 fps (or lower let’s be real) in order to play with full pathtracing instead of playing with just the RTGI at a much higher fps. I’d at least recommend turning on the RT sunshadows if you can because shadows without it are very shimmery and aliased. Especially foliage. In games like Indiana Jones that have been designed from the ground up with raytracing in mind it makes a gigantic difference in how grounded the world feels. The level of detail they baked into every asset is insane and path-tracing elevates the whole experience a huge amount when compared to the default RTGI because every nook and cranny on every object casts accurate shadows and bounce lighting on itself and the environment.
I assume Doom is going to be the same way.
Just chiming in that I played Indiana Jones with 0 problems and great performance on my 6800 XT. And that was without any FSR, which I’m not sure if it’s even available, yet.
Not true anymore.
The Indiana Jones game that just came out does require ray tracing. There are a few others coming out that do as well.
Which other currently released games require it? I haven’t found any.
Hopefully this doesn’t become a trend because it’d be pretty dumb and anti-consumer.
As far as I know Indiana is the only currently released game in this category.
Let’s hope it stays that way.
So anyone with an older or non-nvidia card can just get fucked?
This sounds like an Nvidia monopolistic backroom deal or something. Bet you’ll see an Nvidia splash on those games’ startup.
AMD cards can do ray tracing too. Not nearly as well, but they can.