NVidia GeForce RTX 50x0 cards - 70% performance increase, but AI > you

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

Mist

REEEEeyore
<Gold Donor>
31,285
23,539
That's a disingenuous point. There are clearly series of games that have reduce in quality as they've acquired DEI bs. Anything Ubisoft, Bethesda, etc. I mean, it's really not arguable. Claiming is the same with asian games is silly, as that's a much more recent phenomenon which correlates with their late adoption of DEI. Like Concord. You know, the biggest flop in gaming history.

REALLY DISINGENUOUS point my friend. Almost intentional gaslighting.
You're probably one of those people that thinks programmers do art and story content.

We're talking about very different things here.

Utnayan Utnayan is trying to claim that games running like dogshit is a "DEI" thing, when A) the phenomenon is not new and long pre-dates "DEI" and B) Asian games run like dogshit and don't have "DEI" and C) Bethesda games like Elder Scrolls series have especially run like dogshit forever, for as long I've owned a computer. Ubisoft and EA games run like dogshit because they're built by 300+ person teams doing agile scrum nonsense.

Concord, for instance, ran very well on PC. Excellent port, along with all of the other major recent Sony ports, from a technical standpoint. You're conflating things that have nothing to do with each other.
 
Last edited:
  • 1Moron
Reactions: 1 user

DickTrickle

Definitely NOT Furor Planedefiler
13,438
15,611
Nowadays most programmers use so much abstraction they know a lot less about how to extract performance from their code. There's a lot more framework and tool usage that is designed for general purposes such that it's not going to be as effective for specific uses. I think this is true for basically every industry but certainly so for gaming.
 
  • 3Like
Reactions: 2 users

rectifryer

Molten Core Raider
242
573
You're probably one of those people that thinks programmers do art and story content.

We're talking about very different things here.
AND a typical goal post shift.

You're talking about very different things because your premise is detached from reality. You'll keep raising your pedantic assertions until you find a technicality of language then declare victory while ignoring the original principals of the argument. This leads me to believe you are a useful idiot.
 
  • 4Like
  • 1Truth!
Reactions: 4 users

rectifryer

Molten Core Raider
242
573
Nowadays most programmers use so much abstraction they know a lot less about how to extract performance from their code. There's a lot more framework and tool usage that is designed for general purposes such that it's not going to be as effective for specific uses. I think this is true for basically every industry but certainly so for gaming.
Yes exactly. It's harder to dive in under the hood of unreal with code than it is Unity. You can blueprint easily in unreal, however.
 
  • 4Like
Reactions: 3 users

Tuco

I got Tuco'd!
<Gold Donor>
47,680
81,764
You know, I opened this thread thinking, "I hope the same group of clowns is having the same tired arguments about wokeism and DEI in yet another thread.". Thanks for not disappointing.

I installed my RTX 4080 Super last night. It's so absurdely large. It's almost the same surface area as my micro ATX motherboard (453 cm sq vs 595 cm sq). My CPU fan is also absurdely large, but so far quiet though.

1727703310142.png


With the leaked power draw numbers for RTX 5000 series you know this shit isn't getting smaller.
 
  • 2Mother of God
Reactions: 1 users

Tuco

I got Tuco'd!
<Gold Donor>
47,680
81,764
Nowadays most programmers use so much abstraction they know a lot less about how to extract performance from their code. There's a lot more framework and tool usage that is designed for general purposes such that it's not going to be as effective for specific uses. I think this is true for basically every industry but certainly so for gaming.
Unreal Engine provides expansive profiling tools but even with all that power it's still common practice in UE to effectively measure times before/after functions are called.

 
  • 1Like
Reactions: 1 user

Noodleface

A Mod Real Quick
38,292
15,136
You know, I opened this thread thinking, "I hope the same group of clowns is having the same tired arguments about wokeism and DEI in yet another thread.". Thanks for not disappointing.

I installed my RTX 4080 Super last night. It's so absurdely large. It's almost the same surface area as my micro ATX motherboard (453 cm sq vs 595 cm sq). My CPU fan is also absurdely large, but so far quiet though.

View attachment 549307

With the leaked power draw numbers for RTX 5000 series you know this shit isn't getting smaller.
My guess is at some point they're going to ditch the current GPU design and you end up with two towers. One is your GPU tower and one is your CPU tower
 
  • 1Worf
Reactions: 1 user

Khane

Got something right about marriage
20,420
14,094
My guess is at some point they're going to ditch the current GPU design and you end up with two towers. One is your GPU tower and one is your CPU tower

Separate, but equal, towers?

giphy.gif
 
  • 3Worf
Reactions: 2 users

Furry

🌭🍔🇺🇦✌️SLAVA UKRAINI!✌️🇺🇦🍔🌭
<Gold Donor>
22,065
28,901
Isn't 5090 supposedly smaller than the 4090? At least I think I've heard that in the rumor mills.
 

Tuco

I got Tuco'd!
<Gold Donor>
47,680
81,764
My guess is at some point they're going to ditch the current GPU design and you end up with two towers. One is your GPU tower and one is your CPU tower
I know this is a thing for laptops https://www.amazon.com/Razer-Core-Aluminum-External-Enclosure/dp/B08J5J8C1H

I think it definitely makes sense for laptops because having a dedicated GPU is a huge sacrifice to portability for a laptop. I can never see wanting a separate tower for a GPU on my desktop PC though. And it's not like CPUs are getting smaller either. The Ryzen 7950X3D I got is huge and so is the fan for it.
 

Mist

REEEEeyore
<Gold Donor>
31,285
23,539
My guess is at some point they're going to ditch the current GPU design and you end up with two towers. One is your GPU tower and one is your CPU tower
Strong doubt.

I'd bet it moves in the opposite direction, with GPU and CPU using a shared pool of very fast memory.

I actually think the days of discrete cards and DIY PCs are numbered.

Longer term, I think monitors will have GPUs in them, and some kind of DisplayPort SLI-style protocol that allows workload distribution between an internal GPU and an external one built into the display interface. So maybe the internal GPU handles the basic rendering, and the display GPU handles all the AI interpolation and image enhancements like FSR/DLSS.
 
  • 1Imbecile
Reactions: 1 user

Furry

🌭🍔🇺🇦✌️SLAVA UKRAINI!✌️🇺🇦🍔🌭
<Gold Donor>
22,065
28,901
Strong doubt.

I'd bet it moves in the opposite direction, with GPU and CPU using a shared pool of very fast memory.

I actually think the days of discrete cards and DIY PCs are numbered.
I'll move to china before I buy a prebuilt PC.
 
  • 2Like
  • 2Worf
Reactions: 3 users

Tuco

I got Tuco'd!
<Gold Donor>
47,680
81,764
Strong doubt.

I'd bet it moves in the opposite direction, with GPU and CPU using a shared pool of very fast memory.

I actually think the days of discrete cards and DIY PCs are numbered.

Longer term, I think monitors will have GPUs in them, and some kind of DisplayPort SLI-style protocol that allows workload distribution between an internal GPU and an external one built into the display interface. So maybe the internal GPU handles the basic rendering, and the display GPU handles all the AI interpolation and image enhancements like FSR/DLSS.
I wonder how much memory bandwidth is actually consumed on my GPU. My RTX 4080 has 716GB/s which just seems like an absurdly high number, and the RTX 50x0 with GDD7 has twice that at 1.5TB/s. Conversely, my DD5 system memory has a bandwidth of like 90GB/s. I'm guessing bandwidth and response times are the bottleneck preventing GPUs from using system accessible memory, but how far is that gap?
 
Last edited:

Kirun

Buzzfeed Editor
19,309
15,677
I'll move to china before I buy a prebuilt PC.
I think Mist Mist is correct. Slowly but surely PCs will just be "cloud gaming" and you'll "rent" your PC from now on.

You'll own nothing and you'll be happy, peasant.
 
  • 3Like
Reactions: 2 users

Noodleface

A Mod Real Quick
38,292
15,136
I wonder how much memory bandwidth is actually consumed on my GPU. My RTX 4080 has 716GB/s which just seems like an absurdly high number, and the RTX 50x0 with GDD7 has twice that at 1.5TB/s. Conversely, my DD5 system memory has a bandwidth of like 90GB/s. I'm guessing bandwidth and response times are the bottleneck preventing GPUs from using system accessible memory, but how far is that gap?
There's nothing stopping a GPU from using system memory, it just doesn't really make sense when their own GDDR is so much faster.
 

Tuco

I got Tuco'd!
<Gold Donor>
47,680
81,764
There's nothing stopping a GPU from using system memory, it just doesn't really make sense when their own GDDR is so much faster.
Right. But if Mist's prediction is that future GPUs will use system memory, how big is the bandwidth / latency gap and how will they close it?
 

Noodleface

A Mod Real Quick
38,292
15,136
Well Nvidia is making it's own CPUs now, so if anyone is going to close it, it will be Nvidia.

My guess is there will be dedicated GDDRx
 

Mist

REEEEeyore
<Gold Donor>
31,285
23,539
I think Mist Mist is correct. Slowly but surely PCs will just be "cloud gaming" and you'll "rent" your PC from now on.

You'll own nothing and you'll be happy, peasant.
Cloud gaming has failed miserably.

Handhelds have taken off spectacularly.

I would expect to see future handhelds with monolithic chips similar to the M3 Ultra but from Nvidia or Qualcomm.

That said, I still think you could do stuff like DLSS/FSR on a separate chip closer to the display. Integrate those functions into the scalar instead of doing them on the GPU.
 
  • 2Like
  • 1Moron
Reactions: 2 users