NVidia GeForce RTX 30x0 cards

mkopec

<Gold Donor>
25,424
37,545
IDK, I have a cool$500-$600 stowed away for this gen, and who is going to win my cash probabaly depends on availability first. (If the AMD specs and benches hold true)
 

Quineloe

Ahn'Qiraj Raider
6,978
4,463
A discrete GPU or the new onboard on the 11 series?

A discrete GPU, not the on-CPU chip. It's called Iris Xe Max or DG1

1604218109916.png


1604218160584.png


1604218185235.png



that comparison looks like a cripple fight.
 
  • 5Worf
Reactions: 4 users

jooka

marco esquandolas
<Bronze Donator>
14,438
6,160
I'll believe the AMD can pump out 4k ultra at a reasonable frame rate when real benchmarks come around. Particularly from a game developer being paid by AMD. I'm all for it but reading propaganda doesn't make shit true
 

Uriel

Blackwing Lair Raider
1,654
2,052
So If my monitor supports variable refresh rate for both g-sync and freesync (Asus XG279Q), there's no reason not to get an AMD card right?
 

Break

Silver Baronet of the Realm
4,290
11,870
Has anyone tried Minecraft RTX with their 30x0? I have this weird issue where my FPS are great until I start flying but as soon as I fly with elytra the FPS drops to around 5 FPS until I land. Even if I'm just gliding and barely moving at all, FPS drops to < 5 and immediately shoots up as soon as I land. Does it with ray tracing enabled or not, and the standard Win 10 bedrock client doens't have this problem. Found some google results of one person having this problem like a year ago but no resolution it seems.
 

Quineloe

Ahn'Qiraj Raider
6,978
4,463
it's like the early 2000s when the latest cards just barely met the requirements for the games released at the same time.
 

Neranja

<Bronze Donator>
2,605
4,143
So a 3090 with less memory.
It's a 3090 with less memory and memory bandwidth. The cores have to be fed work, and for that you need input from the memory. AMD may have a winning strategy with the Infinity Cache here, reducing memory bandwidth requirements for the main GPU memory, so they can use the cheaper GDDR6 instead of the GDDR6X.

I think Nvidia bet on AMD not being competitive with "only" GDDR6 and bought out most of the GDDR6X supply. Now Nvidia has a lot of GDDR6X memory chips lying around.