NVidia GeForce RTX 30x0 cards

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

Quineloe

Ahn'Qiraj Raider
6,978
4,464
going from an i7 to the R7 1700 a few years back when I built a new rig saved me a lot of money yet the performance of the computer today is still perfect, no problem with games (which arguably is to be expected anyways) but also other CPU-heavy tasks such as video encoding run very quickly. Although it's really hard to get good benchmarks there.
 

Neranja

<Bronze Donator>
2,624
4,192
So outside of price, why would anybody go for an AMD cpu right now?
Yes, at the cost of power usage, which you have to cool somehow.

Also, you don't buy a CPU right now. Zen 3 is coming out October 8th, and if they can really get an 10-15% (some say 20%) improvement per clock cycle together with improved single core frequency boosts, then Intel is royally fucked, because Ryzen will take both the single core and multi-core performance crown.
 
  • 1Like
Reactions: 1 user

dizzie

Triggered Happy
2,509
3,939
Supposedly Intel is concentrating on the enterprise market for their GPU offerings and not really vying for a place in the consumer section at the moment.
 

nu_11

Avatar of War Slayer
3,310
21,974
There are more rumours coming out about RDNA2 floating around. There's a video on youtube but the guys a dick and has bad teeth, so here's a summary.

*60% performance per watt uplift
*No HBM2
*Not using 512bit bus. Lower Bus width
*Infinity Cache on the GPU 128MB which helps make up for the lack of memory bandwidth of GDDR6
*Clock Frequency similar/around PS5
*80CU for top sku
*6700, 6800, 6900 skus
*6700 will compete 3070
*6800 will compete against the 3080
*6800XT will compete against the 3080 TI
*6900 will compete 3090 but will be faster than a 3080TI. Not sure if it will beat a 3090
*No word if they will undercut Nvidia pricing.
*Hybrid Ray Tracing (AMD Patent)
*Up-sampling handled via lower precision operations.
*Decompression: unknown at this time
*This will not be another Vega64
200.gif
 
  • 3Like
  • 1Worf
Reactions: 3 users

Fucker

Log Wizard
12,628
28,733
Supposedly Intel is concentrating on the enterprise market for their GPU offerings and not really vying for a place in the consumer section at the moment.

Another idea headed right towards failure. Intel should stick to what it is good at. Making products that cause rolling brownouts and adding a bunch of ++++++++++ after their technology descriptions.
 
  • 1Like
Reactions: 1 user

Neranja

<Bronze Donator>
2,624
4,192
From everything I've seen, the Intel cpus generally run cooler than whatever AMD has to offer.
That is a thing of the past with the TSMC 7nm node. Intels flagship 14nm++ 10900K is 125W TDP. Compared to the average CPUs for cheap, entry-level gaming (Ryzen 3600: 65W) that is a lot of heat to dissipate. If you don't have water cooling or steady airflow, your CPU is going to throttle after a while. Especially in the summer.

For comparison: The 7700K was 90W, the 8700K was 95W, and only workstation level CPUs went way above that with 140W to 160W on the 2066 socket. Intels top line there now goes up to 255W.
 
  • 1Like
Reactions: 1 user

Neranja

<Bronze Donator>
2,624
4,192
The i5 10600k has better performance than the 3600 at almost exactly the same stock temperatures
That graph doesn't mean what you think it means. Measuring max temperature is super iffy, and has a lot of variables: Case, cooling solution, thermal paste/pad, ambient temperature, humidity.

Here's another one, and suddenly it doesn't look that good:

index.php
 

Quineloe

Ahn'Qiraj Raider
6,978
4,464
Supposedly Intel is concentrating on the enterprise market for their GPU offerings and not really vying for a place in the consumer section at the moment.
They can at least advertise a 1000% performance increase over the Intel HD 630, no matter how shit their cards will be.
 

Quineloe

Ahn'Qiraj Raider
6,978
4,464

what the fuck is this? "You deserve to know this", what exactly, that Nvidia wants high margins and high profit shares like literally every single corporation out there? 15 minutes of nothing. Oooooh I have a source. My best source. They want money!
 

Intrinsic

Person of Whiteness
<Gold Donor>
15,019
13,118
what the fuck is this? "You deserve to know this", what exactly, that Nvidia wants high margins and high profit shares like literally every single corporation out there? 15 minutes of nothing. Oooooh I have a source. My best source. They want money!

Yeah I liked him for about 3 videos and now it is getting old. Listened to one of his Broken Silicon podcasts and he comes across condescending with almost every statement.
 

Hateyou

Not Great, Not Terrible
<Bronze Donator>
16,627
43,258
what the fuck is this? "You deserve to know this", what exactly, that Nvidia wants high margins and high profit shares like literally every single corporation out there? 15 minutes of nothing. Oooooh I have a source. My best source. They want money!

I took it as they have very tiny amount of low cost but well made founders editions and everyone else’s is going to be either cheaply made or expensive.

I do admit the “you deserve to know” part is dumb. Companies want to make money obviously. I think this video points out that this amazing leap came at a much higher price than Nvidia is letting on. They made it seem like the tech was so amazing that it was way cheaper to produce, and it probably isn’t if all the 3rd party cards are going to be super expensive compared to the FE. They created false hope/hype with their reveal.
 
Last edited:

Brahma

Obi-Bro Kenobi-X
12,506
45,528
There are more rumours coming out about RDNA2 floating around. There's a video on youtube but the guys a dick and has bad teeth, so here's a summary.

*60% performance per watt uplift
*No HBM2
*Not using 512bit bus. Lower Bus width
*Infinity Cache on the GPU 128MB which helps make up for the lack of memory bandwidth of GDDR6
*Clock Frequency similar/around PS5
*80CU for top sku
*6700, 6800, 6900 skus
*6700 will compete 3070
*6800 will compete against the 3080
*6800XT will compete against the 3080 TI
*6900 will compete 3090 but will be faster than a 3080TI. Not sure if it will beat a 3090
*No word if they will undercut Nvidia pricing.
*Hybrid Ray Tracing (AMD Patent)
*Up-sampling handled via lower precision operations.
*Decompression: unknown at this time
*This will not be another Vega64

Nah. They would have leaked this info already themselves if true. If they have ANYTHING close to a 3080/3090 at less cost, people will wait and see. Minus those here who are clearly enthusiasts.
 
  • 1Like
Reactions: 1 user

kegkilla

The Big Mod
<Banned>
11,320
14,739
I haven't followed the whole Intel drama too closely so I only know they had trouble with switching over to small diameters. But most benchmarks (especially gaming related) still show Intel at the top, except for very specific workloads most of us don't give a shit about.
So outside of price, why would anybody go for an AMD cpu right now? The i5 10600k seems to beat the shit out of most Ryzen pretty handily and overclocks better if you care about that.
Most of it comes down to individuals in the build-a-PC nerd culture having a lot of pent up rage towards Intel for various retarded reasons and (stupidly) envisioning AMD as some sort of scrappy underdog valiantly fighting to bring better value to CPU chips. This is nothing new and has been happening for decades, the only thing that's changed is that AMD has been making better chips the last few years. To put it all in perspective, there were morons riding AMD's jock even when they were putting out absolutely abysmal chips that no sane, emotionally-level person should ever consider buying such as Bulldozer.

If gaming is your top priority, Intel is still the way to go.

Cue AMD nuthuggers swearing that Zen 3 will beat Intel's gaming performance.
 
Last edited by a moderator:

Vorph

Silver Baronet of the Realm
11,490
5,240
Half the battle will be driver performance and stability though, and this is where they still have to convince people to buy into their architecture.
Which is absurd because AMD "driver problems" have been a myth for at least a decade now. I can count on one hand the number of times I've had to roll back a driver update, and Geforce driver rollbacks are certainly no less common from what I see in threads here and on reddit about AAA/bleeding edge graphics games.
 
  • 1Like
  • 1Dislike
Reactions: 1 users