Desktop Computers

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

Lurkingmoar

Molten Core Raider
74
75
AMD-Ryzen-5-3.jpg

AMD Ryzen 5 Processors Start At $169 and Launch on April 11th - Legit Reviews

This is all rumor so take it for what it is, but I am far more interested in how this line up compares to skylake/kabylake than the R7 lineup.
 

mkopec

<Gold Donor>
26,228
39,931
Yup, and their per core performance still lacks vs intel especially in gaming, which is why most of us here buy these chips. Until they get their shit together on the gaming front, they will still be back of the buss.

But if they are basically the same as the other big brother ryzen, this is a game changer for the coders, and video and photo geeks, because they are getting an i7 for the price of an i3. Or a six core i7 for the price of an i5. So if youre into decoding/encoding shit, this might be it.
 
Last edited:

Big Phoenix

Pronouns: zie/zhem/zer
<Gold Donor>
46,381
98,530
I think the 1600 will be the best of ryzen. At its price point it will not be competing with the 7700k but the 7600 and 6600.

That 1500 though, I dunno what that thing is going to do. It's not a single four core chip, but two modules with only two cores enabled. Seeing how is been demonstrated when ryzen has to communicate across modules there's a noticeable performance hit, I'd imagine that will be even more pronounced with the 2+2 layout.
 
  • 1Like
Reactions: 1 user

Fulorian

Golden Knight of the Realm
104
46
I feel like there's such a disconnect between all the talk about Ryzen and the reality of quality of life with a given piece of hardware. There's this assumption that all that matters, at the end of the day, is the raw FPS number on screen, and that FPS number (on a few existing titles) divided by price is the only metric of consequence.

We've become used to ignoring AMD and our only scale of relative comparisons from Intel, generation to generation, has become 'Where is the CPU bottleneck FPS?', on the automatic assumption that virtually all threads are fully utilized. Intel absolutely has had teething issues in the past on new architecture (and avoidance of this, and the backlash similar to the Ryzen reception may be why they haven't bothered to introduce any meaningful architectural differences in 5 years)

I've been running on a 1700X system for about a week now, and while my comparison point is to my old 4690k, instead of a 7700k like most reviewers are, the differences between those two aren't really that dramatic (~15% more instructions through a combination of IPC and higher clocks, 30% more total throughput on the Kabylake due to HT). But the 7700k is 100%, to the balls maxed out in all the comparison circumstances. When someone sees Ryzen producing fewer frames, the automatic assumption is that it, too, is either maxed out, or has a single threaded bottleneck. It's even worse with these nonsense 300-500 FPS tests, that are only testing the draw call capacity of the CPU, not the actual computational bottlenecks that exist in a real gaming situation.

But that isn't what's happening, in any of those circumstances. On most of the games I've tested, I'm getting one of two results:

- The CPU is the 'bottleneck', but no single core is over 75%, and the total use is under 30-40% - the Windows scheduler is moving data across the CCX, and not knowing how to manage that subsystem properly, is keeping any threads from even maxing out. Resultingly, my FPS is about the same than on my 4690k, and lower than the 7700k's in reviews - but at very low utilization, power use, fan use, noise, minimal thermals - while the slightly better results are going up against max fan use, 100% threads maxed out, 80C thermals, etc etc. With a few exceptions, this usually puts me at 120 FPS instead of 140 FPS. Overclocked Kaby Lake's evidently run about as hot as the surface of the sun, by comparison.

- The GPU is the bottleneck, the threads are properly distributed, and I'm getting the same frames as the 4690k/7700k, but again with massively lessened power use, heat, temperatures.

Once there are some optimizations to scheduling, new games that are programmed to use more threads (it wasn't worth anyone's time in the past, when 99% of the population used 4 cores or less), and the memory issues are sorted out, this thing is going to be insane. AMD rushed the launch, no question, but the reception that Ryzen isn't worth looking at because, hurr durr, the frames are lower! - total idiocy. The single thread performance is awesome, the threaded performance is awesome, but the co-ordination of the two needs some work.

As an aside, it's unbelievable how efficient this thing is when you clock it down and drop the voltage. I was pulling a 40 watt delta over idle while running stress tests at 3 GHZ/0.88 volts, and temperatures of 35C. And it keeps scaling very efficiently up until you get to about 3.8 GHZ. That's going to make for a monster server processor.

Now, Vega, I have zero faith in, but you know Intel is feeling the pressure behind the scenes.
 
Last edited:

Intrinsic

Person of Whiteness
<Gold Donor>
15,027
13,124
I do agree with you for the most part. It still isn't a lose lose scenario and a pretty good time for people to upgrade. A lot of us were ready to move on from the 2500k which has been a beast for a long time. And at the best we have actual options.

35F? Wtf cooling are you running?

I got the rest of my case fans in today and put everything back together. Only took my 7700k to 4.9 GHz but am sure I could do 5. Just seems like it is fine where it is. The 960 Evo m.2 drive tests at 3 gbps tho lol
 

kegkilla

The Big Mod
<Banned>
11,320
14,739
I feel like there's such a disconnect between all the talk about Ryzen and the reality of quality of life with a given piece of hardware. There's this assumption that all that matters, at the end of the day, is the raw FPS number on screen, and that FPS number (on a few existing titles) divided by price is the only metric of consequence.

We've become used to ignoring AMD and our only scale of relative comparisons from Intel, generation to generation, has become 'Where is the CPU bottleneck FPS?', on the automatic assumption that virtually all threads are fully utilized. Intel absolutely has had teething issues in the past on new architecture (and avoidance of this, and the backlash similar to the Ryzen reception may be why they haven't bothered to introduce any meaningful architectural differences in 5 years)

I've been running on a 1700X system for about a week now, and while my comparison point is to my old 4690k, instead of a 7700k like most reviewers are, the differences between those two aren't really that dramatic (~15% more instructions through a combination of IPC and higher clocks, 30% more total throughput on the Kabylake due to HT). But the 7700k is 100%, to the balls maxed out in all the comparison circumstances. When someone sees Ryzen producing fewer frames, the automatic assumption is that it, too, is either maxed out, or has a single threaded bottleneck. It's even worse with these nonsense 300-500 FPS tests, that are only testing the draw call capacity of the CPU, not the actual computational bottlenecks that exist in a real gaming situation.

But that isn't what's happening, in any of those circumstances. On most of the games I've tested, I'm getting one of two results:

- The CPU is the 'bottleneck', but no single core is over 75%, and the total use is under 30-40% - the Windows scheduler is moving data across the CCX, and not knowing how to manage that subsystem properly, is keeping any threads from even maxing out. Resultingly, my FPS is about the same than on my 4690k, and lower than the 7700k's in reviews - but at very low utilization, power use, fan use, noise, minimal thermals - while the slightly better results are going up against max fan use, 100% threads maxed out, 80C thermals, etc etc. With a few exceptions, this usually puts me at 120 FPS instead of 140 FPS. Overclocked Kaby Lake's evidently run about as hot as the surface of the sun, by comparison.

- The GPU is the bottleneck, the threads are properly distributed, and I'm getting the same frames as the 4690k/7700k, but again with massively lessened power use, heat, temperatures.

Once there are some optimizations to scheduling, new games that are programmed to use more threads (it wasn't worth anyone's time in the past, when 99% of the population used 4 cores or less), and the memory issues are sorted out, this thing is going to be insane. AMD rushed the launch, no question, but the reception that Ryzen isn't worth looking at because, hurr durr, the frames are lower! - total idiocy. The single thread performance is awesome, the threaded performance is awesome, but the co-ordination of the two needs some work.

As an aside, it's unbelievable how efficient this thing is when you clock it down and drop the voltage. I was pulling a 40 watt delta over idle while running stress tests at 3 GHZ/0.88 volts, and temperatures of 35 F. And it keeps scaling very efficiently up until you get to about 3.8 GHZ. That's going to make for a monster server processor.

Now, Vega, I have zero faith in, but you know Intel is feeling the pressure behind the scenes.
Yet another "Ryzen is going to be awesome once stupid developers learn to harness the awesome power!" We've been hearing this same shit from AMD for the last 10 years.
 

Fulorian

Golden Knight of the Realm
104
46
I do agree with you for the most part. It still isn't a lose lose scenario and a pretty good time for people to upgrade. A lot of us were ready to move on from the 2500k which has been a beast for a long time. And at the best we have actual options.

35F? Wtf cooling are you running?

I got the rest of my case fans in today and put everything back together. Only took my 7700k to 4.9 GHz but am sure I could do 5. Just seems like it is fine where it is. The 960 Evo m.2 drive tests at 3 gbps tho lol

That's with a Noctua NH-D15, which is massive overkill for my use case. I play in a fairly small room that doesn't have great ventilation, so I've always paid attention to my power draw/use - so I've situated pretty happily in at 3.75 Ghz/1.15v, which only gets to 50C running Prime95. I just don't see why I would want to double my heat/power draw for another 5-10% fps.

I really should have just gotten a 1700 with the Wraith cooler, since I'm not really pushing it very hard (although I ran myself up to 4 Ghz/1.45v a few times for shits and giggles - and had almost a 250W power delta from idle running Prime95, but still only hit the upper 70s on temperatures).

Yet another "Ryzen is going to be awesome once stupid developers learn to harness the awesome power!" We've been hearing this same shit from AMD for the last 10 years.

I'm not arguing that, but there's a provable difference here. I used an FX-8150 for years and hated it. The fact of the matter was, Bulldozer simply lacked the power. It wasn't a lack of multithreading support in games, it was just garbage at single threaded performance. But where it was at 60% of the comparable Intel CPU's single threaded performance, the Ryzen processors are at 85%-90%. The Bulldozers were just flat out hitting a processing wall they couldn't overcome - the primary core would hit 100%, the fans going crazy, the heat off the charts, and you were just fucked. Whereas with my 1700X, when none of my cores are hitting 75%, and the CPU is almost asleep while it rips through at -almost- the same amount of performance as the Intel CPU's that are totally maxed out, there's something else entirely going on. With a few exceptions, I've pretty much managed 100+ FPS in every game before hitting a so-called CPU bottleneck, and the difference between gaming at 100 fps and 120 is negligible. The experience has been utterly smooth in everything I've done. Bulldozer was choppy and useless.
 

Intrinsic

Person of Whiteness
<Gold Donor>
15,027
13,124
That's with a Noctua NH-D15, which is massive overkill for my use case. I play in a fairly small room that doesn't have great ventilation, so I've always paid attention to my power draw/use - so I've situated pretty happily in at 3.75 Ghz/1.15v, which only gets to 50C running Prime95. I just don't see why I would want to double my heat/power draw for another 5-10% fps.

I really should have just gotten a 1700 with the Wraith cooler, since I'm not really pushing it very hard (although I ran myself up to 4 Ghz/1.45v a few times for shits and giggles - and had almost a 250W power delta from idle running Prime95, but still only hit the upper 70s on temperatures).



I'm not arguing that, but there's a provable difference here. I used an FX-8150 for years and hated it. The fact of the matter was, Bulldozer simply lacked the power. It wasn't a lack of multithreading support in games, it was just garbage at single threaded performance. But where it was at 60% of the comparable Intel CPU's single threaded performance, the Ryzen processors are at 85%-90%. The Bulldozers were just flat out hitting a processing wall they couldn't overcome - the primary core would hit 100%, the fans going crazy, the heat off the charts, and you were just fucked. Whereas with my 1700X, when none of my cores are hitting 75%, and the CPU is almost asleep while it rips through at -almost- the same amount of performance as the Intel CPU's that are totally maxed out, there's something else entirely going on. With a few exceptions, I've pretty much managed 100+ FPS in every game before hitting a so-called CPU bottleneck, and the difference between gaming at 100 fps and 120 is negligible. The experience has been utterly smooth in everything I've done. Bulldozer was choppy and useless.

I was just giving you a hard time for saying 35F, that's like 1C or some shit. Heh.