Desktop Computers

gogusrl

Molten Core Raider
1,365
108
Samsung 850 Pro just came out,here is the reviewfrom Anandtech. Spoiler alert: Fastest SSD ever.
FFFFFUUUUUUUU, I just got a 256gb 840 pro a week ago
frown.png
 

Joeboo

Molten Core Raider
8,157
140
Samsung 850 Pro just came out,here is the reviewfrom Anandtech. Spoiler alert: Fastest SSD ever.
Very nice. I hope we see an Evo line based on the 850 V-NAND design soon. The 840 Evos are still the best bang for your buck as far as Samsung SSDs go. Heck, a 512GB EVO isn't much more than just a 256GB Pro. I'll happily take double the capacity over 10% more performance, especially considering that performance degrades the fuller the SSD gets, so by the time you throw 200GB on the drive, the 512GB Evo is probably outperforming the 256GB Pro anyways.
 

Quaid

Trump's Staff
11,859
8,265
Pretty much every single game that comes out, excluding Watch Dogs.
You must be satisfied with extremely low frame rates. There is no way you are playing modern games (battlefield 3, 4, tomb raider, Witcher 2, Blood Dragon) at anything more than 20-30 FPS on low settings.
 

Kuriin

Just a Nurse
4,046
1,024
You must be satisfied with extremely low frame rates. There is no way you are playing modern games (battlefield 3, 4, tomb raider, Witcher 2, Blood Dragon) at anything more than 20-30 FPS on low settings.
You would be wrong about that. I don't think I have ever gone below 50fps. I don't max out settings and I always keep vsync disabled. But, to say that I only get 20-30fps (max) is hilariously wrong. You should be ashamed of yourself.
 

spronk

FPS noob
23,828
29,073
i play all new games at 1440p no problems against a nvidia 680. People are way, way off on how much GPU games really need nowadays, almost all the AAA games target PS4 (or even PS3) so pretty much any shit video card will work fine. SLI is a joke and almost universally will work against you. The only exception is if you are trying to run a 2+ monitor setup and turn it into a single screen for gaming, then yeah you need some massive GPU power. I think we've gotten to the point again where CPU power seems to be more important than GPU for a lot of games, and it probably plays a bigger factor when going from 1080 to 1440 or higher.

But with a 3 monitor setup and only one korean 1440p as my gaming window, everything from BF4, Titanfall, Watch Dogs, AC4, Wolfenstein, Landmark, Wildstar, FF14, Transistor, and dozens of other games have run perfectly fine at 40-90 fps. I just did a Titanfall game and it was constant 60 fps no problem. I do usually set shadows to one notch lower than max, always have vSync off, and play around with the AA settings until I get the FPS I want - usually 4x MSAA.

don't get me wrong gsync is fucking amazing and worth it if you can get it, and kinda forces you to 1080p (but should be able to do 1440p soon). But the extra real estate of a 1440p monitor is awesome esp at 90-120hz and there are almost no games made anymore that really target high end PC gaming.
 

Mist

REEEEeyore
<Rickshaw Potatoes>
31,800
24,478
Very nice. I hope we see an Evo line based on the 850 V-NAND design soon. The 840 Evos are still the best bang for your buck as far as Samsung SSDs go. Heck, a 512GB EVO isn't much more than just a 256GB Pro. I'll happily take double the capacity over 10% more performance, especially considering that performance degrades the fuller the SSD gets, so by the time you throw 200GB on the drive, the 512GB Evo is probably outperforming the 256GB Pro anyways.
Doubt you will see an EVO that's a whole lot cheaper. The EVOs used TLC instead of MLC to make them cheaper. V-NAND is another whole type.
 

Joeboo

Molten Core Raider
8,157
140
As an alternative, I wouldn't mind the current line of EVOs becoming even less expensive, rather than bringing out a new line of them. I'd be goddamned thrilled if I could pick up a 1TB Evo for about $250-300 sometime in the next year.
 

jeydax

Death and Taxes
1,421
960
I think we've gotten to the point again where CPU power seems to be more important than GPU for a lot of games, and it probably plays a bigger factor when going from 1080 to 1440 or higher.
And you're completely wrong. It is quite the opposite.

You have to go back to an AMD Athlon II X4 640 (now $100), a CPU released in 2010 (fucking eons ago in the tech world)to see any FPS degradation in BF4 at 1920x1200. You see a whopping (
rolleyes.png
) 15 FPS drop between the same AMD CPU and the more powerful CPU's in Bioshock Infinite. From another link, "Clocking the 3770K from 2.5GHz up to 4.5GHz made for a marginal 4fps difference." Going from a i3-3220 to a i7-3960X only made a 7 FPS difference. Now comparatively, going from a 650 Ti ($175~, released in 2012) to a Titan ($1,100ish) made the FPS jump from 34 to 88. Hell, just jumping from the 650 Ti to 670 Ti made the FPS jump by 30.

You literally see no difference between a 2500k stock and 4770k stock in Crysis 3 at max details 1920x1080. Similarly the difference between a i5-3550 (stock) and i7-3960X (OC'd) was only 2 FPS. And afaik as resolution goes up, the CPU becomes less and less important. Playing Crysis 3 at 1280x1024 with a 4770k resulted in 96 FPS vs. 78 FPS with the 2500k. If you use the same setup and bump the resolution up to 1920x1080 the FPS was the EXACT SAME.

Here is another linkcomparing a i5-2500k (~$200 - 2 or 2.5 depending on how you look at it generations old) to an i7-4770k (~$340 - present or 0.5 generations old). TLDR version of the link, here is the FPS difference between the CPUs in each game:


Game (2560x1440, highest graphics): FPS Difference
BF3: 1.5
Bioshock Infinite: 0.7
Crysis 3: 0.3
Dirt3: 4.7
Metro: No Change
And so on and so forth...

If you took a GPU that was 2 generations old and cost $200 (560 Ti, $250 at launch but w/e) and compared it to a current generation $340 GPU (R9 280X, $280 actually but w/e)you'd see a FPS jump of 33.

I could get more data but a video cards matters WAY the fuck more than a CPU does when it comes to gaming.
 

Mist

REEEEeyore
<Rickshaw Potatoes>
31,800
24,478
i play all new games at 1440p no problems against a nvidia 680. People are way, way off on how much GPU games really need nowadays, almost all the AAA games target PS4 (or even PS3) so pretty much any shit video card will work fine. SLI is a joke and almost universally will work against you. The only exception is if you are trying to run a 2+ monitor setup and turn it into a single screen for gaming, then yeah you need some massive GPU power. I think we've gotten to the point again where CPU power seems to be more important than GPU for a lot of games, and it probably plays a bigger factor when going from 1080 to 1440 or higher.

But with a 3 monitor setup and only one korean 1440p as my gaming window, everything from BF4, Titanfall, Watch Dogs, AC4, Wolfenstein, Landmark, Wildstar, FF14, Transistor, and dozens of other games have run perfectly fine at 40-90 fps. I just did a Titanfall game and it was constant 60 fps no problem. I do usually set shadows to one notch lower than max, always have vSync off, and play around with the AA settings until I get the FPS I want - usually 4x MSAA.

don't get me wrong gsync is fucking amazing and worth it if you can get it, and kinda forces you to 1080p (but should be able to do 1440p soon). But the extra real estate of a 1440p monitor is awesome esp at 90-120hz and there are almost no games made anymore that really target high end PC gaming.
This is all completely wrong. Only poorly coded MMOs and some other fringe shit are CPU bound. And it's mostly the crappy scripted UI modules in those games that use up all the CPU power.

Real games, like FPS games, scale pretty much directly with GPU power.
 

Zodiac

Lord Nagafen Raider
1,200
14
Some of you guys kill me... Turning shadows off and running shitty FXAA or lower quality AA isn't max settings.
 

Mist

REEEEeyore
<Rickshaw Potatoes>
31,800
24,478
I turn off FXAA because it kinda looks like shit if you think about it. Why would I want games to be blurry? I also feel like it blunts your motion recognition response time and accuracy in FPS games.

But I always run everything else on Ultra.
 

Kuriin

Just a Nurse
4,046
1,024
Some of you guys kill me... Turning shadows off and running shitty FXAA or lower quality AA isn't max settings.
I don't know if you're pointing that comment to me, but, I never once said ultra/max settings. I set every game to Medium/High and even then, I customize it to get rid of the extraneous things that I know will absolutely rape my framerate.
 

jeydax

Death and Taxes
1,421
960
Dudes and bros, critique this $1,000-$1,100 gaming build for me. Yes I know it is missing a harddrive but assume they already have one...

Intel Core i5-4690K, Sapphire Radeon R9 290, NZXT Source 210 (Black) - System Build - PCPartPicker

The R9 290 seems to be the best pick for the GPU at this price point. TH index scores puts the AMD R9 290 neck and neck with the nVidia 780 at 1080p and actually comes out ahead at 2160p, probably due to the increased GPU RAM. Battlefield 4 and Bioshock seem to favor the 780 by 8-10 FPS at 1080p but the R9 290 pulls ahead in other games. Again at 2160p the AMD R9 290 pretty much wins across the board. So again I think for $100~ cheaper the R9 290 is the best fit. Though I've personally never used them (I have been tempted to), I went with the ASRock board since it has received some glowing reviews for a budget Z97 that doesn't sacrifice too much.

Thoughts? Opinions? I always recommend watching slickdealsSlickdeals: The Best Deals, Coupons, Promo Codes & Discountsfor a good Power Supply deal so that really is the only thing I feel would change. The one listed is perfectly fine, imo, just maybe not the best deal. That could be said for all parts though, I guess. Whatever... let me know what you think.

rrr_img_70392.jpg
 

Quaid

Trump's Staff
11,859
8,265
Solid build. Not a single thing I would change. I'm team green though (for the time being, anyway) so I'd probably go with a GTX 780. The value is really high on those 290s right now though, so I understand the decision.
 

jeydax

Death and Taxes
1,421
960
I am also an nVidia dude but the value on those R9 290s is just too nuts to pass up. And they're damn near equal.