Desktop Computers

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

Crone

Bronze Baronet of the Realm
9,714
3,211
Looking to upgrade to a near-worry free experience for 1440p144hz.

8700k 5.0OC
16gig DDR4
1060 6gig

Which video card would I be looking at? a 2060 or 2070? Anyone think the new 1660 would push 1440p decently? Obviously if budget allows grab the 2070 but if a 2060 can perform then difference can be added towards a display; as a compromise i'd lean more towards doing that. Also anyone able to decide on TN vs IPS in regards to MMO gaming? Now with mass gsync compatibility i'm curious if IPS rules in favour.
I went through this thing lately, and a 2060 is gonna be just fine if you don't mind dropping down some graphics settings to hit the desired FPS. My debate was between a new 2070 , or a used 1080 Ti , and I went with the 1080 Ti, and in my research I felt the 2070 was gonna do 1440p res just fine.

As for TN vs IPS? @Dom has a link to a site where you can see the color shifts. Now I have a new TN panel, and I don't mind it really at all. I think it's fantastic, but it may be something that bothers you.
 

Malakriss

Golden Baronet of the Realm
12,659
11,973
While reality doesn't work exactly like the plain math does, you can get an idea how GPUs scale the same way resolution does.

If you can do 1080p @ 240 fps, then increasing the pixels by 4x to 2160p should yield 60 fps (1/4 of 240)
Similarly, you would expect to get around 1440p @ 135 fps (9/16 of 240)
Once we hit 1440p @ 240 fps then we should be around 2160p @ 106.67 fps (4/9 of 240)

Now imagine how far off 8K / 4320p is...
 

a_skeleton_05

<Banned>
13,843
34,510
Yeah, the pixel count only really serves as a really ballpark estimate. There are a lot of elements for games that are just overhead and don't really depend on the amount of pixels being pushed, and higher frame rates bring other performance issues into play like CPU/Memory speed. In my real-world experience, the average is close to something like 1080 being 2.2x and 1440 being 1.6x the frames of 4k. But this can vary wildly by game/engine. The new game Dawn of Man for example: My system can't even manage 120fps at 1080p when not at full load simply because the game shits itself at high frame rates, which is a common issue with Unity games.

That's why I try to recommend to pick 1-3 games you know you will put a lot of time into and design your system around those games. This lets you target your interests and hopefully not waste money on things you won't really need (4K if you're an FPS player for example)

This hobby blows for people who suffer from "grass is greener" issues like myself. It's impossible to be satisfied, even if you toss money at all of it. There's something to be said for just going mid-range and not thinking about the rest so you are a gamer, rather than a PC hardware enthusiast doing the car equivalent of washing your shiny new car every single day, but never going for a long drive.
 

Daezuel

Potato del Grande
23,408
50,229
Overclocking my Vega 56 and Ryzen 2600 I just ran a 2 minute fps test running around in Touissant (Witcher 3) with most settings on Ultra 4k with a few settings turned down, things I find acceptable to turn down for the performance gain and IQ loss, and averaged 58.6 fps.

Not bad for a $250 video card, who says I can't play games in 4k. Stock cooling too, not too shabby, I'm sure there is more fine tuning to do.
 
  • 1Like
Reactions: 1 user

a_skeleton_05

<Banned>
13,843
34,510
Switched over to my dinky little 24" 1080p screen to help alleviate some neck pain I've been having, and the lack of screen space is really frustrating. Also, I've come to the conclusion that anti-glare coating is massively important when it comes to monitor picture quality. The TV I've been using has a really light coating and you need to be within inches to notice it. The monitor on the other hand has a medium-heavy coating which is obvious from 1-2 feet away, making everything look like it's being viewed through a dirty filter that you can see the grid of. It's a larger issue when it comes to picture quality than the 1080p.

It's not something most people think about(I didn't when I bought this and the previous monitors), so I'm mentioning it for you folks that are looking to buy one.
 

Fucker

Log Wizard
12,639
28,767
This hobby blows for people who suffer from "grass is greener" issues like myself. It's impossible to be satisfied, even if you toss money at all of it. There's something to be said for just going mid-range and not thinking about the rest so you are a gamer, rather than a PC hardware enthusiast doing the car equivalent of washing your shiny new car every single day, but never going for a long drive.

I used to always have the latest stuff. Now, I can't be bothered. I don't play hardcore anyway. 4790K/1070/1440.

All I care about is reliability and quiet so long as it is pleasingly fast enough.

I got a Ryzen 2600/RX 570 for cheap for my 2nd box. It's good enough for movies and a backup PC in case the main one shits itself and I don't have to wait on parts. I have a few much older boxes sitting around the house that don't do anything other than surfing and distributed backups.

I made this 4790K when they came out, what...4 years ago? I used to spend $2k a year before that. That's $8k saved, more or less.

It would be a different matter if I did video encoding or something else heavy, but I don't, so this is fine for now.
 
  • 1Garbage
Reactions: 1 user

Lambourne

Ahn'Qiraj Raider
2,863
6,834
CPU performance gains (for games at least) have been so minor the last decade that it's really not worth building a whole new system regularly. Single thread performance has barely doubled in the last 10 years, it went up by more than a factor of 10 in the decade before.

I still have an E8400 Core2 that I use as a home theater PC, has zero issues with 1080p netflix streaming / decoding mkv's. That chip came out in 2008, when was there ever a time you could use an 11 year old CPU productively?

For a game PC, just throw in a new video card every 2-3 generations and you're good. I put a 2070 in my 7 year old 3570k box and it gets 60fps at 4k just fine, it's not even overclocked anymore.

1552030558006.png


from Real World Technologies - Forums - Thread: Macro-economic consequences of Moore's law? and 42 Years of Microprocessor Trend Data | Karl Rupp

1552030648119.png
 
Last edited:
  • 1Like
Reactions: 1 user

a_skeleton_05

<Banned>
13,843
34,510
You aren't playing many games if you think that a 3570k isn't going to bottleneck you in many situations. Core count matters on top of ipc now, not to mention memory speed now mattering as well.

My current system was made specifically because my own 3570k was shitting itself in many situations.
 
  • 2Like
Reactions: 1 users

Lambourne

Ahn'Qiraj Raider
2,863
6,834
Yes you're going to leave something on the table sticking with an older CPU. Depends on the game somewhat, but the difference can be disappointingly small, especially at 4k. Games are mostly GPU bound still.

Have a look at CPU 2019 Benchmarks - Compare Products on AnandTech
Compares a 7 year old 3770k with intel's latest $2000 monster CPU, the 9980XE. Six generations newer and fourteen more cores, at 4k the difference even in a newer game like F1 2018 is only around 10% There's a huge difference at 720p but 150fps vs 250fps doesn't really translate to any real world improvement. (the stats I'm referring to are at the bottom of the page).

Meanwhile a single generation of video card will give you 20-30% in the same game, at any resolution.

There's been huge gains in performance in various office applications but if you're the average home user that just uses the computer to game, browse and watch videos, you're not missing out much by not upgrading the CPU frequently.
 
  • 1Like
Reactions: 1 user

mkopec

<Gold Donor>
26,235
39,953
I used to think exactly like this and then I tried playing BF5 on my i5 3570. thinking I had the 1070 I was good to go. But then in game my frames would hover between 60 all the way down to 30 with sudden 1 second freezes. @1080p

So reading up a bit, I went out and bought new i5 9600K, mobo and memory. Cost around $450. And holy shit, its like the proc opened the door for the video card to do its thing again.

Same game, same settings I was getting 100-130 frames with no more stutters at all.

So go ahead and keep thinking the way youre thinking. Youre absolutely wrong.
 
  • 2Like
Reactions: 1 users

a_skeleton_05

<Banned>
13,843
34,510
I don't think you'll find anyone arguing that CPU's have slowed down significantly in what they're capable of. Moore's law hasn't been law for a long time now.

The key issue with cpu's and gaming right now is that engine makers have finally gotten off their arses and started putting CPU's to use whereas they were almost solely single core just a few years ago. IPC matters now just as how as it did then, but now we also have core count factoring into several game engines. So you get situations where a CPU is more than good enough for many games because it still has decent IPC, but it falls over itself in games like the new AC or BF games because it doesn't have the cores to handle it, or ram support to handle other aspects of it.

The reason why core count matters a lot now is that consoles rely on core count significantly as super-high IPC is not possible for them for various reasons, so engine makers had to start designing around how consoles had cores available to offload things.

~3 years ago people saying how a 2500k was all you'd need were pretty safe in their statement. As time went by and people got their new shiny GPU's they started realizing that something wasn't right, and that they weren't getting the performance they expected from the upgrade. The community started noticing how CPU's were holding things back, and the mentality has shifted significantly since.

The shift into high refresh rates has magnified the situation even more. That 3570k that only causes you to lose ~5 fps under 60fps isn't a big deal, but once you break that 60fps ceiling then you're looking at losing out on upwards of 50% or more frames. That's huge.

This isn't to say that you need the biggest baddest CPU in all the land by any means, as even those will hit bottlenecks due to IPC limitations, but core count cannot be ignored now, and even hyperthreading(and w/e AMD's version of it is) can have significant effect in some cases.

So yeah, you can get by with a 3570k in many situations if you're not too fussy about things, but you can also get by with a 5 year old GPU. Or why bother at all: Just buy a console
 
  • 1Like
Reactions: 1 user

mkopec

<Gold Donor>
26,235
39,953
I don't think you'll find anyone arguing that CPU's have slowed down significantly in what they're capable of. Moore's law hasn't been law for a long time now.

I think you will start to see more now that AMD is back in the game and there is actual competition. In the past 10 yrs intell could do its 10%+ per generation and no one even batted an eye. But now? They actually have to step up. This is why youre seeing 6 core and 8 core products from intel, not because they want to, but because they HAD to.
 
  • 2Like
Reactions: 1 users

a_skeleton_05

<Banned>
13,843
34,510
I think you will start to see more now that AMD is back in the game and there is actual competition. In the past 10 yrs intell could do its 10%+ per generation and no one even batted an eye. But now? They actually have to step up. This is why youre seeing 6 core and 8 core products from intel, not because they want to, but because they HAD to.

Core counts are going to eventually hit the same issue as IPC. There's only so much you can do before you run out of space or just hit literal physics barriers. There's also only so many cores you can use to push a video game without adding impossible levels of complexity to the programming side of things, and even then you're going to get stuck at IPC limitations to the core game logic/loop/whatever.

Hopefully something like quantum computing or some other such thing will end up being a giant leap forward. That's if gaming hasn't majorly switched over to cloud computing by then.
 
  • 1Like
Reactions: 1 user

mkopec

<Gold Donor>
26,235
39,953
Yeah I mean there are limitations in going smaller and smaller, so new breakthrough will need to happen.
 

a_skeleton_05

<Banned>
13,843
34,510
It has been great to see AMD kick Intel's ass into high gear though. I got into custom builds back in the Cyrix days and my second build was a k6-2 AMD build back when AMD was the man. Watching them slowly turn to shit over the years and letting Intel intentionally retard their own progress was frustrating. One of those clear signs that competition is the lifeblood of progress.
 
  • 2Solidarity
Reactions: 1 users

Lanx

<Prior Amod>
65,309
147,252
It has been great to see AMD kick Intel's ass into high gear though. I got into custom builds back in the Cyrix days and my second build was a k6-2 AMD build back when AMD was the man. Watching them slowly turn to shit over the years and letting Intel intentionally retard their own progress was frustrating. One of those clear signs that competition is the lifeblood of progress.
haha 6x86 with pr200+, idk where that old ass chip is i was recently cleaning out shit and found my old slot 1, 450 and 550
 

Lambourne

Ahn'Qiraj Raider
2,863
6,834
I used to think exactly like this and then I tried playing BF5 on my i5 3570. thinking I had the 1070 I was good to go. But then in game my frames would hover between 60 all the way down to 30 with sudden 1 second freezes. @1080p

So reading up a bit, I went out and bought new i5 9600K, mobo and memory. Cost around $450. And holy shit, its like the proc opened the door for the video card to do its thing again.

Same game, same settings I was getting 100-130 frames with no more stutters at all.

So go ahead and keep thinking the way youre thinking. Youre absolutely wrong.

If you were getting 1 second freezes at 1080p, you had a software problem somewhere. The Battlefield engine doesn't like quadcores without hyperthreading though, so that's definitely a case where you'd benefit from upgrading your 3570k.

I'm not advocating never upgrading, in fact I built a 6700k box a while later that I use for VR currently. The 3570 I use for regular games and I stand by my point that it still gives more than acceptable performance with an up to date video card though. (I was responding to the post above about not spending $2k a year on upgrades anymore)