Sinistkir13
I saw that video posted on retardera and the posters there were mad as hell for him going after DF.
Gotta say, those DF dorks started annoying me a long time ago when all they could talk about was "muh waytwathing". I'm a layman when it comes to graphics so it felt really odd hearing them use the "buzzword" with such regard yet not elaborating on how or why it was an improvement.
It's like... There could be two games, one uses raytracing and looks 7.5/10, and one uses meticulous baked lighting techniques to look solid 9/10.... Yet the raytraced game would get all the praise just by virtue of using the newer method.
I like Nvidia's hardware, but I was skeptical about ray tracing day 1 because of how much it obviously hit the performance on the 2000 Series cards for what
I would consider almost negligible visual payoff. Sure, you get some really wild mods
now like the realistic Cyberpunk 2077 videos, but how well those actually run even on the most expensive hardware available is something I couldn't personally say, and there aren't that many games that can even utilize that level of ray tracing.
It's only now that we're starting to see older games get ray tracing implemented where it does look like a visual overhaul for the better, but so they can
I'm with you baked lighting generally being the superior choice when done well. With ray tracing, you're dependent on "real" light sources in the environment, which can easily result in a lot of overly dark areas or a general lack of artistic vision if they're just letting the environment do all of the work.
Non-ray traced lighting is perfectly functional without tanking performance on details nobody is going to notice in real gameplay, but people sure as hell notice when the frame rate isn't high enough while the image is upscaled and smeared to compensate because some twat needed his geometrically accurate light casting and shadows.
On another interesting note as I'm done with SB, tonight I saw FF16 went on sale for $32 so I figured I'd give it a go. Don't want to be like mgftp and talk too much chit about things I never even tried myself. Will report back soon with initial impressions.
I thought FF16 looked really cool, but then I read mixed things about various aspects, but the combat sounded like the most promising since it was done by the DMC guy I think, so I'd like to hear your take.
One of the main issues I read about the latest FF games is that they're kind of shit in the "RPG" department, which hurts their depth (and replayability) versus being a straight up action game, but then they still have all of the items, materials, ect. that you would see in the former. That's the kind of thing Square needs to get sorted, rubbish optimization aside.
I never liked those DF dorks. I always thought of them as those turbo-nerds who overanalyze things trying to use as much industry jargon as possible to make their bullshit sound like fact. I mean it really isn't that hard to tell whether you think graphics look good or not, and you don't need to be pausing frames and analyzing them under a microscope trying to detect whatever tech they've been using to figure that out. Like they always seemed to spend so much time micro-analyzing things that I would never care about or notice when all I do is look at how a game looks to me in action and I can easily decide for myself whether or not that looks good.
I think they really popularized making a big deal about effects technology as opposed to just simply enjoying how a game looks. They always seemed to be popular among the type of gamer nerds who take this kinda shit way too seriously and try to make academic statements about them or some shit to win arguments on the internet. It's just always been so stupid to me.
Truth, man.
DF kind of gives themselves and the entire gaming industry's stagnation away with how stupidly impractical their analyses became when they're pausing shots just for every normal person to go, "they're the same picture."
It's only going to keep getting increasingly muddied now when they're basing everything off of their own normalized low standards for smeared temporal anti aliasing, and upscaling from some >720p base resolution. By the time you've filtered everything through all of that, what was even the point of having so many more polygons packed with higher resolution textures and extra detailed lighting if it's all just going to be smeared and interpolated in the end?
I was playing a modded Dark Souls 3 the other day, an old game for sure, and was somewhere in the Ringed City looking up at the skyline just in awe of how good it still looked. I'm using my old-ass PC with a
980ti and a regular 1440p monitor - no blurring or smearing or aliasing, just a gorgeous view that wasn't even some intentional set piece.
Most new games just don't look good enough to justify the dramatic jump in requirements that are still not good enough and thus require those aforementioned workarounds that degrades the finished look.
While I've never seen a single thing that I can recall from Digital Foundry, gaming advertisements media really started pushing graphic quality as the most important aspect of games, to the almost exclusion of everything else, back in the early 2000s. The console-tards seemed to eat that shit up even more than PC gamers and now a whole generation was raised on OMGs the graphics is all we care about.
Today, in a large part due to this mindset and companies like EA pouring money into gaming "journalism" (so they can pump out the same game with "better" graphics every year) we have a AAA & AAAA gaming market that is 95% shit.
Digital Foundry is just another extension of the IGN conglomerate, so they're nothing but access journalist at the end of the day like the rest of them.
The push for better visuals has definitely gotten the worst with the latest console generation because despite how outdated the PS4 and Xbone were for so long in their lifespans, games were hitting their peaks the entire time. It was much easier to see on the PC equivalents, but the PS5 and Xbox series really drive that home because only now console users are seeing pretty much what they could have gradually gotten over time with progressively improving hardware versus a fixed state for many years. Some developers could really milk the hardware like with some of Sony's exclusives, but even that drives home just how much little headroom was left for what is feasible to render in real time.
However, even the latest console hardware is insufficient to handle ray tracing, stable 60fps, and certainly not 4k resolution, but there is a fixation on the former and latter of the three regardless of how they get there. Yeah, games have "ray tracing," but it kills the performance for hardly any visual gain that barely anyone will notice on a 4k screen, which isn't even playing at 4k because everything has to be upscaled from all of the performance cost on an all-in-one device that can't exceed several hundred dollars.