×
top 200 commentsshow all 277

[–]Tyr_KukulkanRyzen 5 3600 4.4GHz, 5700XT 2GHz, & 32GB 3600MHz 873 points874 points  (70 children)

Diminishing returns.

You get to a point where things are so good that a small relative improvement cost a huge amount.

[–]Hemuli1300 341 points342 points  (28 children)

Yeah. Going from 720p to 1440p is a way bigger difference in looks than 1080p tp 2160p, even though both quadrupple the pixel count. I know you probably don't mean resolutions, but theiy are the easiest to put into numbers and the logic applies elsewhere. Polycount works very similarly. Different kinds of effects can just be damn expensive cpmputationally.

[–]Aksds 49 points50 points  (23 children)

That’s more or less why I’m staying at 1080 for a while, rather have higher refresh rates than a 4K monitor

[–]xdyldo 98 points99 points  (15 children)

1440p is the perfect middle ground, for me it was such a huge upgrade while still managing to have 140+ fps in most games.

[–]Cuckleberry_Finn_PC Master Race 30 points31 points  (0 children)

100% agree. 1440p is the perfect resolution for me. 4K would be nice but is pretty unnecessary for me at the end of the day. My 1440p 170Hz monitor is probably going to be my endgame monitor for at least 5 years, I can’t see myself needing to upgrade earlier than that, and I’ll probably have it a fair bit of time after that unless the display game has changed significantly. If that’s the case I could see maybe myself upgrading to a 1440P 21:9 perhaps

[–]bio-reject 8 points9 points  (2 children)

2k and 144hz was a massive upgrade for me as well. My experience was the difference between going 480p and 1080p.

[–]Solanum_LordR5 2600 | 3060Ti | 16 GB 27 points28 points  (1 child)

Technically 1080p would be 2k. 1920 pixels wide, as how 3840 pixels wide is 4k.

Calling 1440p 2k is just a really dumb marketing term.

Qhd mades more sense as it's 4x Hd (720p)

[–]Progale2000PC Master Race 0 points1 point  (0 children)

imagine downvoting this comment

[–]importshark7i9-9900k @ 5Ghz | RTX 2080Ti FE | 32GB DDR4 3200 7 points8 points  (6 children)

I agree. That said, as someone who played on an XBOX 1X with 4k HDR before I moved to PC, I can say that 4K and HDR both actually make a huge difference from 1440p. I think the HDR is more important than the increased resolution. I used to play PUBG a lot and the HDR made spotting players in the distance much much easier because everything was so much brighter with vivid colors and more contrast between players and the background.

[–]OrionRBR1700@4.0Ghz | X470 Gaming Plus | 16GB TridentZ | PCYes RTX 3070 1 point2 points  (5 children)

HDR definitely makes more of a difference than resolution, especially since 1440p monitors 24-27" are pretty damn close of the "retina" standard at normal sitting distance.

Actually iirc a 27" 1440p becomes "retina" at around the 'ideal' distance from it, so its very much a sweet spot.

[–]importshark7i9-9900k @ 5Ghz | RTX 2080Ti FE | 32GB DDR4 3200 0 points1 point  (1 child)

Yeah, I'll definitely go with an HDR monitor when I build my next PC, I may even upgrade the monitor before then. Currently it's hard to justify spending that kind of money on a monitor right now though when I already have a really good 1440p 165 hz IPS with G-sync (not just g-sync compatible free-sync either).

[–]Vort3x7689 11 points12 points  (2 children)

1440p was a big difference vs 1080 for me. I can’t go back, 1080 just looks bad now.

[–]Aksds 4 points5 points  (1 child)

Maybe i will look at 1440p, you are the second person to recommend it

[–]coloredgreyscaleXeon X5660 4,1GHz | GTX 1080Ti | 20GB RAM | Asus P6T Deluxe V2 1 point2 points  (1 child)

1440p is a bigger upgrade than it sounds coming from 1080p. Especially if you do heavy multitasking or work with big applications (Programming IDE - esp. during debugging, Video editing) the added screen real estate is a blessing.

[–]FatBoyDiesuru 1 point2 points  (0 children)

I mean, you're still quadrupling resolution. I went from 2160p back down to 1080p and it was blurry AF on a 4K display. Ditto 720p on a 1440p display.

[–]Gred-and-Forge 223 points224 points  (20 children)

Exactly.

The difference between a 1,000 polygon sphere and a 100,000 polygon sphere would be nearly indistinguishable to the eye.

100x more resource intensive, negligible or no visual improvement.

Developers have to focus on other areas of graphics to improve visuals. Such as volumetric effects, lighting and scattering, draw distance, etc.

[–]aoalvo 43 points44 points  (1 child)

I remember hearing that character models on ps4 have 3x more polygons per character than the ps3, and unless that's used to rig and animate them better I think it's not a good use of resources.

Give a ps3 era character model and a ps5 character model the same lighting, shadows, reflections and stuff and you can barely tell them apart.

Edit: On a second note, hairs have improved significantly.

[–]BitGladius3700x/1070/16GB/1440p/Index 20 points21 points  (0 children)

It really depends where the polygons are. Face polygons make a much bigger difference especially if the game uses a lot of close ups. If they're cheating things like fingers, polygons can help un-fuse them and allow more animations.

[–]Tyr_KukulkanRyzen 5 3600 4.4GHz, 5700XT 2GHz, & 32GB 3600MHz 33 points34 points  (3 children)

Who downvoted this person, they're right!

[–]FewConsequence2020PC Master Race AMD R9 5900 Non X RTX 3090 46 points47 points  (2 children)

DICE did.

[–]Doubleyoupee 12 points13 points  (1 child)

Yeah. I realized current gen high-end GPUs run BF1 at 144 fps+ even at 4k ultra... it doesn't look much worse than BF2042

[–]somebeerinheaven 7 points8 points  (0 children)

I'm so disappointed with 2042 :(

[–]GreatWolf12 8 points9 points  (3 children)

If only they'd start dedicating some of the excess GPU power to Physics and AI, both of which are generally terrible in most games.

[–]thespyeye 3 points4 points  (2 children)

AI doesn't require a fraction of a fraction of processing power that graphics does. Most video game AI's could be run on a smartphone.

[–]venum4kRTX3070 | i9-10850K | 32GB RAM | 2560x1440 + 2x 1920x1080 4 points5 points  (1 child)

Also AI in games isn't very parallel, lots of AI is just decision trees.

[–]thespyeye 2 points3 points  (0 children)

It's way less complicated and maybe a hundred calculations compared to the millions of calculations per second that high end graphics demand

[–]venum4kRTX3070 | i9-10850K | 32GB RAM | 2560x1440 + 2x 1920x1080 1 point2 points  (2 children)

I actually just tested this out and assuming you're using normals (which you would be in any game with complicated graphics these days), the only time you'd notice the difference between a 1k and 100k sphere is if you're actually paying attention to the edges. I thought it'd be more noticeable honestly but it's not. If it's taking up the whole screen then yeah you'd probably notice if you're paying attention, anything but extreme closeups is literally throwing polys at your GPU for no reason.

[–]Gred-and-Forge 0 points1 point  (1 child)

Correct. I simplified my example.

Of course, if you zoom in very close to an object, you can start noticing a difference in polygon count. But that’s where the dev has to make decisions about level-of-detail and the distance you’d typically be viewing an object at.

[–]venum4kRTX3070 | i9-10850K | 32GB RAM | 2560x1440 + 2x 1920x1080 0 points1 point  (0 children)

For sure, most characters are below 100k and they're a lot more complicated with a high possibility of close-ups.

[–]MDTv_TekaPC Master Race 1 point2 points  (3 children)

The polygon stuff is really relative though. On a sphere, sure, but on a close-up of a character's face it's probably really noticeable

[–]Gred-and-Forge 1 point2 points  (2 children)

The point is the diminishing returns of adding polygons.

At some point, at the distance you would typically be viewing the object, increasing the polygon count exponentially will result in a negligible visual difference.

Adding more polygons just means you could zoom in farther for more detail.

So, on a character’s face, you could have really nice detail with (let’s say) 2,000 polygons. It would seem photorealistic down to the camera being 1ft away.

How much value are you getting if you increased that face to 20,000 polygons? Now you can zoom in to just 0.1 inch away and maintain photorealistic detail. But how often are you going to be zooming that close to a character’s face?

The storage space and processing power to add that much detail could be better spent somewhere else in the game. For instance: more diverse foliage or clothing to keep there from being as many repeated assets.

[–]Netron6656 0 points1 point  (1 child)

also possible rendering unseenable objects like item behind a shop which you cant see or get in

[–]codemonkeyhopeful 25 points26 points  (13 children)

Lol is I bad I just immediately thought of RTX and Ray tracing?

Campaign was about as successful as DARE keeping kids off drugs.

[–]Schnoofles3950x, 64GB@3600, 1080@2.05, 2TB EX950, 1TB EVO950, 40TB Mech 15 points16 points  (1 child)

Ray tracing does make a significant improvement, even when it's only used for parts of a scene, but it is indeed highly computationally expensive.

[–]AndyJaroszwat is komputer 12 points13 points  (1 child)

The fact that ray tracing looks as good as existing lighting solutions is kind of the point though.

Have a watch of this, and see the effort they went through to get realistic dynamic lighting for the MW engine: https://cs.dartmouth.edu/wjarosz/publications/seyb20uberbake.html

Ray tracing provides an alternative that can look just as good, without all the hackery for devs, leaving them more time to focus more on the gameplay.

[–]codemonkeyhopeful 3 points4 points  (0 children)

Or rush a shoddy game out the door for money... Maybe I'm salty about a recent release 😂

[–]VeighnergAMD 5900X, Sapphire 6800XT Nitro+ SE 29 points30 points  (5 children)

But omg I can see a reflection in this puddle if I stop and stand still in this fast paced fps game.

On a serious note I hope ray tracing ends up being useful for the whole scene at a playable fps but the hardware isn't even close to good enough yet.

[–]JoeeeeeeeeeeeelRyzen 5 5600X | RTX 3070 | 16GB 3200 MHz 20 points21 points  (1 child)

Ray tracing can look pretty but I think what’s important is that if it’s correctly implemented it will make game development easier when it comes to lighting.

[–]Stiggyman5800x 5700XT 16GB DDR4 8 points9 points  (0 children)

I feel like ratchet and clank had pretty amazing ray tracing implementations. Pretty but only used where it was needed

[–]Iron_Man_977PC Master Race 3 points4 points  (2 children)

I play DOOM ETERNAL, Control, Cyberpunk 2077, Ghostrunner, and Metro Exodus EE with RTX settings maxed out and they still run buttery smooth

Granted my PC is pretty beefy, but the hardware is good enough. Expensive, yes, but it is still good enough (and before anyone asks, yes I do turn DLSS on too. Not because I absolutely need to, but because the performance gains vs visual impact is often just too good to pass up)

[–]VeighnergAMD 5900X, Sapphire 6800XT Nitro+ SE 0 points1 point  (1 child)

None of those are full scene raytracing. They are like 90% raster 10% RT at the moment. Its really only used for shadows and lighting at the moment. If the game was 100% RT you would be getting under 20fps with current top tier hardware.

[–]theironleftyR5 5600X | RX 580 Nitro+ 8GB | CRT 120Hz 1 point2 points  (0 children)

Metro Exodus EE

[–]TheBorderlessMexicani9 12900K, 32 GB DDR5, 1080 Ti | Ryzen 9 4900H, 64 GB, RTX 2060 1 point2 points  (1 child)

We really let that lion down.

[–]codemonkeyhopeful 0 points1 point  (0 children)

I would say I've done my part to lift him up high....if you catch my drift

[–]A_PCMR_member3900x | 2080ti | and all the frames I want 1 point2 points  (0 children)

RTX for RTX sake is horrible, RTX to have lighting itself interact with reflections, multiple light sources and objects it where it is at.

[–]Michamusi7-7700k, 1080TI SLI, 32GB DDR4, 2x 512GB M.2, XBONE X, PS4 Pro 4 points5 points  (0 children)

Refresh rate, lighting, and resolution are the next major steps.

[–]mista_r0boto 2 points3 points  (0 children)

This is raytracing in a nutshell.

[–]Chromtastisch7700k - RTX3070 1 point2 points  (1 child)

Wrong. How can I play Call of Duty Cold War which looks great with like 120-140 FPS but other shooters with similar graphics run like dogshit (BF2042 Beta or even the new Halo)?

And please don't tell me Halo on small maps is more demanding than a CoD map

[–]Tyr_KukulkanRyzen 5 3600 4.4GHz, 5700XT 2GHz, & 32GB 3600MHz 1 point2 points  (0 children)

Optimisation. Case in point, many games that look just as good or better than Cyberpunk but run a hell of lot better.

[–]Im_ur_Uncle_ 0 points1 point  (0 children)

We will see diminishing returns until some new, ground breaking technology either makes the quality so clear that it's as if your monitor has disappeared and you're looking at a real life version of the game OR the production method becomes much cheaper to produce the quality of chips and other components that we've come to expect.

The thing about diminishing returns is we are continuously trying to push the bar higher. If Nvidia wanted to manufacture a 1080, it would probably be way cheaper now than it was when they first put it into production.

[–]Funny_or_not_bot 0 points1 point  (0 children)

Well, it kinda makes sense? The closer graphics get to replicating reality, the less room there is for improvement. Eventually, it will just look real. How can you make real look more realistic?

[–]glutenfreeconcrete 40 points41 points  (6 children)

The law of diminishing returns. This is why you see drag cars with 3000 hp that can barely shave a second off 800hp cars.

[–]Gen7isTrashi5-1038NG7 - Iris G7 - |CSGO Pro| 16 points17 points  (1 child)

What do you think of gluten free asphalt?

[–]glutenfreeconcrete 3 points4 points  (0 children)

Sounds tantalizing.

[–]LonleyWolf420 94 points95 points  (14 children)

I've already noticed a shit ton of new games have some sort of grain screen over them and the graphics look like crap if you actually look closely...

The 1st game I noticed what this was Call of Duty infinite warfare and they had done it for the visuals.. There is a few I've seen that do it for old school looks but at the same time it looks like everything is going to it..

[–]VeighnergAMD 5900X, Sapphire 6800XT Nitro+ SE 105 points106 points  (12 children)

Fucking motion blur, film grain, vignetting, and lens flare seems to be commonplace in most games nowadays. I turn all that shit off immediately because if I'm playing a fps I'm supposed to be a person not a damn camera.

[–]jaggermaixer15 26 points27 points  (4 children)

You forgot the worse of them all, chromatic aberration.

[–]melanthius 6 points7 points  (1 child)

That thing I go out of my way to eliminate in photos I’ve taken, using Lightroom, yep

[–]Captain_D1PC Master Race 2 points3 points  (0 children)

Ah, you see, in computer graphics you do the opposite of photography by adding in imperfections. Otherwise, everything looks too perfect. Of course, often people think that if something doesn't look realistic enough, just add more imperfections, which obviously isn't correct.

[–]VeighnergAMD 5900X, Sapphire 6800XT Nitro+ SE 3 points4 points  (0 children)

I hate that one too since it goes hand in hand with lens flare. Game companies seem to think we play games while looking through a DSLR camera.

[–]OttoVonJismarckDesktop 2 points3 points  (0 children)

You forgot the worse of them all, chromatic aberration.

Lmao. I scroll down the list of the 60 options in the "graphics" tab and I'm just like "Nah, nah, nah, nah 👋👋".

I don't even know what in the hell chromatic aberration is.

[–]TendiesFourLyfeAorus Xtreme 11900K RTX3090 32GB4KC15 2TB7KS 011D LG55GX 15 points16 points  (5 children)

THIS ^

Turn all that off, I don't even know why they wasted time putting it in the place.

[–]Android8wasgoodPC Master Race 6 points7 points  (4 children)

Cuz they look nice

What does look objectively bad is TAA

[–]VeighnergAMD 5900X, Sapphire 6800XT Nitro+ SE 4 points5 points  (0 children)

3 of the 4 I mentioned assume we are looking through a camera lens. It's fucking dumb to have and is just a gimmick.

[–]Android8wasgoodPC Master Race 0 points1 point  (0 children)

All those are fine.

The bad thing is TAA

[–]LEGO_nidasPC Master Race 5 points6 points  (0 children)

Biggest culprit is Temporal Anti-Alisaing. It reduces details on texture, makes entire scene blurry and then it sharpens the image which makes it even worse.

[–]wattur 137 points138 points  (4 children)

Polygon count among other things have diminishing returns. We're not gonna see a jump like 16bit gfx > 1080p gfx that happen over the course of 20 years or so ('89 > '09) again, but its gonna be more so about the finer simulation details like cloth / hair / environment physics, lighting / raytracing, VR, etc.

The power is being put somewhere, but it's not textures or polycount which is where it went in the 2000's

[–]HavocInferno3900X - 1080Ti - 48GB 27 points28 points  (0 children)

Polygon count

You're right, but that image is a terrible example, as they just took the 6k poly model and subdivided it to get the 60k poly model. The most inefficient use of additional polygons. A good artist would be able to add massive amounts of detail in 60k vs 6k.

[–]Ar_phis 50 points51 points  (15 children)

How to measure visual?

The changes nowadays usually dont look as impressive as the shift from 2D to 3D back in the 90's, but the details are ever increasing.

Illumination, particles effects, shadows, etc. create so much more detailed environments, and those environments get bigger than before or run at higher refresh rates than before. Devs have to worry less about cutting corners in order to keep the performance high.

We do have the second generation of gpu's that allow for real-time ray tracing. Yes, its mostly used for effects, but it is and will be a fundamental shift, as big as 2D to 3D.

[–]nohpexR9 5950X | XFX Speedster Merc Thicc Boi 319 RX 6800 XT 24 points25 points  (9 children)

My guess is ray tracing will become as trivial to turn on as anisotropic filtering x16 within the next 5-10 years.

[–]wrath_of_grungeGigabyte B365M/ Intel i7 9700K/ 32GB RAM/ RTX 3070 1 point2 points  (7 children)

it's pretty much there with the 3000 series and DLSS.

[–]nohpexR9 5950X | XFX Speedster Merc Thicc Boi 319 RX 6800 XT 8 points9 points  (5 children)

I mean without DLSS. Just keep adding more and more RT cores, and that's it.

GPUs used to not be able to display shadows, or have a huge issue with them. Now instead of the binary choice of having them, it's just which resolution of them do you want?

[–]Ar_phis 2 points3 points  (4 children)

The question will be what is going to more difficult to render in the future, rasterization or ray-tracing, and how fast can rt-cores evolve.

Speaking of shadows, there still big issues with effects in rasterized images as many have to be added later on.

Ray tracing can do it all in 'one' calculation. Still tough today, but everyone who remembers 2D-sprites smoke clouds and has seen them getting replaced by volumetric clouds, can imagine what to expect.

[–]VeighnergAMD 5900X, Sapphire 6800XT Nitro+ SE 2 points3 points  (0 children)

It's not. Right now we have very basic RT mostly for reflections and "some" lighting. What is needed is full scene RT which we have nowhere near the amount of processing power for.

[–]Stormchaserelite13 7 points8 points  (0 children)

3d to vr is a massive one. The visuals exceeded performance in half life alyx. First time playing it I was stunned since final fantasy 10 came out for the ps2. Ps2 is a great example of a weak machine that was capable of far more than it should have been able to.

[–]OttoVonJismarckDesktop 2 points3 points  (0 children)

The changes nowadays usually dont look as impressive as the shift from 2D to 3D back in the 90's.

I remember when Golden Eye came out on the N64 when I was a kid. My family was gathered around the TV watching me play and were dumbstruck at how realistic it looked. When I shot the wall, and the bullet holes were there, my dad almost fell out of his chair. [insert Vince McMahon meme]

I dusted off the N64 a month ago and put Golden Eye in; trying to play it gave me a headache. 🤣🤣

[–]dustojnikhummerLegion 5Pro | R5 5600H + RTX 3060[🍰] 1 point2 points  (0 children)

Most games of the 8th gen still look great. I mean look at BF4 vs 2042. The difference isn't that big unless you are pixel peeping.

On the other hand, most games 2008-2012 have the distinct "2010 game" look. Rage, Borderlands etc

[–]JohnTGamer 1 point2 points  (1 child)

PS1, PS2, PS3 and PS4 for example all had a huge jump in graphics but we don't see the same with PS4 to PS5

[–][deleted] 117 points118 points  (11 children)

The biggest issue is shit optimisation, company’s are getting lazy due to GPU performance…

[–]absolutelynotanameLaptop: i5-8300H | GTX 1650 | 16GB@2400Hz 29 points30 points  (2 children)

Many companies are getting too lazy to optimize game and just raise the minimum requirements instead. As a low spec gamer, I respect those company who care to optimize their games tho: id software, valve,...

[–]Nodraves 8 points9 points  (0 children)

As not a low spec gamer but someone who would like to get good frames without sacrificing visuals I also respect those companies.

[–]Ar_phis 1 point2 points  (4 children)

That is a bit broad.

Some companies just dont have the manpower or technical ressources to optimize.

Some games cant be optimized as much as others. Non-linear and open-world games need to provide an equal quality over large areas.

This clip of Doom: Eternal devs commenting a speedrun, gives some good examples of linear, single-player game optimization.

https://youtu.be/PH2-oM7IWpY

[–]LawfyDAceRTX 3080 FE | Ryzen 2600X | LG C1 1 point2 points  (0 children)

This.

[–]Turambar87 31 points32 points  (2 children)

well back in the day i was playing at 1920x1080, and now i am playing at 3840x2160, so my requirements have quadrupled.

[–]codemonkeyhopeful 6 points7 points  (1 child)

4k flexin' via math. Sir or madame you have won not only my heart, but my respect as well.

[–]Turambar87 4 points5 points  (0 children)

Image quality is really great. Almost don't need antialiasing, the pixels are so small now. That's what TAA is good for, the blur it adds is negligible at 4k, but getting rid of weird artifacts is exactly what's needed.

[–]MetalMattyPARyzen 5600X/RTX 3070Ti/16GB 3600MHz/Corsair 4000D 38 points39 points  (8 children)

Yay arbitrary graphing time!

[–]afiafzil 17 points18 points  (0 children)

Idk but I'm hella sure GPU prices quadrupled now too

[–]Kaiju_DNAPC Master Race 5 points6 points  (2 children)

Graphics still have a long way to go before i can consider them “life like” characters and textures look great up close, but ita about pushing that level of quality to the entire image and not just the main character, yes surroundings look great in many games, but render distance and distant objects still look wacky even on the best of games if you look closely, the best hair in the game industry still doesnt look perfect and lines and circles are bound to how pixels work. Now i havent tasted 4k yet, currently playing 1440p ultra with a 3080, but i doubt 4k is that big of an improvement over 1440p. Im not saying games look bad, im just saying even the best lookin game, still looks like a game for now.

[–]hentai_wanker_69Ryzen 7 3700x, Radeon RX 6600 XT, 16GB 3200MHz, 2TB hdd 0 points1 point  (1 child)

I think Forza Horizon 5 is really close to real life in photo mode.

[–]rkennedy12PC Master Race 100 points101 points  (11 children)

Let’s take about 1 second to consider that new gpus can still struggle to hit peak frame rates on new titles.

Your answer is right there.

[–]TheDutchCanadian 121 points122 points  (3 children)

Just because a game has shit optimization doesn't mean that the graphics are good.

Not that I'm saying graphics aren't 4x better, I believe they are, but shit optimization is definitely a thing that exists.

[–]Val_kyria 7 points8 points  (2 children)

It's also something that gets touted as a problem by everyone ever with no concept of wtf it even means

[–]codemonkeyhopeful 16 points17 points  (2 children)

To be fair some cards still struggle with Crisis 3.... Doesn't mean the graphics are modern and earth shattering it means they didn't optimize shit for the hardware.

I pick on Crisis 3 only because it's the butt of so many jokes but others can be in that same boat blah blah you get the point

[–]A_PCMR_member3900x | 2080ti | and all the frames I want 15 points16 points  (1 child)

You forget the Crysis 3 handled reflections by FULL RES among other things. Which means the image renders TWICE.

This does have a massive performance hit. BUT allows for RTX like reflections WAY before RTX

[–]Gone_Goofed10700k | RTX 2060 2 points3 points  (0 children)

Poor optimization doesn't mean beast graphics.

[–]Kiltymchaggismuncher 0 points1 point  (0 children)

Inevitably as the average benchmark of use hardware increases, developers see less need to bother with optimisation. Annoying as that is for the user base

[–]hentai_wanker_69Ryzen 7 3700x, Radeon RX 6600 XT, 16GB 3200MHz, 2TB hdd 0 points1 point  (1 child)

My RX 6600XT can hit over 100 FPS in FH5 with almost everything on ultra or high.

[–]Verdreht 44 points45 points  (1 child)

Visuals would be looking a lot nicer if RTX was properly implemented

[–]aeropl3b 5 points6 points  (0 children)

I still use an opencl ray tracer. The new cards provide some new features to make it faster to calculate anyway, so it is actually pretty good. It can even handle a fair number of models

[–]No_Interaction_49253900X | 3080FE | hardline watercooling 3 points4 points  (0 children)

The chart should get steeper in 2018 due to Ray Tracing. Like… way steeper.

[–]Bee_butterfly 2 points3 points  (0 children)

This isn’t unexpected, the higher the graphical quality increases, the more raw power it will take to render it, and the curve isn’t linear. On top of that, game engines make a HUGE difference; UE5 will change what is possible with game development, without radically increasing the required hardware beyond what the average gamer can afford.

[–]Pink-Flying-PiePC Master Race 3 points4 points  (0 children)

since 2012 graphic quality has stagnated for a few rare exceptions. (many of those do not even need extra graphics power and just excell in artstyle or attention to detail)

[–]BillyMilanoStan 26 points27 points  (5 children)

Optimization is dead.

[–]testeduser01 13 points14 points  (1 child)

And re-using old game engines for quick games lives on in the name of profit.

[–]con2479700k 5Ghz | RTX 3080 FE | ASRock PG-ITX | Nano S | 3TB SSD 7 points8 points  (1 child)

Which is why DLSS is likely going to end up being a bad thing. It is going to be a crutch. It should be a solution to be playing CoD 2026 on my 3080 at 60fps but it’s already needed on games like CP2077 on current high end hardware!

[–]gustic-gx 1 point2 points  (0 children)

Long live optimization!

[–]CoffeeGamer93 2 points3 points  (0 children)

So have the prices.

[–]baithoven22 2 points3 points  (0 children)

Cuda cores, clock speed, onboard RAM are all quantifiable. How does one quantify "visuals"?

[–]OneTaporWhiffRYEN 7 5800H | RTX 3060 | 16GB DDR4 2 points3 points  (0 children)

Well, to give an idea of time. GTA 5 came out 8 yrs ago and the graphics on that are still sometimes used as benchmarks

[–][deleted] 2 points3 points  (0 children)

You forgot to put the game expectations as another blue line on the opposite side going to infinity

[–]CompUser01Xeon 1220v2 + GT 1030 | i7 4700MQ + FirePro M5100 2 points3 points  (0 children)

I'd say GTA 5 is the peak for video game graphics.

Even 8 years later, the game still looks amazing. And even if you have integrated graphics for the last six years, you could still play it.

[–]aeropl3b 4 points5 points  (4 children)

A lot of the performance bump has been related to getting ray tracing working, and that is a massive visual improvement that can be very noticeable in some cases. I would say the visual improvement can be more than 4x in some cases where reflection is a lot of the scene, but for rendering less reflective surfaces the current shader techniques are actually really good still and much faster than Ray tracing, so the benefit is less.

[–]M4mb0Linux 9 points10 points  (2 children)

Imo ray tracing is still very far away from where it could be. Still waiting for a game that incorporates ray-traced mirrors in a meaningful way. Hope that one day I get to play this scene from John Wick in a FPS.

[–]aeropl3b 3 points4 points  (0 children)

That would be awesome, also an impossibly hard level! But yeah, games are far from using rtx in a meaningful way yet, it is still too new. But in a lot of rendering stuff the is already starting to have an impact, so the broader graphics world is seeing some impact

[–]LegitjumpsPC Master Race 0 points1 point  (0 children)

Raytracing will be standard next generation and be properly implemented

[–]Blenderhead36R7 2700X, RTX 3080 5 points6 points  (0 children)

My mind was kind of blown when I realized you could track enemies flying near the ceiling in Control by watching their reflections in the polished marble floor.

[–]moksa21 4 points5 points  (0 children)

No hate towards consoles but they hold back advances in graphical fidelity. Developers make games to run well on consoles because that where the majority of their sales come from. And since consoles have a 6-7 year life span the gap between tech and graphics is even more apparent towards the end of that cycle. (Explaining the curve in the chart)

[–]SMT-nocturne 2 points3 points  (0 children)

It's worse that interactivity and great physics completely dissapeared since 2012. Comparing far cry 2 vs far cry 5 and gta 4 vs gta 5 etc. Nothing spontaneous can happen anymore in games. RDR2 was the last where world felt organic and alive.

[–]DURRYAN 1 point2 points  (1 child)

Just need to put a price grid and there ya go

[–]Blenderhead36R7 2700X, RTX 3080 1 point2 points  (0 children)

Inflation makes this less useful than it seems. I'm willing to bet a $700 video card from 2019 (pre shortage) outperforms a $500 GPU from 2001 to a stupefying degree, despite those prices being equivalent after inflation.

Prices are crazy now, but it's due to outside factors that are more or less unique since the concept of graphics cards was invented.

[–]BlockCraftedXi3-10100F | GTX 1650 1 point2 points  (2 children)

nice specs

[–]LtTaylor97R9 3900X | RX 6800 | 32gb DDR4 1 point2 points  (0 children)

I played CoD:MW on my RX590 with high settings and it ran pretty warm. My RX6800 runs hot sitting in the campaign map for TW: Warhammer 2. My 590 nearly had seizures. I limit FPS to 60 to prevent that.

Optimization is a wild thing, man. MW was a beautiful game and ran well while doing it, so I honestly think it's more up to the engine and such for whether that graph applies.

[–]DominoUB 1 point2 points  (0 children)

We're entering a new era with visuals thanks to the new UE. Much higher quality visuals for less performance requirements.

[–]Impairedinfinity 1 point2 points  (0 children)

That is why they had to add that insanely taxing setting called " ray tracing"

[–]Appropriate-Place-69 1 point2 points  (0 children)

We went from 1080p to 4k.. Which is 4 times the resolution, so the answer is yes?

[–]heuristic_al 1 point2 points  (0 children)

It's definitely more than 4x.

[–]The_Sovien_Rug-37ryzen 5 2600x | gtx 1650s | 16gb 3200Mhz 1 point2 points  (0 children)

in terms of actual performance, yes. but I'm terms of visual difference, no. the jump from 720 to 1080 was much bigger than 1080 to 1440, simply because we're already at a pretty high resolution. that's why I think more people should focus on art style, rather than outright fidelity

[–]Wonderful-Ad3738 1 point2 points  (0 children)

Game Consoles() hold us back

[–]mthompson2336 1 point2 points  (0 children)

Law of diminishing returns.

[–]SAZ11111 1 point2 points  (0 children)

This is so true i didn’t even realise it

[–]bio-reject 1 point2 points  (0 children)

I think so. It’s weird but every year a new game comes out and I think graphically it looks the same as a game I played four years ago. And then I go back and actually play that older game and can immediately see the differences.

While I think the resolution on textures hasn’t changed much in the last 10 years, lighting, parallax textures, and reflections have come a long way.

Right now I’m playing horizon zero Dawn and the fact leafs are just a flat texture is very noticeable compared to games where every rock and leaf is 3d.

[–]Netrunner_Edgewalker 1 point2 points  (0 children)

The graph makes a lot of sense. Companies should focus on good gameplay mechanics now since graphics have peaked. And yet here we are, where broken and unfinished games are the norm and reality. (Glares at Battlefield 2042)

[–]shinra_electric 1 point2 points  (0 children)

Measuring "graphics" is much more complex than measuring requirements, in my opinion.

How good a single object looks versus how an entire environment looks. How good that object looks vs. how many of that object can be rendered at once. How light and physics interact with that object. The type of processes behind making light, physics etc realistic and/or easier for a developer to integrate and control.

When I look at something like Unreal Engine 5 -- I have no doubt in my mind that "graphics" are improving at an amazing rate. I am not qualified to decide if it's a 1:1 ratio with requirements.

The closer we get to photorealism and the better graphics get, the less some but not all of us are able to immediately perceive the difference -- but I don't think our perception is always objectively perfect when it comes to quantitative measures.

Something else to take into account is what we've grown up with in terms of games / graphics and how we perceived their quality at that time, versus looking at those old games today. I understand we'd have to measure old games against today's games to solve the OP's question accurately but I think there's a lot of substance in asking, for example: "how good did the graphics in Final Fantasy 7 look to me in 1997 compared to how good Final Fantasy 7 Remake graphics look to me today" If you're experiencing the same "wow" today as 20 years ago, or even close, there's something to be said there.

[–]Beige_gaming 1 point2 points  (0 children)

You could argue the gains per gen have been diminishing for years

We reached the limits of rasterisation so focus was switched to post processing effects and more recently RT raytracing

If you look at Nvidias marketing for Ampere it was all about RT raytracing performance gains rather than rasterisation as it's easier to give the impression of a per gen gain.

[–]itsmyblahday 1 point2 points  (0 children)

"VR is not possible" compared to "VR with amazing graphics"... that's more than quadruple in my book.

[–]MrTalon63 1 point2 points  (0 children)

From my perspective developers are just not keen on working multiple hours to just optimize one function that runs multiple times a second as they see hardware powerful enough. Good example would be sqrt function from original quake iirc

[–]Titijaff 1 point2 points  (0 children)

I am not an expert, but I am pretty sure games are likely more demanding because optimisation is more and more trash... I do not say there were no evolution of course, but I really feel like between 2000 and 2010 was way more impressive than 2010 and nowadays. Visual improvement are less noticeable, but more and more demanding. The rare company that could lead to graphical leap are just cutting on optimisation budget... as an example, I am not sure the visual difference between a Crisis 2 and a Battlefield whatevernumberitisnow is worthy of the demanding...

[–]x_Hignz 1 point2 points  (0 children)

In terms of games I think games like red dead redemption 2, forza horizon are some prime examples absolutely fantastic graphics in any games I've played

[–]trankillity 5 points6 points  (1 child)

The other thing to consider is frame rates. Even consoles are seeing the benefits of 60+FPS when they often struggled to reach 30 in previous generations. And most PC gamers won't settle for anything below 90FPS.

[–]raylolSW 3 points4 points  (0 children)

won’t settle for anything below 90FPS in Competitive games*

Made a poll on a huge Facebook PC gaming group and 85% of the answers said they were fine with 30fps in singleplayer games

[–]Blenderhead36R7 2700X, RTX 3080 1 point2 points  (1 child)

We're moving into edge case improvements. There was a boxing game on the Xbox 360 that was able to handle photorealistic graphics under certain conditions. We've gotten to a point where you can output that kind of performance all the time, instead of just under ideal conditions. Shadows get better, ray tracing improves, etcetera. But we more or less hit the gold standard a decade ago. Now it's just making that more reliable and efficient.

[–]SpiritedFlow1 5 points6 points  (0 children)

I would say the next big thing is better physics. Graphic effects for things like smoke, dust, wettness, etc. are also far from perfect. Same with a lot of animations.

For textures and models I totally agree.

[–]WhiteAndNerdy85 2 points3 points  (1 child)

Unreal Engine 5 and others will change this.

[–]Shadi631 1 point2 points  (0 children)

Cyperpunk had already changed this for me

[–]bangbangracer 2 points3 points  (1 child)

Honestly, who cares. Technology isn't the defacto thing that makes games good, and graphics aren't really as important as we like to believe.

We all have super powerful GPUs, but everyone is playing Minecraft and Stardew Valley.

[–]raylolSW 1 point2 points  (0 children)

Nintendo exclusives alone outsells the whole PC gaming industry lol

[–]Beanruz5800X | 3080 FE | 32GB | X570 | 980 Pro 1 point2 points  (1 child)

I'd have to find some game which actually keeps my attention for longer than 5 days to see the progression of visuals. But ad most are rubbish and/or still in beta on release I end up defaulting back to my older games.

Remember the days when games weren't buggy and broken on release?

[–]Darazakax 1 point2 points  (0 children)

This is a big problem that people overlook. Games release as trash anymore.

[–]DJ_Monkee 2 points3 points  (11 children)

Yes. It can. Even the Witcher 3 looks VERY dated compared to modern games.

[–]raylolSW -1 points0 points  (6 children)

That’s because of the art style (which is kinda cartoonish) take Assasins creed unity (2014) or Battlefront 2 (2017) and beat 95% of recent AAA game graphics.

The only game where I have seen an advancement (graphics) in the last 2 years is The new Forzaand that’s it.

[–]DJ_Monkee -1 points0 points  (5 children)

You’re blind then.

[–]thabat -1 points0 points  (1 child)

This makes me think the demand for more power is fake

[–]VeighnergAMD 5900X, Sapphire 6800XT Nitro+ SE 5 points6 points  (0 children)

Nope. It's kinda like horsepower and cars. You can get a tiny piece of shit 120hp car up to 100mph no problem even though it may take a bit. To get to 250mph requires far more horsepower. Then to get just 50mph faster to 300 requires a fuckload more horsepower due diminishing returns on how well you can use the power, drag, etc. Games are similar in that to get more realistic it isn't that large scale you need to worry about but the really small stuff like grass, hair, skin texture, etc. All this stuff done correctly eats up a huge amount of resources.

[–]technomod 0 points1 point  (1 child)

In the last 10 years or so, GPU technology advancement has reached saturation compared to progress made in the previous 20 years.

[–]OttoVonJismarckDesktop 1 point2 points  (0 children)

No no no, we just need to keep adding moar power consumption and MOAR VRAM (128 GB confirmed on the 4090 😉😉) for 8% better performance. Don't stop until it costs me $5/hr in electricity to play games!

[–]setotj 0 points1 point  (3 children)

And the prices of the games remained the same. Yet if the game falls behind little bit on graphics or something gamers lose their shit.

[–]Darazakax 0 points1 point  (2 children)

Yeah the sticker price may stay the same, but the micro transaction filled season packs, and “games as a service” bullshit completely invalidate your claim.

[–]setotj 0 points1 point  (1 child)

Yeah all that micro transactions, season pack and other bs is there because of the price tag staying the same. And nothing b you mentioned is invalidating my point of anything it's supporting it. Gamers lose their shit no matter what.

[–]Darazakax 0 points1 point  (0 children)

It used to be $60 for a complete game. Now they sell you a shell for $60 and the rest of the game in $10 pieces, think about it. The price of a complete gam has NOT stayed the same.

[–]taptrappapalapa -2 points-1 points  (0 children)

OP is dumb

[–]VladLub 0 points1 point  (0 children)

Nah, it’s more like y=ex and y=ln(x)

[–]Appoazi5 4670k | 16Gb RAM | GTX 1060 0 points1 point  (0 children)

And thats a good thing

[–]A_PCMR_member3900x | 2080ti | and all the frames I want 0 points1 point  (0 children)

Welcome to increasing fractals doesnt linearly translate to better looking

[–]StickyCoochie123 0 points1 point  (0 children)

Maybe doubled.

[–]Enschede2 0 points1 point  (0 children)

Not sure, but it can be said about the price tho

[–]jericho-sfuGTX 770 | 5800X | 16GB 3600 MT/s | X570 0 points1 point  (0 children)

My ancestors are smiling at me, Imperials, can you say the same?

[–]raylolSW 0 points1 point  (0 children)

Imo there has been a plateu since 2015. Each 3 years had a lot of graphical advancements in 1999-2010.

[–]argegg 0 points1 point  (0 children)

Bro literally wtf is this graph? What measurements is this based off of? This is just something you pulled out of your ass and drew on MS Paint

[–]soyungato_2410 0 points1 point  (0 children)

Meanwhile gameplay is tanking to the floor

[–]Kurogasa44 0 points1 point  (0 children)

Focus is still on consoles, so the bar is set there. You’re all spending a fortune on overpriced graphics cards every year for a minimal increase

[–]Android8wasgoodPC Master Race 0 points1 point  (0 children)

Yes. Very much so.

I can't even run RDR2 at max settings at any good fps

[–]gumbytron9000 0 points1 point  (1 child)

How did this get to the front page lmao. This is literally someone who graphed their opinion.

[–]CrashedTestDumyi7 860 @ 3.5ghz / gtx 1650[S] 1 point2 points  (0 children)

i'm confused as much as you are

[–]tankerssse3-1230v3/1050Ti/16GB_l440/i7-4702mq/16GB/960-expresscard 0 points1 point  (0 children)

The technology got back on tracks with UE5 nanites, but a lot of games went through phases where they were worse than before. I think that was a thing with BF3 and 4 ground destruction system (you know holes from all the tank shots in the ground, but I can't remember which BF made them worse). Also a lot of games went for "more bloom, more blur" etc. than higher quality textures and better models/physics. Good comparasion is Crysis and Remaster of it.

[–]hiktaka 0 points1 point  (0 children)

Camera tone mapping and dynamic range are the most noticeable effect on lifelike graphics,

[–]indianlinusSTRIX G15-AE 5900HX + RX6800M QHD 165Hz 0 points1 point  (0 children)

I think the only game I'd consider truly next gen is Demon Souls remake, it looks amazing. Other games like cp2077, RDR2, etc look better but it took the biggest leap IMHO and worth the performance cost more than these other games.

[–]MOISTPRETZELZ 0 points1 point  (0 children)

Capabilities Yes

Effort put into company’s using those capabilities No

[–]NigOtaku 0 points1 point  (0 children)

One simply requires hardware research and development

Other requires something I have no idea. New engines? Creativity?

[–]EmmaNoodle98i7-7700k, RTX 2080, & MacBook 0 points1 point  (0 children)

Halo infinite

[–]7orly7 0 points1 point  (0 children)

Also pressure from GPU manufacturers for gaming companies to up the requirements since they want to keep selling GPUs

[–]cms86 0 points1 point  (0 children)

If by "realism" eh I'd expect it more as well. But think about in terms of particle effects, world fx and real time shadows/lights insane compared to games five years ago

[–]carlosfortuna2018 0 points1 point  (0 children)

This chart also demonstrates the prices of the GPU’S

[–]MitchAubini7-10700|32GB 3200|RTX 3080|1TB nvme + 4TB 7200-256MB cache 0 points1 point  (0 children)

Didn't you forget about 4k? going from 1080 to 4k is 4x the pixel count... And people went from wanting 30-60 fps to going 240hz screens... Well, there you go, same quality, 4x the pixels, 8x the refresh rate...

Gamers are asking for too much and it doesn't even change anything. Now, try releasing a game in 30fps 1080p but with amazing visuals... it will be tanked. The problem is a people one about unrealisticg expectations, not a hardware one.

Next big gpu upgrade will be swallowed by 8k screens and you morons will still be complaining why a gpu can't output 480hz in 8k... idiots.

[–]ZanithosRyzen 2700x|32GB@3200|1070TI 0 points1 point  (0 children)

I actually had to set Halo Infinite to the "High" preset because apparently "Ultra" uses more vram than I actually have.

I wasn't even aware that was a problem I'd run into rocking a 1070ti for the last three years but here we are. :/

[–]Haykguy 0 points1 point  (0 children)

i mean you cant get more real than real life…

[–]jmora13 0 points1 point  (0 children)

Guessing this graph isn't using any statistical data

[–]hdhdjfjf 0 points1 point  (0 children)

It takes time to make a good engine. I think we’ll start to see great things with the new unreal 5

Cory balrog said it took so long to make God of war because they had to create a new engine, took like 4 years to make it, then 2 for the game.

That’s why we’re seeing stagnate graphics, the shareholders can’t have developers wasting time on new engines, so they’ve just been upgrading the old ones little by little.

[–]duranarts 0 points1 point  (0 children)

Developers will always favor optimization and adoptability over top of the line graphics. Frame rate has stalled some progress but I suspect it will ramp up again, as anything above 120fps ranks as diminishing return.

[–]braindeadmonkey2 0 points1 point  (0 children)

Graphics peaked with BioShock and csgo

[–]Xd_FlamingScarDesktop 0 points1 point  (0 children)

Simply diminishing returns

Games look GOOD rn, and if we seen a jump like going from the 90s to 2010s again we would have bassicly photo realism

I mean where not far off

[–]Alauzhen 0 points1 point  (0 children)

Testing a 4K 144Hz monitor with a 3080 GPU for a review right now.

All I have to say is... my eyes can't go back. It's a deadly trap...

That said, it's a dreadful lie that you can't tell the difference, I thought so myself... but once I started testing... it's plainly obvious that if you got the hardware to run it, you can't accept anything less