Why have graphics hardly improved in 12 years?

Why have graphics hardly improved in 12 years?

Attached: crysis.jpg (960x600, 258K)

Other urls found in this thread:

youtu.be/Zfxs6PjjXyc
youtu.be/2NsdM1VS5u8
twitter.com/AnonBabble

Because the video games market has become 75% console + mobile

Hardware is (wait for it) hard to improve.

How do modern games look worse than Crysis and run way worse than Crysis ? I remember playing this with contemporary tech and it was running decently.

no its not just make more and smaller transistors forever and ever

crysis does not look nearly as good as modern games

OP here,

Brainlet question but have programmers got sloppier and sloppier to the point they need over powered hardware just to make modern games look on par with crysis?

Give this man the nobleprize

look back at games 12 years prior to crysis (1995) and tell me graphics have come as far from 2007 to today.

>smaller transistors
Meanwhile, powerful GPUs in 2019 are even bigger than powerful GPUs from 12 years ago

games today are optimized better than crysis but it takes a lot of extra GPU muscle to squeeze out visual gains

never said they made as much progress in the previous 10 years just said they made progress
compare it to battlefield 5 or something and the difference is pretty clear

Aiming for photorealistic graphics is fucking expensive and the result is often not great. You can make a game look good expending way less if you aim for a more stylized art direction.
Also, hardware isn't evolving nearly as fast as it once was.

Look up "Software Crisis"... We've been there once already

you can't just slap more polygons on something anymore and call it a day

This, and this is why we need raytracing

True, but AMD toddlers will disagree.

>never said they made as much progress in the previous 10 years
You're right, we've made more.

> you can't just slap more polygons on something anymore and call it a day

300 KILO OF POLYGON ASS

Attached: gMnj0Mq.jpg (539x657, 45K)

AMD already said their future GPUs will eventually support it.
AMDrones will be the first to use the "AMD does raytracing for cheaper" when they'll release. Just wait

Bayoneta 2 on CMU is the best looking game I have played this year on my 270€ laptop with a 950m at 720p 30-50fps.Games can look amazing without realism

Companies discovered you don't need ultra realistic graphics to make enjoyable, fun, profitable games.

>You can make a game look good expending way less if you aim for a more stylized art direction.
This. Not every game must be photorealistic. I wish more devs would acknowledge it.

I think the capability of technology has exceeded the average skill of most developers and programmers. And what began as a niche industry built on the backs of a very talented few has become an almost automated industry filled with barely trained bodies. Certain skills have potentially been lost with time just like the animation business which now no longer even has anyone capable left to do high-quality 2d animation if they even wanted to.

That's right.
You only need a multiplayer game with a Battle Royale mode, DLCs, and microtransactions

>Spend millions trying to make super duper real game
>"next-gen" blockbuster hit game
>it does poorly because people's expectations were too high
>lose millions
>meanwhile 1 guy makes a game like undertale in his spare time
>2d sprites
>makes millions
>does well and is getting a sequel

GEE, I WONDER WHY.

only a handful of games have been good/memorable since 2007, i disagree.

Plus microtransaction/loot box cancer.

The difference between 20 polygons to 200 polygons on a player model was more significant than the difference between 200-2000 or 2000-20000.

Diminishing returns i guess, so its hard to see.

to be real the foliage is very detailed and all but the interior scenes do not look good at all

Attached: Untitled-1.jpg (2560x1440, 480K)

crysis doesn't look that good, if you actually play it you will see the age. foliage looks good but a lot looks really bad
it also runs like garbage, only really uses 1 thread and at most 2 threads, and can't have any number of actors or good draw distance
there are plenty of games that could do the same thing crysis did and release software that hardware can't run. the 3d models used in renders are far beyond what can be done in real time, but if you just shit those into a game like crysis did it doesn't mean much

Peak was reached long ago, realism is dogshit.

When a game becomes too real it starts feeling less like a game and more like an interactive movie.

more significant in the sense of perceived detail i should say.

>pic related

Attached: polygons.jpg (1280x720, 171K)

>realism
>good looking
pick one

Attached: 142055.jpg (1195x2048, 365K)

Yes and no, graphical fidelity is moot when the world itself is dead imo. They've spend insane amounts of money on graphics because we covet what we see-- interactivity has suffered because of it and now what we have are highly detailed and thoroughly dead or otherwise completely mundane worlds. Nothing on the input end has changed since the early 80s.

This all breaks apart when you start animating it. The 25k poly model will obviously move more fluidly.

realism brainrot

Retard

We're nowhere near diminishing returns on anything but character models. Go into any game and you will see huge flat surfaces everywhere. All they have is normal maps to try and fake detail but they don't come anywhere close to proper displacement mapping.

If it progressed the same as it did back then, video games would look more real than reality which is impossible.

is this why the game runs like ass?
heheheheheheheheheheheheheheheheheheheheheheheheheheheheheheh

no, it just looks bad, I laugh at people who say pixel art is "oversaturated" in indie games but have no problem with reAAAlism when it's the art style of reality, nothing more overdone than that

Only on winblows, runs great maxed out on lignux AMD

And when you cant make smaller transistors
Build more and smaller cores forever and ever
And when you cant do even that
just submerge the entire thing in mineral oil just above freezing point and ovrrclock

Why are you retards so obsessed with graphics? The question should be is why games have stayed the same for the past decade within the same relative scope.
The only game really pushing boundaries is Star Citizen but that is mismanaged piece of shit and will never come out, even "AAA" developers like Rockstar just focus on making games that look pretty within the same limited confines of what console CPUs can accomplish.

Do you get mad because movies look like real life? Imitation of reality is obviously a separate axis from visual artistic expression.

I think we've already peaked in video game graphics. A realistic looking game is pointless if your input its limited to button presses.

It's time to develop VR further until we reach full body immersion.

bro how could you ever say video game graphics have peaked when there still is not a single video game that looks like real life

>but have no problem with reAAAlism when it's the art style of reality, nothing more overdone than that
Are you claiming you don't see God rays all the time when walking in the street?

Crysis was fully coded for the game and optimized.

Modern AAA games just import 50gb of libraries and use 10% of it

Recording realistic video with phone is minimal effort, achieving similar but worse visuals by spending millions of dollars and hours on HD textures and models is pointless when there are SNES games that look better than any recent AAA game

That would be rock bottom. Perfect realism is the death of art.

Because if they wanted to make a game that looked like real life they could, it's just not worth the resources, especially since it still wouldn't feel 'real' due to your limited interaction with it.

Attached: 1.jpg (2700x1619, 513K)

Crysis got bad graphics.

Sure im just saying it doesnt look like games have improved as much, but i bet if you look at total polycounts and animation frames they have significantly

I think this really sums it up. You used to always hear about the learning curve for devs on consoles like PS and Nintendo because the hardware was special made. You usually could see the difference in quality from launch titles to games made towards the consoles EOL. Now most of it's standard PC parts and it just seems that the devs don't have any "tricks of the trade" to really push the hardware like they used to.

Personal preference is not an argument. You have no grounds to pretend to be superior to people who have the opposite preference. You'll see that you're in the minority if you look at how well games with realistic graphics sell.

Obviously not true given that game companies spend millions of dollars on realistic graphics and still can't come close. The image you posted doesn't look like real life either.

>The image you posted doesn't look like real life either.
Real life looks worse, you're right.

We've reached the limits of rasterization.
Fully ray-traced games are the future.

>unrealistic shadows and poor contrast
That looks bad compared to exodus' raytracing

That looks like shit. It has dense vegetation so retards think that it's has good graphics when it really doesn't.

>Real life looks worse
And that's why anime waifus and furry thots will always be superior to Human roasties

They haven't? That looks like a PSP game

What game is this from?

I think Louis Castle, of Westwood C&C fame, had the right idea. He said they would put out an ad looking to hire artists with 10+ years experience.
Artists applied and found out it was for computer games, and said they didn't really know how to use a computer much less program. To which Louis replied,
>I can teach you how to program, that's the easy part, but I can't teach you how to "see" the world as an artist. That takes life experience. Programming is the easy part.
Look how those C&C games came out. Still hailed today as what a good game should be.

Attached: louiscastle.jpg (316x235, 14K)

>2019
>studios still not using performance capture so Human characters in their video games stop moving like bulky robots

almost every AAA game uses motion capture for everything important

>all these retards citing art direction

Not an argument. Crysis was a AAA (which had plenty of artistic direction, by the way. Realism and art direction are compatible) and tried to wow with a very good looking jungle. It was a huge step up from even just Far Cry, the game that came before it.

Battlefield 5 just doesn't look visually impressive over Crysis, it's at best an incremental step up, and it's on the same axis as Crysis in that it's an AAA game trying to wow with its environments with realism.

Most of the culprit here is middleware. Everything runs on Unity, EU4, whatever with specific tools used for any number of tasks. Lots of games use the CPU version of Physx, for instance. The dev isn't making the decisions concerning code, they're buying someone else's code and trying to adapt it.

Can't think of a single game that doesn't use performance capture, even low budget indies do it because it's so cheap.

Then why are character motions still so unnatural?

In real life, contrast is shit. Everything is too bright. The surroundings are dull.

Why would anyone want to replicate that? I can't even find a picture of a decent parking lot to compare this too, because real life just looks that shit.

Attached: 1298767.jpg (1200x960, 162K)

>A realistic looking game is pointless if your input its limited to button presses.
>It's time to develop VR further until we reach full body immersion.
It's pointless but for other reasons.
The hardware just isn't powerful enough. Look what 1440p does to one of the best GPUs money can buy at this point:
youtu.be/Zfxs6PjjXyc
Even 1440p isn't there yet, forget 4K and VR gaming.

Name a real example.

>real life looks like shit
That's because you need to go outside to see the beauty of the world, something developer nerds never do.

youtu.be/2NsdM1VS5u8

Developers spend weeks squeezing out single cycles in shaders and percentage points of culled triangles. They're not poorly optimized it's just a hard problem.

>Westwood
And the EA came around.

Anything you want: Battlefield 5, Far cry, Assassin's Creed, ...

Diminishing returns.
And a lot of improvements have come in areas that aren't as clear on static images like lighting and animations.

If you're piecing it together with middleware then it's poorly optimized. Take modern fighting games, for example. They frequently use the EU4 engine which also means they all have 7 frames of input lag, something that so far has required years of patching to fix them. Fighting games cannot accept more than 4 frames of input lag.

The error was in choosing which middleware, and the then inability to change the issues quickly.

This goes past graphics, though graphics are still affected.

diminishing returns
>pic related

Attached: 3BA0C380-B45D-4291-A430-BA500144D969.jpg (549x275, 29K)

All of those games have good human animations, much better than the majority of the rest of the industry.

middleware that does not run on the GPU has nothing to do with GPU performance
input lag has little to do with graphics

Why does ray traced lighting still look worse than faked GI like in AC Unity or Battlefront 2?

Physically based rendering is mostly to blame for the stagnation in graphics

Attached: rage 1 vs rage 2.jpg (1920x2157, 573K)

Yes, clyde, we know. The whole damn world knows. You don't have to remind us, again.

Nier: tomato soup

Because you can't just import motion data and have it look good in game without an animation team tweaking it? Not to mention the difficulty of blending animation data smoothly, abrupt changes in direction, sudden interactions, etc. That costs a lot of money to properly implement and a talented development team, which is why only top tier AAA developers manage it.

Because it's barely functional. Current hardware simply can't handle it yet.

Pareto Principle in effort.
You can go 80% of the way with 20% of the work.
But getting that last 20% of the way needs 80% of the work.

Seeking perfection is asymptotic. You'll never get there, and you'll exert more and more effort for smaller and smaller gains.
There's a bit of revolution still to be had (raytracing for example). And that's going to be significant. But better lighting won't improve texture quality or motion capture. It's ultimately going to be incremental evolution.

Attached: asymptote.png (401x378, 10K)

>watermark
user...

Attached: 1406567882111.jpg (500x447, 19K)

Untrue. Graphics are affected by any number of things outside the GPU. Lots of decisions with game environments are predetermined long before we get to graphics. A simple example would be a space game where 90% of the screen is just a skybox. The graphics are going to look considerably different, even if all we're talking about is the other 10% of the screen.

>raytracing
>significant

Attached: 1565386292448.png (753x960, 29K)

BF2 and AC:U don't have fake GI they have offline baked GI which is still higher quality than anything realtime. They spend hours on a whole server farm ray tracing the light bakes.

This
I think OP forgets that there are a lot of games with different engineers and art directions. An aesthetically beautiful game doesn't require photo-realism. And besides, I think most people would want to be reminded that the game they're playing isn't reality. A part of playing a game is the escapism of it. Why would you escape to a world that's 1:1 with reality?

Whatever happened to the megatextures hype? I played Rage after all the patches were out, and it looked great and had fantastic performance on a Radeon 5770. Does Rage 2 have megatextures?

It's a better way to render light. It is both more elegant development-wise and looks more realistic. It's also harder to find bugs, whereas rasterized alternatives have many more.
It's just a better mousetrap.

ok reddit

Well argued.

Honestly I'd rather have a game with a proper storytelling / interesting hook / a world full of interactions with just an ok current generation graphical fidelity.

Then why bother? Sure it's easier for the developers but why would I give a fuck about making their jobs easier when they are fleecing us whenever possible, all the while reporting record profits year after year?

Rage 2 isn't even on id tech 6. It's a soulless cash grab. True megatextures went out in Doom 2016 because GPUs are fast enough to do tons of decals in realtime now without a big performance hit. Virtual texturing is the core technological innovation in megatextures and most engines these days use some form of it.