>pinnacle of graphics in gaming released in 2007 >graphics have barely improved since 2007 >at most they even got worse >yet for some reason you now need 8gb ram, 4gb gpu and a 6 core 3,8ghz cpu to run modern games on medium
Not really, it was a pinnacle in the normie market and probably the last game to actually up the game slightly. Ever since and already at that time, games became dull because they didn't innovate anymore.
We still have better graphics now and did when Crysis released. Your specs are pretty low even, the problem with modern games is bad optimization bundled with them being able to run on the lowest configuration.
Isaac Flores
>2019 >Still can't max out crisis on a single gpu
Asher Lewis
2 (in words: TWO) GB of ram to run crysis on max settings.
New kind of engines have been made. Higher resolution. Textures in game are much better. New graphics tech for example ray tracing, anti aliasing(its old but now at least optimized), volumetric lightning and all that stuff. Bigger maps so you need more resources. Smarter AI, better programming and much more.
>We still have better graphics now and did when Crysis released. Name one and post a picture.
Michael Ross
APUs from consoles begin worst than Q6600 since 2013 and GPU is a medium card from 2012.
Consoles.
Cameron Baker
Graphics wise, nah, not much changed, better lighting and higher res is the only answer we got.
Robert Jones
compare this to 4k ea star wars battlefront endor
John Diaz
Witcher 3? Even shitty MMOs like ESO? There's so many I can't even bother to list them all. Take your pick, Crysis was impressive as a tech demo game, it was barely atmospheric or good looking though.
Google your own pictures/videos.
If you're talking about when Crysis released, I was talking about graphics possible on a consumer desktop. There were plenty of actually photo realistic demos that rendered in real time.
Mason James
>no transparency AA okay fag
Bentley Cooper
Unironically this. As shitty as modern EA games are, something like Battlefront 2 (new one) running at 4k looks just stunning, Crysis looks like a game from 12 years ago next to it.
Aaron Thompson
Diminishing returns. There's a much larger difference between a 50 poly model and a 550 poly model than there is between a 550 poly model and a 1050 poly model
John Harris
>4k ea star wars battlefront endor You mean PRE-RENDERED BULLSHIT like this? You seriously belive their not going to downgraded it?
>Higher resolution. Textures in game are much better. Higher resolution/better texture resolution are the result of technological progresses in both display and memory technologies. >ray tracing Already a thing since the 90s >anti aliasing Same as higher res >volumetric lightning Lighting again >Bigger maps so you need more resources. Same as higher res >Smarter AI, better programming and much more. Not graphics related.
So pretty much no progress, except that we can now put more pixels on the screen so it looks less rough.
Bentley Howard
witcher 3 is shit even w2 had better graphics
Camden Davis
Well i have to agree on this one. But still. We came from Crysis to fucking fortnut graphics. It looks like it came out of 90-s cartoons. And then we have new Need for Spirts and other high tech games. It has changed. But not a lot.
Eli Russell
>.jpg
Ethan Price
>muh compression
Ethan Lewis
Well yes. But graphics can change. Like lookbat Minecruft. It doesnt realy have god tier graphics but it still looks good for blocky graphics style. Graphics depends on developer and experience. But yea. I agree again. We havent moved a bit.
Jason Rogers
>rear wheel on vehicle not rendered
Carson Perry
You should also mention that past this point, the computing power required for the polygons you add grows exponentially.
Ethan Morgan
Crysis has to be the game that leverages 4K the most. All this foliage everywhere really doesn't like 1080p. I was blown away the firt time I tried it.
Elijah Smith
W2 had better atmosphere. W3 has more "realistic" graphics.
Take your pick, it's not a definitive answer.
Levi Jackson
this looks like diarrhoea
Jordan Reyes
>pinnacle of graphics in gaming released in the 1980's >graphics have barely improved since 1980's >at most they even got worse >yet for some reason you now need 8gb ram, 4gb gpu and a 6 core 3,8ghz cpu to run modern games on medium And don't even get me started on holographic games that haven't existed since the hit arcade game "time traveller". It's like we're took 10 steps back, and never ever progressed since the 80's/90's.
Add better lighting (the only improvement since 2007) and it would look like a 2019 game.
Ryder Lee
>being this retarded holy shit
Alexander Powell
You know, you're not forced to post if you don't have arguments user.
Chase James
no arguments are needed against retards like you. phones these days have better graphics than in OP, but autistic idiots don't see it, it's like face-blindness for autistics but for graphics instead.
Matthew Gray
>phones these days have better graphics than in OP Imagine believing this.
Julian Hughes
Whatever you say zoomer aspie
Isaiah James
Because they know they have their addicted customers by the balls. Nothing needs to be optimized. Gaymyers not only have an inelastic demand for gayming products, they end up becoming shills, little marketing drones that work for free
Bentley Brooks
graphics faggotry is retarded, games have looked good enough since like 2001. modurn graphics are a waste of system resources.
Cooper Bennett
Developers got lazy. No one is actually doing any low level engine work anymore.
Josiah Ross
>recommended = max settings
Jason Evans
>You mean PRE-RENDERED BULLSHIT like this?
3D scanning is the newest fad. Just like photo-realistic textures were a fad in the Max Payne / Mafia 1 days, it's gonna take some time before game creators learn how to tone it down and make it blend well with the overall style of the game. Battlefront and current games using 3D scanned assets are only the first generation, that's why it looks so uncanny / fake. When battlefront came out, nobody even knew how to properly delight 3D scanned albedo maps, now we do. Source: I work with 3D scanning assets for game and VFX.
The reqs are bullshit Crysis is shittily optimized and you can make a 2019 high end build choke down to sub 60 fairly easily because of it On a 2008 high end build it's 720p high no AA and you will still spend half the game in the 20fps range with massive stuttering fairly often.
Eli Nguyen
Godrays lmao
Colton Morales
will zen 2 finally allow amd users to run crysis at 60fps?
I completely agree with this. turn on everything possible in Crysis and that high end modern rig becomes mediocre.
Cooper Jenkins
Diffusion, godrays, better fog algorithms, better AO algorithms, better AA algorithms, better lighting techniques, clothing/hair bone structures and of course AAA games today push way more polygons.
Nolan Reed
so much bullshit in those requirements.
I ran Crysis on my old build. Pentium D 950, WinXP, 2GB RAM, 768MB Quadro FX1800 (basically a Geforce 9600GT) and it ran like ass.
My current build doesn't run it much better though I can turn on a few more options (Xeon X3220, basically a C2Q 6600, 4GB RAM, 2GB Geforce GT710, Win7) and my workstation (i5 4570, 16GB RAM, 1GB Radeon HD 8570) runs it even better but there's no way I can approach turning everything on unless I like playing games at 1-3fps.
Joshua Powell
buy a not shitty pc?
Michael Gray
> phones these days have better graphics than in OP Show me proof of that you retard
Julian Brown
Crytek has always made poorly optimized games. They look great for the time, but when you play games that look similar but are released a few years later, they run much better.
Ian Wilson
Far Cry never had that much issue running halfway decent, played through on a P3@1Ghz / geforce 2MX Crysis 3 has really good multithreading and ran really well on FX as a result
Thomas Perez
I'm not a gaymer and everything else runs well enough. You're missing the point here: The specs were utter bullshit. A C2Q with a GT710 (which, despite being low end is significantly more powerful than a 9600GT) does not deliver even a decent game experience nor does a Haswell i5 and a GPU faster than a former "sweet spot" GPU (my 8570 is quite a bit faster than a GT730, which was once lauded for being great for it's price point).
Hunter Wood
>is significantly more powerful than a 9600GT Nope Worse fillrate and no memory bandwidth, the 710 is a dumpster fire of a card.
Hudson Cox
I may be wrong about this, I never modded Crysis, but engines today are capable of using a "pbr" pipeline which is to say that they can handle several maps at once for a texture. You're talking metallicity, roughness, normal, bump. When you're pushing 4k textures this starts eating up memory and cycles since you're using numerous variables in the shader instead of a very unified non-gradient modality. Subsurface scattering wasn't integrated back then. AO wasn't either as far as I'm aware, but I may be wrong, which is also an intensive process depending on method. I can't speak for everyone but I was playing on a 1024x1280 display. That's 1.6x the amount of rendered real estate.
Of course there's just the straight up texture and model resolution as well. I don't know what leaps and bounds have been made in the polygonal department, but textures are easily pushing 4k if not more in certain aspects of games like environment tiles. You're talking a huge leap with every increase in texture res. Compression probably plays into it as well.
And this generation of consoles is probably just paralleling the 8800's of the time if we're being honest.
Connor Long
All that shits destructible too. All of it.
Hudson Sullivan
>And this generation of consoles is probably just paralleling the 8800's of the time if we're being honest. Not a chance. youtube.com/watch?v=OcnYrSB9xvM
Ryder Martinez
Games have not improved much because PC hardware hasn't improved much. Simply added shit loads of cores that are nearly the same speed isn't helping gaming in the slightest. You are seeing the end-spectrum phase of a dead-end, technological, development phase. If individual cores could double and triple their processing speeds like they did during the PC video game hay days then you'd see game graphics developing far beyond what you could imagine right now. If the speed increases had happened along the chart of PC gaming history you'd have single cores clocking in around 20ghz right now in the bargain bin section.
There are ways to create better looking games for current technology, but that would require devs who actually know how to code, who don't rely on copy/pasting. They'd need the intelligence to be able to actually optimize their game for the hardware and reduce its performance bloat. That simply isn't going to happen.
Smarter AI eats up resources for everything else. You either have godlike AI or godlike graphics, but not both currently.
Minecraft may look blocky, but it is actually one of the better graphics technologies. It uses voxels. Voxels are extremely resource intensive. The terrain in Crysis is also voxel-based. We've had voxel based games since the 1990s though. Try playing, "Comanche," for some old skool voxel stuff. Voxelstein 3D shows how you can use voxels for bodies and everything in the entire game.
Brayden Collins
>Minecraft may look blocky, but it is actually one of the better graphics technologies. It uses voxels. It uses normal 3D polygons with textures.
John Flores
>Minecraft is a sandbox video game that uses voxels to store terrain data,[17] but does not use voxel rendering techniques. Instead it uses polygon rendering to display each voxel as a cubic "block." Minecraft also contains "entities," such as the player and creatures, that exist outside of the voxel data structure.[18] en.wikipedia.org/wiki/Voxel#Computer_games
Aiden Flores
>Games have not improved much because PC hardware hasn't improved much lol, more like the so called PCMR would go apeshit if their new RGB filled setup couldn't run the game. I mean look at how much they bitched about RTX, and you can still get 60fps with that shit.
Luis Carter
>but does not use voxel rendering techniques. Instead it uses polygon rendering to display each voxel as a cubic "block." The terrain is stored as voxels, since the world is made up of cubes, but when you're playing the game, you're rendering cubes like any other game would render cubes.
Asher Edwards
You can’t run modern wofenstein with AMD hardware. It just doesn’t work way below 60 FPS. It’s actually unplayable. It all has to do with operating systems and optimization technique that doesn’t translate to modern hardware.
Connor Campbell
Realistic my ass
Luke Howard
I'm still using a PC with hardware from 10 years ago and everything runs fine on all modern games at default settings and res. The only ting I had to upgrade was from HDD to SSD for installs so that games that need to load sandbox terrains all the time would be rendered at a proper speed.
Andrew Johnson
I would say that gaming industry became more of a normie business. Now investors pump certain companies full of money expecting quick buck. Most of the game studios nowdays don't even try to experiment anymore since it's perceived as a financial liability. Also many assets are reused in order to build new ones and push out products as fast as possible, so the time spent on optimization is always close to bare minimum.