VRAM

if I play at lowest settings possible and maybe even a config file for highest fps in 10 years time will 6gb of vram still be enough?

will only be playing at 720p btw but difference in res doesn't effect Vram that much difference from 720p to 4k is only about 500mb.


I could upgrade to 12gb of vram with same gpu performance but would cost 70% more.

i think ill still have driver support for OS thats supported to run games until like 2040. how long could 6gb last vs 12gb. or would they last the same as you would hit processing limits before vram ones?

Attached: stf,small,600x600-c,0,0,1000,1000.jpg (559x451, 65K)

reason I ask is in 2012 people where asking how long 6gb would last and now in 2018 the max any game uses is 4gb... yet we will have 8/16gb cards soon.

Depends entirely on the game
Most games in existence run just fine even with onboard graphics.
You have to be more specific.

frogposting
reddit spacing
crying
poorfaggotry
Fucking bingo! You won a suicide. Now go neck yourself.

reason I still think ill have driver support for relevant OS that long is 8800gtx from nvidia a 2007 card is oldest card to have win10 drivers and windows10 will prob run games until 2027
but then mainstream game like wow droped dx9 support in 2017 so that is only 10 years but a 2009 card GeForce 400 from 2010 has dx12 support and also has win10 drivers and wow might have a dx12 mode till 2030 or even later so lifespan might even be longer.


asking how long 6gb will last on a 10+ year old card might be retarded question but id like ot know if i should spend 70% more for 12gb. thanks.

I mean any game im asking if 6gb of vram will be enough to run the hardest to run games in 2028 on lowest settings and config edits to make it run on weak decade old gpus.

also if you wondering about level of performance of card its like Titan V sure 6gb titan V don't exist but imagine if they did.

I used a 1gb vram card until 2016 at 1440p and it was barely an issue

Stop playing games you nigger. Right fucking now.

Attached: 1533426531741.jpg (250x140, 3K)

yer I question if it will even matter like at the time it goes over 6gb the speed of my card might be so old that it wont make a difference. will be probably upgrading cpu and stuff so PC will be optimal in all other ways just have a really old gpu.

im more worried games in like 2025 or some thing will require 8gb to even launch or some thing. but then 7years ago people where freaking out if 6gb would be enough and it turns out 2gig is enough for most things with afue games using 4gig but not even sure if that effects performance if you only have 2gig and your card is already old.

this is my 1252nd pepe.
thank you frogposter.
i will give you a random one out of my collection

Attached: 0980pepe.png (318x308, 50K)

inferior smoker pepe

Attached: 1534084600078.png (399x322, 12K)

ive got plenty of smoker pepes, but that was more of a Hunter S. Thompson pepe

Attached: 0057pepe.png (500x485, 192K)

>doesnt state what games he wants to play

Attached: 1527123885015.jpg (321x340, 51K)

I don't know what games will exist in 2028 idiot.

The only games that matter schoolfag

>GTA
>Counter Strike
>KSP
>BeamNG

it isn't possible to enjoy games after your early 20s

>720p in 2028

Yikes...

well I could increase res abit if OS or game increases its minimum supported res. windows is 800x600min atm and some games at 1024x768 minimum.

in 10 years I could see that lifting to 1080p perhaps but even that's doubtful as lots of 720p and 1300x800 laptops and shit are made still.

but increasing res doesn't really help me answer question like I said vram useage diference between 720p and 4k is only 500mb or some thing like half a gig.

PLZ HALP

Attached: 1532796579344.jpg (680x521, 63K)

by 2025 your current gpu probably won't even work in latest games. for all we know dx13 will be out

like I said a GeForce 8800gtx could run wow for 10 years and a GeForce 400 from 2010 can run dx12 wow that was just added and should stay a feature until 2030 so that card will technically run the game for 20 years.

I don't think its unreasonable to assume cards from 2018 will be able to run games for 10-15 years. DX12 is dieing out and Volkan/Mantel is taking over also all nvidia cards from 2014-2019 will have the same feature set dx12_1 that's 7 years of cards with the same support that's never happened before things have slowed down.

wouldn't be suprized if the cards from today last even longer than the 8800gtx or GeForce 400 like I mentioned.

but yes obviously if games suddenly require shader model 3.0 like they did in 2009 that invalidated top AMD cards that where only 1-2 years old that would suck or if it goes raytracing or some thing but I think both thous situations are more unlikely now.

once PS5/xboxnext come out we prob will have a better gauge of how long 9/10/20 series nvidia cards will last. but then again the SM3.0 thing was patched into the xbox 3 years after launch so maybe its hard to tell right away.

HD 6970 2gb can still run BF1 at 70fps at 1080p that's a 8 year old card.

Attached: 1533700980850.jpg (500x492, 32K)

Yeah, I really hope my gtx 970s mememagic will holdits own for years to come as well

i think performance will stop your 970 being usable before feature set does more wondering about high end cards like 980ti sli or 1080 sli or 2080. all have 6-8gb of vram and all are as powerful as the 2080 (if your ignore the raytracing) as a card that came out in 2018 and all are dx feature set 12_1 and have same opengl/mantel/volken support.

sure they are from years 2015/2016/2018 respectivly but they all basicaly the same card (ignoring shitty SLI support that might improve with volken/mantel supporting it nativly and dx12 being able to do a better version of sli)

asuming all this how long could these last.

sure you where 100% required to have a dx9c SM3.0 card in 2009 for blackops and 100% required to have a DX 10.1 card for BF3 in 2011 but things where advancing faster then imo. we have had DX12 cards for 9 years and DX12_1 cards for 5years. that huge install base makes me think DX12 will last longer. also considering Xboxone is only DX11.1 its possible next Xbox will be just DX12.1 or some thing and not DX13.


that means we could have DX12 games till 2028 and even then that would be 18 years of DX12 hardware so even in the 2030s it would make sense to have a DX12 mode even if the game is DX13/14 or what ever.


also you have to consider OpenGL and MantelVolken. they seem like they will upgrade standards slower and have wider backwards compablity beacuse they are popular with indy devs who dont have publishing deals with xbox etc. and they are becoming more dominant beacuse Playstation/Nintendo is killing xbox.

performance wise i just dont know how things are advancing like a 2080 could still run games at 60fps 1080p for 10-15 years i think. things really are not advancing that fast. like i said a HD 6970 2gb will prob run BF6/battlefront 2020 at 60fps in 2020 thats a crazy long lifespan and imo would mean a 2080 could prob run BF10 in 2028 at 60fps also.


obviously some standard could change like with SM3.0 in 2009 or dx10.1 in 2011..

but how likely would that be? I notice games now have ablity to run in many modes and the jump to SM3.0 and 10.1 seemed like more a artificial one like doing the work for older cards wasn't worth it or it just would look so different that it would ruin the experience so they didn't support it. (I think you could actually trick BF3 to run in dx9 some how) and possible trick BO to run with out SM3.0 but it would look like dog shit.

im willing to do lots of hacking and config edits to keep using this level of performance as long as possible and assuming the performance last long enough and the feature set doesn't fuck them over at what point would the Vram be a problem? or would the aging performance drop to a degree than having half the vram required wouldn't drop prerformance that much extra?

I just haven't run into many Vram problems years and years but I wonder if games will come back that like "require" 8gb of vram to even boot etc. or will that not happen.

Attached: 1532998892200.png (636x511, 327K)

6 GB will not be enough in 10 years. Ten years ago, you could choose between 512 MB - 1GB VRAM. Even if you went with the top of the line back then and had a 1 GB card, that would be pushing it today.

I work in gamedev and I can tell you that the direction games are going in are photogrammetry pipelines coupled with smart materials and on the fly LOD generation. This makes it easier and faster than ever before to make good quality assets that scale well but the downside is that it's increasingly VRAM hungry. What we'll see is a push for more VRAM in workstation card due to machine learning and GPGPU needs, and this will trickle down to prosumer and consumer cards due to the memory chip fabs only pumping out the same dies combined with binning etc.

So, to sum it up, 6 GB won't cut in 2028, not by a long shot.

Ooof. Nice edge

you could get a 9600 GT with 2gb of vram in 2009 95% of games don't use over 2gig atm even at 4k and about 5% or even less use 4gb.

what your saying doesn't really line up to reality maybe you had a 256mb card in 2009 but not every one did.

I get what your saying thou but my bigger question is will 1080SLI or 2080 level performance drop so low that 6/8gb of ram wont matter if you need more will the card be obsolete before the ram is the reason I ask is I can get 1080SLI level performance with 2 overclocked Titan Xmaxwells that have 12gb of ram but cost 70% more but similar performance level when OC.

also while games use 2/4gb atm the games that use 4gb are not unplayble on 2gb cards and im not sure they even get that much fps. like maybe 10% less or some thing.

basically im asking if I want to survive on 1080SLI level performance for a long as possible will 6gb or 12gb be better. would the upgrade in vram not matter in long run because ill be getting like 50fps any way and double the vram would only lift my fps by 5frames or some thing in 2028.

wtf are you talking about binning and chip fabs to do with gaming cards. you realise workstation GPUs have 24-48gb of vram and use totally different chips to even titans. they use like 6-12gb Vram chips while the gaming cards use 256/512/1gb chips which they double up. and 2080 will use 1gb or 2gb chips. they will never ever use 6gb vram chips on a gaming card or any card that costs under 3000$ they didn't even use thous larger chips on the Titan V .. what your saying is crazy and wrong.

you realise the internet is full of threads from 2010-2012 asking if 6gb of vram will last long enough its now 2018 and 2gb of ram is enough if you have 2gb of ram GTAV and Tombraider don't just stop working they get like 10% less fps.

you cant use the rate from 2000-2010 to gauge things by that was crazy fast advancement most vram increases for the last decade have been superficial and not needed like I said you could get a 9600gt 2gb in 2009 but it probably wouldn't run any thing very well today but it would have enough ram for 99% of games and has win10 drivers.

I don't think you can use the scaling of GPU power or Ram from the 2000s to judge things in the 2020s gpus increased massively from 2000-2012 but have honestly been the same level of performance for about 6 years now.

and just because ram went from 5mb-2gb from 2000-2010 doesn't mean going from 2-8gb from 2011-2018 means any thing

I personally think VRAM will be a bottleneck before FLOPS will in the future. Also if you're gonna play in 720p you will want to use some heavy MSAA or even a future neural network based AA solution because games in the future will be made for 4K and then scaled down on the fly which cause a lot more AA than when mip maps are used because those are downsampled in advance. Those kinds of real time AA solutions are VRAM thirsty. Also, in the future games will be a lot more detailed with high resolution meshes and maps which also increase the need for AA on 720p.

because no games have needed that 2-8gb jump like literally nun while games from 2000-2010 required that increase to even boot or function.

if games continue to use just 2gb of vram they might even reduce the Vram on cards and the 3080 or what ever might go down to 4/6gb again. there is no point Nvidia selling you shit that wont be used and they are not just going to increase it for marketing reasons at some point them being able to make the card for 100$ cheaper will override the marketing value and they will just post some explination how its going to waste or actually slowing the card down. and they can have "smaller" but faster Vram.

>95% of games don't use over 2gig atm even at 4k and about 5% or even less use 4gb
No fucking shit, Sherlock. Consider that 95% of games weren't released in the past 3, or even 5 years. Just because time goes by doesn't mean that a game from the 90s is suddenly going to use more than 4GB of VRAM.

In other words, a statement like this means literally nothing.

lol no I wont use AA ever I like pixels that shit is terrible for competitive gaming and only makes sense on a upscaled image on a TV not a native res 720p monitor.

ok your larping and what your saying is bullshit you actually think games will run at forced minimum 4k res in the future and scale down to lower res... what your saying is just retarded and wong please stop giving your opinion and larping that your a game dev.

GTAV was released in 2015 you idiot and tombraider was released even before that. not a single game released in 2017 or 2018 uses over 2gb vram and the whole 4k uses more vram shit is total bullshit 4k uses like 250-500mb more of vram than 720p. that was just bullshit nvidia spun to sell titans to idiots.

In the next decade all cards will converge on the same HBM(n) chips, the only difference will be the amount of stacks. They will probably just bin those with some defective layers and use those in lower tier cards.

Also, the memory requirement in games didn't increase that much from 2008-2018 partly because of consoles, partly because of VRAM costs, and largely because the asset creation pipelines were still non-photogrammetry / structure from motion based. These days and going forward, asset creation is data driven, with motion capture, facial capture, 3D scanning and procedural generation quickly taking over, which is already pushing the VRAM limits and is gonna explode over the next decade.

>not a single game released in 2017 or 2018 uses over 2gb vram
>source: my ass

the only 2 games in existance that use 4gb of vram where released in 2015 and probably only use that much because they where poorly made first gen xboxone/ps4 games. every single game released in 2016/2017/2018 uses 2gb or less.

so new games do not use more and the only games that use 4 are probably made by retards.

Well, if you conveniently shove all the games that use between 3 and 4 GB of VRAM under the rug, sure.

its true you idiot find a game from 2016/17/18 that uses over 2gb of vram. nun exist. go have a hunt.

they use GTAV/tombraider and some other retarded game from before 2015 to test and show a game using 4gb but they are all old shit games.

"viewing how much ram is used in your gpu info isn't actually how much ram the game uses you need to do another measurement every game will fill that up as a buffer even if it only needs 2gb to operate optimally fps wise and even if you have only 1gb you will only loose like 10fps"

what games Tomb rader and GTAV and some random game from before 2015... they haven't released a game that uses more than 2gb in 3 years and before that basically nun used more than 2gb... wonder why?

what are all these games you speak of/…. oh wait your justification for your titan... "skyrim mods use heaps of vram" no they don't you dumb shit.

>Fucking bingo!

xD

Fucking brainlets.. Materials (albedo, normal, displacement, occlusion, etc) will be 4K native and scaled on the fly, insted of using MIP maps, meshes will be high poly with no precomputed LODs, and will be scaled on the fly (like in PUBG for instance), etc. Basically, author the asset once in 4K, use compute to deal with scaling. This reduces cost, but increase VRAM usage and adds a little bit of compute overhead. Read some books you dumb niggers.

Attached: 1534441422534.png (297x435, 197K)

all the games that use between 3-4gb of Vram are literally 2 games only GTAV and rise of the tomb raider both released a year after the xboxone/ps4 launched and prob just made badly. there is literally no other 3-4gb games.

shadow of Mordor says in options menu it needs 6gb of vram for ultra but that's retarded and just mentioning 980ti 1060 6gb cards and not the actual ram it uses it uses 2gb on ultra.

lol your larping so much 1080p games have used and 6/8k/16k textures for a decade. saying games will start using 4k textures shows how little you know

go google some shit your a larping no life no skill idiot who likes to feel like you know some thing about your epeen that's your shitty rig.

PUBG 1080p low

Attached: maxresdefault.jpg (1280x720, 223K)

Attached: 90_441_much-vram-need-1080p-1440p-4k-aa-enabled.png (600x967, 25K)

dude have you even read my posts. games fill up all the vram you have that system is limited to 4.5gb because of a driver/os thing that's the max VRam of the card the game doesn't actually require that much vram to run at optimal FPS the reviews that mesure how much VRam a game uses don't just use that info they do some thing els. play any game and look at your vram usesage even on a titian it will go up to 12gvram usage its buffering not what the game actually uses and doesn't improve fps or lighten the load its just how vram works.

>its true you idiot find a game from 2016/17/18 that uses over 2gb of vram. nun exist. go have a hunt.
DOOM
Monster Hunter: World
Dark Souls 3
NieR: Automata
Final Fantasy XV
Agony
SOMA
Prey

I'm sure there are more, those are just the ones that I've personally played and can verify for a fact that they'll use over 2GB of VRAM.
Oh wait, but they're probably all made by retards, right?
Or they don't count because they don't use more than 2GB on the lowest settings?
Or maybe you'll come up with some other bullshit excuse for how they totally definitely absolutely not at all nuh uh don't use more than 3GB of VRAM, what's it gonna be?

Go on, keep replying to the same posts over and over again because you're so anally devastated that your shitty 2GB card can't keep up with new AAA titles on high graphics settings.

>being this fucking delusional

>1080p games have used and 6/8k/16k textures for a decade.

Material complexity have increased, so have the number of materials and especially the number of UV sets. You probably don't know what a UV set even is. You also ignore my points about MIP maps and LODs. I make games for a living, what do you do?

Why don't you prove me wrong instead, you little bitch.

Attached: 1533416113867.jpg (470x470, 70K)

like I said the only games that use over 2gb of vram are from before 2015 and launch games for the next gen consoles released in first year of their life.

GTAV and rise of tombraider in 2015 and a random game before that shadow of Mordor.

no game for the last 3 years has used more than 2gb of vram even at 4k. the only reason thous 2014/2015 games used so much is they where made badly and launch games.

your a literal idiot. every game shows max Vram use in GPU info that's not the game using all that ram its buffering the actual game is using sub 2gb of vram.

you could plug a 12gb titian into thous games and it would show 12gb of vram use after a while. nun of thous games use more than 2gb of vram.

Wrong.

LARP LARP LARP LARP. just shut up please. did your mum buy you the artbook of gears of war and you took the editorials too seriously.. just shut up your clueless.

>uses Maya
>says he "makes games" for a living
Go back to /3/ you migger

who's trolling who

y'all retarded

I don't have a 2gb card im asking if I should buy a 6 or 12gb card. no game in last 3 years has used more than 2gb of vram and only 3 xboxone pc ports that where released in first year of console did in 2014/2015.

that's just a fact.

sure games could use 4gb or 6gb or 8gb or 10gb or 12gb in the future im just trying to work out when and how and you saying games use 4+ is simply not true. they use 2gb at the moment and 4k increases it by 350mb or so compared to 720p so even if we get 8k we will prob only be using 3gb. im asking what will push it next posting results that are recorded badly or being a retard and thinking your gpu using 100% of its ram after playing for 10mins means the game uses that much isn't helping answer my question.

I obviously want to know if 6gb is enough i should double that. and at what year would 6gb stop being enough or even 12gb stop being enough.

just crying and trying to convince yourself your games use 100% of your vram and not just 2gb isn't helping the answer im looking for if you don't know how to predict how vram will increase please kindly fuck off and be happy with your 6 or 8gb card or what ever.

Attached: vram.png (500x410, 18K)

Attached: vram (1).png (500x410, 18K)

Why don't you just buy a 2GB card since games don't use more than 2GB then?
It was enough 10 years ago, so it should last another 10 years if you just keep larping that games don't use more than 2GB, right?

Attached: vram (3).png (683x402, 23K)

DELET THIS

Attached: 1529918994108.jpg (601x601, 65K)

its not a larp you idiot you guys are posting graphs of people recording things after playing the game for 10mins and being suprized yoru Vram is going up and up till its limit. your posting recordings from literal retarded review sites.

Attached: vram (4).png (676x402, 23K)

Attached: vram (5).png (683x402, 23K)

>uses Maya

Attached: vram (6).png (863x744, 1.15M)

>I warned you about the stairs, bro

Attached: 1530865128758.gif (265x200, 1.96M)

are you fucking retarded the fury X is going to its 4gb limit cant you fucking see that you play any game for 10-15m and your vram will max out the only reason the GTX 1070 8gb isn't at 8gb is they are recording it after 5mins or some thing and it doesn't have time to record and they are literal waiting longer on the 4k one and going OH WOW as they "wait for the Vram to stabalise" that's now how you record Vram use that's scrub teir "tech reporting"

that's not how you record Vram use you don't just look at it in GPU info you fucking retards all games max out Vram if you wait long enough for fucks sake. guru3d is clearly a shit teir site of people just enabling their fantasy that their titan is worth it now and have 0 tech knowlage. the german sites record Vram properly by setting the card to not buffer frames no game uses over 2gb for fucks sake. doing it that way means the card just uses the ram it needs to get optimal frames and doesn't just randomly fill itself up for the fuck of it. GPU101 you guys are reading scrub teir performance reviews.

>>uses Maya

I use Zephyr, Houdini, Zbrush, Topogun, Substance, Maya/Max, etc etc as well as UE4.


>Go back to /3/ you migger

Gamedev is both Jow Forums and /3/, migger

Attached: mick-jagger-3d-print-model-3d-model-obj-stl.jpg (878x840, 80K)

>boot up game that was launched in 2018 (8 days ago, in fact)
>VRAM usage immediately shoots up from about 500MB to 3GB after loading into a map, and that's with the game being stuck with 512x512px textures because the developers are mongoloids
Can you do the math for me here or did you fail your math classes in preschool?

Attached: vram.png (382x108, 3K)

turn off buffering in drivers and set it to 0 your literal retard game will then just use Vram it needs to get max fps. that's how you test Vram in games all you retards looking in gpu info and waiting for it to max out have no clue what your doing EVERY modern game will eventually max out vram if you leave it running long enough usually 5-25mins. these tech review sites are just loading up the game at diferent resolutions and the longer time it takes to load at high res is making that Vram score look higher its not actual required VRam you need to turn buffering off to messure that. and buffering doesn't add performance it just smooths out frame pacing and micro stutter.

The move towards real time asset and material streaming means that instead of devs optimizing VRAM usage for a couple scenarios (low, medium, high, ultra) games will increasingly just take advantage of all the VRAM available to them dynamically. So a couple years from now, no matter how much VRAM you have the game engine will use all of it.

Attached: josh-van-zuylen-cyberrunner-scene-render-9.jpg (1920x1080, 896K)

>looking in gpu info and waiting for it to max out
user.
I'm going to bend this in neon for you.
I booted up a game and loaded into a map.
Then I took a screenshot of the graph.
This all happened within less than 2 minutes.
I think you should get back on your meds, this is getting ridiculous.

your a larping faggot you first say games will downscale from 4k res then try and cover yourself by saying textures will be 4k textures where 4k in 2005 you literal retard they are 16k+ now even on medium you don't have a fucking clue what your talking about and just want to justify you pirating creative software and buying a overpriced gpu and thinking its optimal. your a larping nolifer go away. go make a retarded ugly cube in zbrush and feel like your a industry insider you don't know shit artists wouldn't know shit about future vram usage they are literal slaves from india being controlled by hipsters in Montreal in the real industry only way you would know the future of VRam use if you where a engine programmer for Ubisoft or Capcom or Square or Microsoft Studios (sony people wouldn't know shit and just follow trends because they jumb japs and gorilla germans)

there is prob litearlly 60 people in the world that would know the answer to how much Vram would be used in 2022-2025 games and even then they would have no clue about how much would be used after that.

stop namedropping and saying nothing of worth.

Go look at some game play videoes on youtube with stat overlays. Vram use literally always just cap at some amount and stays there more or less stable during the entire gaming sessions. It doesn't max out the Vram in most games unless you have to little for it to reach the usage that it needs. No game max out the 11GB 1080Ti for instance.

If you disagree, show me one (1) game that max out 11 GB Vram. I challenge you.

turn off buffering and you will understand.

Shockingly Nvidia requires a lot more memory for "ultra settings". It's almost like Nvidia uses additional libraries and supports additional features.

did you bother to notice the level the vram stays at on youtube videos is 500-0mb short of the vram cap of the card being used? did you notice that.. I doubt it.

go read up how to do vram requirments tests. the only way to know how much vram the game uses to get max fps is turning off buffering. your watching literal clueless monkeys "review" shit.

As I said, UV sets. Multi Tile maps, as well as more complex maps, increase the size of materials today by an order of magnitude compared to a decade ago. Do you even know what a material is? In the next decade, UDIM UV will come to real time applications. So will Alembic temporal meshes. Not that you know what any of this means. Most people who work with asset creation and/or gamedev knows perfectly well what way we are heading. Most of us have stuff on our hard-drives that will only start to find their way into games in 5+ years from now. GDC and SIGGRAPH presentations have consistently predicted the future for years now. I also do Photogrammetry btw, have been for games for about 3 years.

also some games have a buffering limit of 2-3 and that means that yes some games buffer up to 65/75/85% of the cards ram depending on the setting or game but all that extra ram used is not increasing fps its just preloading frames.

turn buffering off and play some games and you will realize what im talking about. the reason some old games required a set amount of Vram to run is they where talking about smaller amounts of Vram and only have 25vram when a game required 256vram would make the game lock up the system or run at litearly 0.1fps so they made the game not boot. now every one has 1-2gb of vram games don't need to do that and even if you have only 1gb of vram the game will only get like 10-20fps less. and people care about frame pacing and stutter now and gsync and shit so that's why they leave buffering on and just max out the vram to a ratio of 3/5th 4/5th depending if the game supports 1-3fps buffering.

you cant messure how much Vram a game actually uses to get max fps with out turning buffering off. its just that simple.

your unemployed and pirated 10k worth of creative software and watched some youtube videos because you have nothing better to do and want to justify your welfair pention you spent on a titan 3 years ago... please shut up im a graduate architect I know more about rendering both realtime and post than you you unemployed frog.

Literally no one knows. Futureproofing is retarded, just sell your card when you want to upgrade.

The fact that you consistently avoid my technical arguments is proof to me that you don't know shit. Please just go back to playing your games.

Attached: 1533990117211.jpg (2048x2048, 943K)

I cant they stoped making cards that support CRT and 980ti / Titan X Maxwell in SLI 2x is probably the best performance you will ever get out of a CRT. some retard in another thread thinks his DPtoVGA adaptor ads less than 10ms of lag which would mean you could use that and still get benefit of CRT but I think he was a retard and just did a human reaction time test and clicked faster on the CRT and assumed it wasn't getting 50ms of lag from the active converter cable. (could be wrong thou).

are 5$ dvi-d to vga or dp to vga cables really that fast.... I don't see how they can be and want proofz.

your not making technical arguments your just name droping things you seen in lectures you don't understand your not explaining why any of thous buzzwords would effect performance or how much you no lifer "pho expert" beyond sad levels of cope.

if you can say 10 diferent technical terms and say you have files on your HDD that wont be in games for 5+ years why cant you say games will use 4gb in 2020 8gb in 2024 and 12gb in 2026

your just sperting name dropping shit to sound smart and provide no contribution other than "more will be good and will happen" because you have a fucking titian and are a sad fuck.

Haha oh it's the 14 year old CS:GO "pro" who doesn't even know how monitors work. I destroyed you in the other thread, you have no idea what you're talking about. DP to VGA adaptors have minimal input lag, you've been misled.

you did a human reaction time test result very from 180-260ms you clicked faster on your CRT

I just ran FF XV on my 4gb 980 with buffering and prerendered frames off

1080p and 4k uses 1.3gb and 1.5gb of vram
with HD texture pack
1080p and 4k uses 1.8 and 2.1gb of vram

I turned buffering and prerendered frames back on and both maxed out my 4gb at 3850gb used and my fps was the same as with buffering on but my frame pacing was less consistant.

more vram with buffering on is the same as more cores it shows as full activity or fills up for vram but it doesn't increase fps it only makes frames more consistent.

learn2benchmark techpowerup and guru3d used to be trusted review sites in the 2000s but their editorial staff are now console faggots that just got a 1080ti and don't know what the fuck they are doing. go look at the German benchmarks they still have a clue of how to do it.

Look, I know that VRAM use increase the coming decade because of a lot of different developments converging that all depend on a lot of VRAM so be usable. I can't predict at what year it will happen, that will depend on the market price of DRAM and a lot of other factors are not deterministic. I know that the main cost of making a game today is the asset pipeline, and I also know that the only way to increase graphic fidelity without increasing costs exponentially is to adopt data driven pipelines. That change has already begun, and is will only accelerate. I could explain to you how a data driven pipeline ultimately push up VRAM needs in the end user hardware, but that would take forever to type out. The gist of is it that it's cheaper to throw data at the problem and use algorithms to decimate, scale, and filter the data instead of hand crafting everything and manually pre-rendering assets in various sizes to fit into predetermined memory-limits. And we haven't even talked about particle simulations both baked and real time as well as 3D scanned video>animations through deep learning training->Alembic with mesh compression algorithms which will make their appearance the next decade and allow us to mitigate the uncanny valley as well as give us fluid simulations etc.

Attached: 5423452523523523523454363456345.jpg (2048x1187, 362K)

ps4 and xboxone both have 7-8gb of vram games don't use more than 2gb with buffering off. how will the market price of vram effect any thing they already have 4x more than they are using. PS4/xboxone are already approaching end of life if games don't use 2/3x the vram they are using atm the PS5 pro will prob have less Vram than the PS4. what your saying make no fucking sense.