8800 GTX appreciation thread

>Still runs most games on medium
>Can use three-way SLI an max out most games even today
10 years later is there still any reason to upgrade?

Attached: 1527939918840.jpg (1000x409, 63K)

Other urls found in this thread:

youtube.com/watch?v=C1vVsqUr8nU
twitter.com/SFWRedditGifs

does it actually run dx11/12 games thou?

like would it be able to run BF3/4/1 or WoW now its droped dx9 support.

It runs games worth playing

imagine if nvidia stopped being ultrakikes for 10 seconds and made a card as good of quality as the 8800 again

It's called the GTX 1080

the way its meant to be played

apparently crisis warhead still gets like 20-40fps on top end computers today I have a feeling it was just made by retards.

actually heaps of 8600 cards died randomly 2 years after use because NVidia quality was really low at the time. forced apple to give 100,000s of americans free macbooks in 2011

oh shit, had no idea. me and a couple friends swore by the 8800 gt, card never did me wrong until it was time to upgrade

no overclocking

Even a GTX 1060 can max Crysis 1. Warhead was a bit more optimized.

I knew like 5 people who had 8600 cards that died including myself. was basically the second summer you had it not sure why. nvidia was selling bulk 8600/9600 in this time period not suprized quality went to shit they where selling even more than they did 1080/1060s n stuff. it was the whole does it run crysis meme time.

digital foundry did a video of like a intel extream cpu and 1080ti running warhard at like 20-30fps its in their "do we need i9" video that's maybe 2months old. I expect they didn't install dx9 or some thing utterly retarded but check it out for me and try and work it out.

They did something wrong then.
youtube.com/watch?v=C1vVsqUr8nU

they said it was part of a CPU benmark they where making. maybe it was a particularly cpu intensive part of the game?

Were they using SSAA or 4k?

why no love for 7800gtx? :3c

Attached: nx7800gtx+.jpg (500x263, 137K)

that only gets like 55fps in combat I expect it was a intensive part of the game that runs at 20-30fps even on new hardware.

I'll never forget this card. I remember having some poorfag card that couldn't run BF2 maxed, and this guy had one of these and used to take screenshots maxed out all the time and I was so jealous

like I said was a CPU benchmark so id say no.because 7xxxgtx was PS3 bullshit architecture that no one loved. there was no reason to buy a new GPU in 2005/2006 every one was playing wow wat was wrong with you.

I had one but I never got to use it because my pc didn't have a pci-e slot.

>3-way SLI
>requiring 500w top get performance that's less than a single 1060 3GB
>Having to deal with SLI

Nothing wrong with keeping tech for as long as you can, but recognize when it's time to upgrade.

>>Still runs most games on medium
Who are you kidding.

I don't have the GTX one but I have an 8800 GT and the answer is no.

I miss my evga 9800gt. Fucker ran hot as shit but it was a trooper.

8800 Ultra was my wet dream

my HD 6970 2gb still runs bf1 at 80fps and pubg at 50+

you have been sold a lie.

Had the EVGA 8800 GT single slot design card. was a great card but then one day it started displaying red artifacts during heavy gaming sessions. So I had to replace it. The pinnacle of my quest for "more FPS with no comprising quality or resolution" in my games. Ah good times.

I had a GeForce 3 500ti with current inflaction that's like a Titan V today I killed it a year or 2 after getting it via rubbing arctic silver underneath the core by mistake and it fucked up the connections. maybe if I washed it in alcohol or some thing I could have saved it... sad times .

>PUBG
You must be 18 years or older to post on Jow Forums

I remember this card, the GT cost $200 I think. Do they even make good (aka not gimped as fuck) cards anymore at the $200 price point anymore? That GTX 1080 that was mentioned earlier costs like $400 still.

As many people pointed out. This supports DX10 only. SLi support for anything prior to 2010 will be non existent. A much more powerful gtx 660 with a far larger amount of VRAM would even struggle to play most modern titles at medium/low at 1080p. This even includes titles like fallout 4 that still use DX9. You're objectively wrong.

i played warhead on my 8700k+GTX 1080 machine. I got about 110 fps at 1440p. it really rapes one of your cpu cores. my old 3470 would drop below 60 all the time.

the ps3 had a single powerpc core so it didn't really do justice to that card. sony wasn't even going to use nvidia, but had to switch late into the design process because their first choice fucked them over.

Back to dabbing on Fortnite, kid.

>the ps3 had a single powerpc core
Don't forget about the Cell supercomputer-on-a-chip.

>i played warhead on my 8700k+GTX 1080 machine. I got about 110 fps at 1440p
Probably at the beginning. There's some vehicle scenes later on in the game that drop to 30fps even on a 5ghz intel.

Attached: mpv-shot0015.jpg (1280x720, 178K)

What the fuck are you trying to show us?

you cant run crysis at 60fps

What the fuck sort of settings do you have to use to make a Titan X Pascal not be able to run Crysis at 60fps?

CPU bottleneck.

default very high settings

I remember my 8800 ultras still. Last great card that's been released. More than double the performance of the last gen.

On a 8700K and 2700X? I don't think so.

What a shitty engine.

mine still works today even though it was always running at 90°c+ for years.

How dumb and young are you? Single core performance on modern CPUs is trash.

lol amd

Attached: watchv=PcYA-H3qpTI-[17.48.150-18.09.417].webm (853x480, 2.64M)

>directx 10
>less than 1GB VRAM
when will this meme die

Who /9800gtx+/?

Still have mine at home in it's original box and price tag.

Attached: 66-1.jpg (300x300, 13K)

who takes a shirtless picture of themselves for their avatar?

Attached: Capture.png (789x200, 67K)

It's objectively better than it's ever been.

I have a HD 6970 2g you retard im obviously not a teenager. no 10 year old gets a card like that.

its not the engine crysis has gameplay shit games today don't even have. they have simplified games/maps to make them run on shitty consoles what was the last PC exclusive impressive game?

it was literally crysis.

arma and dayz shit doesn't count because that engine is even older than crysis and simpler and for military retarded shit

PC games used to have unique gameplay now all PC gaming does is let you run at 144/240hz and native fps.

I wana die

And that card is about 5x faster than the 8800 GTX being discussed in the OP

don't think its that much faster mate. 2007card vs 2010. doubt its 5x. but it does run dx11 obviously which is a bonus.

Games in general just suck anymore. Anyone remember NOLF? Great game, mix of everything, but it pulled it all off very well. Nowadays such game would never see the light of day for whatever reasons. Referring to the PC Version btw

Attached: Advertisement_NOLF_PS2_01.jpg (3113x2058, 871K)

Couldn't find any benchmarks directly comparing the two but you can see here that the comparatively (to the 8800 gtx) powerful 8800 gtx performs nowhere near as well as the 5970 which is not as powerful as the 6970, so yes, the performance difference is probably gigantic

Attached: 23702.png (550x600, 86K)

>she'll break your heart with a .44 slug
>carries a .22

nice

It's the same thing that happened to movies and TV, big-name productions are expensive enough to make that they have to aim at the very lowest common denominator to have the widest possible audience, otherwise they won't get funded because the risk of them not recouping their costs is too high.

and just like with movies there's an "indie" art scene, which has a trend-following problem thats almost as bad, people just follow whatever the in thing is (eg, the pixelated faux-8-bit aesthetic) instead of what the money wants. There's a few that're good in spite of the low budgets and production values but you have to wade through a lot of shit to find it.

You're thinking of the 1070Ti. And even that you can technically overclock, you can't OVERVOLT. 1070Ti was voltage-locked to prevent people from overclocking over 1080 clock levels.

Why would you believe random pre-launch rumors on Jow Forums, instead of reading actual fucking reviews? I have no idea where the "Pascal can't overclock" rumor got started.

It's a CPU issue. Crysis uses like 1-2 threads, Ryzen dips below 60 when there's tons of enemies on the screen as well.

Not him but pascal in general is so fucking locked down it's pretty upsetting. I can overclock my 1080ti by like 80mhz from the boost clock out of the box thanks to voltage and power limitations.

No alien Isolation for you

>GTX
>Not Ultra
Begone foul demon

Not sure if this it because it's locked down, or because it overclocks itself dynamically as thermals will allow.

But I have neither. I have an 8800 GT.

Voltage is still capped super low, my 1070 hits a wall at around 1.050v

who are you bullshitting moron

I had a 8800gts 512mb. It ran crysis ok

Medium 800x600 30 FPS?
BF3/4 still has DX10 fallback, if you run "Ultra" on a GTX 285 it won't be the same than a native DX11 card, it will miss out effects

I have 9800 GT and it can barely run Killing Floor 2 on lowest settings and shitty resolution.

That's about on par with my experience. I tried putting everything as low as I could and I still got painfully low fps and consistently came up on the bottom in score because moving and doing anything takes about twice as long.

There is no reason to use those old cards anymore. You can get a better one gifted today probably.

they're good for retro windows xp machines

gaymerfaggot

bf3/4 requires DX10 card thou. this is only dx9 card and maybe not even Sm3.0.

no im thinking all of 10series.

you realise you can bios flash a 980ti and make it as fast as a 1080ti you cant do that to a 1080ti you have to literally solder shit onto it in super fine complex ways to overclock it properly.

9/10series was a shit upgrade if you overclocked. it was like 5% at best. 10/20 is going to be way bigger like 25/50%

I'd like to bench one of these ( older drivers vs the last driver ever made ).

I had an 8600, all I had to do was bake it in the oven for a few minutes to reflow the solder and it ran like new.

That was the 980ti, still going strong 3 years later, maxes out pretty much any game still at 1440p.

I could only ever afford the GT version, I was too poor fag.

>t. retard

fuck I should have done that to my 9800m gt and 8600gt in my brothers MacBook.

actually I still have a 9800m gt maybe I should try that.

died twice in my expensive clevo gaming laptop :( paid the price of one of thous cool mini Alienware laptops in 2010 to get it fixed...

apparently the 980ti has 50ms lag on its VGA out signal thou.. into the trash it goes.


(even thou you can get 0.2ms hdmitovga adaptor for 30$)

>still runs most games on medium
If for games you mean the elder scrolls morrowing then ok. My HD5870 can unironically run most MODERN games on medium and with crossfire can do it but with 400W consumption instead.

>average processor has 4 to 6 cores now
>devs only use one
That's not a CPU bottleneck, that's pajeetware being garbage.

kill yourselves grafix kikes

only retards play shitty games made for shitty people like you

Attached: 1535350361707.png (954x704, 489K)

Posting an anime picture doesn't make you right or make anyone think your opinion isn't garbage.

>HURRR CRYSIS FRENS DUNT RUN
The game talls you on the start scre that it runs best on Core 2 Extreme period.

Mine died after caps burst. It was made by XFX.

Why didn't you just replace the caps? Simplest bit of board work you could ever do.

because if you had that skill you would be spending your time volt modding 1080tis so you could actually overclock them properly and bypass the voltage regulators not fucking with decade old cards.

some overclocker bros fuck around with old radion cards from like 2011-2015 or some thing but that's just because they get the best scores in the older benchmark rankings I think when pushed hard.

Why do you even give a shit about the outdated port? HDMI, DisplayPort, and even fucking DVI are all options on the 900 series.

Changing caps is not a fancy, difficult "skill". It's high school shop class tier.

VGA monitors are only ones with 0ms lag even the best "1ms" 240hz lcds have 15ms of lag in the middle of screen and 18-20 at the bottom.

having a vga port with 50ms lag defeats the entire purpose of having that natively supported. might only be in SLI mode thou some idiot on youtube has a video about it. he might have used a cheap adaptor like a retard thou but I sorta believe that he didn't maybe.

>0ms lag
Are you implying that 20 years old tech is literally able to compute information in 0 time and we somehow forgot how to do it?

also its DVI-I (that supports VGA natively via a passive adaptor)

900 series doesn't have actual vga ports.

new gpus have DVI-D that doesn't work with the passive adaptors.

980ti 6gb (or 12 with Titan XM with same performance) is best DVI-I card. there are 0.2ms hdmi-to-vga adaptors for like 30$ from china but im not sure if they can do past 85hz they possibly can and are fine but haven't tested and they only rated 85hz at 1024x768 but that's prob the Chinese guy not knowing you can increase hz output to 120hz in drivers via a custom profile but that might not be possible on the hdmi 1.4 maybe only 2.0 not sure.

possibly means 10series is shit for CRT monitors. 2080ti etc has hdmi 2.0 right?

its analogue you madlad in reality its like 0.00000001ms thou I think but still super fast.

I mean VGA CRT monitors VGA LCD is equally as shit as the fake "1ms" 15ms at crosshair shit we have today.

15ms is shit all but it matters to me. ill prob take the lag when they make 500hz lcd thou but that might not be for a while.