Name a more iconic entry level GPU than the 750ti. It had amazing value. It had a a good hash rate per watt...

Name a more iconic entry level GPU than the 750ti. It had amazing value. It had a a good hash rate per watt. The low form version could Instantly give any desktop PC PS4 level quality gaming. Was very quiet and ran very cool. Sipped on power from the Mobo.

Attached: evga_02g_p4_3753_kr_geforce_gtx750ti_2048mb_gddr5_1032845.jpg (1000x1000, 82K)

Other urls found in this thread:

youtu.be/vDcmZW7KPg0
youtu.be/7iNNUAnFjcM
twitter.com/NSFWRedditGif

You just described my 1060 3GB cept mine is faster.

>650Ti Boost
>GTX460
>6600GT
>HD4850

The 1050ti

Attached: stepaside.jpg (576x464, 64K)

Nigga I shopped for VESA Local Bus cards before there was a WWW, and you want to tell me about iconic components?

Attached: 1380869646806.jpg (217x208, 8K)

This. My 750Ti was going bad so I upgraded to the successor 1050Ti.

Radeon 7870
GTX 1060

HD 5770
still viable today, especially with the amazing linux drivers

1060 3gb

...

GTX 670
it can play Crysis, thats all that matters,

Barely
Meanwhile 1070s are getting cheap and I can't think of a game today that it can't play at max settings without issue

It wasn't really an "entry level GPU", but for its price it pretty much was for anyone building a PC at the time.

Was there ever a better performance/$ than the 970, 3.5 meme included? That card was fucking incredible.

And I'm still using mine with a 1440p144 monitor.

Attached: file.png (1306x1306, 1.47M)

1070 isnt an entry level card, brainlet.

3.5 was terrible but you have a point. 90% of new builds had an i5 and 970 at the time

TNT2

FX5200
every fag and both of his dads had one at some point in time

I still remember when the 6600 GT was the best performance-per-dollar card on the market by a huge margin.

Gtx 1080
Vega 56
290
390x
Rx580
Anything below high end gpus from 2013 tier perf is shit

Heh I owned one of these
Was shit so I got a 7900gt to replace it woah 500aud so expensive I bet I'll never pay more then that for a gpu...
I think I had a tnt 2 Riva turbo or some shit wish I kept it

HD4850

except that series was absolute dogshit

8800GT/9800GT

you could crossfire those cards

it was awesome

that's half the iconic part about it

I held out with a single HD4850 until the release of the HD7850 and I didn't switch because performance issues, I switched because the new DirectX that was suddenly in every game.

I'm a big fan of the HD7750 because it was common for them to be single slot so you could game on a businsess pc that had a wierd form factor

Attached: 07152014127.jpg (4000x3000, 1.91M)

ATI Radeon 9500, back in the day you can install a modified driver and turning into a 9500pro or even a 9700.

>had
>was
970 is the newest component in my pc

Attached: 053A237E-814E-4846-941D-7C28B5972E58.png (327x327, 124K)

My nigga

i know that feel

i held on to my 4870 for as long as i could

Tfw my 32mb AGP tnt2 still working 10/10

Shit had so many hours of Q3 Arena

Mine was 16mb
God to think we've had 16gb gpus for years is a mindfuck

I also have a 40mb hdd in my IBM 2121

GTX 1050 Ti

Checkmate, faggot OP

Attached: 04G-P4-6251-KR_XL_4.jpg (1200x1200, 79K)

>tfw on suicide watch right now because a 1050ti is slightly faster than my 250w 7970

price doesn't seem to matter to nvidiots

hd 4670
hd 5670

>surprised his SEVEN year old gpu is getting surpassed

Not surprised, just bummed at how things move on, I think it still holds it's own even though it will heat the room.

amazing? at least with the stock ubuntu 18.10 even my desktop lags. which driver you using? fglrx doesnt work anymore and the only other choice is radeon

Can't wait for the 2050 Ti

Turing microarchitecture, which means it will do compute like a champ and have HEVC Main 4:4:4 12 hardware decoding which Pascal does not support, only 4:2:0

I still have a HD 5970 somewhere in my closet. I wonder if that thing still works.

>all these zoomers saying last gen cards are iconic

>Turing’s new NVDEC decoder has been updated to support decoding of HEVC 4:4:4 8/10/12-bit video streams

THANK YOU BASED NVIDIA

There's more to a GPU than raw power, there's price to per performance, thermals, and my favorite, performance per watt. As a 55w GPU ages, it gets old and slow but still sips power, as my 7970 ages, it will become a hot potato.

Local store has basically a pallet of MSI GTX 1060 3G Aero OCs for $115 USD still in the warranty period till 2020/Nov. Basically half price of the new cards for 1 year less warranty.
Internet cafe's are upgrading and flooding the market with the GTX 1060s.

Second hand GTX 1070s are still $320+ here.

GeForce 4 Ti4200

fuggin zoomers :-DDD

I still use that HD7850 and will only switch - again - not because of performance issues, but because NVME SSDs can not boot in legacy mode, and the HD7850 doesn't have UEFI drivers.

However I need to replace my SSD ASAP because it's dying, but I don't want to buy a new video card before Cyberpunk 2077 is being released, so I think I will circumvent that boot issue with Clover.

Not him but what the fuck are you on about?
My fucking T500 with an ancient MOBILE ATI card works ootb now with the built in amd driver.
try fedora

how's this shit even relevant to what i'm saying?

Everyone and their grandma had the Geforce 2 MX200/MX400 for Age of Empires II

Except the 750Ti was like $100, usually less.

It'a relevant because there sometimes are generations of GPUs that gave alot of performance for the money and were iconic as fuck, compared to now where a GPU like the 2080 is barely even faster and costs a shit ton more than the gen that came before it.

not when it came out it wasn't, it was a reasonably priced card but no where near $100

750TI sucked, slower than a 650TI Boost, often not faster than vanilla 650TI.

So many retards bought it over the 650TI/Boost because of the viral marketing.

>3GB
And it's not a value GPU, actually that's the gimped GPU for retards that can't manage money.

But does it support 3dfx?

there's more to being iconic than just being a good GPU for the money

Attached: 1534645510391.jpg (300x291, 11K)

I got mines for $160 and it shreds through every game at 1080p

Average of $65 USD more for 7~10% performance boost and 3GB of extra ram that the card can't make use of is a no brainer, 1060 3G for 1080P 60FPS or go straight to 1070/TI.

How the fuck so? the way it looks?

Yeah, now try putting textures on ultra. Games like Doom will run like ass.

Don't mind the underage faggot that had the 750TI has his first gaymen CPU becaues that's all that mommy would buy him.

>5 FPS less

OH NOES

Attached: DOOM.png (684x848, 49K)

calling a GPU iconic would imply that it left a lasting impact on the market because of its huge popularity, which isn't the case with any of the last gen cards

Sure thing, now go to an area with more than 3GB of textures around. That GPU is shit and you know that you fucked up. It will age like milk.
youtu.be/vDcmZW7KPg0

Well AMD's first GCN graphic core must've been really iconic because AMD made huge bank putting it in the PS4/Xboner.

It's iconic because it was a killer DX11 GPU for the money, Nvidia had to release the 780 with 3GB of ram on it because the 7970 GHz was so damn good and had a 3GB frame buffer back in late 2011 when the og 7970 came out.

>Retards posting 10 second snippets of cherry picked video.
>you fucked up
Nah the only thing I fucked up with was not buying a 1080TI for 1080P

Attached: TI.jpg (4032x3016, 3.18M)

Nice cope. Take a look at another game that uses more than 3GB at 2:15. Having 3GB in a world were even the entry level card has 4GB is just retarded.
youtu.be/7iNNUAnFjcM

They keep coming up with ever increasingly cringey box art. Amazing.

They think they can compete with 3dfx

>Entry lvl
>4GB
LOL you're so delusional it's so funny, 50.33% of the market has 3GB or less.

The MSRP for the 750Ti is $150 and iirc it was right around $100 towards the end of 2015/beginning of 2016.
I wasn't into PC building back in 2014 when it released but I can't imagine it would have been much more than it's MSRP.

>tfw average GPU on steam has 2048MB of ram
E T E R N A L

Attached: 2.jpg (1600x1200, 205K)

That's for the fucking 1GB version. By the time prices started falling near 2016, You should have been buying the GTX 950 which was released Aug 2015.

The GTX 950 at least started with 2GB @ $160 and green versions even had no PCIE 6pin.

1050ti has 4GB, so does the RX570. Both are entry level cards.

The average CPU on Steam is also a Intel HD Graphics.

>tfw the lowest end GCN 1.0 card still whoops intel HD graphics
The Intel HD 630 is pretty damn good for basic gaming imo, pretty much obsoleted anything Radeon HD6670 and below

Attached: 7750_all_1600.jpg (1133x1000, 310K)

And the Athlon 200GE/2200G/2400G with their Vega graphics just killed Intel graphics for good. AMD just need to get their shit together and put more Ryzen CPUs on laptops and people interested in low end game will buy that shit like water.

>It scores 195 points more than an HD7750
F

I still think a single slot refrence HD7750 is a good card to have around for old PCs or as a spare video adapter because it's amazing for daily use and games decently at 720p

Attached: i-img1200x900-1538360567gfjm9e57242.jpg (1200x900, 166K)

Attached: Untitled.png (616x187, 14K)

2400G uses a Vega 11 which is even better.

nah, a 750Ti or a 1030 would be better.

8800GT/S/X

GT710

The 1030 GDDR5 is pretty damn good because you can get it in a single slot form factory which is sexy as hell, I think there are single slot 750 cards but they're rare.

Attached: 2017052509344859_big.png (1000x1000, 162K)

i bought one of these evga 750tis used and whatever plastic bracket the fan screws into literally is falling apart. I tried to fix it and it just shattered in multiple places, leaving the fan only screwed in by one screw.

8800GTX was hardly entry level.

>factory
factor

Bunch of cards on here aren't entry level.

Still run one of the LFF models in the kids machine. Does a damn good job.

Why would you retards pay more than RX570 money for a worse option?

1050ti was cheaper