Can't wait for it

Can't wait for it

Attached: D2OoC9RWkAAnkR9.jpg (1200x628, 100K)

I would pay for a path traced San Andreas (compatible with GTA mods and SAMP/MTA:SA)

Reminds me of the time I was arguing with a "GeForce 210 Gamer" on SAMP about how his card sucked and how even Intel Integrated Graphics was superior to it.

OP here, i dont give a shit but yeah, san andreas it's a very good game

I work fixing/upgrading computers at a university and I find so many computers with 6th gen intel core CPUs and the fukken gt210 that it's painful. I'm so glad they actually believe me when I tell them that piece of shit is actually bottlenecking the graphics performance when compared to the IGP.

OP here, damn 6th gen CPUs! fucking really? i live in El Salvador and i'm fucking poor, mi main PC and the best one that a have ever have it's this optiplex 3010 i3 3220 lmao, i managed to get a gtx 1050 and i'm pretty satisfied with it

Well that's how contracts with big OEMs work.

Attached: 1014085584.jpg (1000x541, 125K)

There's this lenovo desktop in this store that costs $1600, it has an i7-8700 paired with a fucking GT 730. It's so fucking stupid.

this is your brain on /v/

Attached: 1548538598254.jpg (500x465, 36K)

No joke I just pulled out a Geforce 705 from a workstation and the 3D benchmarkes went from 1-5FPS to 15-25FPS

You can get two DP1.4 ports and an HDMI 2.0 off a 6th-gen Intel CPU- the GT 730 can barely do one DP1.2

this is your brain as a lenovo marketer
>

If they could just make an RT card like this but only does ray tracing to be used with your current card. Kinda like those physx cards back in the day.

All for the low price of 130 us dollars

So it's going to be just as much of a failure as those PhysX cards?

>In a shock announcement just a few moments ago, Nvidia has revealed that it has signed an agreement to acquire Ageia Technologies—the company that raised the awareness of in-game physics with the launch of the PhysX physics processing unit in 2005—for an undisclosed sum.

>undisclosed sum

Yeah, big failure that was.

I remember when I was building my first PC and thought that any GPU would be better than my iGPU.
Then I did some more research and realized that I could play Crysis on my iGPU which the 210 decidedly cannot. Even the 730 was basically exactly as strong, despite costing more than my CPU.

The G210 GT730 GT1030s etc were made to breath life into computers without modern IGPs, and is a god send compared to what was available in the 775 era where IGPs often had single digit or dual digit shared memory in megabytes.

They are still often used today to provide more video outputs for stock exchange displays and other shit, being dropped into pcie x1 slots.

Why don’t they make more powerfull low-profile cards.
Having the fastest one being a 1050ti sucks the big gay.

And no, you shitters can’t say anything. Since you can get 1080 in MXM size.

This looks shopp'd.

There really isn't a market or form factor that will acommodate a low profile PCIe GTX 1080, all decent Mini-ITX cases for instance will be able to accomodate a full sized card. Also bigger=easier and more quiet to get rid of heat

But you can....

Why no DX 12? The GT 1030 has it.

OP here, of fucking course idiot I wasted 30 minutes of my life making it in photoshop

>of fucking course idiot I wasted 30 minutes of my life making it in photoshop
Who is the idiot?

OP here, no u

You know they only use them to output video to projectors, right?

3rd gen intel core CPUs are quite decent, the current cut-off in my opinion. 1st and 2nd gen are getting too old and the IGPs in those are showing their age dramatically.

that's what I thought but the mobos had both DVI-D and HDMI ports built in. They we actually using them for the "2 VGA ports", plugged into 2 DVI displays with VGA converters. That made me extra mad.
Also they had some shitty acer laptops for projector needs. Tossed that shit away and instructed them to get refurbished x230 memepads instead, they're shocked that they don't need to replace them every 6 months anymore.

>but the mobos had both DVI-D and HDMI ports
That are not compatible with DVI-I or VGA, which most projectors use.

Tech yes city actually made a video about it

desktops weren't being used for projectors, as I mentioned.

Then what were the desktops being used for that needed 2xDVI/VGA?

2 displays, but ironically all the monitors supported both VGA and DVI-D.

That's fucking retarded then. Probably just IT faggots trying to make some bucks.

What is it?

>dabbing on the GT210
Great card, only idiots didn't like it - because idiots tried to use it for games (as evidenced by the OP pic) instead of just for desktop and HTPC use.

OP here, The fuck you say to me you little SHIT

do you really needed 30m to badly paste a fan on this andf add some text?

Attached: file.png (1200x1200, 852K)

OP here, I needed to find the NVIDIA fonts and a lot other images so yeah, but thats because i like to waste my time

Why isn’t there a market for it?
And how come that not even one single manufacturer is making a low profile 1060 or 2060.
And the only PCIe 16x to MXM converter boards are +200$ for a PCB with a few connectors on them.

And for heatsinks they could just make them tripple slot or something.

Most of all it seems like they are trying to limit this area without a clear reason why.

What if, Geforce MX8000?

gaming in 15fps with rtx

Even small cases will fit a full sized card with risers and whatnot meanwhile shit like business SFFs are limited to the 75W of the pcie slot, some even to mere 25W.

Fucking buy fags.
Does nobody on Jow Forums even make their own stuff?
The whole point of getting a smaller PCB footprint was to minimize each component enought to keep it at its absolutely smallest functional state.
I am seriously looking into the STX form factor for its size, and run graphics card over the M.2 socket.

If it was possible to buy the chips and information on requirements for them. I would just make my own card.
But I am at the mercy of the manufactures here.

then you answered your own question
everyone buys plug n play shit and companies who want to buy small gpus for shit like NUCs or something buy them in bulk
that being said an m.2 gpu sounds adorable as fuck

I think it is strange that there isn’t a small M.2 accelerator card.
I now some cards for video decoding and frame grabbing exists.
But getting a tiny CUDA chip with at around 10W for 128 cores and 2GB memory chip.
It would make it possible to add basic graphic capabilities to many SFF and laptop units. While using the iGPU for video output.