Is pic related a good design for a Modular Graphics Card?

Is pic related a good design for a Modular Graphics Card?

Attached: Modular Graphics Card.jpg (1920x1280, 1000K)

Other urls found in this thread:

youtube.com/watch?v=L8EWqWj2srg
twitter.com/NSFWRedditImage

Stop being stupid.

What does the cooling system look like?

OP's idea might be better than pic related, but that might not be much of an accomplishment.

Attached: modular-graphics-card-nvidia-radeon-design-2018-dave-delisle-davesgeekyideas-1.jpg (740x450, 162K)

You first. Retard.

I'm stupid too, care to explain why this wouldn't work?

A better use of your time would be to move the cpu to its own pcie card(s)
i want a motherboard thats just a brainless shitload of slots

This shit is just impractical
GPUs are already a pretty low margin product for the actual board makers, to do something like this would jack up prices significantly while only benefiting a small few

So consumers wouldn't save money buying chips for a reusable board vs just buying a new board?

No, because the chips would cost as much as a new board does now.

Okay, remember Project Ara? well why did it fail? it was a good idea on paper, yeah for tech savvy people, for the masses was a no go because of too many removable parts, people are morons and morons make up for 80% of the consumer base, so the idea it's to make technology for the masses that are mostly plug and play devices, you don't want stupid people tampering with little pieces that can be broken for misuse and thus blaming it on the company, which means warranty process will be lengthy because you need to know if it was an user related fault or damage and since the fucking costumer is always right it would mean huge money and time losses for the company, why make something for a minority niche market with little to no profits when you can produce on mass for the normies and cash in a lot? that's my opinion tho, also excuse my grammar and senseless typing i am still learning the mutt language.

Get raped and kill yourself, you retarded fucking faggot sack of nigger shit with down syndrome.

>brainlet lost time doing that shit in mspaint
pay a visit to the doctor and do a checkup, you might be legit mental retard, and that could entitle you to neetbux. i am not joking, please do visit a doctor.

Attached: Calm-down-bro.png (600x459, 133K)

>lga vram
Thats just asking for trouble

gpu architecture isn't that so easy to modulate and beyond a point you're really just adding a much greater level of complexity and constraints to save buying a new $10 PCB

Balls to you sir.

Would you rather have PGA VRAM?

Maybe you're an mspaint wizard but I don't think I could make OP's image with that program.

Having it in sodimms would be the best, or build onto the gpu kinda like mxm

>Modular Graphics Card
this is retarded. the better idea would be add gpu socket to the motherboard and share system memory. this will actually improve performance because currently memory copying is a major performance hit.

Regular DDR sucks shit for GPUs.
Just look at any APU and how they scale with memory speed.

Attached: uqbd0v5r6nq21.jpg (503x281, 51K)

That's why you go GDDR for system memory like a console

> LGA RAM slots
Now that's an abomination I want to see.

Bazinga!

that image is so fucking retarded my head hurts from just looking at it. the only real modular part would be the gpu core and the cooling solution.. memory, vrms and connectors would all be fixed stock, preferable just like on the highest end gpu availible right now.

also, do I really have to tell you, that you would have to pay much more even if you went with the lowest end modular gpu core availible?

>get angry because someone has good reasoning against your retarded, worthless idea
please just fuck off already, you're not an innovator and never will be

I'll repeat myself in this second thread.
GPUs only needs standard fans sizes and mounting points, so anyone can easily replace the shitty oem ones.

With that in mind, why even have the Cooler pre-attached to the Graphics Card in the box. Not only will the cooler be more easily replaced, customers will be able to put on their own thermal paste as an alternative to the included thermal pad.

>why even have the Cooler pre-attached to the Graphics Card in the box.

Because PCBs are OEM designed, with different memory, vrm and power connectors placements. It makes more sense an OEM heatsink that maximized dissipation with standard fans.

Less than 1% of people replace their heatsinks.

Guess the only components of the Graphics Card should be the GPU (LGA or PGA instead of BGA), and the cooler (heatsink and fans).

Why not put the GPU on a socket on the motherboard with its own heatsink bracket, instead of putting it on a PCIe card?

Attached: consider the following.gif (500x374, 500K)

why do we need swappable gpu cores? the whole point of the integrated pcb design is to work effortlessly with low latency memory. this whole design is just inefficient.

My first 3D capable graphics card (S3 Virge) had upgradeable VRAM.

ITT retards who don't understand buses

>Modular Graphics Card?

Why not HBM instead of the looking RAM grids?

>Why not put the GPU on a socket on the motherboard with its own heatsink bracket

Power delivery

>LGA memory
Make it a fucking DIMM you retard.

Why not just give the GPU its own power connector on the motherboard, then? The CPU has one.

What is a buse?

What about the latency?

How? The GPU companies could sell the chips directly to the consumer but packaged with LGA or PGA, but without the added cost of things like GDDR6 and VRMs

Is pic related a better design than that of OP?

Attached: Modular Graphics Card 2.jpg (1920x1280, 858K)

At least it will be more customizable.

>Suggesting that the GPU should be on the Motherboard.
Hasn't that been done before?

Attached: laptop graphics chip on motherboard.jpg (800x600, 131K)

Sometimes it was like that in 80s, won't work with today's freqs

is it removable retard?

youtube.com/watch?v=L8EWqWj2srg

>Laptop

Specifically an IBM ThinkPad T42 from 2004

Attached: WCeg4QX1A3SImZTi.large.jpg (800x600, 54K)

ITT tripfag who doesn't understand how adding a ton of resistance to your bus increases resistance/latency

Let me think about this...
>HBM4 GDDR7 VRAM
>3.5nm LGA GPU
>PCIe 5.0 x16
Just in time to meet High-End market of the early to mid 2020s.

>lga ram
lel

Attached: cap_[Tsundere] Plastic Nee-san OAD [BDRip h264 1920x1080 10bit FLAC][8DF85A36]_00:04:39_35.gif (1920x1080, 3.01M)

>Okay, remember Project Ara? well why did it fail?
Because google bought it to get rid of it

O.K. HBM3 or HBM4 it is (HBM2 pictured).
As for interchangeability, would you rather have PGA or LGA in terms of which has the two has the lowest thermal resistance?

whatever makes physical contact with the VRAM.

>Socket for every memory chip
My fucking sides

>Because google bought it to get rid of it
Because it was too hard for normies to use in the first place.

Just make it the socket alone, with no VRAM. Then the board can be half the size; and the GPU must include HBM. Yeah, it'll cost $1000 for a midrange card, but that's not a problem, you guys are already fine with GeForce cards costing that much.

or if you absolutely want modular memory, then remember that we modular VRMs as well, so you need VRM sockets.

The Thinkpad where the South Bridge or GPU would desolder from the motherboard and require you to reball them to get the laptop to work again?

>just fuck my latencies up m8

Power delivery in that context means VRMs which is much more than just an 8-pin connector. Current needs to be processed before it can be fed to a processor and that requires mosfets (or power stages) and inductors which has both the cost of the actual parts and costs of designing a PCB to accomodate those.
tl;dr modular daughterboards are a much smarter idea for non-essential high amperage chips

Actually if future VRAM has anything resembling the wattage of current parts you could just leave it alone or have VRAM chips with their own heat exhaust like you see on some mosfets. Airflow will do the rest.

Because the board in massive production numbers costs 2$. Because both the chips and boards would multiply in complexity and would ramp-up R&D cost.