Jow Forums designs a Graphics Card

Current blue prints for you to expand upon
>PCIe 5.0 x16
>HBM3 VRAM (capacity to be determined)
>5nm GPU

Attached: iu-2.jpg (474x474, 15K)

Other urls found in this thread:

realworldtech.com/silvermont/8/
youtube.com/watch?v=lRNAkklbMVU
myredditnudes.com/
twitter.com/NSFWRedditImage

8 HDMI outputs

goy-sync

Also, I'll start on the logo

>5nm
oy vey

>PCIe
Obsolete

Integrated gpu is the future

No support for DirectX. Only OpenGL, Vulkan and most importantly, Glide

6 million cuda cores

Three (3) Whole Kilo-Bytes of V-RAM

>no DP
>no dvi-i
>no SCART
I smell /v/ in your post

What about this?
>2 x HDMI 2.2
>2 x DisplayPort 1.5
>1 x USB Type-C (USB 3.3 Spec)

>cuda
I don't think so, Jow Forums is an AMD board.

Open source drivers ONLY

Keions on the shroud and box. Libre hardware and software.

Attached: Konachan.com - 56934 akiyama_mio chibi hirasawa_yui k-on! kotobuki_tsumugi tainaka_ritsu.png (1600x1200, 773K)

>SCART
Calm down, gramps.

Make it a specialized RISC core that you can upload your own firmware onto
Or an FPGA

16GB VRAM, to flex on RAMlets and poorfags.

>promoting the consumerist cesspool
kys

It's got to have some of those special ray tracing chip things.

This with a special edition that will be announced and released 10 days later with 32GB of VRAM for the same price.

Only one shader unit running at 50GHz

Only thing i really need is drivers that aren't totally shit.

Honestly I can't wait to see the mainstream intel gpu will be like.

Sweaty, Integrated GPUs still talk over the PCIe bus to the CPU, even when they are on the same chip.

I want RGB for my high end CRT display.

"no"

Attached: Screen-Shot-2013-09-13-at-6.32.07-PM.jpg (1050x694, 165K)

Oh yeah, and some way for custom resolutions without fucking around with the incomprehensible garbage that is xrandr

just one HDMI port
>if you want more you have to buy the dongle.

How do you think the "Intel HD" block talks to the rest? Via the PCIe protocol.

Blueprints so far
>PCIe 5.0 x16 (63.01 GB/s)
>HBM3 VRAM (128GB)
>5nm GPU (15360 Cuda Cores)
Ports
>2 x HDMI 2.2
>2 x DisplayPort 1.5
>1 x USB Type-C (USB 3.3 Spec)

>The new SoC fabric is based on the In-die Interface (IDI), which was used in Nehalem and Westmere.

>IDI is a uni-directional, point-to-point link that connects each IP block (e.g., CPU cores + L2, graphics, system agent) to a global crossbar. The read and write channels are independently sized and governed by a credit-based flow control mechanism that delivers low latency for high utilization scenarios. Unlike the earlier FSB-based designs, transactions between IP blocks can occur out-of-order, reducing latency and improving system level performance. The system agent contains the memory controllers, which have much more extensive support for out-of-order memory transactions, compared to Saltwell.

realworldtech.com/silvermont/8/

>Hey guys let's design a graphics card
>But i ultimately get to choose what actually goes in it
Why bother asking in the first place?

16 12G SDI outputs, dual balanced XLR for sound.

>DP 1.4 x2
>HDMI 2.1 x2
>VGA x1
>RGB x1
>S-video x1
>USB-C x4
Support 8 individual displays, 32k x 32k @144 Hz

128MB of eDRAM

Which is not relevant at all. It's an alternative to FSB or QPI, do you think computers that used FSB, didn't use PCIe?
The protocol being used is PCIe, out of sheer legacy and compatibility reasons, if you have a device based on said SoCs, open up AIDA64 or lspci, you'll see that the GPU is connected over the PCIe protocol and even see a PCIe root address.

And no windows/osx support. Good way to release a dead product.

It's as much PCI as LPC is ISA.
Also there's no PCIe protocol, it's all PCI

hybrid chip that not only does 3D graphics but also 3D audio in parallel

think 3dfx Glide + Aureal A3D

Attached: 1418672198601.jpg (1632x295, 215K)

A built-in 4" monitor with RGB backlighting.

>not doing one thing well

>do you think computers that used FSB, didn't use PCIe
Are you fucking stupid? The FSB is completely independant of the PCI/PCIe buses, it can be GTL, AGTL, QPI, GigaPlane, 60x bus, MaxBus, FlexBus, UPA, what have you. PCI (and PCIe) were designed from the ground-up to be completely agnostic of the rest of the system. The PCI bus controller used to reside in the chipset, and the only reason why modern CPUs have PCIe buses coming out of them is because the chipset has been integrated into the same die.
>The protocol being used is PCIe, out of sheer legacy and compatibility reasons, if you have a device based on said SoCs, open up AIDA64 or lspci, you'll see that the GPU is connected over the PCIe protocol and even see a PCIe root address.
It is presented to the system as a PCIe device because of compatiblility reasons, yes, but it is NOT PCIe. SATA drives present themselves as IDE devices until the AHCI driver kicks in, does this mean they are IDE even when in AHCI mode? Modern PCIe graphics cards still use the same interrupts and addresses as EGA cards on an ISA bus, does that mean they are ISA EGA cards?

MISD

Rops based on the alliance AT3D video chip:
youtube.com/watch?v=lRNAkklbMVU

only has 1 non-sHDMI output limited to 1080i, and the sHDMI outputs are required to use multi-monitor setups, and sHDMI monitors are and with anything.

Nigger HBM3 is a low speed, low cost version.
Fuck off with these gay threads

iGPU will always be the future, unless we can reach 256gb/s+ bandwidth on common DIMM DDR memory instead of 30mb/s

vga
s-video
composite

/thread

Vp8 hardware encoding and decoding

But how does the architecture of the GPU cores look like and what would Jow Forums call them.

Then you can flash the 32GB BIOS onto the 16GB card and unlock the rest of the memory.

Integrated FCPGA for cycle accurate SNES and Sega Genesis hardware capabilities. With a breakout cartridge connector packed in the box.

Composite in/out
RF in
No voltage limit, if you want to burn your house down you can.

integer scaling

2 VGA
2 DVI
2 HDMI
2 DP
500mm * 120mm radiator connected to water block on gpu
32GB HBM3
600W TDP

>16-bit bus

Why the fuck do you need 2 vga you caveman

VGA only is the future

vga sink
wifi chip and android app for management

>dvi-i

projectilevomitingman.webm

>going backwards in tech

We started out with that. No one want integrated graphics except laptop and phone shitters.

On-die FPGA to support new standards and accelerate fringe tasks.

This will never happen because they want you to buy new cards every year.

All drivers written entirely in Python

>dvi-i
fuck off back to 2007 nigger

LETS DO THIS

Attached: rgb-scart-to-3-rca-ryw-composite-adapter.jpg (550x637, 40K)

>SCART

Attached: 1537378127484.png (1920x1080, 40K)

>(capacity to be determined)

32Gb RAM, DONE!

My uncle works at nvidia, so I know what they will launch in 2020.
>RTX 3080 Ti RGB MAX
>PCIe 4.0 x16
>9216 Shaders
>288 Ray tracing cores + 2304 Tensor Cores
>32Gb HBM3 8192-bit
>1 DVI-I and 4 VGA
>2000 Ghz Base Core Clock and 2500 Ghz Boost Core Clock
>Intel 10nm Fab
>Hybrid cooling : Six 60mm RGB Fans and an external 1770 Watt Hailea HC chiller equiped with RGB Lights
>TDP: 1750 Watt
>Price: 29,990 USD (34,990 USD Founders Edition)

>SCART

Attached: 1555741138688.png (613x344, 250K)

My nephew works at Notebook check and xe said they are writing the article noes

4GB of VRAM is a pathetic amount.

I want RGBHV BNC connectors with 750 mhz 10-bit dacs

SCART is the white man's connector, faggot.

we already had this topic.

Attached: Modular Graphics Card.jpg (1920x1280, 1000K)