>The PCB of an upcoming NVIDIA car has been leaked by Baidu user. The photos present a new card that will sport either 8GB or 16GB of GDDR6 memory. The card has 10-phase VRM powered through 6+8pin power connectors (the 6-pin might be optional). For comparison, the GTX 1080 has only one 8-pin connector.
>This is not a custom board, as the board model and NVIDIA logo suggest that this is a reference PCB. The new SLI fingers might be an implementation of NVLink for gaming cards.
>The pinout for the GPU is rather small, so likely for the GV104 processor. The board number is PG180. The certification logos indicate that the board is final.
>KCC, ROHS, FCC, this is most likely the final production board?
>literally the same pinout as current gp104 REBRAND INTENSIFIES
Jackson Allen
inb4 it consumes more power than the 10 phases can handle
Brandon Gonzalez
Autism
Henry Bailey
Look at this housefire
Henry Fisher
>not using HBM DOA
Dominic Harris
>2080 first time hearing this one
Henry Scott
>NVLink for gaming cards I doubt that. NVidia has been cracking down on using gaming cards in the data center. They can't charge extra for their data center cards if the gaming cards have the same features at a lower price.
Caleb Brooks
I don't care about 12/14/16nm or GDDR old shit I'll wait for 7nm and HBM3
Owen Long
HBM is a meme
t. amd fanboi
Colton Williams
>HBM3 >I want a $2k gpu
Jonathan Cox
what an absolute mess
John Taylor
Exactly what I'm thinking about, with nvlink there's almost no way this is a consumer card, I bet on the nearest event they'll just announce new quadros.
Tyler Miller
>DVI IS DEAD
THANK YOU BASED NVIDIA
Carson Lewis
Hiuu 2 fan connectors so nvidia reference will have 2 fans now
Jayden Miller
Not much of a feat. My RX480 has no dvi ports and that was released 2-3 years ago.
Dominic Lopez
ahahahahahahahahaha THIS SHIT WILL HAVE LESS CUDS CORES THAN THE 1080ti ahahah a new card slower than an old one , thank you based novidea
Colton Taylor
>They are using the 600 bucks bridge used in their ultra high end tesla and titan v shit. wew lad.
Ethan Lee
HBM is not a meme. It is the future of ultra-high bandwidth applications.
The problem is that gayming hasn't been massive bandwidth-straved on modern GPUs for years. That's why the entire industry got by with GDDR5 for so long and only pushing for more extoic stuff on their high-end SKUs.
Eli Rodriguez
It is a testing prototype card.
The reference design is going be to 90% of that (PGA for the GPU, removal of testing fingers and streamlining power circuitry)
I'm so jealous of people who can look at a PCB and understand anything. Really cool knowledge to have and I'm too lazy to figure it out.
Robert Roberts
inb4 they use dubblers and dont annc anywhere resulting on literall housefires
Connor Anderson
>The problem is that gayming hasn't been massive bandwidth-straved on modern GPUs for years That's builshit though. At resolutions above 1080p and especially at 4K, games are massively bandwidth-starved even on the cards with the widest buses on the market. Overclocking the memory brings bigger gains than overclocking the core on current cards at higher resolutions in many cases.
Cost is the single and only reason that Nvidia aren't using HBM2. It simply isn't worth it when only a fraction of their audience actually game at 4K. Their profit margin will be much bigger sticking with GDDR and sidestepping all the technical challenges and added costs associated with implementing HBM2 instead.
Cooper Hill
yeah anonymous I'm sure it's the laziness LOL
Jace Smith
You mean stupid, I assume.
Austin Hughes
Oh, the stupidity too. But I could probably gain a surface level knowledge of the various components if I bothered to do the research.
Ian Murphy
I have a 980Ti and I can assure you that overclocking core gives me much more performance on 4K than overclocking memory.
Justin Gutierrez
if your front end is choking overclocking your core wont do jack shit because the massive textures will still be choking behind it
Ian Garcia
Hey faggot, Quadro GV100 only has PCIe slots and NVLINK WORKS as GPU to GPU link on Intel motherboards
FUCK OFF FAGGOT IF YOU CAN'T UNDERSTAND SIMPLE THINGS LIKE THAT
Textures in games have been stagnating since like 2013
Austin Harris
I suppose you won't know until you try.
Robert Torres
PCIe connector and NVLINK GPU to GPU link is confirmed to work by Nvidia themselves, but of course some non engineer faggot thinks he knows better than Nvidia
>The Quadro GV100 GPU, with 32GB of memory, scalable to 64GB with multiple Quadro GPUs using NVIDIA NVLink™ interconnect technology, is the highest-performance platform available for these applications.
>2 fan connectors I'm sure one is just a power source for the inevitable smattering of RGB LEDs.
Elijah Gonzalez
>RGB they only need green and a breathing effect
Joshua Ward
Its really weird how radio silent Nvidia is being about 11x0/20x0. I'm bracing for disappointment (10 series refresh which it self was already a refresh of 9)
Nathaniel Wilson
>muh hbm
GDDR6 can be faster than HBMeme 2 bunch of retarded
>Its really weird how radio silent Nvidia is being about 11x0/20x0. It's because Nvidia is competing with themselves right now, Vega was a massive disappointment for gaming.
GPU to CPU link doesn't work because Intel and AMD CPUs don't support it, motherfucker
But GPU to GPU link do work, they bypass the CPU and chipset and they talk to each other, hence why there's an NVLink connector on the top of the cards, something so simple yet a faggot like you can't even understand
KYS FAGGOT because people don't need to deal with retards like you
Really your paying for the license to use it, but yeah, thats probably all that thing is.
Xavier Mitchell
>GPU to CPU link doesn't work because Intel and AMD CPUs don't support it, motherfucker
no shit imagine if i didnt said that already
no its missing quite a lot of connections you would have known that if you bothered to count the actual lanes leaving the core its literally a second sli and nothing more
Adam Gonzalez
You're using with cards that go for over $6k each.
Jayden Sullivan
Quadro GV100 is $8,999 US dollars actually
Cooper Barnes
Who's the girl?
Camden Perez
Why dont you just use an adaptor? I personally only use dp for my gaming monitor and run my other two monitors with hdmi and dvi off my intel igpu, and i mostly do this because 144hz, 60hz, 60hz on a gtx 970 is broken
Christian Morris
There was no claim that 11/20 is a refresh you dult. Its bracing for disappointment that it MAY be just a refresh. Last new arch was in the 700 series, 900 was a refinement, and 10 was a refinement+die shrink.
Chase Cooper
Is this real????
600 USD for a fucking piece of passive dumb plastic and metal?
Adrian Reyes
does an adapter do 120+hz without input lag?
Levi Allen
>consumer nvlink >will allow for perfectly scalable vidcard clusters without the need for custom profiles for every game >basically what SLI/xfire should've been from the beginning
>consumer nvlink Wut? No "consumer" will put down 2k $ to run dual GPUs. NVLink/SLI/XFire have and will always been/be for high end gaming.
Logan Howard
>What are we gonna do AMDBros. Do you seriously think people are shedding a tear over vapourware?
Alexander Russell
fuck off m8 my old 120hz LCD moonitor is still DVI
Hurr durr lets move away from DVI and keep HDMI shit version
Robert Torres
>GPU to CPU link doesn't work because Intel and AMD CPUs don't support it, motherfucker
more like nvidia doesnt allow them to support it
Ian Cooper
16gb ddr6 vram
sounds like its easily going to cost $1000+
picked up a used Titan for $500 feels good man, sure its 10% worse but HALF the price
see yall crackers in a few years when the GTX3080ti comes out
Ethan Richardson
Ariadna Majewska
Samuel Baker
Are you retarded? It does not cost $1000
If you use 16Gb/2GB GDDR6 modules, you only need 8 chips for 16GB of VRAM
Brody Morales
MOAR CORES on Threadripper!
Luis Rivera
>Ngreedia will now refresh their lineup every 6 to 9 months whenever they buy faster GDDR6 let's hope it will OC well justs like Micro promised ('member the 20Gb/s GDDR6 test)?
Hudson Campbell
this and it fucking annoys the shit out of me since I play at high res, everything still looks like shit because the textures are made in fuckn 1366x768
>GDDR6 can be faster than HBMeme 2 bunch of retarded that argument is like claiming that 500 Sata III HDDs on Raid 0 are faster than a samsung 970 NVMe. HBM has less power consumption per pin More MT/s per pin Takes less space Can have wider interface with the cpu/gpu
What are the disadvantages? Obviously the know-how, a good packaging partner and having a proper PHY IP.
novidia is a known jew in the GPU market, they won't give you the latest and greatest. They will sell you instead something cheap and market it as great. Do you remember the laughable move with the 2 versions of Titans in the 10 series? >Here's a Titan one year later >Here's another Titan, a less shitty one.
But GDDR6 is still cheaper and can be faster than HBMeme tho...
Brayden Robinson
HBMeme requires expensive interposer and consumers really only care about how many FPS they get. The only current benefit of using it over gddr6 is smaller PCBs for stacking more teslas in the space.
Cooper Thompson
Thats why NVIDIA put it only on high end card imbecil
Christopher Diaz
>HBM is a meme! >that's why they're only using it on the highest end cards dummy!