Nvidiots btfo

Nvidiots btfo

Attached: vegasquared.jpg (1420x832, 303K)

Other urls found in this thread:

ark.intel.com/content/www/us/en/ark/products/codename/84976/titan-ridge.html
pcmag.com/feature/368729/here-s-all-the-amd-x570-motherboards-we-saw-at-computex-2019/15
tomshardware.com/news/amd-radeon-pro-vega-ii-7nm-gpus-apple-specs,39571.html
videocardz.com/80956/amd-announces-radeon-pro-vega-ii-duo-a-dual-vega-20-graphics-card
twitter.com/NSFWRedditImage

AMD btfo

Attached: 1541845437750.png (655x161, 27K)

jesus fucking christ
this is the only impressive thing about the cheese grater

>tfw apple will use an obsolete GPU and obsolete CPU in their premier $50k workstation on launch

>proprietary graphics card
>$3999 each

>best gpu in the world
>obsolete

Attached: 1550785235121.png (856x846, 85K)

>Pooga HOUSEFIRES 550W
>best

TOP KEK

>AMD GPU
>Intel CPU
Nvidiots confirmed utterly BTFO and irrelevant!

This
I know we do what we must to keep the anti apple jerk going
but this is pretty
With 2 of these cards in a Mac Pro that is 128GB of HBM

it wont sell so ayymd wont make money on this either. still its nice to see that applel still hates ngreedia shady advertisement and defective products enough to show them the middle finger

Infinity Fabric linking two dies on a GPU seems interesting tbqh

Or maybe I'm retarded, idk

Still no CUDA replacement.

Just to be perfectly clear, it is not possible to put a high end GPU in this since power is delivered in a fascimile pci express port and not through an usual pci express juice connector

NOOOOOOOOOOOOOOOO WOOD SCREWS MATTER

Attached: 1457365892325.png (808x805, 49K)

Sounds like a job for the Chinese to make a breakout board for some PCIe power

Nonstandard junk, enjoy not being able to upgrade ever

>Putting 475w through a pci-e slot
That's gonna char eventually.

Module design looks like it was ripped off of the Razer modular PC concept.

Nothing wrong with pulling that amount of power trough an edge connector
It's not a normal PCIe connector

AAAAAAAAAAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

Attached: 1551474076187.jpg (500x375, 104K)

I honestly don't get this. Apple must have some insane blood pact with Intel. AMD could have supplied them with 64c TRs with PCIe 4.0.

What are you even trying to say, Pajeet? Vega sucks down as much power as any GPU on the market. If the power solution can power that, it can power literally anything. Or are you just retarded and assumed that this is 580X-based, since the base model is?

Apple are basically the beta testers for Intel. Lightning connector was a beta for USB C.

He's saying you can't whack any high end GPU in it because of the custom power delivery method.

So is the second "PCIe" connector just for power? Because if it is, and they're sticking to standards for data transfer, it would actually be trivial it make an adapter for these, especially with how common PCIe risers are these days. The most annoying part is you'd have to use one of those meme cases that have a separate area for GPUs. The most difficult part would be drivers, assuming these need special drivers to work properly. Not that anyone who actually needs these GPUs would ever use such a hacked together solution, but it's a fun thought.

wait amd gpu + intel cpu... this is probably the worst combination of hardware you can get.

It's more of a concern that companies aren't gonna make the custom boards for these MPX modules, leaving the new Mac Pro is the same position as the trash can, where it just doesn't receive upgrades.

Nvidia still seems to be serious about GPUs for Mac
Nvidia can always make new GPUs for the Mac Pro, they already have the drivers

its just two normal GPUs on a single card

I want to be clear, I was talking about making an adaptor to use these cards in a standard PC.

Apple doesn't use inferior and poorfag products like AMD, the only reason they're not sporting Quadros is because of past grievances and problems with Nvidia.

They're dropping those housefire Vegas if the new Intel GPUs have similar or better performance.

Wonder how Bootcamp will fare.

Nvidia are straight up dicks to work with. AMD will make you a custom anything.

Also, it seems like it would be fairly trivial to run a normal GPU in Apple's meme socket, as long as that's a standard PCIe 3.0 x16 socket, and the second one is strictly power. A normal GPU would clear the power sockect entirely. So the only difficult parts are getting standard 4(+2/4) power cables to power it, which is going to mean a second PSU unless Apple has defied all odds and put the ports on their PSU, and drivers/software, which should only be a serious problem if Apple goes out of their way to make other GPUs not work.

Apple never really went out of there way to make other GPUs not work
macOS has out of the box support (or near out of the box) for lots of GPUs that where never found in a Mac from both AMD and nVidia
My R9 380 which as far as I've seen has never been inside a Mac has full acceleration under macOS with no modifications

A second PSU is not a feasible work-around.

>intel cpu and amd gpu
The most fucking retarded combo LOL

>still no ROCm

AMDrones btfo.

>shitel pozzed housefires
WHAT WERE THEY THINKING?

I know. I'm just talking from a purely technical standpoint.

>Lightning connector was a beta for USB C.
come on, this is pure BS.
TB was intel's desperate move to get more influence on the connector market.
they took PCI-e and moved it outside.
TB has the most horrible vulnerabilities, the DMA attacks.
They tries for 3 iterations to fix this shit, they broke compatibility a couple of times, they gave up because a TB license, plus the chips had a BOM of almost $100 in the early days and noone wanted to add $100 to the final product just because Intel wanted to have some easy money.
They announced that since 2019 TB will be royalty free, the guys at USBIF took the TB standard and cooked it into the next version.
Apple has adopted the TB shit and thus they are forced to use Intel CPUs.
They would break free from Intel, as long as someone else provides a TB compatible chip.

>infinity fabric link
When is based Wang going to make chiplet GPUs a reality for everyday tasks?

Attached: ganbare JC Staff.webm (1280x720, 2.76M)

Thunderbolt is royalty free
New AMD AM4 X570 motherboards have thunderbolt

>intel cpu
Does it come with all the mitigations applied?

Attached: 1558816973358.jpg (960x717, 187K)

If they can apply infinity fabric on GPUs then things will get really interesting. Think Radeon RX 5800 x2

I didn't say Thunderbolt, I said USB C, meaning Lightning was a beta test for power delivery on a reversible connector.

>two gcn shit housefire
>not cuda
>apple

No thanks

you forgot
>cooled by your imagination

>still slower than 2080ti
>cant even do cuda/tensors making irrelevant for professionals

Attached: f8174cca00c030759c0c951ffe1cd8e481c4a0e7ec679d25712f54ccd8310b3e.jpg (522x422, 54K)

>don't need firefighter squad on standby

>Radeon 7 is faster per core in professional workloads
Its purely for that, its the only thing Vega is exceptional at

Are you retarded? They use Intel's Thunderbolt 3 chips for Thunderbolt support, no third party has a TB3 controller

ark.intel.com/content/www/us/en/ark/products/codename/84976/titan-ridge.html

They're called contracts.
Also, Apple depends of TB and also depends of CPU technologies like Intel QuickSync.

He's talking about a future that there isn't here yet.

>Wonder how Bootcamp will fare
Don't worry, they'll gimp it somehow. Gotta make macOS look good somehow.

>Are you retarded?
Are you? Several AMD x570 boards have TB3 support.

>Nvidia are straight up dicks to work with
Fun fact: Nvidia tried to license Kepler to Apple and other companies. Nobody acepted the deal. Nvidia sued the other companies in retaliation for "Kepler related patents". Apple being a giant company wasn't part of the suit, but it was obvious that Nvidia was threaten it too. Since then Apple cut all their relationships with Nvidia.
Thank Nvidia for your lack of support on macOS. They can't sue everyone in a tantrum and get out clean.

Next cycle. He has to polish the shit that rajesh left behind in the meantime

Show one then.

What do apple users need Nvidia for anyway? Facebook isn't that graphics intensive

>Since then Apple cut all their relationships with Nvidia.
Never mind the faulty chips that Nvidia sent out during the 9800gt/650m era as well.

Nvidia in general is a bunch of elitist dickholes

HEy guys we made a NEW FEATURE literally nobody asked for, Also it makes GPUs cost twice as much! Pay the tax you cuck!

pcmag.com/feature/368729/here-s-all-the-amd-x570-motherboards-we-saw-at-computex-2019/15

The MPX module is just a standard PCIe connector with a power delivery that makes it cable free. Create an adapter for those will be trivial.

Are you asking seriously or shitposting?
This could be a factor, but AMD also had faulty chips during 2011 and they still supply Apple.

Are the connectors on the top of the card infinity fabric links to the second card? Interesting

I understand that the infinity fabric is internal.

>He has to polish the shit that rajesh left behind in the meantime
Raja unironically did nothing wrong.

So just a heads up this will not work well in pcie 3.0 as 1080TI, 2080 and 2080Ti are starting to push the limits of it in single gpu x16. Even in sli as not ALL data goes through the sli bridges. So dual gpus in a single pcie slot would surely max out pcie 3.0 as I’m sure they get 8 lanes per card or just share the data through x16 run way all together

Attached: C39767F5-A9B3-4200-90EF-AC3AE34B540D.gif (399x152, 135K)

I wonder what they will be used for then, crossfire has been depreciated since hawaii cards, is there enough bandwidth over pcie 3.0 16x for these cards to communicate with each other?

See

dual cards works fine because they have a controller chip to feed them correctly, crossfire/sli is totally dependent on the driver/developer to work right and things are a mess.

Infinity Fabric exists between the GPU dies precisely to avoid having to go over a PCIe bus
If infinity fabric is meant to go across cards (which according to toms hardware appears to be the case) it would only make sense they would use those connectors at the bottom
tomshardware.com/news/amd-radeon-pro-vega-ii-7nm-gpus-apple-specs,39571.html

Sorry, crossfire bridges have been depreciated since hawaii and all crossfire communication takes place over pcie bandwidth instead
I guess we'd have to look at how the infinity fabric links these chips together on the same card. It looks like they are still using a plx chip like the 295x2 / 390 x2 cards do.

>It looks like they are still using a plx chip like the 295x2 / 390 x2 cards do.
Infinity fabric links GPUs dies directly and not over PCIe
AMD made a note of how IF is faster than what can be achieved with PCIe
videocardz.com/80956/amd-announces-radeon-pro-vega-ii-duo-a-dual-vega-20-graphics-card

Attached: IF.png (892x241, 51K)

Fucking neat. A shame apple is the first to adopt it however.

Does MacOS need them?
Intel is superior in shit like Premiere Pro because they work with Adobe. If AMD ever bother to work with these companies and optimized their code we'll start to see AMD CPUs in these sort of PCs.

Anything that uses javascript needs patching.

Remember when Nvidia wouldn't let Sony see the schematics for the 7600GT custom SKU in the PS3, and then wanted to charge them for each version of PSGL developed? Similar stuff happened with the OG Xbox.

Remember when Apple wanted a 5k display, but Nvidia wouldn't produce the timing controller, so Apple built their own and went to AMD with it?

Nvidia are just dicks to work with.

Literally everyone except the sheep hate Nvidia.

The cpu link is still over that PLX, so its x8 max if both cards are loaded at the same time.

>its x8 max if both cards are loaded at the same time
What? Why? Surely Xeon has the PCIe lanes to do two x16 connections?

This shit reminds me of VLB

Attached: 317_diamond_stealth_24_vlb_rev.c4_top_hq.jpg (1600x613, 311K)

intel has 6 memory channel, lower latency and better optimized for softwares in general( like adobe ). Amd knows very well that they have to start working with software companies if hey want to break in the production world

It's more to do with the fact that a single slot only supports x16 lanes, not x32 lanes. So you have to split the lanes/bandwidth between two GPU's somehow, which is where the PLX chips come in on cards that have done this before. What's new on this card is infinity fabric, so look at .

its basically a vesa local bus, but now with apple vesa is sold separately

>between two GPU
Oh, gotcha. Forgot for a minute these are essentially two GPUs sharing a x16 connection. It's somewhat interesting to me that they didn't go ultra meme and either do a bizarre custom x24/30/32 slot (I'm not sure how many PCIe lanes these Xeon parts will have available to expansion cards, I think they have 64 total, but I'm sure how many of those are reserved for the chipset) or just had a second connector protrude to a second x16 slot. x16 per card is pretty limiting, especially since the choice to go Intel means no PCIe gen4.
The software, and, more importantly, whatever partnership Apple has with Intel, are really the biggest blockers. AMD probably has Epyc 2 SKUs that would work better in this from a technical perspective, but it'd require significant work to get macOS and creative software to perform optimally across all the cores.

It is is a Instinct Mi60 on a stick. The only interesting thing about it is the IF connecting the Vega 20 chips.

This might be a sign that AMD RTG is seriously be going towards MCM-route with post-Navi stuff.

>Oh, gotcha. Forgot for a minute these are essentially two GPUs sharing a x16 connection.
It's just occurred to me that the second connection is a full x16 connection and the chip on the board is for thunderbolt 3 support seeing as AMD doesn't actually have thunderbolt 3 native support from my understanding.

Attached: Untitled-1.png (1420x832, 943K)

nvidia btfo

Attached: 68d2-rtx2080-pcb.jpg (1334x812, 258K)

>the second connection is a full x16
Uh, based on what, exactly? The second connection is just power. You can't pump 475 watts through pins that small unless you use a ton of them. I'm pretty sure they explicitly said is was strictly for power, since they didn't want use traditional 4+(2/4) power connectors.

That's probably what the bigger single tab is for, a large 12v current. There is no reason to have that many pins for power delivery, especially only for 12v. See power supplies in servers.
>I'm pretty sure they explicitly said is was strictly for power
Source? I'm genuinely curious about this card now.

Attached: hp-dps-1200fb-netzteil-umbauen_power_supply_hack_resistor_soldered.jpg (2000x1331, 710K)

No. The fore tab of the connector is for power, like on standard connectors.
The pins width is non-standard, too.

There is no power over the back side pins in a standard slot. The front slot has been plated into a huge pin to carry 400oddW and the rest must be communication or they would use more suitable pin sizes.

Attached: pcie-slot.gif.69b588023714083f9aacd86329716680.gif (679x760, 69K)

>We put two gpus on a single card
>It takes up 4 slots of space
>"Ii-i-i-its the words m-m--mm-ost p-p-p-p--powerful card goys"

>4 slots
Holy hell I can't wait to see the cooler, I hope it's as convoluted as the water cooled g5's

Big heat sink fins to make up the lack of a dedicated fan.

Hopefully they at least use a fucktonne of copper, but probably not.

For the base and heatpipes.

All the Thunderbolt connections use PCIe lanes, so the MPX connector likely has PCIe 3.0x16 as well as the 16 lanes provided by the standard PCIe connector.

Now I'm really curious. All I could find is that it has PCIe, DisplayPort, and Thunderbolt. So is it just a retarded 3.0 x16 connection in a non standard form so MPX stays Mac Pro exclusive, and DP and TB just go over PCIe? In other words, if I wanted to make my own MPX for some retarded reason, would I have 32, or more, PCIe lanes for whatever I want? Or are some dedicated to DP and TB and can't be used for anything else? How many PCIe lanes do the GPUs they're shipping actually use?

Display port always has to come from a GPU