Nvidia sabotaging Kaby Lake-G release

>We've spoken to three independent and reliable sources close to Notebookcheck and they have all suggested the same reasoning - Nvidia is strongly responsible for keeping Kaby Lake-G from proliferating. Factor in the loud rumors about the anti-competitive terms of Nvidia's GPP, the rumors of HP and Dell keeping their distance from the program, and AMD's own VP acknowledging the leaks and they all strongly point to Nvidia putting a tight lid on the Kaby Lake-G platform.

Attached: 600px_kaby_lake_g_with_amd_radeon_package_1.png (600x500, 175K)

Other urls found in this thread:

notebookcheck.net/Where-are-all-the-Kaby-Lake-G-laptops-Nvidia-s-GeForce-Partner-Program-may-be-to-blame.300748.0.html
tomshardware.com/news/amd-earnings-call-tsmc-7nm-gpu,36957.html
twitter.com/NSFWRedditGif

Forgot source.

notebookcheck.net/Where-are-all-the-Kaby-Lake-G-laptops-Nvidia-s-GeForce-Partner-Program-may-be-to-blame.300748.0.html

when is someone going to step in and stop this bullshit?

Based Huang fighting both the jews and pajeets.

It's absolutely revolting because it's not the only thing nvidia is doing that is fucked up at the moment. They're in a position similar to the one intel was in after sandy bridge but they're a competent company and won't blow all the money giving people free tablet cpu's and buying mcafee.
Those kaby lake g chips are so awesome that I want one just on it's technical merit alone. I can't imagine any possible nvidia configuration in that price range being able to compete.
Nvidia pushed the GPP as part of this too, it had little to do with AMD cards posing a threat on desktop.
Not to mention they utterly killed GPU overclocking with pascal and pretty much just give you the illusion of having control over your card. They're limited so low voltage wise that a supremely binned card might give you one extra frequency bin at the most. Then they put logic in gpu boost 3.0 to stop hard mod vcore control(see buildzoid's 1070 videos).
I'm fairly certain they're doing everything in their power to keep gpu prices high post mining craze so they're $800 1180 isn't such a shock.
Finally they want to push everyone over to geforce now.
I can't stand this shit.

Don't buy Nvidia then :^)
-t. RX 580 owner

I personally own a 980ti. Looks like I won't have a chance to upgrade to another card that I can actually control the voltage and power limits on until big navi.
Nvidias shit is locked down like a fucking verizon iphone.

>To date, there have only been 4 major products announced with Kaby Lake-G: The Dell XPS 15 9575, 2018 HP Spectre x360 15, Intel Hades Canyon NUC, and the Chuwi HiGame mini PC. Two of these are not even laptops, the HP and Chuwi systems are not yet shipping, and the NUC is solely an Intel product. This leaves HP and Dell as the only two notable manufacturers with overt Kaby Lake-G plans who also happen to be allegedly backing away from Nvidia GPP. Other major manufacturers like MSI, Zotac, Gigabyte, Asus, Lenovo, Acer, and others have been oddly silent about the processor series. For a product born from an inconceivable partnership between two of the largest PC rivals in history, Kaby Lake-G should have received more attention or at least comments from OEMs everywhere.
This makes me upset. AMD and Intel finally working together on something and it may fail because of Nvidia faggotry. This will just indicate to both companies that they shouldn't work together.

>big navi
so literally never ever?

Can't tell if you're talking about navi being glued together smaller GPUs or RTG's incompetence in general.

Can't wait to ditch this nigger garbage 970 and buy an AMD to finally use FreeSync which I have but can't use. Fuck you niggers at jewvidia. Also can't wait to no longer have DPC spikes that the nvidia drivers have.

Attached: 4L_3C1oEESw.jpg (2592x1728, 763K)

Grass is always greener etc

Basically, we're fucked unless we sell out to the newer Nvidia cards.
I like AMD's new CPUs a lot but damn their GPU division is pretty pathetic.

If it were just Intel implementing an AMD gpu I wouldn't care so much. It's the fact that it's such a good implementation of that GPU in such a unique product that is pissing me off. Nvidia gaming laptops are fucking trash, they fall apart.
This chip can provide enough performance for 1080p gayman in what is basically an ultrabook form factor.
Be happy you at least have maxwell. You can overclock the fuck out of a 970 if you have cooling for it. Good news is AMD's next midrange GPU should be around 1080 performance which will be a sick upgrade for you.

The grass is indeed greener when the driver uses HAL.dll instead of making its own against Windows' standards.
also,
>$200 for variable refresh rate
up yours

Nvidia even uses freesync in their laptops and brands it as gsync. There was a point in time when you could convince their driver panel from last decade that your monitor was a panel in a laptop and it would enable freesync.
Pretty sure they sent cease and desist letters to Wendell for posting a video about it and then patched it right away.

>may fail because of Nvidia faggotry
if you have an electric screw driver at home, pin/screw the following very tight to your brain
the first gpu at 7nm will be amd.
the 7nm at GF is the one with the most potential out there and maybe the only that is totally focused on high performance big chips.
novidia is going to milk the TSMC 14nm cow as hard as they can because TSMC 14nm > GF 14nm (12nm is 14nm).

2019 is going to hurt nvidia a lot.
They already have a lot of trouble with their custom designs, as it's their fault that the nintendo switch got an unpatchable security hole.

>Nvidia even uses freesync in their laptops and brands it as gsync.
that's becasue laptops use eDP and variable frame rate is mandatory according to the spec/standard.
There was a leaked driver that exposed a non-"G-sync" marketed laptop as capable of enabling g-sync.

but lisa already said that AMD's gpus are going to be built with TSMCs 7nm, not GF

honestly we don't know who is doing what, just that 7nm is the thing they are using.

nope.
all are either 12nm GF or 7nm GF.
The 400/500 series are on Samsung 14nm
and
all novidia gpus are on 14/12nm TSMC.

nvidia doesnt have the capacity to go against intel

pretty sure its apple's fault that we arent seeing much

you can pass the nvidia gpu though to the lowest end amd one and that can use freesync and it will work.

jerryrigging at its finest, but it will work.

tomshardware.com/news/amd-earnings-call-tsmc-7nm-gpu,36957.html

Remember when Intel was doing this to AMD? This is hilarious if true.

Brainlet here, what exactly did they do? The OP sounds pretty vague.

Nvidia, through their partnership programs with major OEMs, are essentially forcing major OEMs to not build Kaby Lake-G laptops. Not a word has come out of MSI, Zotac, Gigabyte, Asus, Lenovo, and Acer about their own KL-G laptops. The only two laptops using it that released are form Dell and HP, both of whom are trying to distance themselves from the program.

Nvidia's GPP puts manufacturers at a disatvantage if they choose to sell Nvidia products along AMD. They must choose to sell Nvidia only versions of products if they want to stay in the business, which is a borderline illegal practice
for instance, MSI dropped the gaming-X series of AMD cards, leaving only Nvidia ones

Why would anyone do that? It's a open market, that's how it works.

You think any of those companies cares about performance or their users?
No, they care about money and making sure they make the most.

Attached: Laughing Whore.jpg (762x900, 161K)

So literally bribing? Why dont Intel and AMD do the same to these partners? They together must have more money than Nvidia right?

>intel
irrelevant in GPU market
>AMD
way, WAY less market share than Nvidia

>AMD's own VP acknowledging the leaks
Wait, what does this have to do with anything?

Intel dare not risk AMD getting any serious leverage - the cpu market still massively dwarfs the gpu market. The again AVX2 basically exists so Intel can pick a fight with Nvidia anyway.

Remember: Intel locked Nvidia out of the chipset market years ago.

kaby lake-g is the intel laptop sku with vega graphics?

is it out already?

nvidia is intel's buddy on the laptop market, the combo intel+nvidia is very common there. why wouldnt intel bite nvidia's ass if they pulled out on them a shit move like this?

it's out, but only in NUCs, which is a shame since it's ideal for laptops

>open market
>end game is stagnation by eliminating any sort of competition

Attached: Laughing Whores.gif (400x400, 117K)

>AMDtard bunker thread
>"waaaaaaah why is the biggest GPU maker bullying our favorite company waaaaaaah"
>AMD should've gotten gud with their GPUs/CPUs instead of sucking
>AMDtards take it up the ass once again
kek

I'm on the same boat and should have upgraded to Vega 56 when it came out, but I was too short-sided. To be quite frankly honest familia, I would have bought the Frontier Vega if it had SR-IOV.
Maybe they might glue a few 7nm Vegas and call it a day. The Vega on intel's chip is such a good GPU, they should have released that standalone.

>literally bribing the manufacturers not to sell competitor's products
if you see nothing wrong with that, regardless of your brand preference, you're a mouthbreathing retard

For you unrionic retards, this is everything a free market shoudn't be.

Why don't intel and AMD sue then? As long as they can get single whistleblower from any of those companies to torpedo hos career and leak official communication on the program, they've won.

you're a special kind of retard, aren't you?
by the time settlements would be made, the chip would already be irrelevant, considering legal actions take time

There's no plans for an APU as powerful GPU-wise as Kaby Lake-G, Nvidia is effectively killing the best portable gaming chip ever due to their faggotry

it makes me happy to know that nvidia will never be able to produce an x86 soc. amd could make one even without intel, kek

nvidiots on suicidewatch.

tfw no 7nm GF

That's how it works, it's easier to try your best to destroy the competition than to improve your own products.

>No, they care about money and making sure they make the most.
Humanity will never learn, we're bound to become prisoners of our deadly sins.

>never be able to produce an x86 soc
Not true

x86 patents wont last forever :)

>our nvidia x86 was made with love and node.js from the ground up :)
not happening.

Isn't this largely responsible for the lack of AMD GPUs in notebooks full stop?

Not that AMD's mobile GPU options have been particularly competent lately

They already planned to use their denver core

>Also according to Demerjian, Project Denver was originally intended to support both ARM and x86 code using code morphing technology from Transmeta, but was changed to the ARMv8-A 64-bit instruction set because Nvidia could not obtain a license to Intel's patents.[

All I see is
>Nvidia could not

>Not that AMD's mobile GPU options have been particularly competent lately

not even in the low-mid range? they can always price them a little bit lower than nvidia.

>preventing others from competing
>open market
Are you fucking retarded?

AMD can compete in every price sector but there's a huge problem and that is size. Nvidia's dies are much smaller and they're still fucking up AMD. The margins must be huge.

Nvidia can't into hbm, intel partnership in 2018, x86, android market (they lost that because no kernel source led to no updates - how incredibly stupid), privacy conscious drivers, linux drivers, macos drivers (they have to manually be installed - usually you dont install drivers in osx), reasonable power consumption, future proofing due to big memory buses, low pricepoints, no microstuttering (remember the dual gpu fiascos? i do), they can't handle competition like a real man that doesn't hide behind leather jackets, they hype vaporware (woodscrews), scamming customers (3.5gb) and throw away all the single moms money for ads in videogames. ps3s died en masse due to overheating nvidiachips, they sell identical cards but slap a different model number on it to enable 10bit moneyfarming, they overcharge for cuda licenses.

and now they even hinder humanitys progessing.
could they be more evil?

Attached: 1509475258328.jpg (750x500, 77K)

capitalism

Sounds good to me. Nvidia has nothing to gain by teaching Intel how to make competent GPU features and everything to lose. They're both ultra jews and we all know that as soon as Intel catches up, they'll drop Nvidia off to die.

Obviously this whole industry is fucked up but I can see why Nvidia might try to screw Intel first.

ok, but speaking about real performance and power consumes, how's amd discrete laptop graphics mid range lineup?

i havent been following much lately and laptop graphics is generally a mess because it's not really covered by tech sites

The aristocrats.

Polaris is fine and outright beats the 1060, but we're back to the main point of efficiency and die size. The size delta isn't so big, but the same can't be said about the efficiency delta.

This is literally the free market at work

wonder of wonders the free market is anti-competitive and seeks to squash their opponents in any way possible, not just through competition in the marketplace itself

I honestly wonder how long it'll be before corporations start hiring hitmen to off each others' talent or executives

the only hope is the EU and it will take YEARS if a process will ever be issued agaist nvidia bullshits in the first place

It was run by a poojeet with a management team that wanted to sell the division to Intel and did everything they could to fuck AMD over (ironically they helped save AMD by pure accident thanks to compute being useful for mining and bitcoins coming out of left field)

AMD upper management only finally clued the fuck into this with Vega and now the entire upper management team of the RTG division has 'quit' (fired) and been replaced with zen members and some old RTG blood who are leaving qualcomm now.

AMD's not in a good position with their GPU's, but they're in a better position than a year ago now that they've rid themselves of literal internal saboteurs.

>Nvidia can't into hbm,
stop read here

Attached: amdfag.jpg (8624x3896, 2.33M)

SR-IOV on a consumer gpu = instant buy for me

goodbye windows, hello linux and virtual machines

It's like these companies hate my money

Are you? That is literally what happens to EVERY "open market" that isn't kept that way by a government.

>ps3s died en masse due to overheating nvidiachips
ps3s died because of shitty cracking soldering. dies were fine.
xbox had the same problem.

I hate nvidia but this was funny, even though Huang is a gook

Blade Runner-style corporate-owned dystopian society WHEN?

Dubs of truth right here.

^^^^^

This is the real reason behind the GPP.

Nvidia wants to stop AMD and Intel iGPU solutions from eating away at their mid-range discrete market (a.k.a lion's share of the discrete market)

They are trying fight hard against the inevitable decline of discrete GPUs in mainstream rigs and systems.

They will lose, there are rumors AMD and Intel are reducing the number of PCIe lanes on mobile chips. All the lanes will be taken up by the other peripherals like NIC, WLAN, USB, Camera etc. There will be no lanes left for a discrete GPU.

This will destroy Nvidia on the mobile market

literally living it as we speak

Yeah, not really, not yet.

I hate Intel with a passion and have been using AMD all my life. But this is one instance where I hope that Intel just delets Nvidia with illegal shit like prohibiting Nvidia GPU's from working on Intel motherboards or some other illegal shit

fuck nvidia and intel, but this fight, i am on intels side

>Nvidia's dies are much smaller and they're still fucking up AMD.
What?

Look at Vega's 484mm2 vs Nvidia's 1080's 314mm2 and tell me that isn't loss for AMD. They even have HBM2!

It would be amazing, but I don't think it's happening anytime soon. Though with how they're giving stuff like SenseMI to consumers, they might give it to consumers one day. Maybe 7nm will ship with it? Please, I would unironically pay frontier prices for it.

What's that sound off in the distance? It sounds a lot like a bunch of people lawyering up.

Vega 64 is approximately a match for an aftermarket 1080 since Pascal boost/OC OCs the GPU past the standard boost when cooled decently.
The 1080 die is around 300mm^2. The Vega die is 500mm^2, before HBM2. And the 1080 is on 16nm, not 14nm. AMD is at a huge disadvantage in die AND transistor efficiency. And to make matters worse Fury and Vega hit the max reticle size for their interposers IIRC so unless they remove the HBM dies they can't go bigger. And 4096 shaders is the max for GCN.

Good thing they're actually building a research and development dream team then.

That said, I really don't see the problem with Navi being composed of chiplets. The savings incurred by not going with a monolithic design are pretty huge.

>dream team
>new uarch in 2020 meaning it's still Raja's baby

Yeah ok

>Until big Navi.

AMD confirmed they're not aiming to release high end chips, but a 1080-like for $300.
It means that they just want to release cheap, not top of the class, cards and consolidate as the budget shitter gpu seller that will appease both to miners (who will be able to buy them en-masse) and casual gamers.

FUCK THIS LIFE, I TRUSTED YOU AMD.

I rly hope this is true...
All hope seems lost really.

Nah. Only actual fact is that Intel got BTFO when AMD has a GPU lineup and they don't, with the mining boom. Only thing miners buy from them are the least expensive chipsets suitable for infinite PCIe multiplexers and trash bin Celeron CPUs. They can't even inflate prices and blame miners. Core i9 only gets a few enthusiasts attention, Core i7 only gets Intel fanboys and Core i5/i3 are the only ones that make some profit because of OEMs. Xeons were the only big profitable shit they had, then came EPYC and took a good portion of their market share. About damn time they started giving up on trying to keep monopoly.

Now, comes nVidia and pushes out a "program" that's probably going to be taken as anti-trust and whatever by every commercial organization out there. You just wait until some 1st world government gets butthurt because they got tied to nVidia for the foreseeable future and AMD/Intel start doing everything they ever needed in GPU terms for max half the cost.

>Implying Intel will catch up
kek

Except he did nothing wrong, compute is the future of GPUs, and for the first time since pre-ATi Stream AMD became extremely competitive with Nvidia on workstation applications
Raja did pretty good despite the anemic budget RTG had
High end gaming is dead, 4K has no perceivable advantages at any normal screen size and viewing distance, and VR is still a failure
There's no need for anything past 1080 performance on desktops

>~>SPECULATION

But there is, user.
I play at 1440p, which does have its advantages and its the new niche. I don't care about 4k, but do care about reaching 144Hz+ on ultra settings at 1440p and in latest titles that's not even possible with a Ti in a lot of cases.
Also, you're basically neglecting the fact that videogame graphics are still far form reaching a 99% realistic view. Light ray tracing is just being born at the moment, and that's going to push GPU development even further.
Not even talking avout VR also, when virtual reality will eventually become the future of media entertainment if they can ever reach realistic graphics.

i7s are still the best selling processors (speaking about Coffee Lake too). And they're still the best for purely gaming reasons due to a higher single core frequency.

this

fuck raja koduri

But muh 1440p

Ray tracing has been done since the 80's, illiterate /v/edditor
144+ Hz gaming is a niche of a niche, the vast majority of consumers are fine playing at 30 fps
Even from the photorealistic meme, ray tracing barely improves perceivable quality over traditional shaders

>fired
>gets an entire division made just for him at chipzilla
o, i am laffin

I've been out of the GPU loop for quite a while. Why are we comparing the Vega 64 to a 1080 instead of a 1080 Ti, wherein the die size comparison is now 486mm2 vs 471mm2? Furthermore, the transistor count would be 12.5B vs 12B transistors despite the smaller process size.
Looking at the GLOPS this seems like the comparison that should be made, with Vega blowing the Ti out of the water when it comes to double precision. Is the Vega's architecture not conducive to gaming, such that a comparison to the 1080 is more appropriate? Or is it a matter of the games and compilers being optimized for Nvidia and not taking advantage of the Vega?

Unless your playing latest and greatest games and trying like hell to max everything out w/highest res your display can output @ 100 fps+ why should you even give a flying fuck about what ATI/Nivida/AMD/Intel may do? If your current setup can run your shit fine, don't worry about it, just keep right on going. If the company (s) fuck up, hey, least the stock price will be cheap to buy.

Mostly the hardware scheduler, nvidia does the scheduling in their drivers. This has positives and negatives, but comparing die sizes is kind of pointless in this case.

>Why are we comparing the Vega 64 to a 1080 instead of a 1080 Ti
Because it's roughly equivalent to the 1080. The Ti kinda shits over it everywhere except in two or three literally-what games.

>muh games
No one cares brainlet