Do you think this will deliver, Jow Forums?

Do you think this will deliver, Jow Forums?

Attached: Screenshot_20180331-171033.jpg (1080x1129, 401K)

Other urls found in this thread:

ark.intel.com/products/130411/Intel-Core-i7-8705G-Processor-with-Radeon-RX-Vega-M-GL-graphics-8M-Cache-up-to-4_10-GHz
notebookcheck.net/Intel-Core-i7-8705G-SoC.279034.0.html
hexus.net/tech/news/laptop/113927-hp-spectre-x360-15-intel-core-i7-8705g-soc-launched/
hothardware.com/news/nvidia-titan-v-gpus-flunking-basic-math-scientific-simulations
tomshardware.com/reviews/intel-hades-canyon-nuc-vr,5536.html
digitaltrends.com/computing/hp-refreshes-spectre-x360-15-adds-intel-envy-x2/
twitter.com/NSFWRedditVideo

Is this a desktop CPU?

Yep

ref:
ark.intel.com/products/130411/Intel-Core-i7-8705G-Processor-with-Radeon-RX-Vega-M-GL-graphics-8M-Cache-up-to-4_10-GHz

doesn't seem like it sadly, wish it was though. it's about time to upgrade (i5 4430 here)

Of course it's a Desktop CPU

>Intel + AMD Radeon
This confuses the shill

This. Who the fuck is on suicide watch now.

Attached: 1520551508256.jpg (512x498, 30K)

Intel graphics division, since it means they simply gave up.

notebookcheck.net/Intel-Core-i7-8705G-SoC.279034.0.html
hexus.net/tech/news/laptop/113927-hp-spectre-x360-15-intel-core-i7-8705g-soc-launched/

who's the subhuman now

The 8706G looks interesting, as the Vega GPU is on a discrete PCIe bus, and the it supports VT-d, which would allow you to run a virtual machine on a laptop with GPU passthrough.

Nvidia.

Attached: NvidiiaFuckYou.png (1920x1080, 1.1M)

AMD, since they need to sell shit to their competition to even stay alive.

Man AMD has been bullying these guys non-stop. They barely got sloppy seconds from nintendo selling them old dusty X1s nobody wanted after sony and microsoft laughed at them. Now they're flagship graphics cards can't even do math anymore.

Attached: 1522101285643.png (752x720, 1.19M)

Big if true.

Attached: 1521193635932.jpg (2048x1152, 432K)

lol what

Attached: 1482574053190.jpg (267x323, 7K)

hothardware.com/news/nvidia-titan-v-gpus-flunking-basic-math-scientific-simulations

>the absolute state of AMD shills
Nvidia has 70%+ of the desktop market, dominates the professional market and everything mobile that isn't just Intel HD graphics is a Nvidia. AMD gets basically nothing of those $300 consoles, yet it's probably the only thing keeping their graphics division alive. Their current situation is not even funny.

How hard were you sweating when you wrote that? Was your boss standing right next to you? You sounded tense lmao.

>How hard were you sweating when you wrote that? Was your boss standing right next to you? You sounded tense lmao.

Attached: 1481838239393.jpg (346x450, 22K)

>that damage control
lmao

Attached: eye dog.jpg (1536x1536, 316K)

What else would you expect from a AMD shill

Attached: 1519737802493.jpg (300x247, 8K)

Attached: 2288FFE0-CEB4-42E2-8079-F02A8DFD5190.jpg (4032x3024, 3.11M)

nice watch senpai

Fukken gay. When can I get my Threadripper with Intel graphics?

It is the beginning of the end of discrete GPUs for the masses.

Nvidia is absolutely horrified by it. The recent GPP has nothing to do with undercutting AMD in the discrete GPU market (They have been dominating since second generation Kepler)

Nvidia wants to keeps its mid-range shit relevant in the mindshare of the masses. They know once the normies and kiddie gamers see that a simple iGPU solution can handle their gaming needs. They no longer have any need to get a discrete GPU. Mid-range GPUs make-up the lion's share of revenue for gaming market.

That juicy revenue from Nvidia's gaming division will start to dry up. High-end discrete GPUs have never been "big money". They barely cover their massive R&D costs. That's why their high-end stuff has been expanding beyond silly gaming.

How powerful are these graphics tho? 750ti tier?

all of us since the cia niggers have managed to unite intel and amd so they can put their differences aside and fuck all lolifags once and for all

Attached: angerydragonloli.jpg (775x960, 78K)

>The performance of the Vega M GL should be between the Nvidia GeForce GTX 1050 and 1050 Ti according to benchmarks from Intel.
According to first link in .

shit that's good

This is another of the reasons Nvidia is strong arming partners into GPP. They are worried.

It's pretty simple, Nvidia forced Intel to make an odd play.

Intel figures whatever miniscule mindshare Radeon will gain is worth it even if it takes a tiny bit of sales from Nvidia.

Also AMD got cucked and the driver software is Intel branded, so the mindshare gain is hindered by that since most people are brainlets about semiconductor companies.

I wish they'd socket the damn thing so I could put it in a real PC. It's everything AMD's APUs are not.

>socket the damn thing
What sort of cooler would you put on it? It would also have to be a FUGHEUG socket because it's literally 3 dies on one PCB in a row.

msi on extended suicide watch

This is true. But whatever. There have been stupid looking coolers for as long as I've been looking at PCs. Maybe just steal the dynatron sideways box design or something. Shouldn't be any harder than cooling a Vega [cue Gamers Nexus field day].

Aren't amd pro card better than nvidia?
How I amd doing in the neural network market instead?

Selling every single GPU that they can produce, so, pretty good.

>Closest I can get to a proper price is the Intel barebone at $800
Which means its value is in the 500-600 mark. A bit too much for a quad core cpu and a gpu between 1050ti and 1060.

>gt 1030 level performance

No thanks

>gt 1030
No, this is the Vega 20/24 variant which is between the 1050Ti - 1060.

how many Nvidia GPU's are in macs?

0. They are all dead now.

this.
Holy shit they dropped the ball with their drivers.

mine still works

but shuts down every time i lauch a graphic intensive app :^)

I don't understand why cache size is the same as 5 years ago?
What does 2018 processors do better than 2013 ones?

This is gonna be a clusterfuck to work with

>Intel graphics division
This was never for our market though. Intel graphics was for corporate market where YouTube, Netflix, or TelePresence would be the most graphically demanding ask. For the price it still is the best enterprise option.

Intel graphics play in a completely different price and performance category, even Iris.

I think most of Nvidia's revenue comes from shitcoin miners now.

>run a virtual machine on a laptop with GPU passthrough

Wait, aren't those desktop CPUs?

The main problem is that nvidia cucked developers so basically all ML libraries (Theano, PyTorch, Tensorflow) are all CUDA accelerated.

((( )))
>Man AMD has been bullying these guys non-stop.
nice oy vey there faggot, everybody knows nvidia can't sell their over-priced shit on consoles or any other market than gaymin' and (((ML))) , not even Apple doesn't want their fucking shit

Is there info on more laptops with this.

Man, if they deliver laptops with close to 1050 Ti performance with 65w TDP for full fucking system and getting it.

Fucking finally, laptops that can play game without getting hot or load as hell itself.

CPU is good (it's basically a Core i7-6700 non-K)
The Vega Co-Processor is going to be terrible for gaming, since it's not designed for gaming.
You won't find these in a reputable brand gaming laptops at all, only workstations and mobile workstations.

Only good goys buy Intel wholesale.

Attached: th.jpg (474x474, 35K)

It will bring fire and fury and 4-digit TDPs.

Attached: 1494307454511.jpg (553x936, 195K)

>implying

>implying implications

Attached: 1500279400844.gif (280x158, 1.43M)

I really don't want to upgrade to LGA 1151 I just got 1150 in 2016. I'll keep this 4690K for a while

Matrox?
MATROX BTFO LEAFS ON SUICIDE WATCH!!!!

>Aren't amd pro card better than nvidia?
Short answer no.

Vega is AMD's gaming line. Workstations need Radeon Pro.
High-end gaming laptops will continue to use discrete GPUs since GTX 1060 and up are still noticeably faster. Midrange ones may very well switch to those.

>everybody knows nvidia can't sell their over-priced shit on consoles or any other market than gaymi
Everyone knows tht nvidia is in position here they can refuse console, and apple since nvidia believes that these aren't worth the trouble.
Consoles lose money in hardware, so they purchase the cheapest available.
Apple is very much like the consoles in which they purchase from the cheapest, but they profit from their hardware, so apple rebadges, and charges a premium.

Nvidia is actually moving away from GPUs. Their GPUs have amassed them enough wealth to branch out in different markets. They can see this apu future coming about its amds plan basically. Nvidia will be fine though. All Its GPU revenue doesn't go into R&D for their consumer cards but it does go into all their other prospects

Bigger cache just slows down the compute process. Forcing them to speed up something else in the CPU. As of now CPUs can't really harness larger cache sizes
>first Vega is marketed as a workstation with gaming capabilities
>Vega flops and everyone says its just for workstations
>now Vega is a gaming line
Amds marketing has really confused everyone. But they out a lot of useless shit in the Vega that isn't need for gaming.

AMD does not have a gaming line of products. They only sell graphical accelerators and co-processors, such as the entire Vega and Polaris product line.

>Bigger cache just slows down the compute process. Forcing them to speed up something else in the CPU. As of now CPUs can't really harness larger cache sizes
Great shilling Chaim, but the real reason Intel doesn't raise cache sizes is because it would make the CPU run even hotter and the die even bigger - thus killing what little yields they have.
It's not even a speculation, this exact same thing happened with Itanium some 15 years ago. The only way it could compete was with mountains of cache that made it hotter and more expensive.
That being said, raising the amount of cache almost always has a direct positive impact on performance. Excluding cases when some retard upsets the balance of the cache structure like with Bulldozer.

So your basically saying Intel can't increase cache sizes because then they'd have to change the architecture? Basically like what I said

Attached: 6bd83b6a28b3bf7da31d10f6f4fe2e2c.jpg (2048x1024, 651K)

Like the Artist Formerly Known as Prince?

Attached: image_2018-03-31_14-12-2.jpg (1440x2112, 136K)

They could raise them now - in fact they sort-of did it with the eDRAM for Iris graphics, again at a cost of thermals - but they would be shooting themselves in the foot. Their 10nm process is a disaster in the current form and spoiling the little yields further with larger die would be a complete armageddon. They could lower the cache size again but that would make for a further performance loss and confused looks as to why a simple die shrink has less cache.
So yes, probably only in a new arch. I mean, historically the cache size wasn't really a problem because until now Intel only competed with themselves.

>Like the Artist Formerly Known as Prince?
Dead? I don't get it.
Intel claimed that the future is more cores so I'm pretty certain they've already started with a new architecture with smaller but more dies. They're just gonna copy amds model now
I'm interested to see what they pull out their ass

better stack up on cheese pizza, pedo. with time they'll train their eyy ayys with their own collections and then ure fuggd.

Attached: tumblr_p5eblz7LWi1tdj66po1_250.png (250x278, 65K)

IDGAF how much hate I'll get for this, if they make a macbook with this, i will finally sell my current one and buy the new one (paying about the same amount as a shitty Walmart laptop after the money I'd get back since Macs hold value like crazy)

>worst CPU + worst GPU
Is this a joke?

What watch is that?

I'm sure it'll make a great portable heater.

This is a preview of what AMD will release with Zen2. Current mobiles APUs are pretty good, but Vega/Navi + Zen2 is going to be a killer.

The performance will be pretty neat, but unfortunately it'll only be used for on custom mobos and in small form factors, not in something you can easily build. And it's aimed at a premium price point, so it's definitely not a way to save money.

Why the fuck does Intel get Vega 20 yet Ryzen+ APUs only get 8/11?

AMD could have done so fucking well with a 2600G with even 1050 level performance, especially with GPU prices/ availability the way it is now.

different market, different price points. These are going in $800+ Nucs and $1000+ laptops/all-in-ones. I bet Apple will have a go at them at some point in a macbook

nice blog, mactoodler

I don't understand this Mac meme. What makes it so special?
Its all about scale. Vega architecture is really power efficient in regards to performance to power usage. But that goes right out the window with Vega 64/56 as it doesn't scale up well

>I don't understand this Mac meme. What makes it so special?
Americans love shiny things

this fucking SUCKS

Intel iGPU's literally never die.

AMD and NVidia laptop GPUs always die within 3-5 years. it's always some "microscopic soldier" bullshit that kills these GPUs extremely fast, but conveniently just after the warranty expires.

now laptops with intel iGPUs will have the same failure rate as those with AMDs.

When Great Depression 2.0 and the proceding Great Revolt happens in 2023, we should make planned obsolescence illegal and punishable by hanging of all the board of directors, and nationalize Intel

Attached: FFFFUUU.png (256x256, 30K)

>FUD
What makes you thing Vega mobile won't last long? Perhaps its because discrete GPUs are subject to Navy work and thermal cycling that causes the soldering to CE lose and other problems.
What if because of low temps and power draw that's what'll keep the Vega mobile running longer?

No, nvidia and AMD GPUs don't die either.
Only the reference PCB dies because they use underspecced memory chokes and shit tier tantalum capacitors on them. These components can be replaced to fix the card. Or you could just buy a card with an aftermarket PCB designed by someone more competent/less penny pinching.

>aftermarket PCB designed by someone more competent/less penny pinching.

Is there anyone who tests tears down/tests board partner cards to see who brings out the best design?

Who currently makes the best? I hear MSI isn't doing so great.

You'll have to look hard to find a PCB more shit than nvidia reference. I mean, you literally can't find a PCB more shit for the same GPU unless there's some special OEM version that is even more stripped down.

>Why the fuck does Intel get Vega 20 yet Ryzen+ APUs only get 8/11?
Because that would make the APU die from around 2 to 3 times larger than what it is now, and would pretty much kill yields and profits.

For CGI and animation, yes. For Machine learning, no.

Buildzoid

>microscopic soldier

Yes, there are soldiers the size of nanites that lay siege to the cores, it can only be sustained for so long.

Good to see Intel I tegrating mining technologies in their cpu

Their quality hasn't been so great in recent years. I been a Mac user for over 15 years, maybe longer. I am replacing a MBP that failed within 5 years, and was optioned out to the moon. Apple can't seem to pinpoint the issue. They've looked at it at least three times, I've torn it apart. Sometimes it starts, and sometimes it don't. Apparently, they want $600 for a logicboard replacement.

I am going Linux/Windows 10 in a dual boot machine with the Intel Core-7 8705G. I hope my luck it better this time around, and I have a machine that lasts me longer than 5 years, because hardware is really fucking expensive right now, and this thing cost me a fucking fortune.

I probably could've hobbled along till a new MBP release in June, but after my latest iPhone that has been replaced several times because the internals keep failing, and a $2500 MBP that is about to completely die, I can't give them anymore money.

this, nvidia have seen the writing on the wall in terms of chasing high end GPUs - the launch pricing and negligible real life performance difference of the 1060 6GB and the 1080 reflected this.
it's why nvidia has been pushing deep learning and self-driving cars so hard - that's where the real money is.

What the fucking fuck?
>Code Name Products formerly Kaby Lake G
This reminds me of that musician Prince who changed his name to.. some symbol and people were saying things like "The man formerly known as Prince is looking for his car".

But intel is designing new dedicated GPUs that can replace the radeon part in future versions.

Looks promising:
tomshardware.com/reviews/intel-hades-canyon-nuc-vr,5536.html

digitaltrends.com/computing/hp-refreshes-spectre-x360-15-adds-intel-envy-x2/

AMD is very likely really happy with this. Intel iGPUs are the most common GPU out there. This will give them a huge bump in the percentage of installations using AMD graphics. It also gives them a foot into the laptop market and that's a big market they are basically not in anymore. It's all Intel+NVidia graphics. Even a small slice of that market would be a big improvement over zero.

These Intel G chips will be just fine for gaming for the vast majority of people. They will probably perform on-par with the RX 560/Nvidia's 1050 and be a step up from AMDs own APUs. They will run most games at 1080p with acceptable framerates.
>You won't find these in a reputable brand gaming laptops at all
That's true, NVidia twisted the arms of all their partners and forced them to sign the GPP which does mean that nobody will brand laptops with these chips in them as "gaming". Not until some large governmental-like body such as the EU steps in and stops NVidia's blatantly criminal activity. It really is clear that the GPP is a violation of several EU laws. It may be legal elsewhere.