AMD on suicide watch, HAHAHAHAAHAHAHA

AMD on suicide watch, HAHAHAHAAHAHAHA

How will AMDrones cope with everything being taken away from them. More shilling and autistic crusades against Intel and Nvidia?

theregister.co.uk/2018/06/13/intel_gpus_2020/

Attached: 1518993988462.png (1150x601, 1.18M)

Other urls found in this thread:

gpu.userbenchmark.com/Compare/Intel-Iris-Pro-580-Mobile-Skylake-vs-AMD-RX-Vega-8-Ryzen-iGPU/m132950vsm441833
twitter.com/NSFWRedditImage

Go back to /v/.

says the AMDrone

Will they finally have drivers though?
Because that's probably the biggest advantage AMD and Nvidia currently have.

Go back, /v/ermin. You hit the 0.5 daily shekel deposit limit already.

>How will AMDrones cope with everything being taken away from them.
how?
I've always wondered.
We haven't recovered even from the 1998 pwnage
1/2

Attached: how_amd_is_gonna_recover.jpg (1600x1200, 383K)

intlel does't have qute waifus

Attached: 1525871113767.jpg (853x1280, 164K)

hwomnst are we gonna recover.
The second hit came in 2009
1.9/2

Attached: howmnst.jpg (3500x2394, 681K)

in case you din't realize...
Intel is trying once every decade to enter the GPU market.
My sides haven't recovered from the last 2 attempts.
2/2

finally, someone will be worse than AMD at making GPUs

Give me real time ray-tracing for cheap and I'll switch to whatever brand.

I mean, Intel has proven over and over again that they are utter shit as far as GPUs are concerned. Even the AMD iGPU are better than the Intel ones. Maybe it's different now and they'll actually deliver. But it's certainly not a given.

>Even the AMD iGPU are better than the Intel ones.
this is supposed to be some kind of low bar?
The amd igpu is a full fledged chip from a dGPU with a DDR3 or DDR4 controller.
It has ALL the rendering features of a dGPU, it has the quality support of a dGPU, and the best AMD iGPU out there competes a $100 card which has higher tdp, power consumption and its dedicated proper VRAM.

Traditionally AMD APUs have always been competitive with $100 dGPUs

t. kaveri 7850k owner.

how many MHash/s

epic post man xD

Intel has been saying they will release a GPU for 20 years.

>intel_gpus_2020
I swear that this reminds of something.
...hmmm but what?

Attached: 4bb.jpg (600x469, 23K)

why would AMD be on suicide watch? intel has no history of making decent GPUs, they can't even get 10nm right, and the best they could come up with was Raja Koduri who absolutely scuttled Vega and Polaris potential

>intel
>raja pajeet
>super gpoo power by 2020
you literally cant make this shit up (unless you are a street shiter)

More competition? Good.

My RX 56 will hold up for 1080p anyway.

man after TR2 annc the amount of annoying macacos has gone almost as when amd released the first gen ryzen

its really pathetic

I love AMD but love competition even more. I hope Intels attempt at the GPU market is successful.

Why is the AMDrone calling everyone a /v/tard, when the AMDrone is a /v/tard himself? Not enough designated streets to shit up, Pajeet? Or enough dogs to cook, Chink? Come on buddy, I hope you get paid for shilling AMD and attacking Nvidia and Intel by your Maoist leaders or Street Shitter leaders. Maybe go for a call and your "mommy" Lisa Su will show up.

>he thinks AMD drones will dislike this
No, it will only mean even cheaper GPUs but with even better performance from AMD.

>No, it will only mean even cheaper GPUs but with even better performance from AMD.

The poor AMDrone unironically believes his RX 580 is better than a 1080 TI. Holy shit, my sides. The amount of delusional mindless drivel coming from you!

>Will they finally have drivers though?
uhh... the best drivers are actually Intel's.

>implying intel will catch up to AMD/NVIDIA in the GPU department in a mere 3 years
I'd be surprised if their GPU even matches the rx560

fpbp

who gives a fuck about graphic cards if you're not a virgin faggot playing games as a (((girl)))

so Intel has finally given up making CPUs?

AMDfag here.

I'm genuinely interested what Intel can pull off now that Raja Koduri has unlimited resources to develop a GPU.

I mean, shilling and trolling aside, Intel may have their name on it but it was designed by the guy who was running Radeon Technologies until recently so it's kind of vicariously an AMD product.

Hey, if they can shake up the AMD / Nvidia two-party system I'm all in.

Those who use them for fluid dynamics calculations?

Fuck you, though. Larrabee/Xeon Phi is a really good idea architecture-wise, and I'm really hoping it will succeed. We don't need ugly hodgepodge architectures with completely dissimilar ISAs and completely dissimilar memory management structures for different parts of programs. If both CPU- and GPU-like tasks can be done under one, simple OS image, that's a good thing for everyone.

Attached: linus-4.jpg (620x299, 21K)

NVidiots are concerned also, idiot

Attached: file.jpg (1150x601, 61K)

>Larrabee/Xeon Phi is a really good idea architecture-wise
bullshit.
You have no idea what a nightmare it is.
I had to write some fucking convolutions on that damn thing... and guess what? it fucking runs its own linux distro in order to be functional.
It's an absolute abomination.

>If both CPU- and GPU-like tasks can be done under one, simple OS image, that's a good thing for everyone
Intel has 0 (that's zero) tradition in being agile. You sneezed? Daymn, you just lost compatibility. Farted? Snap, you just lost part of the extension ISA... and so on.
If Xeon Phi was so nice then it wouldn't have been kill.
Intel lost a lot of capital investing into teaching people how to use their clusterfuck.

You only need 1 thing. good OSS GPU drivers and a wide range of supported CPUs to cooperate with. ... hint hint HSA
Thus far the best OSS GPU drivers out there are AMD's, and they lack 2 key features. 1) multi-GPU support, 2) GPGPU optimizations.
Xeon Phi is proprietary as fuck, doesn't run on most consumer motherboards due to huge BARs(though huge BARs are great in some applications, e.g. FPGA accelerators) but they fucking failed to make a competent Xeon Phi product due to the various limitations they set.
No, intel, I don't want a 16GB BAR1 directly mapped to my CPU. Tesla/Quadros and Firepros are doing just fine with smallers BARs.

Both AMD an novidia architectures are cleaner designs than any CISC mofo CPU that intel has designed, becuase they are RISC.... reduced.

>Make low-effort shitpost
>Get called out
>F-Fucking AMDrones!!!

A thread died for this

Attached: 1362671107409.jpg (1000x993, 606K)

Great, now you can have vulnerabilities in your GPU.

Attached: smug looking anime girls with condenscending looks on their face.png (181x220, 27K)

>Brian Krzanich debuts intel’s newest dGPU

Attached: FCEC3A56-5487-49FD-9795-4602B35B2C5A.jpg (640x860, 87K)

I don't play many games, but when I do, I play as a cute girl

Attached: 1519623461725.jpg (600x573, 93K)

Lol no, Intel iGPU drivers suck on KDE worse than Nvidia despite being open source.

>KDE
Don't use a hobby DE and expect it to work.

The difference is this will be an actual GPU and not an x86 manycore processor. Hopefully Raja doesn't fuck this one up

bruh you should have gotten the 3dfx voodoo 3 what were you thinking

>You have no idea what a nightmare it is.
I wouldn't know if Intel's specific implementation is good or not, since I don't have access to it. What I was talking about was the general architecture of only using one, coherent, processor architecture for all problems.

If you do have access to it, though, I'm very interested in hearing more about it. The problems you mentioned only seemed hardware-related, which I'm sure matters, but it doesn't sound like something that should matter to the software side of things.

>You sneezed? Daymn, you just lost compatibility. Farted? Snap, you just lost part of the extension ISA... and so on.
I'm sure, but that's just as true for GPUs, the only difference being you're literally forced to go through a JIT compiler since they don't even have any intention of keeping the ISA stable.

>CISC mofo CPU
Meh, it's not like x86 is nice or anything and I'd much rather use RISC-V like anyone else, but it's not like there's any practical different in actual capabilities.

>implying you don't already have
I'm just waiting for vulnerabilities in both hardware and driver being exposed via WebGL.

This, I am also AMDfag and I am looking forward to see what it will bring. But there is just no telling right now, how about we start shitposting when we see some early results? This is pointles

>$200 gpu vs $900 gpu
is the 1080 ti 4x better?

The jews will deal with leather jacket man. Cheap GPUs sold below cost will come but only if you reply to this post with shalom.

>intel
>GPU

How about Intel tries making a CPU that isn't insecure trash before 2020?

This will be fun

t.Nvidia owner

reported for unnessecary shitposting

fuck off

How could anyone be upset about more competition? Fucking retarded fanboys.

shalom
2020 will be the year of the intel dGPU
and NOT of the linux desktop
thanks goyim

They're really not and they rarely get any updates.
My d2500 has no Linux support and Intel HD405 supports vulkan and dx12 but never got the drivers for it.

so is this gunna be like the ps3 which was gunna use the cpu for the gpu as well?

Wow intel sure on the back foot playing catchup. Good luck catching:

>7, no wait 10nm
>corelet arch, 28? lol
>SHPC (security flaws per cycle)
>*new* tail end of crypto mining

>intel
lol can't wait to see what major security bugs there are.

Larrabee was an actual GPU too...

Just in time for AMD loaded next-gen consoles and 4K APUs, and missing out on the cryptocurrency fad.

Move over jew, it's Dr Su.

Attached: 210px-AMD_CEO_Lisa_Su_20130415_cropped.jpg (210x247, 12K)

>More competition? Good.
this, 3rd player will shake things up, let's hope they won't fuck it up
but considering their fabs got no idea how to make GPU chips it may not end well

> More shilling and autistic crusades against Intel and Nvidia?
Yes.

does this mean their igpus won't be total shit in the future?

No.
their iGPUs are actually pretty good right now - what's holding them back is
- They're tiny as fuck
- Shit drivers

To further this- gpu.userbenchmark.com/Compare/Intel-Iris-Pro-580-Mobile-Skylake-vs-AMD-RX-Vega-8-Ryzen-iGPU/m132950vsm441833
>userbenchmark
No better comparison exists, since apparently every is either too scared to being paid off by someone to do a direct benchmark between Iris Pro 580 (576:72:9 shaders:tmu:rop @1000mhz) and RX Vega 8 (512:32:16 shader:tmu:rop @1100mhz)

Intel could be competitive with AMD in graphics RIGHT NOW if they actually put some decent drivers out.
But they haven't and I honestly dont' think that situation will change.

>PS5 Zen/Navi
>XB2 Intel/Intel GPU
>Switch2 Tegra

Dis gon b gud

Attached: 1527913058712.jpg (600x600, 56K)

>Anyone
>using Intelel
No.

With 28 cores on the horizon why are GPUs even necessary? Bring back software rendering!

IT'S NOT FAIR
I CAN'T BUY NVIDIA WITH MY FOODSTAMPS

Attached: 1527277563600.png (882x758, 316K)

Consoles are trying to reduce power consumption, not increase it to the 5kilowatt level...

>buying an InCel inside GPU
kek
they'll probably be workstation only wanyway

GPUs have thousands of cores.

wow finally someone's gonna have worse drivers than AMD

>16B company vs 250B company
we'll see

Attached: 1519838354970.jpg (663x600, 87K)

Intel has horrible OpenGL drivers on Windows by the way.
The bad rap OpenGL gets comes from them.
They also can't cope with hotplugging second monitors on every laptop I've used.
When I see bug reports about intel graphics I am inclined to ignore them; after a month or so of keeping them waiting, I'll close the issue without a fix.
>doing anything at all with an intel igpu
You deserve what you get.

>Firstly, they’re selling like hot cakes and demand is so high they can sometimes be hard to find. Intel would be mad not to offer buyers another source given AMD and NVIDIA can hardly shove their kit out of a fab fast enough to keep up with demand.

And with that the article is trash

Intel is just going to sell miner GPUs with buggy game support and cash in on retards

Intels GPUs have always had worse drivers than ATI/AMD at their worst. A 3rd company is great and will only help reduce nvidia's monopoly.

Imagine the shitty pajeet rootkit of a driver they'll try and foist on incucks.

It's weird how so few people on Jow Forums are aware of Iris Pro. Intel's made ok iGPU's before, they just never really advertised it. It's weird because people here constantly praise AMD's APUs despite the PS4, PS4 Pro and Xbox One X being drastically stronger than any APU AMD has on the market. AMD purposefully holds back their desktop APUs so they can sell more GPUs, yet Jow Forums claims that APUs are the future. Now Intel wants a piece of the dGPU pie and Jow Forums will STILL claim that dGPUs are dying. It's bizarre, this board has been making incorrect predictions about integrated graphics since Llano.

afaik there is no inherent difference in the manufacture of cpus and gpus.

ิGPU need bigger die but less complex

GPU SUPERPOWER BY TWENTY TWENTY

do we need 1771 watt cooler for this ?

1776 actually.

AMD aka Advanced Micro Dongs run by a chiny chanky chong with small marketshare

>Implying AMD+nVidia isn't better than nVidia
Face it, Leatherjacketman and his family won.
Did you forget Lisa Su is Jensen's niece...

Tsukumo-tan isn't an AMD waifu.

>U jelly, Intel fag?
I'm coping just fine you kike shill

Attached: Screenshot_20180616-082336~2.png (720x647, 43K)

>$9,440
meh

This is a good thing, competition forces the other company's to produce something better and at a lower cost to the consumer. The more competition the better. Who would complain about this?

Intel will begin to do GPUs in 14 nm ++++++++++++++++ that will make Nvidia and Vega like Winters in Minnesota and plaged of bugs that will put it at 4004 levels of security.
That shit will be good.

We need Nvidia in the CPU bussiness.

Not even intel uses intel GPUs, who are they trying to fool?

Attached: hmm.png (705x449, 220K)

so they will keep selling AMD GPUs rebranded as Intel GPUs?

>what is OpenCL

>what's cuda?

>- Shit drivers
then why do they run just as shitty on linux?

The fan blades on these coolers can be made thinner

>Raja Koduri who absolutely scuttled Vega and Polaris potential
No he didn't. AMD took half his team away and put it on navi to make GPU's for the ps5.

Rumour is part of the reason Raja left AMD was because they were pulling resources away from Vega to work on Navi which they were developing from the ground up with MS and Sony.

This.Also, I wonder how many backdoors and security issues these things will have.Computing, as we currently know it,will dead before I trust or purchase another Intel product.