>99% of the dedicated GPU market is owned by Nvidia (66%) and AMD (33%) >Intel is going to release dedicated GPU units in 2020 off setting the market
Anybody else hyped that there will finally be competition for this dual-monopoly? Prices will be driven down, technology will vastly improve because of competition and gaymers will rejoice.
>man behind Evergreen and GCN >only got handicapped because Lisa forced him to focus on Navi which could be a big hit and was entrusted with next-gen
AMD is fucked because he actually has money and buttbuddies at Intel
Josiah Cook
YOU MUST POO ON IT
Dominic Gonzalez
>implying intel wants to compete in the first place they'll go for the workspace meme and hope their name is enough for them to be adopteed in the professional environment, no gaymen cards
Ayden Brown
Holy shit
Jace Bailey
Intel CPUs are botnet housefires now and AMD CPUs are pretty good. Intel graphics are mediocre while AMD's recent integrated graphics are good enough to replace dGPUs for most tasks. Why would I care about Intel dGPUs? Why would I care about noVideo shit that won't work with Linux? Seems like AMD is the only sensible choice as of late. I'm going to be putting an AMD card in my Power9 workstation.
Owen Lopez
That makes no sense. Intel has historically had more poos
Julian Roberts
>Nvidia fans currently tend to stick with intel >Intel releases GPUs >Now Nvidia and Intel are in a GPU fanboy war >Nvidia fanboys start to hate Intel >Buy AMD CPUs 16th dimensional shuffleboard
>Nvidia fans currently tend to stick with intel No they don't.
Ayden Clark
>dual-monopoly Nigger, it's called a duopoly.
Landon Green
They want to get into ai hardware sector. And sell us their scraps.
John Scott
I want an Apple GPU
Jace Sanchez
They most certainly do
Angel Rivera
This. Ironic, isn't it? I don't mind their hypocricy though. Momma Su is Queen.
milk > poo
Gabriel Hernandez
check out this shill lmao better pay twice the price for an AMD card aand half the perf top lel
Jack Foster
>all of the poo memes stop Only if Intel use their ME backdoor to stop everyone from posting them.
Jaxson Watson
reddit post
Aiden Martinez
Do you live under a rock?
Nathaniel Rivera
wow, like what they tried to do in the early 2010s but never released it?
plz try pajeet
Brody Moore
I've heard something about PowerVR wanting to re-launch in the desktop graphics space, mainly because they are afraid Apple are going to drop them for an internally designed GPU.
Adam Phillips
OH wait, i'm a slowpoke, Apple dropping powerVR caused their price to crash so hard a chinese company bought them out.
Lucas Roberts
raja will inevitably poo in it so i'm not holding my breath (but am holding my nose)
Xavier Perry
raja wasnt behind gcn user its not a secret that sony was throwing money at amd and designs at them for it...
intel and samsung. although i like amd cards for their open sourceness intel actually has better drivers hands down
Elijah Sullivan
I don't think they will try that again. :^)
Blake Rogers
>people who buy high-performance components tend to pair them with other high-performance components of course they wouldn't buy amd dogshit
Justin Hughes
>Intel >Prices will be driven down hearty kek.
Daniel Howard
You're on rumors from 2 years ago user. Apple already dropped Imagination like a rock and their stock price fell like a rock too. They're fucked and have no money to do anything.
Grayson Ortiz
/thread
Dylan Smith
intel barely can release "new" CPUs, now they are going to break the duopoly. >Anybody else hyped
Cameron Brooks
You now remember the days of nForce chipsets and when the go-to combo was nVidia / AMD 64
Noah Lewis
Based, I'm gonna build a dual system with Intel and NVIDIA only. Better keep out the poo in the loo (AMD)
Samuel Edwards
It’s weird how the shills show up at a regular time every day.
This is wrong Sony has nothing to do with GCN, they only affected Navi.
Liam Rivera
That makes nvidia's drivers the worst on the market. Wow.
Austin Foster
>milk > poo why do I laugh at this shit
Jackson Mitchell
Although AMD is paying us well, if we aren't up shilling by 8AM (whatever our local time zone happens to be) they send the storm troopers to break down our doors and raep us with stun batons. It also sucks having to wear tracking collars and head cameras, but making money in the shill game isn't as easy as it used to be.
Matthew Ortiz
> >man behind Evergreen and GCN > >only got handicapped because Lisa forced him to focus on Navi That doesn't make any sense. > some man did good stuff > told to make some more good stuff > ??? > waah they handicapped him Also he made some big steaming pile of shit known as Vega (that card without primitive shaders). He didn't live up to expectations about him, maybe he could do it at Intel.
Liam Hall
nforce 4 sli master race! Socket 939 goodness. San Diego core Athlon 64 3700+ overclocked to 2.6GHz, dual 7900 GT's in SLi, 4GB Kingston Hyper-X. *sipps* those were the days.
Jackson Bennett
>dual-monopoly It's called a duopoly or an oligopoly.
And yes, fuck the duopoly. But, they are ALL American, which explains because they aren't hit by antitrust laws. No American gives a shit that the market is abysmal because all 3 responsible are American.
that's the point, raja wanted to focus on making vega work, but lisa forced him to give more attention to navi instead
Julian Wood
Why isn't your shitty country making any GPUs?
Christian Jackson
>newfags too young to remember Larrabee It was inferior to Nvidia and Radeon offerings at that time
Even if they got AyyMD's poo designer it will be slightly worse or equal to Pooega
Juan Turner
Because all of the GPU intelectual property is trapped in the hell called 'United States copyright law'
Caleb Rivera
It doesn't need to be as good. Look at Vega and Polaris. You just need to price it right.
Ryder Johnson
Intel has most of the gpu market so it would be really really bad if they got all of it.
Noah Carter
Why didn't your county just come up with it first then?
Connor Lopez
>Intel is going to release dedicated GPU units in 2020 off setting the market See DRAM makers price fixing to give you a idea of how bad things can turn.
Aaron Garcia
Discrete GPU market is about to undergo massive demand destruction in the 2020s. Just like what happened to discrete audio cards in the 2000s.
AMD RTG is going to make decent iGPU and semi-integrated GPU solutions that make mid-tier GPU obsolete. Intel is getting off its arse and is going to match them.
Nvidia is practically locked and they are so scared from losing marketshare that they attempted to do a preemptive marketing/strong-arming BS with GPP before whistle-blowers caught them red-handed.
Discrete GPUs are going to end-up being niche just how discrete audio cards exist today.
Ryder Edwards
>told to put more resources on one project than the other >HURR HOW HANDICAP?
They handicapped his work on Vega.
Sony had nothing to do with GCN gen 1. They started working with AMD after the first generation which is plainly visible because Sony's Mark Cerny played a large role in the expansion of ACE. Raja left AMD about 2 years before GCN released. GPUs take like 5 years to bring to production with the last 2 years being sampling and bugfixes, so Raja was definitely in charge of GCN.
Jackson Martin
Shills have been saying this since Llano, APUs are still worthless garbage. The ONLY way integrated graphics are ever good is if they use high bandwidth memory like GDDR5 in the PS4 or Xbox One X, but desktops obviously don't use this for system RAM due to latency constraints. Even DDR5 won't be good enough to make desktop APUs not shit.
Charles Thompson
I think you autists are missing the big point here. It doesn't matter if intel will release shit dedicated GPU products. What matters is that they'll break the duopoly or at least force Nvidia and AMD to reduce prices to compete assuming that Intel will undercut both. Hopefully they do.
Asher Wood
>AMD RTG is going to make decent iGPU and semi-integrated GPU solutions that make mid-tier GPU obsolete. Intel is getting off its arse and is going to match them.
That's nonsense. That's basically a pseudo-console...dGPU's are going to last for a while until tech isn't stranglehold at infancy at the moment!
Carter Roberts
/thread
Bentley Peterson
Nope, it is reality. The masses don't care about meme-tier gayming non-sense like 4K gayming, AF/AA + EXTREME DETAILS!, 200FPS.
It is the consequence of miniaturization. Almost everything on a modern motherboard used to be on separate discrete cards. Discrete GPUs are the next victims.
It will not be long until iGPU solution start yielding 1080p/60-100FPS gayming in majority of stuff. The masses will little to no reason to get a mid-range discrete GPU. Protip: Mid-range discrete GPU make up the lion's share of revenue in the entire discrete GPU market.
Ubiquitous iGPU already killed off the basic 2D/3D discrete GPU market. The only units that exist are either old hardware units or overpriced low-end Quadro/FirePro SKUs.
Angel Johnson
Intel makes this announcement every year or two. Nothing ever comes of their R&D in the consumer market. Larrabee ended up becoming Xeon Phi which apparently is only used for very specialized academic applications, render houses, governments, "deep learning", etc. Whatever R&D goes into the next iteration is just going to feed that market since Intel already has customers there.
Hunter Edwards
What do you mean by the masses. Average Joe consumer never cared about discrete GPUs. Today they do all their computing on mobile devices and gaming on consoles. Discrete GPU cards are and always have been a hobbyist market and hobbyists want those periodic upgrades. The periodic upgrades are still giving something better every time, which didn't really happen with sound cards and I/O port cards.
Bentley Wilson
This man will release nothing good.
Jonathan Adams
>It will not be long until iGPU solution start yielding 1080p/60-100FPS gayming in majority of stuff Are you aware that the 2200G and 2400G can barely match the Xbox One console, which came out in 2013? That Vega 8 is less powerful than the meme 750Ti, which is widely considered to be the bare minimum for 1080p gaming? Integrated graphics are a dead end. Intel wants to make dGPU's, the Intel/Vega Frankenstein SoC is actually a CPU + dGPU connected together, and laptops are moving towards using dGPU's in midrange and high end configurations. The industry is literally moving in the OPPOSITE direction of what you claim.
Hudson Garcia
Nope, it is moving towards intergrated GPUs since Clarkdale/Sandy Bridge.
Intergrated audio didn't take over discrete audio overnight. It took years but it eventually happened.
The pieces for next generation of integrated are already in place. The 2020-2021 is the beginning of the end.
By the time 2025 rolls around. Mid-range tier discrete GPUs are going to be difficult to come by and justify. Discrete GPUs are pretty high-end niche tier stuff for towards who are willing to the extra mile.
Matthew Lopez
Actually, discrete GPUs are mostly bought by kiddies (High-School/College-age). Hobbyist have always been a small, but vocal minority.
Even most of those hobbyist will eventually find it difficult to justify those "periodic" upgrades if they continue to give smaller and smaller returns. They just say "fuck it" when integrated GPU can take of their needs without none of the hassle and strings attached.
I have seen the consequence of miniaturization on the PC front since 1980s.
There's nothing special about discrete GPUs that make them immune to it.
Christopher Thompson
Death to GPUs and heterogeneous architectures, be they from nVidia, AMD, Intel, ARM or anywhere else. All hail RISC-V and the vector extension which will bring both 3D rendering and professional vector processing back into unified, homogeneous software.
>all-in-one cpu It doesn't have to be all-in-one. There's no intrinsic problem in having two CPUs in a system that are homogeneous in architecture but different in performance attributes. The point just being that they can both run the same operating system image, using the same paging structures. You'd basically just ask for your rendering threads to be scheduled on the CPU that performs vector operations better. >How amazing is this extension? It is pretty amazing, though. The key points being: 1. The vector unit has configurable registers, allowing more advanced implementations to effectively give wider vectors if a program requests fewer registers. If some rendering algorithm only needs five or six registers, it might get 128-wide vectors since there's room for it in the vector register file. 2. It has a vector-length register, similar to Cray systems, meaning there's none of the traditional SIMD clean-up ugliness for uneven elements. Even if you just loop over two elements, you still gain from using the vector instructions for it. I do believe the vector extension may actually become the killer feature for driving RISC-V adoption. For professional applications (eg. things using CUDA today) it's potentially much better than GPUs, since it is fundamentally a CPU, so it can do all the standard nice scalar flow control while not losing anything in the vector department, and from there it'll likely trickle down to 3D graphics. The people working on it really seem to know what they're doing.
Cooper Ortiz
GPU are Trojan horse. GPU begin compute high parallel unit +ASIC for graphics operations and high bandwidth memories.
Nvidia infiltrate engineer and science universities with CUDA and hey this graphics card for gaming too could run compute loads.
Nvidia want laptop discrete market begin biggest in sales by 2020.
Intel using dedicarte as Intel + Vega processor replace Vega with own GPU to attack Nvidia markets.