Why are integrated graphics so shit still...

Why are integrated graphics so shit still? Everyone hypes up the 2200G/2400G but these can only barely compete with the GT 1030/RX 550, the most entry level of entry level dGPUs. I thought the 2400G would be at least RX 560 level, but no way. They still can't even beat the 750Ti. Intel graphics are an even bigger disaster outside of Iris Pro which nothing uses.
>Hurr but what about the 8809G?
Vega M is a dGPU connected to an Intel CPU connected over EMIB, it's not on the same die.

It's funny, AMD hyped up their "Fusion" products so much in 2011, yet, ever since Llano we've seen remarkably little progress from on-die graphics over the past 7 years. Don't forget the terrible drivers as well.

Attached: AMD-RYZEN-3-2200G-Processor-1000px-2-v1.jpg (1000x1000, 51K)

Other urls found in this thread:

youtu.be/ijSSeXzwRXY
twitter.com/SFWRedditImages

>Why are integrated graphics so shit still? Everyone hypes up the 2200G/2400G but these can only barely compete with the GT 1030/RX 550, the most entry level of entry level dGPUs
Combined with a competent CPU that's extremely good value in its own right. You get X-Box One S performance without needing a dGPU making for great value. Intel iGPU on the other hand is strong enough to run an office monitor, but does what it needs to.

are you retarded, have even seen the intel hd graphics

>why is 90mm2 of GPU silicon as fast as competitors 90mm2 of GPU silicon


I dunno lol

>are you retarded, have even seen the intel hd graphics
Yes, I've used Intel graphics, it's shit. Can barely run CS:GO at 720p. Don't even try playing modern AAA games.

For what they're worth and the fact that at their price point are basically free decent (relative to Intel) integrated graphics, what is there to actually complain about?

And in your post you say "barely" compete with the 1030/550, which means you admit they do compete.
You're expecting the world, which is great, of a competent GPU that has to share its power and memory bandwidth resources with the CPU it's conjoined with while its competitors don't have such shackles.

How much of an impact will ram speed affect its performance? I'm thinking of a budget Ryzen 5 2400G build just for some basic games and console emulation.

Notably. The CPU portion will also benefit, and will carry over if you get a dGPU.

>thinks he's ever going to get a free video adapter to rival a 1080ti absolutely free and that takes up no space

If you paid any attention to threadripper and epyc you'd know. Two smaller dies are cheaper to make than one big.

Then there's power, a 65w CPU + 75w GPU(like 560) would make one hot processor that needed a large cooler, while still being low end.

There's also memory, DDR4 is muuuuuch slower than GDDR5 AND it's shared with the CPU. I'd bottleneck anything much faster.

>seen remarkably little progress from on-die graphics over the past 7 years
Intel graphics have gotten about 4x better from SB-->CL

2400g does exactly what it's supposed to, offer something cheap that kids can play fortnite on. You're just entitled.

Intel UHD Graphics 620: CS:GO ultra 1920x1080 37.1 fps
It's like 90fps at "competitive" settings

all ryzens are affected by ram speed

Maybe one day, but not for a long time.

The reason why the ryzen APUs are so hyped up is because they offer entry-level discrete GPU performance, especially worth their price considering the GT 1030 is around $70-$80 and the ryzen 3 2200g is $99

Can't beat physics. With 35W thermal budget, you're going to get 35W worth of performance. If AMD could do better, their discrete graphics would be correspondingly better as well.

And of course, current integrated GPUs have a big bottleneck in the form of DDR4. So even if you made a 200W CPU+iGPU, the GPU performance would still suck because you'd still be limited to at most 8 channels of DDR4 - substantially less than even a RX580.

Within these confines, Vega 8/Vega 11 perform very well.

You mean like, MAYBE some day a free discrete unit will perform as good as a current top line video card? or you mean like one day a free discrete unit will rival a 1080ti?.

Guys you're posting wrong, we're supposed to agree with OP's shitty narrative.

Fuck AMD, novideo integrated when?

All CPUs are, and it's dependent on what you are doing

What's the difference in either? I just kind of meant it in the overall scheme of improvements and efficiency, maybe one day.

May not be tomorrow, next month, or next year. Maybe in 10 years if we're lucky who knows. Might not even need GPUs at that point.

The fact that there is a chance that one day you'll get onboard video that's as good as a 1080ti but IF that were to happen by that time a real video card would be at like a 2160ti and that 1080ti equivalent wouldn't run shit because best newest games are being designed for that fastest hardware. You'd still be disappointed because by then you'd be asking for the onboard equivalent of whatever is fucking out at the moment.

Standard TDPs are still targeted as to not have an inordinately high power draw on the socket, and there still isn't enough bandwidth provided to the IGP to have substantial performance.
If you want a powerful IGP you need a ton of memory bandwidth, pretty much only HBM provides enough, but its expensive.

That intel chip with the Vega GPU on package is something AMD could make themselves with their own CPU, but intel had the advantage of making it a BGA part. They got to design the interface and power delivery around that one part. AMD doesn't have the cash to throw around at random projects. They stick with single designs they can fit into as many market segments as possible.

:,^)

>Jow Forums constantly shills APUs and claims Nvidia will go out of business by 2020 becaus eiGPUs will make dGPUs pointless
I'm just pointing out there's a LOT of limitations to APUs. Like I said, they can't really game at 1080p besides CS:GO and Fortnite, also if you've ever used them the drivers suck. I had an FM2 board and it was complete garbage, no drivers and it was dropped like a year after it came out due to the graphics not being GCN. Vega had a lot of issues too, extremely poor Linux launch support, no Windows 7/8.1 support, and different drivers than the regular dGPUs that only come out every 3 months. The 2200G only seems reasonable because of mining prices, from an "objective" POV it's a shit experience.

Because they are integrated. These aren't meant for performance crowns, thats what dGPUs are for.

These are for SFF/low power systems or users who won't game on it.

The integrated Vegas are intended for gaming.

Are seriously expecting real performance from two hottest PC components put in one package?

>why does a non-dedicated device perform worse than a dedicated device which has it's own, much faster memory
Why the hell do you think? VRAM is multiple times faster than RAM and GPUs have their own processors for graphical calculations. You can't fit that into a CPU and you can't use VRAM instead of RAM without having cooling issues and needing twice as much power.

light gaming like league of legends, but no aaa stuff

This will happen, just probably not for another 8 years or so depending on how well Raja does.

What the fuck are you talking about? I get over 160fps with low graphic details @ 720p. i7 6700k igpu

Buildzoid did a nice video testing impact of speed and timings for Ryzen iGPU. IIRC timings can also give you boost, so try to work with those as well.
youtu.be/ijSSeXzwRXY

More like Whozoid