Why are graphics cards making so much progress but CPUs are stagnating?

Why are graphics cards making so much progress but CPUs are stagnating?
Throwing more cores at it doesn't fix the problem if your software is an unoptimized piece of shit, and SMT is still limited to 1 extra thread per core.

Attached: Sister.jpg (2112x2816, 1.31M)

Other urls found in this thread:

it.slashdot.org/story/18/05/21/2140241/google-and-microsoft-disclose-new-cpu-flaw-and-the-fix-can-slow-machines-down
technologyreview.com/s/400710/the-end-of-moores-law/
twitter.com/NSFWRedditVideo

because GPUs can hit the market and barely work. while CPUs better work 99% or else everyone will fucking flip shit.
for example bulldozer architecture for several years was "ok" and everyone hated it. nvidia throws textures out and they don't even have as much vram as you buy.. yet everyone doesn't give a fuck

There is a limit to exactly how much one can do on 20 year old architecture. As soon as x86_64 is replaced then you will see some decent returns, but seeing as they’re still decreasing die size today with 64bit architecture, no one is willing to fund the development of its successor. It’s easier to laid 5% gains as groundbreaking and use marketing spin to milk as many shekels as possible right now.

cute whore

Isn't it possible to just create a new arch with AMD64 compatibility? Plus I thought they weren't even x86 anymore but some weird risc/cisc hybrid processor.

you would need to dedicate a whole bunch of logic real estate to something that might go unused

It's kind of weird to think a bout how much skin it's normal to expose for girls. Their underwear exposes most of their ass and legs. Skirts are inherently lewd, not even sure how they became a thing. They wear tighter shorts and pants. Not just talking about how modern women will expose more skin than in the old days, I'm talking about how their clothes are just made to be fundamentally different, and they've kind of always exposed more. Even if you go back from skirts to dresses, it's weird that it's be open on the bottom. I wonder if men's clothes will ever approach this point. It's hard to wear some of the same tight bottoms, because of the bulge men have to deal with. You can't have things as close-fitting.

cool

Incel spotted

More like vcel. I had sex like 9 years ago. It's obviously nice, but I think it's overrated. I'd rather focus on my studies at this point. Getting a gf is still an eventual goal. I definitely want to at least have kids. I'd say that's more important than even getting a wife.

>it.slashdot.org/story/18/05/21/2140241/google-and-microsoft-disclose-new-cpu-flaw-and-the-fix-can-slow-machines-down

CPUs are on the brink of destruction without evolution

>because of the bulge men have to deal with.
The equivalent of men's yoga pants. Live proud

>bulge
OwO

Kilts
Also this is like a incognito feminist post. Weird

>he says, as he slides a fresh mag into his rifle outside the homecoming dance

some women have a bulge too

Then why aren't there higher performing archs than x86 right now?

Attached: 1526965236884.jpg (250x229, 19K)

> i can only fit size 77 jeans and XXXL shirts, and that makes me mad.

If you spent more time on yourself, you'd have no reason to compare yourself to others. Sad.

I actually don't like feminism, so feel free to critique my autistic rambling. Think of it more as someone who has a mild interest in traps and has started to think more about women's clothing recently.

Kilts is a good point, actually. Very localized piece of clothing, but I have heard of them without being in Scotland, so it's worth mentioning.

I wear 36" waist pants (I don't own any jeans right now, but I have some thick canvas work pants that are similar to jeans) and medium or large shirts depending on the brand. I think I'm kind of in-between sizes right now, because a lot of my medium shirts feel tighter than I want, but my large shirts feels a bit too loose. I should obv just lose some weight, but I'm not sure where to start.

>and SMT is still limited to 1 extra thread per core.
POWER9 has like 4 or 8 way SMT or some shit

But more threads per core means more execution resources for them (intel and amd supporting a second thread is just to try and make use of the leftovers that the primary thread isn't using)

Compatibility, and lack of funding. Why create an amazing new chip architecture when nothing will run on it. Look at how much ARM chips have come along in the last decade, but still no one uses them for laptops despite their clear power advantages because nothing would run on them.

SMT isn't limited to 1 extra thread per core. there are 8+ threads per core CPUs
that doesn't help though cuz software is shit

Didn't power8 have 12 cores with 8 threads per core or something?

>but still no one uses them for laptops despite their clear power advantages because nothing would run on them.
Microsoft is working with qualcomm on some right now. They've had ARM64 support in VS since 2017 too

Because adding more coars doesn't work for CPU but works perfectly for GPU.

so if a process is using a cpu thread and it only needs 40% of that cpu it can't use more? is that what you're saying? rofl

Graphics cards making progress? Aren't most of the ones on the market 2-3 years old? The latest AMD series doesn't count because it runs the same as current nvidia cards but with a higher energy consumption

What's with this reddit tier humor?

In that case why don't we have some kind of 500w GPU that rapes everything on the market?
Just add more cores.

You do realize men in many parts of europe were wearing skirts on a daily basis barely 200 years ago?
The west is to blame for making men's clothing tight and shitty.

there's a reason nobody responded to it

you have to fit them cores on a die, genious

R9 390 was a nice step up from 290.

Attached: images (9).jpg (286x514, 23K)

Is it really that simple? I was gonna say something about parallel processing.
So there's obviously some limit on latency then?

Anyone know best introductory books on processor architecture?

>genious
hmm
also just make the die bigger
wew lad

true, but i didn't realize it was summer already

So make the die bigger.

There's a lot more to it than just that

>just make the die bigger
>just make the gpu bigger
don't you realize that exactly this has been happening for the past 10 years?

Then what's the problem?
Just make it even bigger and destroy your competition. Who cares if it uses an extra couple of watts if it's 2x more performant.

>nothing would run on them
Lots of software runs of ARM just fine. If it wasn't for the proprietary software locked to x86 you could use ARM CPUs in laptops without problems. And it's slowly happening. Battery life matters more for many situations than top performance.

Getting raped by your uncle doesn't count, user.

yeah i don't really follow power stuff but lots of SMT has been the norm on them for a long time i think

the cores end up crazy wide to support it though

go read up on how they're structured internally

it will make sense then

I figured. Like what?

DESU I wish they'd release an 11 inch laptop with a snapdragon 636 instead of the 835, and running Linux instead of windows.
Add a 55w battery in there and you've got yourself something that'll never run out of charge.

Because silicon yields are a thing. The bigger the die area the less you can make per wafer and the bigger the odds each individual die on the wafer will have a defect.

Attached: Wafer_die's_yield_model_(10-20-40mm)_-_Version_2_-_EN.png (3000x1000, 163K)

There are a few reasons why bigger dies are bad
>yields, like that user mentioned
That's a big one
>power consumption
Doesn't matter too much in itself, but directly influences
>heat generation
Which limits the clock speed, especially on air coolers.

That's why Nvidia's power savings with Kepler and onwards was touted so much.

video gayms

>Throwing more cores at it doesn't fix the problem if your software is an unoptimized piece of shit, and SMT is still limited to 1 extra thread per core.
That's pretty much what GPUs are doing as well. The difference is that graphics processing is inherently parallel.

NVIDIA x86 CPU when?
or
OS running solely on (NVIDIA) GPU when?

Attached: Jensen_Huang.jpg (1352x1040, 797K)

Why would you want that? Nvidia is fucking garbage.
>NVIDIA x86 CPU when
Literally never. Intel and AMD would never license their IP for them to do it.

More like AMD x86_64 CPU when?
We need more competition, markets getting stale and Nvidia doesn't seem as reliable.

Literally all we need is some RISC architecture with some small x86_64 emulator/translator bolted on. It doesn't even need to be that accurate; just enough to run basic applications.

Attached: 22676-23524-21067.jpg (250x250, 29K)

>AMD x86_64 CPU when?
Wat? x86-64 IS AMD.

>Then why aren't there higher performing archs than x86 right now?
Power9 pretty much DOES outperform x86.

Attached: 1525235778187.png (739x662, 286K)

NVIDIA is the best technology company. Literally the only one who has significant improvements over generations.
AMD did well with Ryzen (first release) though the second release ads nothing new.

>AMD x86_64
Ofcourse.

>NVIDIA is the best technology company
Maybe if you're a literal cuck. Nvidia can go fuck themselves with their horrible business practices, locked down cards, and extreme hostility towards open source software.

>NVIDIA is the best technology company.
Their CPUs suck, why do you think they would do any better in AMD64?

>muh open sores
Idgaf. Top notch performance, longevity, excellent driver support. No other company matches it. Maybe Realtek in their own segment. Not many manufacturers provide driver support for a decade, mind you.

Aren't those Tegra's ARM? I said an x86 CPU from Nvidia.

>Literally all we need is some RISC architecture with some small x86_64 emulator/translator bolted on
Like current x86 CPUs, right?

Why do you think they can make good x86 cpu when they even can make good arm cpu which is much simplier? Their GPUs also sucks balls from hardware stand-point, everything is getting carried by drivers and locking vendors. Last time they've released and actual hardware gpu, we got a fucking gtx 480.

>excellent driver support
TOP KEK
Even AMD has been beating them with the drivers since Omega. Missing features to fake performance, planned obsolescence, short and shit support, and let's not even mention the housefire drivers meant to kill cards.

GPUs are literally incapable of non-parallel tasks. For a long while you couldn't even do loops and if/else in GPGPU applications. GPUs are built for specific tasks that CAN be solved by throwing more cores at them.

>longetivity
Is that why every AMD card performs better than its NVidia counterpart a year after release?

Those Tegra's weren't their main focus. And for what they were, they were competitive enough.

>Their GPUs also sucks balls from hardware stand-point
Care to elaborate?

AMD can barely support their GPUs for a few years with driver updates. 4000 series got axed years before NVIDIA stopped supporting their 9000 series in 2017 (that's almost a decade of support). Not to mention limited support on the AMD side, unless you run Winblows 10 kiss your ass goodbye to drivers.

I don't see VEGA64 outperforming the 1080. At least in 9 out of 10 cases VEGA loses.

>Care to elaborate?
It's just a bunch of cuda cores managed by a driver in a complete software mode.

Do they work well or not?

So no reason, then? That's what I thought.

>sister.jpg

Attached: 1526886845729.png (1027x563, 571K)

They work well in situations that Nvidia implemented in drivers.

From pure hardware point of view, they're significantly weaker than AMD's GPUs.

I said the reason. It's the only company bringing significant improvements over generations.
AMD's Ryzen was a one hit wonder in a period of 5 years.. or even more than that?
Intel did nothing since Sandy Bridge, but to add two more cores, threatened by the aforementioned AMD CPU.

CPU hit the performance limit faster because there is more demand for it. if intel had a GPU division we probably would have the graphics we have now in the mid 2000s

So how do you improve CPUs if it's not by adding more cores?

So they work. In which situations they fail to deliver more precisely?

I'm sorry. I see nothing from other companies consistently outperforming the 1080, let alone anything better from Nvidia.

>It's the only company bringing significant improvements over generations.
You misspelt TSMC. The only reason Pascal had huge jump over Maxwell is because 16nm TSMC was an insanely good node, which allowed them to clock chips higher, unlike Glofo 14nm turd. Clock for clock Pascal has an absolutely same performance as Maxwell.

We've reached the end of Moore's law IMO. Transitors are at the limit of size now. In my un-expert opinion (since no one linked to good architecture books (hint hint give me a good fucking architecture book)) we only have quantum computing to turn to from here on out,

>I said the reason. It's the only company bringing significant improvements over generations.
They aren't though. AMD, IBM, various ARM vendors, etc. all do that. Intel is pretty much the only one that DOESN'T.

I thought it was funny

They are the manufacturer of the chip as per Nvidia's requirements based on Nvidia's R&D.
CocaCola doesn't make their bottle plastic by themselves either. Are you going to assume their success on their plastic manufacturer?

>In which situations they fail to deliver more precisely?
Everything with an innate good multhithreading on a less than 5GHz cpu since nvidia drivers are basicall single-threaded. Vulkan, for example. Real vulkan, not a fucking dx11 wrappers.

>AMD can barely support their GPUs for a few years
>source: my ass, plus some shit that happened 10 years ago
Have you been living under a rock for the past 4 years, Rajesh?

t. retard

AMD only did it once in the past 5 years with AMD Ryzen. Maybe an even longer period. And they stagnated, the Ryzen 2*** series are nothing but rebadged 1*** chips. An even smaller difference compared to Sandy to Ivy Bridge.

>Throwing more cores at it doesn't fix the problem if your software is an unoptimized piece of shit, and SMT is still limited to 1 extra thread per core.

you don't understand multi core perks and don't have basic overviews of architectures => shit thread

>the Ryzen 2*** series are nothing but rebadged 1*** chips
cool story bro

>AMD only did it once in the past 5 years with AMD Ryzen.
Plus every time they've made new GPUs(and no, rebadges like 500-series don't count as "new), just like Nvidia.

I used AMD cards in the past. Driver support is lackluster. You can barely, if at all, run their latest GPUs on Windows 7 or 8.1.
You can run the latest NVIDIA crads on Windows 7 just fine.

Thanks for admitting you are.

Well just one short sentence is more than enough. You have some issues bruddah.

Like what? Fury? VEGA? Seriously?

>You can barely, if at all, run their latest GPUs on Windows 7 or 8.1.
Stop lying, fucking shill.

Are you implying that Fury didn't perform significantly better than 290x, or that vega doesn't perform significantly better than Fury?

Looks like Jow Forums is stuck in the 90s.
> technologyreview.com/s/400710/the-end-of-moores-law/
People were talking about the end of Moores law in 2000. By 2005 everyone knew it was long dead and computing power would only see small incremental improvements outside increased parallelization.

Attached: 1270831091835.jpg (344x358, 50K)

Moore's law doesn't say anything about performance, and it didn't really die until around 2013.

Ooh right when multi core processors started to become a thing. I think I'll read your article user

I'm not. All day I see people unable to use AMD's latest GPUs with those operating systems. Here on Jow Forums.

That's funny, because I have yet to see a single such post, here on Jow Forums.

Do you think there is no correlation between transistor count and performance? Half the time the phrase was used it was in describing performance increases. Don't believe me? Read any magazine article from the last 20 years that references it.
And it totally died before 2013 despite the odd article popping up every few years for the last decade talking about "oh, this time it's really died."

Attached: butter_me_up_sunshine.jpg (500x670, 83K)

->