Zoomer here

Zoomer here.
How did people react to the original multi core processors?
Were they immediately well received or seen as more of a gimmick?

Attached: intel-core-2-duo-e8400-3-0-ghz-6mb-1333-mhz-socket-775-original-imaefzr37z5jnng6.jpg (832x760, 54K)

Other urls found in this thread:

youtu.be/FTELiuQv0Ts
merriam-webster.com/dictionary/innovation
twitter.com/NSFWRedditImage

gimmick

>Holy crap Lois! It's like 2 CPUs for the price of 1!!!

Attached: 1554170985946.png (1000x1000, 125K)

They're still mostly a gimmick today. All the actually interesting shit cannot be parallelized well.

Had an Hp-Compaq with that, still works but gave it away because it overheats even with new paste and cleaning.

It was common for enthusiasts to run dual cpu setups so people were happy they could get two in one

AMD was revolutionary back then. First multi core, breaking the 1GHz barrier, first memory controller onboard, first 64bit processor
Jim keller is the man.

Back then I was a gaymer and in that community it was seen as a gimmick.
You should choose the right hardware for the job. The thing is, I don't even care if multithreading is a gimmick 99% of the time, I made an informed decision and bought my CPU for a very specific purpose. I am very happy with it.

i don't want to sound like a boomer or anything
but the lack of cows in this thread is shameful.

O remember the original jump to x64, my father was going crazy how he was getting double the power for cheep.

People did not know exactly what to do with them and thought it was complicsted

>memory controller onboard
That shit still amazes me to this day, Intel didn't have an on chip IMC until 2008 with Bloomfield i7 and AMDhad it in 2003.

Attached: 1523433903969.jpg (1038x1000, 140K)

Originally they were absolutely dominated by AMD, and Intel dual cores were for retarded fuckheads and shills. In a couple years Intel took the lead back. It was the golden age for AMD. Ryzen hopes to bring back that magic by dominating i9 but we will see. It won't be the same as before, even if it is better on paper.

Depends on who you asked. It was mostly a gimmick to gamers and other home users who were running a handful of single-threaded applications at once. In the high-end it was a great way to reduce power consumption and increase density in multi-processor systems.
AMD fanboys are the mactards of computer hardware. 64-bit and multi-core existed in high-end machines for years prior to AMD finally bringing them over to the consumer realm.

This

>multi-core existed in high-end machines for years
Of course espensive ass servers and dual processors existed prior to AMD bringing it to the masses, but the point is that AMD brought it to us for less money when Intel refused to innovate.

Wrong

>coping this hard
There is a big difference between processors the with cooling the size of a fucking bean bag chair, and ones that you can fit in a laptop or a cheap desktop. Any retard can make the former, which is why so many retards were making them in the 80s.
Let's try not to pretend like Intel was dominant with the dual cores by downplaying AMDs achievements.

Who said anything about Intel? Early multi-core processors were mostly pioneered by IBM, HP and Sun, while credit for 64-bit goes to MIPS and DEC. I'm just reminding you that AMD did not literally invent modern computing just because they shit out a cheap dual-core chip at the same time everyone else was. Even Apple was going dual-core in 2005.

A lot of people said you wouldn't need a core2quad back in the XP days, iirc Windows 98 and 95 could only use 1 core or something.

They're innovators user, you don't get it, it's like how apple takes ideas and technologies that already exist, and package it in such away, you think "Why hasn't anyone done this?"

I remember when the q6600 came around and Jow Forums was flooded with cow2beef pictures

>mfw e8400fags got eternally btfo in the end

Attached: 1294374058624.png (326x383, 10K)

Single core fags will always be btfo, intel pretty much died tonight with the announcement of midrange 8 core that beat the 9900K and that 12 core part, wich I don't mind saying is a steal at $500 because intel wants $1100 for theirs.

Not really. Mine served me nearly a decade gaming. Worse performance but not in all games and not by much

Not sure what GPU you had, but a modern game like GTA V slams out a Q6600 and the bottlenecking is crazy even on a mid range GCN card, the game is not playable on wolfdales and that game came out in 2015.

Except they really didn't. Of the major desktop and server architectures that were still being actively developed at the time, x86 was the *last* architecture to get on the multi-core bandwagon. IBM was doing it since 2001. HP, Sun and even Itanium since 2004. 64-bit computing of course had already long been a thing since the early '90s, AMD and Intel only finally bothered to glue it on to x86 when Itanium flopped and consumer systems began to approach the 4 GiB memory barrier.

Of course, AMD does parallel with Apple in that they have done a great job of deluding their fanboys into believing they were revolutionaries for merely following market trends in a way that best pandered to their customer base, but at least Apple put their own original spin on the more complex ideas they took. You can't really do that with fundamental concepts like these.

That's a different Era in gaming and I actually did beat it on that cpu and 8800gtx on low. I had it on 4ghz though

How is life in eastern Slovakia/Bengladesh/Mexico/Philippines

>that cpu and 8800gtx
Damn i would've given my nuts for an E8400 rig with an 8800GTX, and a 4GHz wolfdale core is as good as bloomfield core without hyperthreading when it came to ipc, and it hit the magic 4GHz at the time.

I don't own it anymore. The video card fan went kaput and then the motherboard

How is packaging a 2nd core different than what i486 did with "upgradability"

Basically it is on one chip versus an expansion slot

youtu.be/FTELiuQv0Ts

>the absolute state of zoomers

Intel was pretty much always huge company that never innovated unless they had to.

Meanwhile most of AMD's "innovation" is from an ex-DEC engineer, kek. Fuck x86 kikes.

Most of the work done on most CPUs is small processes that do something and then exit. Multiple cores help tremendously with this. Very few computer users are running tasks which can’t be parallelized.

Intel bent the knee and hired the same fellow you mention because their Pajeets and Chaims just can’t innovate.

>smaller chip
>cheaper consumer oriented chip
>pretty much bought multicore and 64bit arch to masses

>not innovation
Before AMD, most people didn't heard any of this. IBM, sun, hp.... All failed
I know, you don't want to give any credit to AMD, but reality is that all these things AMD did brought to mainstream.

Technology is more than consumer garbage and gaming accessories where having the biggest marketing machine and selling the most == being the first, or the best.

AMD brought 64-bit and multi-core to the desktop PC mainstream by virtue of being the first of the big consumer technology juggernauts that bothered to care about it once it became clear their target market could benefit from it, just as the likes of MIPS and IBM did when they pioneered those technologies in the lower volume high-end years before.

AMD didn't succeed where others failed, they just took what they did successfully for their own users and sold them to a larger market in line with industry expectations. As I said, x86 was the last major desktop architecture to adopt multi-core and 64-bit, there is nothing innovative about that unless you restrict yourself squarely to the x86 bubble and ignore everything else outside of it, something /v/ denizens tend to do while ironically calling themselves tech savvy.

A gimmik but an impressive one. I remember talking about it with my buds and arguing whether it was better despite the lower.clockspeeds. Remember that this was after the p4 era when GHZ was everything.

Most people did not undertand what threads or cores were (nor what they implied in terms of real world impact). What most people believed was that a dual core was essentially 2 CPUs glued together.

By the time multicore processors were fairly common, the argument shifted to whether programs actually used those other cores, and whether it was better to buy a powerful dual core(E8400) as opposed to a entry level quad (Q6600)

As I already said, it seemed like a meme.
I understood more when I started programming.
If you are into tech learn to code a bit, you dont have to be a pro, a little bit is fun.

when these cpus were released, nearly nothing was using them. it took a while

I had my gentoo phase back in the day and Athlon X2 were seen as the holy grail.

Earliest dual core x86 I've seen were the double socket Pentium 3s. Most people who used those told they were pointless (remember, gaming OS at the time was Win98SE, which didn't even support dual cpus). On Win2k you could use both cpus, but gaming was limited, and the only way to use the extra cpu was running 2 apps at the same time, so this was useless for gaming.

First *mainstream* dual core cpu was the Athlon 64 x2, which was iirc 2 chips in a MCM module. Worked fine under WinXP, but very few games supported it. However, by that time you had things like FRAPS common, and you had Pentium 4s with Hyperthreading, so having two cores wasn't that rare and more and more things started supporting it.

I remember jumping from A64 to A64X2 in... 2005? 2006? I don't remember. The main advantages were faster video decoders for HD rips (at the time they were HDTV rips from MPEG2 TS files encoded into h264 or xvid), and some emulators supported it (PCSX2 straight up doubled in framerate and it was playable at full speed for the first time).

The biggest advantage for me was, when a dodgy app decided to hang up in an infinite loop, it did not make the entire OS unresponsive. The OS could run on the other core and I could alt-tab and nuke it in the task manager. This might seem insignificant, but when a fullscreen game did this on a single core machine, you had to either wait 15 minutes until it managed to open the task manager, or do a hardware reset. So that alone made it worth it for me, and I was playing a lot of PS2 emulated titles like Disgaea and Gradius V, and watching HD movies, so the Athlon64 X2 was fucking awesome at the time. For people playing GTA, FIFA and NFS, probably not so much.

Then the Core 2 increased performance a shit load over that, and the Q6600 double core count; AMDs response was buggy and slow and wasn't until a respin that it beat the Core 2 due to higher clocks and cores, but by that time Intel had i7s and i5s out.

Yeah, that was the time when this was made on Jow Forums.

There's also a CUDA-optimized version the gives you a steak burned to coal in 0.2 seconds.

Attached: 1208553523125.png (1356x1539, 354K)

It's still a gimmick.

>First *mainstream* dual core cpu was the Athlon 64 x2, which was iirc 2 chips in a MCM module.
Athlon 64 x2 was monolithic. Pentium Ds were the MCM dual cores.

So why does AMD own the patents to 64bit tech?

>Technology is more than consumer garbage
Oh, sorry. I didn't know that we're talking from other perspective rather than consumer.
Tell me, about all these data centers and mainframes you own. :^)
Btw, stopped reading right there

They were? My bad then. All I remember is being stuck in Socket 939 during the DDR->DDR2 transition, then trying and failing to find an Opteron for it (the S939 Opterons were rebranded FX chips). Then when Vista came out, I had to swap out the motherboard in a hurry, because I used a S939 dual core with a Radeon card with an nForce3 chipset, and Nvidia DID NOT release Vista drivers for the nForce3. You could get it sort of stable with XP drivers, but that only worked if you used a Geforce card - with Radeons, it was unusable.
With a cheap VIA chipset board (remember those?) it worked fine even on Windows 7.

So basically, the most popular high-end PCs from 2003-7 just did not work under a new OS due to Nvidia being cunts. This fucked over not just me but most people I know who had high end PCs. I haven't bought nvidia cards ever since.

AMD had better dual-core system from beginning. Intel had their D-furnaces with TDP above 90W.

Attached: 1558899455041.jpg (720x1348, 188K)

They don't. They have patents on their specific means of extending x86 registers to 64 bits while maintaining compatibility with older 32-bit designs, but they did not invent 64-bit microprocessors or the multi-core concept. I don't get why this is difficult to understand.
>I'm ignorant so it doesn't count!
Just stop posting, you're a fucking idiot.

(You)

real brainlet hours

I don't have a picture if ur mum on my computer, sorry.

i didn't really pay it much attention cause i was busy fucking bitches. i did know of their existence though.

>somehow non consumer computers are more relevant now
How, exactly? Dumb ass nigger

>yfw still using the Q6600 today

Feels good man. Still runs everything I need it to. Only a retard needlessly upgrades often

Gimmick for sure.

We had hyperthreading and the only thing people cared about was single core performance for games.

So basically the same as now.

Did Apple invent the GUI and the Mouse because they were the first company to advertise it to average joes? How is this so difficult for you to comprehend? It doesn't matter.

>>somehow the computers that are actually doing the work to support my epic gamer toys and internet connected fashion accessories are more relevant now
somehow......

Attached: 1555647611661.jpg (224x250, 7K)

Innovation and invention isn't same thing, dumb faggot
Innovation means introductory to something new
And AMD literally introduced multi core computing and 64bit arch to consumers.

They were received with the ultimate in retardation. Well, more accurately it was the single core CPUs with hyperthreading that were reported as if they were two separate CPUs, then the true dual core CPUs reporting the same. The other bit of retardation was how people would add the clock speed of cores together. Suddenly a 1GHz dual core CPU was the same as a 2GHz CPU to idiot consumers.

>How did people react to the original multi core processors?
The first two cores were an immediate improvement for general QoL, anyone that disagreed was doing so out of budget concern or ignorance, granted it was like 3-5 times the price for a CPU that wouldn't clock as high and need aftermarket cooling.
Quad cores when they first appeared were like 8+ cores now seen more as a niche product most people couldn't use properly than anything else, but most reviews still underlined that it would certainly end up being useful down the line and well Q6600 are mostly useable whereas dualcore Core 2 are really starting to become hard to use now

Fucking this. Exactly this. And the "2 cores, each running at 2GHz? I HAVE 4GHZ OF PROCESSING POWER NOOBS"

I didn't even mentioned application of computers.
And, obviously, implied In this place (Jow Forums), dumb fuck.
Post pictures of your data centers or mainframes. Let me guess, you have none, but most definitely have a desktop or a laptop.
What a fucking faggot

>I didn't even mentioned application of computers.
lol. seriously just shut the fuck up retard.

Attached: 1542186045917.png (900x729, 129K)

I dunno, my dude. My first dual core CPU wasn't that impressive. I went from a P4 to a C2D, and the improvement was not nearly as drastic as you imply. Maybe it's because of the hyperthreading of the P4, but I was really underwhelmed considering how much I spent on my first C2D.

They were likened to dual processor systems. The Pentium D was Intel's first offering for consumers, basically two Pentium 4s on one core, and it was fucking horrible. A power hog that couldn't be cooled properly, sound familiar? The Core 2 Duo came along about a year later and was a HUGE improvement.

I believe AMD's first was the Athlon 64 x2 - pretty good chip, I used one for a few years and was perfectly happy with it, the Core 2 Duo was better though.

Back then you could get away with using a single core computer for everything. How things have changed.

Is the concept even still relevant to how multi-core and multi-threaded applications are handled these days?

Attached: 1489181606975.jpg (2752x1334, 625K)

>Innovation means introductory to something new
No, innovation just means something new. Multi-core and 64-bit were not new technologies in 2005, not even necessarily to consumers; see MIPS as an example of a 64-bit architecture widely used in low-cost embedded systems even before the days of the Athlon. Embedded is also another great place to find asymmetric multi-core designs as well, especially combined CPU/DSP chipsets like the OMAP that were of similar complexity.

Maybe it's you that should shut up? Not because you're wrong, but because this fairy farming cocktugger isn't going to stop being fucking retarded and it's only going to end when you let it go and realize you're right and continuing to argue is like winning the special Olympics

>implying programmer or a graphics designer, or an engineer using computer at home isn't consumer
You're retarded, aren't you?

merriam-webster.com/dictionary/innovation

> : the introduction of something new
> : a new idea, method, or device : novelty
Where's the reintroduction of something old to a new market, here? I don't see it.

new to the market senpai.

It's like saying 'X brought innovation to our office by implementing X technology!"

don't worry, I only ever saved like three cancer wojaks
>but what about the 2% of these professions that work from home
see pic also please learn english

Attached: 1553825752470.jpg (971x565, 87K)

Well my first experience with one was when the local cybercafe replaced one of it's dead computer (3500+) with a dual core capable box (4400X2)
Owner was running all sorts of spyware shit (including something that would basically reset the computer to it's original install at every reboot) so we couldn't do illegal shit on it's computers so the single core all choked like hell if you did much of anything whereas the single dualcore would still do just fine.
That and I was very much into ripping DVDs for my own to make the most of that DVD renting store so dualcore meant I could actually do shit while the movie was encoding, but I've always had an issue with my behavior leading top having an absolute clusterfuck of stuff open at all times so moar cores benefited me more than most I'd assume (Also why I'm most likely gonna pull the trigger on the 3900X once it's out for a little while just to avoid some nasty surprises)

Sounds more like mental gymnastics to me. Just accept that your faceless corporation of the month is not God.

>t.
What you're even talking about? You moved goalposts beyond comprehension.

Attached: 1540396635455.jpg (1099x605, 79K)

I'm not even the same guy ;_;
I have a Xeon E-2174G since the jews at dell didn't have a ryzen option.... I actually bluescreen once each day when doing heavy MEMORY ops... hold me

who are you quoting? learn english.

Based retard.

That sucks. EPYC doesn't seem to have gotten very far yet, but Intel probably won't cease being a circus of utter incompetence for quite a while so you'll hopefully have some better options by your next upgrade.

>Originally they were absolutely dominated by AMD, and Intel dual cores were for retarded fuckheads and shills.
Stop lying. AMD ruled the Pentium 4 era, just before the dual cores. Intel almost killed AMD with Conroe IPC and AMD played second fiddle until Ryzen.

Are you posting from 2014?

>not posting the first dual core
truly a zoomer

Attached: amdx2_3800.jpg (550x278, 30K)

that doesn’t look like POWER4

>Pentium D dun exist guyse

AMD dominated power wise, so Intel did what Intel does best, and bribed OEMs to only use Intel CPUs.

Mostly seen as a gimmick. I made the mistake of buying a single core processor (Athlon64 3500+) and regretted it within a year.
When the Core2Quads came out I upgraded straight to one (Q6600) while people were calling quad cores gimmicks and all you needed was dual core.
Zero regrets. It was a great processor.

It was a god-sent for the normal desktop world. Multi-tasking became a reality for the masses. You no longer had to worry about making coasters on your CD-R/DVD +/- R burns. You had to invest in HEDT-tier platforms if you wanted SMP support.

Wrong, SMP was niche back then even among the enthusiast crowd. You had to get HEDT-tier boards to get SMP support.

>How did people react to the original multi core processors?
I liked those PowerPCs, but thought they were too expensive and a hoax.

But once I had a dual CPU G4 setup@800MHz with tiny caches (coming from a single core 1.4GHz with biggest caches available) I didn't hesitate any longer and went balls deep into the multicore meme and bought an Intel C2D asap. Then upgraded to 4 cores asap one to three years later, I don't recall the exact dates.

Immediately a mediocre reaction ("looks like they ran out of ideas"), then 100% want.

>How did people react
Well, the pajeet from anandtech reacted as follows:
> single core is enough for everyone. Buy intel.
Then the other inteltards reacted as follows:
> single core is enough for everyone. Check the pajeet tech dot com
Then everyone else started buying multicore cpus, a.k.a. athlon x2s, and intel reacted as follows:
> dell,hp,lenovo stop using amd's products.
And 2 years later intel managed to glue together 2 p4s and call it pentium D.

Nah, any overpriced facebook machine did it. Plug it in and use your one button mouse with many CPUs!

You missed that completely, didn't you? I noticed it's often like that, the windows-only users have actually no idea what's going on the world outside of redmond, whilst Linux/BSD/Mac users have to dual boot for some rare niche sw, which then in return opens up their eyes to things they didn't see yet in their native eco-system.

Somebody should split your head open by force if you have the audacity to call other people wrong when you're wrong yourself. Let's hope some real human bean approaches you soon,

Disable AA (looks like GTA 4 and nobody cried about GTA 4) and it'll run on any shitbox. It's one of the most optimized games next to Dota 2. You're full of shit, mate. GTA V even runs on fucking playstations from 15 years ago.

Idiot.

>zoomer/latecomer detected

> unironically uses facebook machine as example when SMP hardware and platforms predates failbook by a few decades

> implying that SMP world wasn't enterprise/HEDT-only until the mid-2000s

People didn't thought about it much back then.
All they cared for was the massive boost in single threaded performance that this chip did.
And the fact it was so fucking good at caching things, it could fool all the memory speed testing programs.
But i imagine MANY people did enjoyed the fact that the dual core chip was actually able to play audio without getting all the sorts of awful crackling when using integrated audio chip.

Get a single core computer then, you will LOVE how windows just go and steal the CPU and your sound breaks, your download gets slow, shit stutter...

Phasing out my C2D desktops this year, but will keep the laptops going as far more useful than some think. They can really do any task smoothly if you have a good system for it and yes Windows 7 is one of them if you have 4 GB of ram and SSD.