Intel officially kills Itanium

Is this the biggest money and time sink project in the modern computing history?

Attached: badge-itanium-800x800.png (800x800, 337K)

what is it?

All intels are time sinks until meltdown is fixed at the hardware level (still multiple years away)

What about larrabee?

Elaborate

If they just could get it to work, it would be a very fucking good alternative to the multicore hell we are.

In practice, I think the Cell chip was worse.

Come on, just because it had every mistake pentium 4 had, but with a shitton of mistakes tossed on top of it, with many taking advantage of the pentium 4 mistakes to get even worse, this don't mean it is a bad chip, just a really horrid one.

this. 64-bit x86 should never have existed. It's a kludge over outdated infrastructure. Itanium, or rather VLIW in general, are still the direction forward.

x64 was fine.
But we need an itanium that actually work.

Mill will save us

Not even remotely close. Itanium was suppose to a Intel's DEC/Alpha killer but that market (Big Irons) was starting to die out by the time it reached commercial channels.

Nah, IA64 was killed because big irons were going out of style. HPC shifted towards clusters and GPGPUs and other ASICs killed whatever niche Itatium hoped to survive in.

The whole thing was designed to take advantage of explicitly parallel instruction computing(EPIC). The idea, basically was that you could execute workload in parallel by having compiler determine what code chunks could be executed simultaneously and producing assembly code that fires instructions for those tasks simultaneously. Turns out it's near impossible to determine what code chunks will run simultaneously while the program executes out in the wild, and EPIC produced negative performance gain, just as most of CS academia warned while EPIC was in development.

First generations of Itanium had so much engineering and die space dedicated to EPIC, and subsequent generations still were saddled not only instructions specific to EPIC but an entire architecture that carries VLIW overhead, which was a design decision made to accommodate EPIC.

Itanium is basically what happens when a multi-billion dollar product is developed on wishful thinking.

do you know anything about the enterprise world or is shitposting nonsense like a sad shill the best we're going to see?
> im totally not a broke amd faggot running my setup off a $100 cpu
get fucked, amd.

Just goes to show how retarded Jow Forums is

You are the same idiots who thought intels dedicated gfx project was a failure

Itanium taught them so much about 64bit. The product wasn't a massive success but it wasn't the point. You people don't know the first thing about the influence it had on their implementation of 64bit after amd brought x64 out.

pathetic

do not reply

>Itanium

That is a name I have not heard in a very long time.

I wasn't even aware the Itanic was still around.

Reminder that AMD once kicked Intel's dick so hard it was years before they could stand up straight again.

Attached: sledgehammer.jpg (640x480, 44K)

Stay pozzed, niggerfaggot.

x64 was amds hacky 64bit extension of x86, croslicensed back to intel after the itanic sank.
IA64 was entirely different, native 64bit architecture, which in retrospect, would have been preferable to x64

>You people don't know the first thing about the influence it had on their implementation of 64bit after amd brought x64 out.

oh wow, at last I see how an in-order VLIW design with predicated execution and basically no AGUs was secretly the inspirations for OoO designs, HOUSEFIRE hyperbybelining, then rolling back to pentium pro arch with Core and then making uop caching with SB.

They still haven't fixed it? So they're releasing vulnerable chips and people still get them?

Attached: 1323925713.jpg (600x600, 184K)

>Keramic base
So comfy...
Makes me sad

they still make these? Last I remember only HP servers had them

They probably were forced to by contact they signed with HP long long time ago.

What the fuck, they are still making Itanium chips? I thought they stopped ten years ago.

>which in retrospect, would have been preferable to x64
Dunno, VLIW is kinda awkward to deal with.

>hacky
The only extension that allowed for simultaneous 64/32-bit execution with no perf hit is "hacky"?

kek you sound like a butthurt intel shill seriously. Itanium was shit because 32-bit software was still prevalent literally everywhere and intfail expected everyone and their mothers to suddenly drop x86 because muh new arch. Meanwhile based AMD saved the day and intel was forced to cling to the contract it made with hp to keep making this giant turd.

the entire reason AMD has chips called EPYC? fucking idiot
leave

Weird flex considering VLIW sidesteps Meltdown and Spectre but ok.

of course it's a failure
they couldn't even type the name properly

You are literally an idiot.

VLIW is just too awkward to use for general computing.

Sidesteps cost-effectiveness and usability too, but who cares about that? Scary sounding vulnerabilities nobody's actually done anything significant with will always generate more clicks and ad revenue.

>native 64bit architecture
I don't get this buzzphrase, it's just register width. x86-64 is no less "natively" 64-bit than MIPS, SPARC, POWER, PA-RISC or any other formerly 32-bit desktop architectures that I've never heard remotely similar complaints about outside of maybe DEC Alpha/Itanium marketing material.

Have any of you ever even written a line of assembly outside of college courses?

Attached: 1539418757017.png (200x200, 55K)

Fucking finally.
HP should have continued developing PA-RISC instead of giving money to the Intel cons.

>t. doesn't know shit about CPU architecture
>Itanium, or rather VLIW in general, are still the direction forward.
No, they aren't. VLIW sucks for general purpose computation that CPUs are used for.
>Itanium was suppose to a Intel's DEC/Alpha killer
It ended up being just that. Compaq bought DEC and killed off Alpha because they fell for the Itanium meme.
>but that market (Big Irons) was starting to die out by the time it reached commercial channels
False. What happened was most of the competition was voluntarily killed off because it was anticipated Itanium would be the be-all end-all CPU arch. AMD64 would have significantly harder time to compete with Alpha, PA-RISC, or even MIPS if they still existed, mainly because CPU isn't everything and the traditional UNIX servers had much wider hardware and software feature range.
>HPC shifted towards clusters and GPGPUs and other ASICs killed whatever niche Itatium hoped to survive in.
That's a completely different decade, buddy.
>Just goes to show how retarded Jow Forums is
And you are the biggest retard of them all, you fucking tripfag.
>Itanium taught them so much about 64bit.
It's a register size, not some kind of sacred knowledge.
The only thing salvaged from Itanium that's used to this day are the calling conventions used by compilers
>You people don't know the first thing about the influence it had on their implementation of 64bit after amd brought x64 out.
The answer is "absolutely none". Unless you count "let's not produce a failure like this again" for influence.
>pathetic
Yes.
>IA64 was entirely different, native 64bit architecture
R E T A R D
>VLIW sidesteps Meltdown and Spectre
It doesn't. Meltdown and Spectre are unrelated to the instruction set.
With Poulson Intel included rudimentary spec-exec and so Itanium is now vulnerable to Spectre too.

X86 "the biggest money and time sink project in the modern computing history" we need to move on.. Their was a time when it was moving in the right direction, now it's not.

Attached: 1529318497710.jpg (445x344, 64K)

Yes, those cattle are supposed to mitigate it with patches of microcode that still decrease the performance as a result.

>do not reply
Make me, nigger tripfaggot.

I remember actually pushing this for our SQL servers back when it was reasonably new and looked promising. Sure glad I didn't get anyone to listen to me.

It was pretty well dead after Oracle backflipped on it being their preferred CPU.

I'm so happy this corrupt piece of shit of a company will die in my lifetime.

Attached: 1532756867725.jpg (988x854, 164K)

haha you stupid fag, X86 will still be here in 2030.

(IA64) Itanium was conceived when big irons were a big deal(1990s). It was geared towards that market. By the time it came into the marketplace. Big irons were being eaten alive by clustering computing and most of the big iron players were either dead or dying. Intel spend all of that time and capital on a project that end-up for nigh. IA64 fell to the same pitfalls as all of the other exotic architectures back in the 1980s. IA64 lived on life support throughout 2000s only through vendor contracts of its early adopters. The emergence of ASIC/GPGPUs in the late 2000s into 2010s was the final nail in the coffin. Intel officially killed it today because the long-term service contracts finally expired and they want get into ASIC/GPGPU train that is running hot in the HPC world.

It was dead before that. There was nobody outside of its early adopters (HP) that wanted it.

The potential performance benefits weren't worth the massive headaches of transitioning into it. The hardware and software-level emulation for x86 was a complete joke.

IA64 lived almost its entire life under an coma in ICU. Intel finally pulled the plug as soon as the long-term service contracts for IA64's early adopters expired and they moved onto ASIC/GPGPUs for their HPC needs.

>AMD saved the day
>kept x86 alive for another century
Retard

What is the advantage of x64 over x86-64?

>It doesn't. Meltdown and Spectre are unrelated to the instruction set.

True.

>With Poulson Intel included rudimentary spec-exec and so Itanium is now vulnerable to Spectre too.

Theoretically, don't think anyone have demonstrated an attack let alone suggested a viable attack path.

They are the same, just different names for the AMD64 instruction set.

>Theoretically, don't think anyone have demonstrated an attack let alone suggested a viable attack path.
it is so obscure that there probably aren't any researchers even working on it.

>until meltdown is fixed
you mean Meltdown as security issue or Meltdown as thermal issue ?

6/10

Attached: IMG_20190202_210637.jpg (1200x846, 140K)

Attached: Screen Shot 2019-02-02 at 13.47.52.png (980x780, 528K)

>this dude will eventually make a full doujin
I can hardly wait

Attached: anime girl comfy in bed.jpg (846x476, 71K)

your precious 7 wouldn't exist without Veesta

Gnome

>Oh no, x86 is bad! BAAAD!!
What is wrong with you RISC faggots?

>40 years of tacked on cruft is good
>let's just keep doing it instead of starting fresh

>Have any of you ever even written a line of assembly outside of college courses?

no, lol?

>erasing 40 years of software, progress, experience and documentation on a proven architecture because of some meaningless buzzwords
Can you even articulate what this "cruft" is, why it's bad and why it means anything?

>What happened was most of the competition was voluntarily killed off because it was anticipated Itanium would be the be-all end-all CPU arch.
The only company that really did this to any detrimental degree was SGI that was already on the way to death regardless of what shitty architecture they were going to get there with. Itanium replaced many of x86's higher-end competition, yes, but it didn't kill the platforms that were built around them. They did that on their own with marketing missteps, external economic factors like the dotcom bubble, horrible hardware prices, taking advantage of vendor-locked captive audiences with ridiculous software licensing, and just their general inability to compete with cheaper commodity systems that, even if boring, still did 95% of what you once needed a big iron system for by the early-mid 2000s.
>AMD64 would have significantly harder time to compete with Alpha, PA-RISC, or even MIPS if they still existed
Quite the contrary would have happened, just look at how AMD64 pretty much gutted the already tiny market for SPARC, POWER and Itanium workstations and entry-level servers practically overnight, and relegated them to high-end niche or legacy backburner status just as Alpha, PA-RISC and MIPS had already been for years. People who make statements like these really just don't understand these architectures or their place in computing, these architectures have never been able to compete with x86 in its entry-level homeland, AMD64 only allowed x86 to grow even further.

True.

I have and calling AMD64 native 64 bit is a bit dishonest, it's clearly shoehorned into a 32 bit architecture that shows many signs of originally being a 16 bit architecture.
Example: encoding PC relative addressing is a hack

I doubt (s)he can but I do:
Artifacts remaining from being a simple microcode engine like not setting flags when shifting by zero.
Redundant instruction encodings.
AH to DH registers.
Partial register updates for low (AL) and high (AH) or even word (AX) writes.

I could continue, want me to?

At worst these look like minor nuisances, or otherwise "who-cares" things like partial registers and redundancy to ensure compatibility. A processor exists to be used, not admired.

But at least it's a far better answer than the last time I had this argument where a retard told me we should throw everything out because the processor has to spend a few microseconds switching out of real mode during the boot process.