Emotion engine

Haven't you noticed how weird is ps2's processor(emotion engine)?

Attached: PhotoPictureResizer_190630_015924388_crop_1080x1088.jpg (1080x1088, 131K)

Other urls found in this thread:

en.m.wikipedia.org/wiki/Emotion_Engine
en.wikipedia.org/wiki/AVX-512
en.m.wikipedia.org/wiki/Elbrus_2000
iryoku.com/stare-into-the-future
old-releases.ubuntu.com/releases/xubuntu/ports/releases/10.04/release/
en.wikipedia.org/wiki/Yellow_Dog_Linux
twitter.com/FioraAeterna/status/1126984875041419264
youtu.be/IehwV2K60r8
youtube.comhttps://youtu.be/IehwV2K60r8
twitter.com/NSFWRedditImage

Yeah, it's totally weird haha

No. How? Would you care to explain? I just know it has 128bit registers, but still I don't know what that exactly means.

en.m.wikipedia.org/wiki/Emotion_Engine look "description"

I still understand shit from this.

But what I know, it's from the times Sony used to make quality things. And designing such cpu was not a cheap cashgrab this company is known from nowadays.

Also, I'm more amazed that like most of the games on ps1-ps2 era had custom engines. I can't even imagine where would I start making an engine. I tried to program some shit in pure DirectX and it took me thousands of lines of code mostly from tutorials to display simple rotating box. I am making a shitty mobile game in unity, and yeah it's pretty high end ride for me thinking how they were making games back then

woah, that's pretty weird O.o
lol xD

I don't know if you are being sarcastic but seriously, look:en.m.wikipedia.org/wiki/Emotion_Engine "description"

Look up the Sony GSCube OP, sony were gonna dominate with the EE/GS until (((they))) shut it all down, even today those have features modern graphics cards can't even hope to get

Yeah, it's all weird but still weak than the basic Coppermine of the Xbox.

Attached: ps2.png (850x665, 70K)

Attached: ps22.png (851x370, 43K)

Source?

The processor has instructions that can operate on multiple pieces of data at once.

Where normally you would do something like
a = a + _something_
b = b + _something_else_
c = c + _whatever_the_fuck_
d = d + _you_get_the_gist_

and the processor would do it all sequentially,
The emotion engine allowed performing all this in a single operation. It could operate on 128 bits at once - so it can compute 4 32 bit values at once, 8 16-bit ones or 16 8-bit ones. So that would, in theory, provide an up to 16x speed improvement in some cases, which is obviously huge.
That's much faster, but requires the programmers to write and optimize their code specifically so that it could actually make use of this. Mastering this was the key for making good looking and impressive games on the system. As developers started to master this and other aspects of the system, games became noticably better looking and more complex and with more objects at once.

>Sony GSCube
>Although the GSCube had good rendering capability, they had a major bottleneck in connecting to external computers to transfer content
Like what Sony later did with the PS3?
>According to some sources, they were all sent back to Sony in Japan and were subsequently dismantled.
>Although the GSCube had good rendering capability, they had a major bottleneck in connecting to external computers to transfer content.
Where's the conspiracy?

I think that ps2 was actually made in a very smart way, yeah,it was bizarre and very weak, but with intelligent programmers you could use these weird features to made awesome things. Shadow of the colossus is an excellent example of that.

Damn it, I meant this part...
>The GScube was a hardware tool released by Sony intended for use in CGI production houses consisting of a custom variant of sixteen PlayStation 2 motherboards running in parallel.
Didn't multiple companies do this with the PS3?

It's a bit like the Atari jaguar in that it's not exactly built like you'd expect from a PC, stuff is more programmable and interconnected but also lacks features that you'd find on a normal GPU. The result: programming certain stuff like "shaders"in software was possible, but very annoying compared to GCN or xbox

Wow, why isn't Intel investing on a 4096bit processor?

This way they would have 64x more speed compared to current 64 bit processors.

I think we're up to 512 now
en.wikipedia.org/wiki/AVX-512

PS3 uses pretty much off-the-shelf nVidia graphics, and the Cell is just a poor IBM PowerPC impl. with some extensions like VMX bolted on.
People only got excited about it because multiple cores wasn't really much of a thing back then, especially not for under $1000, that's why you had places like the US Air Force buying them up to make clusters with.

And besides, we kinda have that thanks to modern GPUs and compute shaders, which can run circles around conventional CPUs, even when using SIMD instructions.
It's just that not every task can be parallelized like that. Often you need to do things sequentially because values depend on previous computations.

its processor architecture is really weird, it had a main MIPS processor but it also has two vector co-processors that execute like four instructions at a time in a 128 bit register. it's kinda like hyperthreading but there aren't two logical cores, programmers could just take advantage of that massive data throughput

while the ps2 was technically the weakest out of that generation of console and the least graphically impressive its general purpose floating point units were general purpose, not a GPU. that, plus linux, meant that they were being used as clusters for a while. that was pretty cool

gotta keep in mind that the ps2 was developed right smack in the middle of what you'd call early, mostly-unaccelerated 3D, and modern accelerated 3d with shaders
they didn't invent modern graphics, but rather just made the fastest ever example of "classical" 3d acceleration
like a go-kart with an F1 engine

How come the jump between ps2>ps3>ps4 hardware wasn't half as big as the jump between the ps4>ps5? The ps5 genuinely looks beastly. Proper 4k 60fps+ isn't a pipedream now.

retard you'll be lucky to get 1080/60

Reminds of current Russian CPU
en.m.wikipedia.org/wiki/Elbrus_2000

Step aside chiplets

Attached: AD6FD7E7-0315-404E-ADFD-1F2E7A7D1D99.png (409x367, 4K)

Reminder that PS2 is the only WMD-certified console

Attached: file.png (845x364, 48K)

Oh Saddam, you rascal!

>Saddam stockpiles PS2s
>Sony suddenly recalls all their mega-powerful GSCube boxes, cancels all work on GS/EE and switches to the underwhelming Cell
Makes you think

you call that a chipset?

Attached: proxy.duckduckgo.com.jpg (2500x2300, 1.95M)

You can literally launch missiles with them I can’t blame him

it runs on (sjw) tears

How's that?

ITT: Zoomers discover SIMD for the first time

Attached: 4885803422_1c320f9c9d_b.jpg (1024x912, 141K)

>I'm more amazed that like most of the games on ps1-ps2 era had custom engines
What happened? Unreal took over the industry? Even Japanese devs are into into nowadays

PS2 vector procesor with accelerator graphics
PS3 vector processor Cell oops sony last minute GPU nvidia with shitty interconnection CPU GPU
PS4 piece of shit mobile fail AMD processor and shitty GPU but 8GB RAM and easy programming, a lot amazing papers using Q6600 intel cpus and programmers just facepalm because shitty APU.

PS5
Monster CPU ryzen with more cache than $10000 dollars intel CPU server
GPU with new arch and new geometric shader plus raytracing and massive SSD.

Pic related was publish as next generation character render in GDC2013 and then shitty APU gen come.
iryoku.com/stare-into-the-future

Attached: B98BA065-6D39-400E-A931-38EF4A038919.jpg (2448x1377, 340K)

Video-games became too expensive and complicated and prebuilt engines save a lot of time. Western devs were already used to this, but japan wasn’t, which is why many Japanese studios had such a hard time adjusting during the PS360 era.

I know, right? haha

i fucking hate photogrammetry.

High quality but strategically stupid.

The EE amd Cell CPUs were overly esoteric for gaming applications.

Nintendo basically just had IMB make a PowerPC CPU and ATI make a GPU... simple and easy.

MS basically just raided a Best Buy™ for the first Xbox.

read a book you dumb zoomer fuck this isn't new or """weird""

You have aspergers.

he's right though

only zoomers don't know about RISC

That 4MB of video memory is pretty interesting in that it had a peak bandwidth of nearly 50GB/s (10GB/s for textures and 40GB/s for framebuffer) which is a crazy amount of bandwidth per pixel. Most games ran at 640x480 or 720x480 in 16:9, that is more than 2KB per pixel at 60Hz.

Compare that to hardware today and you see what it is impressive. To have an equivalent amount of bandwidth per pixel while rendering at 1080p @ 60 Hz would require vram width a bandwidth of 250GB/s.

There is a reason why the heat wave effect in Gran Turismo 3 & 4, AC4-0, and other frame buffer effects looked so good on the PS2.

They'd better make photorealistic games with 4k default

Hey little bro

Attached: ABD6ABA3-093F-4D2A-ADAB-90CE156CDF74.jpg (4032x3024, 999K)

>PS3 uses pretty much off-the-shelf nVidia graphics, and the Cell is just a poor IBM PowerPC impl. with some extensions like VMX bolted on.
Correct. That is why you could even install the PowerPC port of Xubuntu on it.
Technically you still can.
old-releases.ubuntu.com/releases/xubuntu/ports/releases/10.04/release/

Back then they actually used interesting hardware on consoles.

>m.
kill yourself, phoneposting faggot

I know right. Now everything is boring old x86. All the attempts to get PowerPC, SPARK and RISCV into consumer products have failed so now it's x86 from here to eternity.

Attached: G1.jpg (362x220, 13K)

The ease of using an already made engine gave us Deus Ex. Nips have never made a ludo on that scale

This. I know nothing about coding besides taking a high school class on visual basic and even I knew that.

Attached: n64.png (849x458, 48K)

And finally the PS3

Attached: ppe.png (817x1372, 120K)

re
tard

Let's see how many I can remember and look up.
PowerPC lives on in POWER9. POWER10 is supposed to be incoming, too.
Oracle tried to kill Sparc but since previous licenses are still valid, Fujitsu is designing new chips.
RISC V hasn't failed; it's so new it isn't even out yet.
ARM still thrives.
Motorola 68000 line died in the 90s, dropped for PowerPC.
HP's PA-RISC died in the 90s, dropped for fucking Itanium.
DEC Alpha dead in the late 90s.
MIPS almost died but is trying to come back. New owners; SGI is still dead I think.
Z80 line only lives embedded and in graphing calculators.
6502 line only lives embedded.
AMD64/SSE2 patents are from 2000, should expire within a few months.
LISP machines long dead.
Obscure Japanese PCs not already covered: probably dead since the late 90s.
NS32032 line (Oberon originally ran on this!) long dead.
Transmeta dead in the 2000s but only made x86 translators.
Cyrix and Winchip bought by VIA in the late 90s, only their Chinese affiliate still makes x86 chips, only for China.
Elbrus makes Russian Itaniums for Russia.

Nothing else comes to mind except a bunch of microcontrollers.

MIPS lives on in cheap routers.
There's also SuperH but don't know how it's doing.

Poeple like to tout big numbers of certain hardwares but I'm more inspired by devs taking advantage of hardware limitations to "cheat" features. For example MGS2 used alpha, mirrored geometry to simulate reflections, taking advantage of the high polygon count of the graphics synthesizer despite not having shaders. A PC can be equipt with the best components available but it'll still be running generic code designed for everyone.

There are is a open implementation of SuperH called J Core. J2 for SH-2 and J4 for SH-4. Also SuperH is used in casios classpad line of graphing calculator.

Me too

Why don’t they open source it so I can clone it on fpga?

PIII at something like 700MHz and a GeForce3, right?

You can buy m68k microcontrollers and development boards, they clock up to 266 mhz.

>PIII
Not even that. A mobile Celeron. You could swap it with a mobile P!!! If you had a BGA rework station. Same with the RAM, some boards had free spots for soldering more chips and doubling the ram.

Why the fuck would you do that

Learn English you pajeet fuck

all i know was development of Cell was hellish because Ken Kutaragi was a weirdo; wanted a specific number of SPE's because 'symmetry is beautiful'.
ofc some of them ended up getting disabled anyways.

>mobile wikipedia

phoneposters BEGONE

What's the problem? Isn't "Haven't you noticed how weird ps2's processor(emotion engine) is?" correct? I'm from Spain and english isn't my first language.

it is not that different from the cell, just on a smaller scale

the 8 SPEs from the cell where the 2 two vector units on the EmotionEngine

I thought the CELL comprised on 1 PPU + 7 SPUs?

>phone posting

>being so brainlet that uses a pc(that uses much more energy than a phone) to post.

The actual Cell has 8, the PS3's version has 7 to increase yields

whatver, i always forgot the right number, the point was the the smaller geometric engines on the cell, connected to the the main power pc core by a fast lane, was similar to the the design of the main mips core on the EE which was paired by two small vector units connected to it with very fast bandwidth

All the cool kids had Yellow Dog. Because fuck Debian.

en.wikipedia.org/wiki/Yellow_Dog_Linux

Attached: yellowdog.jpg (238x250, 5K)

He didn't call it weird.

Also it was new in the mass market, to a degree it was used in HPC back in the day, and we are talking 10 years ago.

>HP's PA-RISC died in the 90s, dropped for fucking Itanium.
Tragic. I still have one work station and one big server with this CPU.

>6502 line only lives embedded.
200 million such CPUs are made every year, used in everything from mice and tamagochi to pacemakers. It has a VERY loyal following.

>What's the problem? Isn't "Haven't you noticed how weird ps2's processor(emotion engine) is?"
>correct?

Yes, this sentence you typed is correct. Now read the OP's post.

>Haven't you noticed how weird is ps2's processor(emotion engine)?

You mean CISC

sounds like cancer

The Reality Coprocessor was a beast hampered by its measly 4+4K RAM architecture.
That and the paranoid fucks at Nintendo, who thought they knew better and made it pretty much impossible to program.

dum frogposter
RV is so new almost nobody made a real chip yet (sifive is low volume and expensive prototype hardware), give it a few years
the chinks are already looking at using it in cheap android phones because they got trump'd out of good arm cores

Sorry, it was an unconscious mistake(I mean that I was trying to write the sentence like "Haven't you noticed how weird ps2's processor(emotion engine) is?" but somehow I placed ''is" In the wrong place. In spanish we call that type of error "gazapo".What's wrong with the other two sentences?

twitter.com/FioraAeterna/status/1126984875041419264

fuck you retard these are vector instructions faggot kill yourself nigger

Based haha poster
haha

I think not all games ran on a 100% native resolution, i.e. gta san andreas (i think it uses another render mode or something like that) Although i agree ps2 hardware had great capabilities and games looked awesome for the time.

>Even Japanese devs are into into nowadays
Both of the most prominent attempts for the slants to have their own engine in recent memory, FOX and Crystal, ended in development hell to the point where both directors were taken off the projects.

just program in assembly and call fork() every 4 instructions

Attached: Kazuma.jpg (500x500, 51K)

#
>I'm more inspired by devs taking advantage of hardware limitations to "cheat" features
This!
This is what's most interesting in retro games. The work the developers had to do to extract every little piece of performance.
This Youtube channel from the Sonic 3D lead developer is very interesting. He explains how he got away with such graphical trickery on an otherwise really limited hardware of the Mega Drive/Genesis.
youtu.be/IehwV2K60r8

>Chinese affiliate still makes x86 chips, only for China.
China also has their own cpus they used in their supercomputers.

>youtube.comhttps://youtu.be/IehwV2K60r8
my browser crashed when i clicked on that

Most PS2 games are something like 540x540 simply because there was not enough room to store the framebuffer and textures and devs had to resort to tricks to get everything to fit. PS2 also lacked native compression methods, which exacerbated this, so devs had to devise their own or be crafty about textures (more common). Additional under utilization came in the form of just being too hard for many to program for the vus, so many devs outright ignored them as best they could and left performance on the table. Mostly JP devs from what I understand. PS2 had a lot of bugs as well, which added to the problem and since system was a ROM, it couldn't be updated to fix many of the bugs without a hardware revision. Also, PS2 had a proper scaler that could even output HD, but it was busted and thus not available and eventually removed (iirc) from later revisions. I think some group somewhere managed to enable it on cfw'd systems. PS2 would benefited from another 6months of baking, but then they wouldn't have had their market share advantages (being alone on the market for around a year was pretty huge).

>photorealistic
if i want to see that shit i'll go outside

>All that r&d just to make ugly people
(((photorealism)))

Gamers are retarded. I highly doubt that. You need the highest in graphics cards to maintain 4k and 60 fps and Moore's law is dead. Computers are not improving at anywhere close to the same speed they use too. It's probably the smallest technological leap we've had between console generations. You're probably going to get 2k with dip to 20 frames a second in most games.

sounds hellish to me.

Attached: 1428408826303.gif (1024x1024, 959K)

That gif always amuses me when we look at the state of emulation for the ps1 and N64 today - the limitations of the ps1 have almost entirely been eliminated though emulation enhancements while the N64 remains....challenging to correctly emulate and enhance.