It's 2018 and we're still using 64-bit CPUs with no sign of 128-bit in the near future

>it's 2018 and we're still using 64-bit CPUs with no sign of 128-bit in the near future

Attached: 1340707.jpg (1200x630, 86K)

Other urls found in this thread:

en.wikipedia.org/wiki/128-bit
twitter.com/AnonBabble

Do you even understand computer architecture enough to know what benefits you'd receive?

By the way, we aren't even running true 64 bit you know, it's extended 32 bit marketed at 64 bit.

There are many commonly used algorithms that use 128-bit numbers.

>tfw it's 2018 and there are no 2300000TB RAM desktops

Attached: 1523356328318.png (671x519, 146K)

oh which include?

128-bit quicksort.

Which you can handle just fine with SIMD.

Pretty sure the PS2 was 128 bit.

That's the GPU bus width

Jow Forums level is going downhill and without brakes

>i-it doesnt count

64-bit CPUs are actually just 48-bit, since AMD64 architechture leaves the remaining 16-bits unused for pointers.

en.wikipedia.org/wiki/128-bit

There would be literally no benefit right now

Gaming...

How does wider bit width on the CPU equate to an improvement in games?

Games would not benefit from a move to 128 bit

What's the point of that post?

Ability to use 128 significant figures seems pretty useful since gaming relies on mathematics.

Give me one game that would see any benefit from it.

you trollin'? that's...not how processors work at all.

besides, most mathematics are done related to graphics, which are done on the GPU which has higher bit depth than your CPU.

Here's whats up: You are a fucking retarded who doesn't know shit about what hes talking about.

Mario 128 when?

What the fuck are you on about? Alsi, do you think GPUs are 64bit?

Explain how I'm mistaken then instead of abusing me. I thought N-bit means you have N spaces for digits at a time so an 8 bit computer's maximum number is 99,999,999 and so on.

I'm honestly not sure. Just got into the PCMR community recently so learning as I go.

The thing is that it is so easy to tell that any technical explanation would be wasted on you. Just stop posting man

>I thought N-bit means you have N spaces for digits at a time
No, its just the width of the address bus

Bit width is in relation to binary length. 8 bits in binary, 1111 1111 gets you 256 distinct values. This doesn't even begin to explain floating point which is what your vidya would require,

all that larger registers do is make more space addressable in memory. you can use multiple registers together to represent numbers that are more than 64 bits

So the maximum number that an 8 bit NES could use was 1111111? Shit. I'm starting to understand why people call those guys geniuses.

No you fucking retard. The maximum binary number was 1111 1111. Which is 256 = (2^8).

256, they could only count to 256.

>256, they could only count to 256.
255 actually

You have to go back.

>Just got into the PCMR community recently
literal cancer and not related to technology

>PCMR community
Kindly leave

Only if you start at 0 instead of 1, there's enough entropy for 256 distinct values.

0-255 is 256 values dingus, learn base arithmetic.

>Only if you start at 0 instead of 1
starting at 1 is not a thing in cs

> we aren't even running true 64 bit you know
Eeh? No, "we" are generally running true 64bit - from Android over Linux to Windows.

On Windows you might still have a fair share of 32bit software that ALSO runs, but that doesn't really make it "extended" 32bit - you got access to "true" 64bit registers, true 64bit operations and so on in hardware.

please see

He's a brainlet that thinks addressable memory is register width.
fwiw, we're still only using 48bit max for addressable memory.

Uh, that's just the addressable space in RAM?

But xx-bit is not indicating that, but how many bits you generally can do operations on on your processors and registers. Which is 64 [or more] on CPU used with all common end user operating systems.

>0000 0000 should mean 1
Nice recovery retard.

>Implying we even fully utilize 64bit processors to begin with
We're not using 128bit because we don't need it and would not even be using them to their full potential anyway. Do you even understand the difference between 32bit and 64bit? For the average consumer, they could most likely get by with a 32bit cpu as long as their kernel supports PAE for >4GB RAM but since no one has made 32bit CPUs for desktop machines in almost 13 years, that would be a very rare case for a normie

Until we reach 8TB RAM, 64bit processors are fine.

Sorry I was being pedantic that 1-256 === 0-255. I don't disagree with you.

Agreed but I was speaking with a moron.

No. Not 128. 2^128.

340,282,366,920,938,000,000,000,000,000,000,000,000bytes.

>brainlets can only imagine binary as a type of decimal instead of raw entropy

They require the same entropy. (-127) to 128 can also be represented with a byte and is arguably more useful.

>raw entropy

We're talking about computer that use established conventions here.

lmao
Not only could it emulate it with very little impact using multiple registers or such, if it was the full CPU, it would likely be detrimental to everything including the game.

Bigger numbers != always better.
There comes a point where you get diminishing or even negative returns.
CPUs would need to be more like GPUs for any benefit of higher bit-ranges. There's no need for that, it's the reason GPUs exist! (and FPGAs and general family of programmable accelerators these days)

128 is fucking overkill.
Hell, 64 is fucking overkill for most shit.
Same reason that disaster that is IPv6 addresses are hated by everyone, overkill.
But hey, IOT!!!!! Everything will be connected!!!
Fuck that future. Hope it gets run over.

Obviously various co-processors on modern CPU [SSE, AVX, ...] as well as various GPU bits and pieces are actually operating with 128bit or more due to practical requirements.

Universal use of a minimum of 128bit is too much, but the number of instances where 128bit or more were actually needed/desired in hardware of co-processors and things and bits actually increased a lot recently. You likely have significant capabilities to process operations on 128bit or more in pure hardware in your machines, too.

You are so fucking wrong it's unbelievable. Why would you post here knowing literally nothing about the absolute basics of computing? I'd do a whole write-up on exactly why you're wrong, but other anons already handled that. Besides, I don't actually believe you're capable of learning given your mindset of spewing shit without any actual knowledge on the topic.

Just fucking leave.

128-bit CPU's would be able to address more memory than there are stars in this galaxy....not happening.

So computers have no negative numbers
Ok got it thanks senpai

What a fucking retard

> this galaxy
Never mind addressing the universe rather than just the galaxy.

I think we're creating porn a lot faster than the galaxy created stars in the last billion years, and that's a thing we are going to address completely and efficiently if possible.

I think 128 bit architecture uses 128 bits which is not 300 kabazillion bytes.

RISC-V, bitches

Data encryption

That's pretty bold, but I'll give you a (You) especially considering the number of replies you got.

Risc-v has the capability but no one cares for now.

Not enough to justify using 128 bit yet.

The main justification for 64 bit was ram starvation. 32 bit was limited to

Starcraft
GTA

>tfw still fizzbuzzing with only 64 bits

No I mean computers don't start at 1, fucking retard.

Nope

48-bit addresses for physical address space,
64-bit addresses for virtual address space.

>2^48 bytes = 256 TiB
Does that mean 128-bit CPUs could actually make sense in a few decades (assuming amd64 stays relevant)?

>they could only count to 256
You can't blame other people for misunderstanding your statement when it is vague like that. When you talk about a computer that "counts", people assume you mean "counts using one of the usual data types, like unsigned 8-bit integers", and not "counts up from one like a human would".

For what purpose? We can already address more memory with 64-bit than we can stick in a single home PC. Buying brand new 128-bit hardware has no real world advantages.

Fucking idiot. Please be trolling.

Attached: 1521829282683.jpg (657x527, 38K)

Only if you're a fucking moron and think that memory is going to continue growing exponentially.

It's 2018 and PCs still can't do 128-bit like the PS2 from the 1990s. Sad!

Attached: 64bit.jpg (759x509, 50K)

I don't think that, but it doesn't seem to be out of reach when computers with terabytes of RAM are a thing right now.

how do you have less physical space than virtual space how is that even possible'?

Wrong, Windows XP was 64bit in 2001.

gaymen must go back

What do you think 128 bits mean?

>he has never heard of two-complement

128 bits = 16 bytes.

We haven't finish the useless migration to 64bits because of code monkeys, and you want to create more chaos with 128bits.

>being called a two would be a compliment for him

It will be done in 10 years, and everything 32-bit in 20 years from now will be considered a hopeless legacy. Y2K38 will put an end to this. However, I think migration to 128bit is important. Imagine people migrated to 64bit and 290 billion years have passed, many aliens use our software and BAM, it's important to upgrade in the next 2000000000 years because of the same error. I'd say let's make the jump now, otherwise aliens may feel our software is inferior.

> 0-256
More like
-128 through 128

/g

God I hope really hope there don't come a day when you need 8TB of ram in your normal home computers to do shit. Software in general is a bloated shit sack as it is now. Software should be streamlined, functional v.s flash/useless shit, but that's just my opinion.

That day will come. All software will be done in interpreted text files containing a usual human speech, describing its features. Special NN compiler will run through them at boot and "compile" them in programs.
Journalists will take the place of Pajeets, and poets will be like Haskell lunatics.

That would be 257 numbers (there's a zero in the middle). The actual signed 8 bit range is -128 to 127.

League of Legends

the chad tongue in cheek joker

the virgin suicides

le basedboy face

imagine 128 bit opcodes
relative jumps could be millions of yottabytes away

>gaming consoles were 64 bit 21 years ago

pcucks btfo again & forever

>8MiB
>64 pointers
I'm guessing it ran with 32 bit pointers? otherwise that's a lot of wasted memory

Games don't even use 64bit for anything but addressing.
Everything, EVERYTHING is done with 32bit floats and ints.

why do you need 128 bit numbers?

even 90% of systems right now isnt using 32bit at its full capacity, to hell even 16bit

No one used the 64bit precision of the CPU.
But the console also used 8 bits (cartridge interface),9 bits (memory interface), 16 bits (RCP) and 128 bits (SIMD packs of 16bit numbers)

Not everything in your address space needs to be in physical memory at once. If you try to access something that isn't in physical memory currently, it will simply generate a page fault and the OS will page it in from disk.

what was sizeof(void*)?

That probably varies with the compiler used back then.

>2018
>wanting a 128-bit CPU
>can't even maximize the potential of current 32 or 64-bit CPUs

16-bit is all anyone ever really needs.