The absolute state of hardware

We have free and opensource software thanks to autistic fucks like Stallman, but what about hardware?

Why isn't there some kind of alternatives to chips and processors that don't require cutting-edge factory equipement to manufacture? Who the fuck needs 16 cores running and 4 GHz or something when all you do is brose Jow Forums and type shit?

I mean, it's like a fucking radio. Anyone can make a fucking radio. You need a factory to make a good one, but it's not like there are only 3 companies in the world that can make consumer available radios.

I mean, why aren't there a bunch of smaller companies that make cheap, but slow hardware? I don't care if it's slow, al long and it is well documented, doesn't have anal probes built in the microcode and car run Doom.

Attached: Без названия.jpg (251x201, 11K)

Other urls found in this thread:

3.14.by/en/read/homemade-cpus
kosagi.com/w/index.php?title=Novena_Main_Page
hackaday.com/2017/02/25/the-fab-lab-next-door-diy-semiconductors/
youtu.be/NRMpNA86e8Q
quora.com/Write-a-c-program-to-print-the-bigger-number-between-two-numbers-without-using-conditional-operators
github.com/ucb-bar/riscv-boom
youtube.com/watch?v=RlD2Tp3ula8
spectrum.ieee.org/tech-talk/semiconductors/design/darpa-picks-its-first-set-of-winners-in-electronics-resurgence-initiative
hackaday.com/2018/09/17/a-1-linux-capable-hand-solderable-processor/
github.com/petit-miner/Blueberry-PI
boingboing.net/2017/12/26/high-school-student-built-chip.html
twitter.com/NSFWRedditImage

>all you do is brose Jow Forums and type shit?
>People have the same needs as I do

im curious how much op would spend on this hypothetical cheap but slow hardware

I'm not saying EVERYONE has the same needs, but quite some people don't need bleeding-edge gaming PC. Same reason why Raspberry Pi has a niche.

Take a look at Risc-v

On cheap - under 75 bucks.

I did, and I hope they will succeed.

But the issue is with practially every component of a PC.

get a rasbery pie ?
or a used laptop or some old busnis pc

I do use an old thinkpad and I occasionally use Raspberry Pi for certain tasks. But they are not completely free (not talking about cost here), and old hardware is always in limited supply.

eoma68

Hardware would need to cap out before open source would really make sense. There is too much innovation in hardware to commit to a single architecture and fix all the inevitable shittiness that comes with open sauce.

Software capped out early in the 2000's and has been stagnant with modest improvement of features but mostly just taking advantage of better hardware and far less innovation. Since things don't change much with software in almost 20 years the open sauce community has figured out how to do most things for free.

Cap out for which tasks?

Besides the obvious CPU and GPU there is a lot of other I/O.

Just think how many connectors there are and each one has a controller and is only popular on the market for maybe 10 years before it is discarded and replaced with something objectively better.

Of the I/O that could be converted to open source, the best candidates i can think of would be sound card and ethernet adapter.

>sound card and ethernet adapter
Not would, but should. Literally no reason to keep it proprietary.

>eoma68
If it will ever gets finished.

Also, the name just rolls out the tongue.

This. RISC-BEE is the only open architecture that has a chance considering how it has the backing of big tech corporations.

> what is a Raspberry Pi

you can build your own slow computer from scratch using off the shelf components.

if you dont want to use an off the self microprocessor you can still build your own non integrated processor you humongous faggot.

>Why isn't there some kind of alternatives to chips and processors
cause you can't fucking carve them from a banana
they requaire actual precise engeneering and that is hard to desing

>they requaire actual precise engeneering and that is hard to desing

But not everyone needs cutting edge. 20 years is a reasonable time to figure out open standart for some cheap 1 Ghz signle-core CPUs, for example.

No it isn't you dumb fuck.

You are just as ignorant as the woman who suggested that physics was sexiest because they had completely solved the dynamics of rigid bodies but haven't solved fluid mechanics. She just saw two categories and assumed they were of equal difficulties.

This is you making exactly the same category error. You are stupid and should feel embarrassed

RISV is making gains. The problem now-a-days is the specialized tasks have specialized IPs. Everything is accelerated hardware. CPU is just one part of the chip with 60 other IPs on the SoC. I don't know about desktop, why do you need powerful CPU when you can get a good GPU for gaming?

>why do you need powerful CPU
I don't. Core2duo is enough for my computing needs.

>good GPU for gaming
Gaming is a niche. Not everyone needs powerful GPU.

have fun building this in your garage

Attached: aaa.jpg (852x480, 134K)

>Why isn't there some kind of alternatives to chips and processors that don't require cutting-edge factory equipement to manufacture?
There is, it's the human mind. You could produce one if you weren't an incel

Look at the trip fag od on soya

Closest thing would be a CPU implemented in a FPGA and even that would be a magnitude slower than current consumer CPUs

You can make your own chips with raw metal and chemicals available at the corner store.
There are tutorials on youtube.
Anybody could make a PC from scratch at home if they wanted to.
I think the main issue stems from the fact that chip design is actually more complex than rocket science.

Go on then, sit down and make your own processor in your garage with home litho. I dare you. Livestream that shit. Base it on RISC-V or something to make your life easier.

Why are you wasting your life shitposting on a Mongolian basket-weaving board when you could be actually doing and creating what you don't think exists?

Don't let your dreams be dreams, user. Just do it.

It would be really interesting if you could pull it off, and equally amusing to watch you fail.

You won't be able to do 1GHz, but a few MHz might be within your grasp.

>Go on then, sit down and make your own processor in your garage
This is the kind of stuff you'll be making
3.14.by/en/read/homemade-cpus

>Go on then, sit down and make your own processor in your garage with home litho

Why would I do that? If I want something like fork or a piece of furniture, I go and buy one. Nothing is preventing me from looking up guides on how to make those objects myself, but it's not practical.

The difference is, I can buy forks and furniture from thouthands of manufacturers, and not just Intel sofa, AMD sofa and ARM Sofa. And in theory I can start a company that produces forks and furniture, because blueprints for manufacturing sofas is not a fucking patented corporate secret.

Nobody mentioned the Novena Kosagi project?
kosagi.com/w/index.php?title=Novena_Main_Page

Turns out when you ask people to pay >1000 bucks for a low end machine you won't sell much.

Sorry, but in chip production volume is the only thing that counts for price.

You can either use an FPGA or bake chips yourself.
Honestly all we need to begin to make MOS ICs is a photoresist, the substance used to develop it (I wonder if the stuff used to etch PCBs would work), some kind of N dopant that can be spin coated (probably phosphorous), a vaccum chamber and a turbomolecular pump for sputtering or PVD of aluminum (I wonder if that would work with a simpler mechanical pump), some kind of optical setup to do photolithography and alignment (or if you're willing to experiment and can get a high vaccum, e-beam lithography with some kind of sensing), a heat source that can raise the wafer to about 1000c, and some acids/etchants (not sure which ones are strictly necessary).

But making furniture sounds kinda boring, making CPUs sounds friggin cool.

>Base it on RISC-V or something to make your life easier.
Hahahaha you don't know what you're talking about pal.

Wrong. I can't buy a turbomolecular pump from the corner store.
Design is pretty easy with modern tools, manufacturing is hard.

hackaday.com/2017/02/25/the-fab-lab-next-door-diy-semiconductors/

Yeah, I've seen his videos, but his family is pretty wealthy from what I've seen. He dropped like 150k on his lab probably.
I suspect it can be done for much cheaper, and TBQH I think his process is not good enough for making a CPU, because he doesn't seem to have any kind of alignment system between the different lithography layers.

>don't require cutting-edge factory equipement to manufacture?
Because all IC manufacturing needs such equipment

>doesn't know about cheap cutting edge equipment sold on chinese alibaba

No one will make a design for a manufacturing process that doesn't exist.

>car run Doom
youtu.be/NRMpNA86e8Q

Literally this. All you need to make a computer is a bunch of transistors.

>thinks hardware able to be made in large enough numbers to be on Alibaba is "cutting edge"
Cutting edge for plebs maybe, but actual cutting edge manufacturing equipment is deeply housed in the likes of TSMC or Intel with some stuff only existing in numbers that you could count on one hand

>Who the fuck needs 16 cores running and 4 GHz or something when all you do is brose Jow Forums and type shit?
JavaScript web developers have sadly made web browsing a processor heavy task.

but you can make your processors, you need contacts at tsmc/go flow

>implying riscv chips won't have fuckloads of backdoors in them after the big manufacturers eee the standard

Not all IC manufacturing requires Intel-level technology.

Masks for lithography are a pain in the ass, they mess up economy of scales and flexibility.

so a phone?

Not to mention GPUs are mostly for graphics and some physics. It's rarely clear which tasks are suitable for the GPU, and the threading model GPUs use is obtuse and counter-intuitive. Writing GPGPU kernels is a highly specialized skill set with an obscene number of pitfalls that won't make any goddamn sense to most programmers.

GPUs aren't going to stop games from being CPU bound for a long, long time.

I never understood all the shit about warps and stuff. It's like they try to intentionally obfuscate how it works.

Warps and wavefronts, as concepts, aren't that difficult. Here's the gist of what's happening:

Your GPU has processors called SIMDs, which are most efficient when every thread is doing the exact same thing at the exact same time, although they will almost certainly be operating on different data. SIMDs are designed to do this whenever possible. This is why you have to be careful when branching in your kernels because if a thread gets out of step with its SIMD brothers everything comes grinding to a halt because the SIMD will have to pay special attention to that thread until it can hopefully resync with its brothers. This desync is called divergence. The resync is called convergence.

Warps and wavefronts more or less refer to groups of these SIMD brothers, although there are some technicalities about them that I don't really understand. For the most part what I've explained is sufficient to understand what's happening. Warps are Nvidia's version of this idea, and they are ideally 32 threads wide. Wavefronts are AMD's version of this idea, and they are generally 64 threads wide. You want to fill these up to their brim as close as you can for best performance.

One of the big reasons why GPGPU programming is so difficult is you have to be incredibly careful about branching. If you can guarantee SIMD brothers will follow the same execution path through your kernel, then branching is fine. If you can guarantee your SIMD brothers will only rarely diverge, then branching is probably fine.

Note that even something as innocuous as using && or || can cause divergence because those operators are short-circuiting operators. If their first operand will ensure the Boolean expression will evaluate a certain way, then they won't even bother doing the full check. If even one of your threads falls foul of this, then you've got divergence. In their place you can use the & and | bitwise operators.

In cases where you absolutely have to have wild branching, you might need to do some fuckery like this to prevent divergence: quora.com/Write-a-c-program-to-print-the-bigger-number-between-two-numbers-without-using-conditional-operators

GPGPU programming is a whole other beast.

you don't need a turbo pump to make a PCB with chemicals, I haven't looked into it but it definitely seems possible

also OP is a retard and should buy a raspberry pi

I'm talking about making a CPU die.

Shit yeah that isn't going to happen without a fab I don't think

for some reason I read his post about "chips" and pictured simple PCBs

I mean aside from some trim along the edges, RISC-V is basically production ready. For some tens of thousands you can get used chip fabrication equipment. Setting up a clean room to minimize defects in the chips during production is extremely expensive. From there you just package the chips up nice and neat and design a socket or BGA layout. The hardest part would be designing a motherboard. All the various controllers and stuff could also be built in your small scale fabrication facility but that takes more R&D. There's also some sort of need for boot firmware. I'd go with something a bit like Coreboot, where the low level boot firmware goes up first, then you boot a payload, a kernel, or another bootloader from the disk like GRUB2.

If a bunch of NEETs were able to pull off a somewhat usable clean room and stuff in a warehouse I'd be shocked if they could make anything better than a 50MHz RISC core with anything better than a 50% failure rate during or after fabrication.

Things already have hit a wall. Literally any machine from 2008 or so still works fine for shitposting and spreadsheets, especially in desktops. The same couldn't be said in 2008 about a 1998 PC. Now is the perfect time to push for open source hardware.

Shitberry Pis use a GPU that requires non-free blobs and the way they boot is fucking dumb.

Attached: Technology.jpg (1920x1080, 1.34M)

I hate to break it to you but it's the absolute state of software. Form does not dictate function, the other way around. Hardware has to keep up with complex software that grows ever more bloated and complex.
Bill gates once said if car technology advanced like computers did, we could all be driving the equivalent of ferraris but cost $100 and run on a thimble of fuel or words to that effect. But he was talking about hardware, not software
Think about it, when game developers come up with graphical optimization, it means less lines of code, less calculating, faster performance on older hardware. operating systems and just about any other kind of software the opposite happens. they get bigger and more bloated with little or no thought to optimization. We need bigger hard drives. Why? because windows or name your OS is 10 times bigger than it was 10 years ago.

Lots of people are mentioning raspberry pi. What can you do on a windows PC with a 10 gb operating system and no graphics card that you can't do on a raspberry pi?

Bit of a minor point, but optimizations do not necessarily mean fewer lines of code or fewer calculations. You could have a situation, for example, where your program's threads are running tasks in a sub-optimal order where the critical path keeps getting interrupted by stuff that can feasibly happen at any time. Reorganize when things happen, and you could wind up having more CPU time that you could fill with even more calculations, and still have an overall performance gain.

>RISC-V is basically production ready
Why are there so many retarded RISC-V shills on Jow Forums? Get it on your fucking heads, there is NO existing RISC-V core design that's open source.
The only difference between it and x86 is that the instruction set itself is not patented.
>The hardest part would be designing a motherboard
Not it wouldn't, unless you mean an accelerated GPU. But that's not really necessary even today unless for gaymes or CAD.
>For some tens of thousands you can get used chip fabrication equipment
Doubt it. You will need an scanning electron microscope if you want to make 386 tier CPUs, especially if you want to minimize defects as you said. That alone would run somewhere into the six figures. And I don't even know that you can get used photolitography equipment.
>I'd be shocked if they could make anything better than a 50MHz RISC core with anything better than a 50% failure rate during or after fabrication.
That probably would be extremely impressive even for seasoned professionals without at least seven figures equipment.
>Things already have hit a wall. Literally any machine from 2008 or so still works fine for shitposting and spreadsheets, especially in desktops. The same couldn't be said in 2008 about a 1998 PC.
Not really. The difference between a PII and a Core2Duo isn't that big.
But yeah, the bottleneck currently is HD video (excluding games), everything else more or less can be done on 2005-tier harware.

What andy giveth, bill taketh away.

Moore's law will end once we get to 5-3nm nodes in like 3 years. Thus, our computers and phones will be the most powerfull they can be, unless we come up with something revolutionary like optical computers. The smaller hardware companies will catch up and some of them will produce free hardware.

Attached: 1537020806425.png (915x645, 113K)

I know how they work in principle (a few real CPU-like cores except with less memory management stuff and a shitton of ALUs per core, ie very long machine code instructions), but I wish they were a little more open about how their drivers/firmware magically convert these "warps" into these locked in series of instructions, and why they need to be some arbitrary n size, and all that stuff.
It feels too abstract otherwise, like they're making up some arbitrary magic rules that you're supposed to follow for reasons you aren't supposed to understand.

>what is graphene: the post

brb hacking my newborn as a spare SSD to download my porn in

*turns into a trannie*
heh nothing personnel dad

>in like 3 years
its gonna be at least 15 years

graphene is a meme

Samsung announced 3nm nodes by 2021. Intel announced 5nm nodes by 2020.

>Moore's law will end in 3 years
Does it give you no pause at all to consider that people h ave been saying things like this for decades?

If you knew what the breakthrough was going to be that will enable the next generation of technology, you could make it now. It's the nature of breakthroughs that you can't see them coming. And yet, there's no reason to think that they'll stop any time soon.

Why is it a meme? It has much better electrical characteristics than traditional FETs.
All that needs to happen is that somebody finds a self aligning process for graphene FETs at a reasonable integration scale.

That's what the Cambridge undergrads are doing this year. Just I, of course, and if you were doing it in your garage then wow, it'd probably be an 8-bit equivalent with like 4 registers, but the instruction decoder is much less of a headache compared to something complex like a Z80 or 6502.

There are several. Example: github.com/ucb-bar/riscv-boom (don't actually use this one, its out-of-order execution is affected by Spectre, but it's useful for research)

The patents on x86 themselves have LONG since expired, you could do x87 and MMX as well. Base XMM/SSE has just expired, but SSE2 is still active, and x86-64 patents expire in 2025.

You will inevitably reach the physical limit until quantum effects start happening like quantum tunneling. It will happen past 3nm nodes. Computer hardware will become mature technology like cars and guns, where we reached 90% of the best we could do.

Ok, I admit that graphene could looks promising for future computing.

>You will inevitably reach the physical limit until quantum effects start happening like quantum tunneling. It will happen past 3nm nodes. Computer hardware will become mature technology like cars and guns, where we reached 90% of the best we could do.
Do you think that people have been making claims like yours for decades without thinking that they had good reasons to be making them? It has always seemed like there was some insurmountable problem just years away. It has never been true. There is no reason to think that this time will be different. You cannot see breakthroughs coming. You can expect them to happen anyway.

"Moore's law will end" is the Jow Forums version of "the end is nigh".

hardware is too expensive you dumb nigger. do you have couple of billion dollars laying around for micro fab + rnd?

no one is seriously considering graphene as a transistor for digital circuits. graphene is quasi-metallic and cannot act as a switch like a typical semiconductor.

How is RISC-V easier to decode than a fucking 6502? The 6502's instruction set was optimized for a simple low transistor count design (explicit design goal of the chip, so it could be incorporated for a couple bucks in things like the NES), and that was the 1975's idea of "simple".

Yeah, I wasn't clear enough. I'm not talking about 2d graphene, I'm talking about semiconducting carbon nanotubes. youtube.com/watch?v=RlD2Tp3ula8

>the woman who suggested that physics was sexiest because they had completely solved the dynamics of rigid bodies but haven't solved fluid mechanics
who?

Luce Irigaray
He's referring to the book "Fashionable Nonsense"

germanium is superior to silicon. but what will we do with all this sand.

ur mum

carbon nanotubes (CNTs) are a very different beast than graphene. irl semiconductor people are highly polarized about viability of CNTs. some people absolutely hate that shit, others are pushing pretty hard to use CNTs either for scaled nodes or 3D chips. DARPA is investing pretty hard in CNTs right now (currently $61 mil out of their 2018 $1.5 bil electronics initiative):

spectrum.ieee.org/tech-talk/semiconductors/design/darpa-picks-its-first-set-of-winners-in-electronics-resurgence-initiative

the biggest problem with using new materials to replace silicon is $$$. people often forget that silicon has had constant process technology investment in money and research since the ~1970s. it's very difficult for shitlet academic meme materials to compete with decades of silicon process optimization.

the DARPA POSH initiative (part of ERI) is also trying to help spur open source hardware designs. i think integrated circuit design will eventually become more like PCB design, with low cost older technology fabs (using 90nm or 45nm nodes) able to shit out cheap custom designs for people who don't need high performance.

anyone who thinks you can buy equipment and fab usable CMOS chips in their garage is a fucking retard who has never actually done semiconductor processing.

t. semiconductor fag here

RISC-V, OpenPOWER, OpenRISC

What about NMOS chips?

nigger there's a reason we switched away from germanium

>You need a factory to make a good one, but it's not like there are only 3 companies in the world that can make consumer available radios.
There's literally only 3 chips available (MPEG jew licensing) that decode DAB+ (digital radio), so either you're trolling HARD or made a funny mistake.

Attached: 15279892172110.jpg (465x464, 60K)

I didn't know that, actually.

They say they will ship on 31st of october if you pre-order, so I guess it is finished.

hackaday.com/2018/09/17/a-1-linux-capable-hand-solderable-processor/

Here's an opensource board for it

github.com/petit-miner/Blueberry-PI

Digital was a mistake.

Definitely, it's never going to be a thing, I say.
People have tried to love it, but it just doesn't work. Instead of mild static noise and mono like with analog you get complete silence with bad reception.

Digital is a solution to a problem that never existed. Because 40KBit/s AAC HEv2 is still far from HiFi quality. There's no positive sides.

>Intel announced
Into the trash.

>Has never heard of Zuse or relay logic or even mechanics and Babbage.
Uh-huh.

>have fun building this in your garage
I know.
boingboing.net/2017/12/26/high-school-student-built-chip.html
Did you?

they can announce it but until it is actually released it doesnt matter. didnt intel announce 10nm ones for like 2015 or something?