Theoretically speaking, could someone use RISC-V to make a free as in freedom GPU?

theoretically speaking, could someone use RISC-V to make a free as in freedom GPU?

Attached: 55.png (213x237, 8K)

Other urls found in this thread:

youtu.be/5AvCxa9Y9NU
anandtech.com/show/11101/amd-files-patent-complaint-against-mediatek-lg-vizio
dev.sifive.com/freedom-soc/evaluate/fpga/
twitter.com/NSFWRedditImage

Yes, it's just a matter of extending RISC-V with SIMD and similar extensions.
Some company could theoretically extend RISC-V for massive parallelism.
The only issue is that certain operations, like texturing, can only be done with good performance using dedicated ASIC units, so they will have to slap some extra fixed function hardware on top of it.

You’re gonna want to think smaller OP. It’d take a village to make a decent GPU. If you want to do it for learning purposes though I’d suggest reading some literature and try to make something cutting edge. You could probably fuck around with some ray tracing asics and AMD might hire you.

Nvidia,AMD,Intel,Qualcomm,Samsung,imagination own thousands patents over GPU and only Nvidia and AMD had vital patents.

Special Graphics funtions are very hard.

Checked
No it wouldn't
It just would just have less performance per dollar
Heck, you could use an entire wafer for your gpu if you want, only problem is your gpu would cost about 90k

bump for free hardware

do those patents control "bread-and-butter" GPU technologies?

>90k
how so? are fabrication processes patented too?

But it would though. For the software drivers alone its going to take a huge team of people. For designing a chip the easiest part is the digital logic. Which I’m assuming you’re focusing on. Enjoy wasting your time I tried to give you good advice.

what's so hard about writing software drivers, especially if there's only one chip to tackle

ganoo/loonix community writes free replacement GPU drivers all the time

costs for a single bleeding edge wafer for, say, intel are probably around 30k-50k
if you are a new business then r&d is gonna be a much bigger portion of costs, and you get less economies of scale. so yeah, probably gonna be selling those wafers at 100k if you wanna break even

what if I'm a research project for a university with intentions to give away the finished design under BSD/apache

I'm pretty sure there are openCL implementations of openGL. and I'm sure there are openCL compilers for lvvm intermediate code. then you roll your own backend. bam, you got the whole graphics pipeline solved.

You have to support most versions of OpenGL, Directx, etc. It has to be performant. Like everystep in the process these have to rock solid pieces of software. Its just so much work. The reason why manufacturing a gpu is so expensive is because it is. Its a big engineering feat that takes a lot of people. Its like building your own operating system. You don’t want to end up like our lord and saviour Terry A. Davis, right?

so? you could lower the costs of r&d but the university isn't gonna give you money to fab GPUs for random people, the actual device is still gonna cost at least enough to cover the costs of fabbing+management
and academics can't ever do anything right, so the GPUs gonna suck and nobody is gonna buy it, which means less economies of scale. heck, you probably are gonna have to make do with an fpga because there aren't gonna be enough orders to make an ASIC

what about vulkan and DXVK

will come... RISC-V can spawn 256 cores easily, even more, pissing on all fucking Dx or other stupid API's to draw a fucking triangle.

I remember when Voodoo was out, API was so clear and good, all vendors shitted pants around. The only problem is the resistance from big guys.

I am pretty sure something RISC CAN do it, and is inevitable.

>Directx
well... if you are making an open source gpu you probably aren't aiming for the goymer market. in any case fast-ish translation from directx to opengl is possible and being improved

I could beg people for money on kickstarter

novidya is actually the biggest player in the riscv alliance I think

why tho

then do it. you probably could make an fpga prototype with a single (good) hardware engineer in less than two years, and a basic graphics pipeline in another year, so you need about 300k to get started if you hire internationally

why what? they see real massive multithreading is the future, instead of the current SIMD approach, so they're developing small, lean cores for their GPUs

yes cuda cores aren't actually so faster, the powers comes from the number of them, easy math

instead going in Mhz to gain a naive 1-3% increase, one more core DOUBLE the everything

I believe GPU manufacturers have plans to use RISC-V chips as schedulers.

problem is CUDA cores aren't actually cores, they're just ALU units inside a core. right now your gpu only has like between 5 and 15 actual cores
riscv can change that, giving us like 100 actual cores per gpu or something like that

Seems like a cool idea. Took me awhile to wrap my head around. I’m not exactly sure how it would work though. So you need to write a library that targets your GPU using OpenCL, and that’s it right. I’m not sure how efficient that’d be, but dounds like it would work.

I'm assuming being part of an open hardware alliance probably means they are expected to contribute something back to it. why would they do that if they could just take the ISA for free and do their own thing in-house

why just scheduling? riscv was built with extensibility in mind from the ground up, so they can add long as fuck SIMD instructions just fine

Don't forget masks cost up to $1m each for high-performance semiconductor device manufacturing.

>I’m not sure how efficient that’d be
not very efficient. if you actually want drivers+firmware+microcode that performs at novidya levels you are looking at spending like minimum 20 million, probably much more just on software developers
drivers are actually a meme
all the magic happens in the encrypted firmware running on the GPU itself

I doubt that dude, an real open gfx api is not that complicated that you think, single devs made examples api on framebuffer just to show how things works, you overcomplicate everything

oh, yeah. they are hoping the open source community can build them a good optimizing gcc/lvvm risc-v backend for free
the only thing they contribute is the instruction set and the shilling of said ISA to open source developers
good point. in any case he'd want to start with an fpga. I mean, this has been actually done (the fpga thing, or maybe they made a low integration ASIC, don't remember) piggybacking on an avionics commercial project, and only like 100 people showed up to buy it. there aren't enough freetards willing to shell out the big bucks to make an open source GPU happen

noob question, can an array of RISCV boards connected in parallel be capable of processing graphics at an OK level? like how SLI and crossfire do for graphics cards

>framebuffer
kek
if all you want to do is display shit on the screen, sure, an EE graduate can do it within a month
we're talking about 3d acceleration here

>3d acceleration

theory is the same, transforming vertices by perspective is simple math

Yes, but not at the performance of a SIMD processor (what current gpus are). because graphics are very well suited for a SIMD approach, conventional multithreading is worse in performance per dollar terms. you'd be wasting die size on unneeded instruction fetching/decoding logic.
but sure, if you can fit them with enough memory bandwidth there's no reason you couldn't play gta v at 500 fps using only risc-v cores. with SIMD you have the advantage that you only have to load the data for each ALU, with MIMD (conventional cpu) you have to fetch instructions to each core reducing the efficiency

key word being "efficiency"
like I said sure, you can accelerate graphics re-using existing software. but there's gonna be a lot of abstraction layers so it's gonna be shit, and modern gpus still have a lot of fixed-function logic in them. to implement openGL efficiently at a close to the hardware level, plus add support for whatever fixed function logic you choose to put in the hardware, you're gonna need a lot of manpower.
if implementing a graphics api was so easy, amd's opengl implementation on windows wouldn't be shit compared to directx and amdgpu

Remember thsat it takes min 3k dollars and min 3-9 months to have a small ic chip produced on 1um or 90nm process .Not to mention you cant diagnose jack shit there.
So you would pretty much fork up close to a million at least to produce something on uptodate prcoess on glfo or tsmc for 1 waffer.

nobody cares about opengl, there is nothing optimized in video card for it, you really are naive

>nobody cares about opengl
>there is nothing optimized in video card for it
ok rajeesh, I'm sure you are quite the expert on the subject

STOP USING POLYGONS
youtu.be/5AvCxa9Y9NU

Attached: hqdefault.jpg (480x360, 23K)

I want a RISC-V desktop system. Don't give a shit about free as in commienism GPU

check opengl docs fucktard, opengl specs are from icd drivers, and vendors are making efforts to adapt their hardware to specs via icd

what do you think mobile games on mobile GPUs run on pajeet, directx?

>don't give a shit about freedom
what's the point then, just be an intel/amd slave and buy their proprietary shit like a good goy

>micropajeet unintelligible babble
ok rajeesh

your proprietary GPU has access to the system's RAM and network with the OS being none the wiser, with a massive amount of onboard ram and a very strong GPGPU, and an encrypted firmware running on a secret, proprietary, non-documented arch.
if you're worried about the ME or spectre/meltdown, you should be even more worried about your GPU.

Can you point me in the direction of one of these patents? I have looked through Nvidia's patents in the past looking for the fabled GPU patents and come up completely empty (there were some but they were all for fringe functionality like SLI, but none for actual vital stuff).

it doesn't matter what the patents actually are for. just look into SGIs patent trolling
you could get sued for a patent related to "three-dimensional image generation system" for all you know. but then you gotta have the lawyer fees to withstand that (or pay them off to stop bothering you if it's cheaper)

did this meme take off?

I didn't even see any patents for vague things either, it was all fairly specific.

>overdramatic voice acting for normalfaggots
>gta v graphics
>real world
pick one and only one

I love how elaborate these trolls are. They’ve been scamming VCs for literally a decade. Their newest project has absolutely nothing to do with this rendering technique. Great company.

>what is baking in textures
>what is LOD scaling
>not actually not using polygons
disgusting attempt at marketing a meme technology. don't fall for the meme varg

they even got a gorillion shekels from the australian government, lol

>Their newest project has absolutely nothing to do with this rendering technique.
??? are you talking about their scanning tech? guess those VCs wanted some ROI, they're latest projects are in holograms and using the voxel technology for those purposes

Attached: Screenshot_20181003-135211.png (1080x1920, 134K)

dumb phoneposter

And those island monkeys think they’re better than us.

Stop shilling their shit. They’re latest technology is a fuckin vapor ware holographic air hockey table.

anandtech.com/show/11101/amd-files-patent-complaint-against-mediatek-lg-vizio
And AMD won,Nvidia had even more patents.
Extreme vague and vital points for GPU, GPU had special hardware functions for Texture(load,decompression,processing), polygon processing and shader programming.

they're apparently using it for static point cloud capture of locations for real estate demos and heritage site reconstructions and shieet. I'm not sure any of that is true, sounds like a huge money laundering scheme to me

where my hardware niggers at? bump

Attached: sam zeloof's chip.jpg (800x450, 68K)

embedded/ee peasant here desu

Is this yet another Jow Forums project that will never see the light of day?

do you have that list of Jow Forums projects that never happened?

That AMPRNet-esque thing, those browsers, various different pieces of software...

no. quads say the project will be successfully completed

Attached: 1546895327927.jpg (480x360, 36K)

has this place made anything other than tox nyaa and clover"""""""""OS"""""""""

>sifive risc-v devkit costs 60 dollars
I'm too poor to buy it

Attached: 9gag_2.png (232x254, 50K)

dev.sifive.com/freedom-soc/evaluate/fpga/ says 120 dollerinos

it's the same difference for me

meaning?

well, faggot, do you have a couple million around to get us started?

can't buy it anyway, not for a couple of months at least

>well, faggot, do you have a couple million around to get us started?
Crowdfund, you mong?

Not really, no.

what about those tons of booru scrapers and hentai userscripts

I don't think that is really on par with starting up our own "open source" hardware manufacturing project.

is this even a project? how are we gonna communicate with OP

Let's crowdfund some NEETs who took an undergraduate digital logic design class and now somehow they're capable of designing and manufacturing a GPU and software to run it. I've got a better idea lets just dig a big pit and throw money into it, and save ourselves some fucking time.

IRC? We will need to crowdfund this though, to get enough start-up capital.

>undergraduate digital logic design class
Aren't those the same sort of people who made modern computing what it is today?

and who's gonna be doing the design part exactly? OP doesn't seem very capable to me, and I need to know what's happening before I throw money at it

>freedom
I have the freedom not to buy shit so fuck off foot cheese

I'm a bioinformatician/genetic engineer. I can handle the programing side of things, but I'm no EE/CE.

>your proprietary GPU has access to the system's RAM and network with the OS being none the wiser, with a massive amount of onboard ram and a very strong GPGPU, and an encrypted firmware running on a secret, proprietary, non-documented arch.
doesn't matter to me. I want a RISC-V machine to play around with. don't care if aymd or nvidia read my porn history

>bioinformatician
lol meme

what can kind of programming work do you do user? can you do GFX APIs?

Fuck you, bio data is pretty important when trying to solve issues in things like genetic engineering. You hurt my fee-fees.

I can, but I would need to probably leave my research positions to have enough free time.

I'm a NEET college flunkout who knows some C/C++ and is trying to learn embedded systems

I'm a math flunkout, so don't feel too bad.

Maybe, but theres a difference between being a monomaniac developing a product in an emerging field. And being a ho hum college graduate who spends his time browsing Jow Forums. Don't get me wrong theres nothing wrong with being curious, but trying to break into the GPU market is just not a feasible task right now especially for the average person on this board. I think most of the founders of semiconductor companies had Ph. D's anyway not that that matters much.

I thought we were doing this for the freedom, not for the dollar?

Attached: images.png (399x368, 8K)

We'll all make it... some day!

Attached: drunk_wojak.png (1000x1000, 119K)

yeah, breaking into the commercial market shouldn't be a goal. if some anons can come up with a prototype that can be replicated or a design that can serve as the basis for future open source hardware efforts, this will be a success

I swear Stallman said something about "open source" hardware? I mean, besides the refurbished Librebooted Thinkpads. If memory serves, he mentioned this in a relatively recent interview.

I'm a android app developer and former webdev. I know it's a pajeet-tier job but can I help

Dude, this would be to fill a hypothetical (not existing in reality) freetard market. If you actually developed a competitive GPU and open sourced it, the chinks would be LEGALLY stealing your design in less than a week, and producing it for 50% less.
I mean, we """could""" produce an FPGA thing with almost no money except to buy a dev board, or even just with volunteer manpower than that considering software emulation options. The problem, just like with every Jow Forums originated project, is nobody is gonna spend their time doing that.
It was a rhetorical question you dipshit.
I am not going to crowdfund anything, because I'm not a kickstarter scam artists, and there's no fucking market for this thing if we are being honest about what's to be expected.

Attached: 30yo doomer.jpg (827x835, 123K)

Oh gosh, Jow Forums's flagship GPU project as spearheaded by an android app developer, a glorified biologist and a college dropout with C/C++ experience.
This is going to go well!

IRC is not persistent and fizzles out a lot. someone make a GoyHub repo, we can collaborate to write a spec/manifesto first and discuss it on the issue trackerlists

pls no bully

Attached: 1521820062510.png (645x773, 140K)

WHAT the FUCK that meme is scarily accurate description of me