If gpus are so powerful then how come I cant run an os off of one

If gpus are so powerful then how come I cant run an os off of one

Attached: 518866-nvidia-geforce-rtx-2080-founders-edition-7.jpg (810x456, 90K)

Other urls found in this thread:

nurburgringlaptimes.com/lap-times-top-100/
riscv.org/wp-content/uploads/2017/05/Tue1345pm-NVIDIA-Sijstermans.pdf
envytools.readthedocs.io/en/latest/hw/falcon/intro.html
twitter.com/SFWRedditVideos

>if my car's engine is so powerful why can't I drive with just the engine?
Also if you bought a 2080 instead of a 2070 super you're beyond retarded

>if Lamborghinis are so fast how come we don't use them to go to space

Price to performance lambos, ESPECIALLY older ones are slow as fuck. It's mostly Porsche, Ferrari and McLaren that are putting out the fastest street legal cars at the moment

If OP creates a thread how come he is always a faggot?

actually a pretty good analogy
>Price/performance
Chevy is the clear winner here with the Corvette and Camaro

I won it in a raffle.

By supercar standards, though the C7 does qualify still. Kind of proud of Chevy for really turning its shit around lately and giving us an affordable, high quality supercar along with the Camaro getting a lot better, still wouldn't buy the latter because IT HAS ZERO FUCKING WINDOWS. The 7 speed on the Corvette is also excessive for a manual but it's more for dodging emissions taxes than anything
Fair enough

you probably could actually, but noone's really bothered to try it yet.
You'd need a small supporting cpu with just enough power to feed the gpu with opencl/cuda instructions and then build an entire OS in one of those frameworks.

I have a STI that could beat off a Lambo any day. Lambos can't corner worth shit.

Lol

>he thinks his shitbox can corner better than a [modern] Lamborghini because he has a STAGE 1.5 CAI BRO
user pls.

Attached: 345_p11_l.jpg (1920x1280, 953K)

They aren't meant for general compute, and kind of only excel at vector based calculations in general.

super cars can't corner and would lose to an STI. the only thing they are good at is going straight.

>super cars can't corner and would lose to an STI. the only thing they are good at is going straight.
you're a full-blown retard
even if you ignore the much larger and stickier tires that they have, any modern Lamborghini is much lighter and has a more central weight distribution than your STI, not to mention smarter electronic LSDs
t. regional autocross champ with an E30 that pulls more g's than your vapemobile

that's wrong.
lamborghinis are built on audi platforms now anyway, they handle pretty good, a Ferrari of any type would decimate even the fastest Impreza. not even the same kind of cars at all.

you might could beat a retard in a lambo with an STI but against a driver who knows what he's doing you lose, sorry kid.

lol literally the only area your car can beat a lambo in is in a straight line if you throw a bunch of boost at it

> if aspergers isn't the same as retarded why do they always say the stupidest shit

Attached: markypoo.png (274x428, 202K)

nurburgringlaptimes.com/lap-times-top-100/
Not a single Subaru in the list. Third best time by Lambo. Accept facts.

Next-generation Nvidia GPUs will use RISC-V cores. probably running a "richer" operating system. The previous Falcon cores could only run a very basic operating system, though I'm not sure which. Not sure if Turing is using it already, but probably not.

see riscv.org/wp-content/uploads/2017/05/Tue1345pm-NVIDIA-Sijstermans.pdf

all those cars suck at 90 degree angles

Post a pic of your car.

this is for cars and shit with a basic processor to run shit with a gpu along side to compute-er the logic from the sensors

Attached: 9404062a8678a9fa146c7233e10084bb3a71c28d8fed531f75ee4adfac95f24c_1.jpg (640x729, 54K)

no, Falcon is inside of all Nvidia GPUs since 2010. It runs its own operating system and performs all sorts of shit:
envytools.readthedocs.io/en/latest/hw/falcon/intro.html

Is the extra $200 for the 2080 super that stupid?

My house is full of Nvidia shields and my 1440p 144hz monitor uses gsync so I'm sticking with Nvidia. I want to use super sampling on htc vive pro and be relatively future proof.

a stock photo? neat.

They are not made to talk with peripherals, perform instructions, schedule and cache data. They are optimised for quickly solving vector algebra problems.

Different task, different specialist, just like in real life.

Attached: we need to talk.jpg (466x438, 43K)

why did this turn into a car thread

CPUs are like a small but good team of executives, they're well-rounded, fairly efficient, and capable of coordinating a large team. They can do some grunt work, but they would rather delegate it to somebody else.

GPUs are like a chink sweatshop, they're good at doing what CPUs told them to do: churning out tens of millions of RGB vectors 120 times a second. Some specifically-designed algorithms can enable them to behave like a hive mind (convnets, O(log2 n) bitonic sort, etc. come into mind), but for complicated executive tasks it would be much more efficient if the CPUs do it themselves.

Projection, the post.

fave analogy reply so far

Attached: 1564312165892.jpg (450x632, 112K)