4 hours until NVIDIA wants to sell you $200 cards for $1,200

Are you ready to be robbed?

Attached: file.png (800x500, 616K)

Other urls found in this thread:

ycharts.com/companies/NVDA/profit_margin
twitter.com/AnonBabble

+ AMD aren't better either. They patent hog equally not letting anyone in.

The only reason this is allowed is that they are both American.

Except power efficiency, each generation is somewhat the same as the previous one with like 30% change in overall performance and that due to software mostly. Nvidia playing that way with the prices is an obvious way to show that consumers really don't need more than what they already have or the older gpus, and that the newer ones are just a silly niche for stupid people that throw money at literally anything.


It's not bad for a company to seek more money from stupid people that don't understand specs so they can make better products in the future.


AMD does the same in both the GPU and CPU field. Don't be stupid and get a 980ti if you have the power to run it.

>They patent hog equally not letting anyone in.
What?
Even SS is making their own GPUs.
AMD is just the only non-NV company retarded enough to try making dGPUs.

Don't be stupid, all 3, Intel, AMD and NVIDIA are patent hoggers.

Did you know you still can't make an x86_64 CPU? It requires SSE2 and it didn't expire yet (and they still keep spamming patents every release).

The true reason this is allowed to continue is that they are all American. Trump would screech if at least one was Asian.

GPUs are similar. Intel paid rent to AMD and NVIDIA to have iGPUs (maybe they'll make GPUs in 2020 because some patents expire, dunno). But when they do those deals with each other they usually swap to keep the duopoly going, so I doubt money changes hands.

The true problem is that there is no choice. It's a duopoly of patent hoggers. You literally can't make PC CPUs or GPUS right now even if you had the money because everything is patented and the rent you'd have to pay would be usually prohibitive.

Even Intel themselves weren't allowed to make iGPUs and they had made deals both with NVIDIA and AMD.

>Don't be stupid, all 3, Intel, AMD and NVIDIA are patent hoggers.
How the fuck is this related to GPU patents?
NV, AMD, Intel, ImgTec, Apple, Samsung, Qualcomm and ARM are ALL making GPUs.
It's just requires a lot of effort.

>You literally can't make PC CPUs or GPUS right now even if you had the money because everything is patented and the rent you'd have to pay would be usually prohibitive.
You can.
Why would you?
It's expensive and you're entering already very competitive market.
Suits don't like gambling.

??? I can still buy older hardware for cheaper prices. It just runs on a lot more power. Performance is comparable for consumer-grade hardware.

The norm that consumers need power efficiency and "muh-cores" started buy companies testing how stupid consumers can be buying things they don't need.

But that really hasn't changed the available choices, at least for me. Don't know what kind of argument you are trying to make, but GTX series 10 and 20 are really not needed for the average consumer. 60% of the market doesn't even need an off-mobo gpu to be satisfied.

Seething AMDrone.

>$200 cards

How is literally the fastest card available a $200 card

I'll just wait for the 2050.

>be robbed
how is willingly pissing money away == being robbed?

>chamfered rectangle
>requires a lot of effort
oh come on now

The PC tech isn't the same with mobile. The patents don't only cover the chip itself, it's about the entire card design.

It's not expensive though, it's prohibitive with the constant spam of patents. The patent system wasn't meant to be abused like that, it was about making a product line and having a couple of decades of freedom. Those patent hoggers keep spamming patents every week in case something sticks, pushing the next effective expiration 20 years into the future.

Intel themselves couldn't make an iGPU without making deals with both AMD and NVIDIA but in those cases they usually swap for the duopoly cartel-style so I doubt money changes hands.

It was exposed their profit margin is about 1,500%

Patent hogging is lucrative.

What is everyone's fucking problem with tech going expensive? They literally make hardware releases more often than sports games releases nowadays.

Newest is bound to be expensive. And stupid people are bound to buy it.

It's better when companies get all that money from stupids to invest it in something better: Their horrendous software.

Engineering is expensive.
>The PC tech isn't the same with mobile
GPU is a GPU.
Like, the concept od unified shader is the same for everyone.
>It's not expensive though
It is.
Are you genuinely retarded?

>manufacturing integrated chips is not expensive
have you ever wondered why either they're colossus or they aren't at all, there is no medium-small silicon manufacturer?

Yes. And the reason is people will not buy chips from a not well known manufacturer. Because software support.

The tools we use are built around the chip architecture. If you are not building the software of your system from scratch, you might as well go with the big boys.

I meant it as "it's not just expensive", but even if I didn't, it's mainly the prohibition of the patent or the need to pay rent to the hoggers that makes it prohibitive.

Intel is the best proof. With that colossus pile of money and ALREADY having foundries of their own, they made deals both with NVIDIA and AMD to be allowed to have iGPUs.

>nvishit

Attached: 970.jpg (625x352, 72K)

>I meant it as "it's not just expensive", but even if I didn't, it's mainly the prohibition of the patent or the need to pay rent to the hoggers that makes it prohibitive.
Making a dGPU is VERY expensive for little gain.

NVIDIA's profit margin is around 1,500%. Don't be jew'ed to think they are charity.

Main thread here

How the fuck is this related to new entrants?

You said "for little gain". 1,500% is not little gain.

You just can't do it, you are patent hogged.

ycharts.com/companies/NVDA/profit_margin

Profit margin nvidia is same level as Apple since octuber 2016

That's whole company, not GPUs.

Support and legacy certainty have their impact on a product, it doesn't stop companies from existing and whenever there's a successful product you're bound to see clones popping out. None one sane on his mind would attempt to create a new standard nor reinvent the wheel.
Such case was in the early days when x86-clones were fairly common, however of all manufactures producing clones only one has stood as today.

Patents and monopolistic behaviour are the true killer of free market.

>You said "for little gain". 1,500% is not little gain.
You need to capture share first.
It's no gain when you're selling nothing you drooling braindamaged moron.
There's an obvious reason why no one was trying to enter dGPU.

Welp, I see no market death.

Current "modern" hardware is simply too much for normal businesses and consumers, so it might as well be expensive

Heh, back in the day I spent $400 on a Geforce 6800GT and $300 on an Opteron-185. The ram cost $200 (this was DDR 512mb x 4 sticks). The Sound card and speakers combined were around $200 (Creative brand). The Mobo and HDD both cost $100 each. The monitor cost $300 (Samsung Syncmaster 17). The cheapest parts of the whole build were the DVD-Rom drive ,Floppy drive,mouse and keyboard. The Case cost $50. The PSU cost $100

So all in all I spent $1550 just to build a single gaming pc. This was in 2004/05.
Oh, I still got this system today. In storage but still fully operational.

I'd rather pay extra for performance I don't need than being a AMD fanboy, thanks.

Attached: 13102694583.jpg (425x301, 90K)

sauce?

I think it is from Kisaragi Gunma. Name of the cartoon should be Kozue Panic.

Learn to reverse image search retard.

I did you retard. It's just a nice meme when you waste replies for telling me :^)

I'm ready senpai

So the card I want to buy will be around 800$ MSRP.
How much would this be in £? Give or take £700?

I'm guessing around 800 GBP, considering taxes and distributors and such taking their share.

A 30% change in performance on my workstation means that my simulations can get done two months sooner

If the RTX 2070 is actually going for $400 that's an easy buy.

Equates to about $2k based on CPI since 2004 - gaming PCs have always been expensive.

See? THIS is the audience those prices are aimed at.

This audience is not "anyone" and if you ever claim you are a hobbyist doing such work, you are stupid enough not to be worth any of that tech.

But oh well, stupid or real professional, everyone that does such work WILL pay the price.

Yeah, it's pretty brilliantly timed as well. Incentivizes people to finally ditch their radeon hd 6700 and upgrade to the 21st century.

Conversion is not 1:1 even with VAT.
Launch price of a 1080 was $700/£619.

>workstation
>two months
???
Explain

Sauce? I'm interested in this kind of stuff.

They are probably a hobbyist that just created a "powerful" machine to "experiment" simulating onions plantations.

Like the ones creating home "mining rigs"


Turing will not improve much for them though. It only revolutionizes rendering.

I perform molecular dynamics simulations of large peptide systems (professionally).

Any idea how Turing hardware will speed this up better than Volta?


Or are you the typical scientist that uses Python and R just because there are more ready solution APIs on the field? Cause I've met a lot of those over the years that read-up specs knowing absolutely nothing on how they'll benefit

AMD
1. Creates free drivers
2. Develops free standards and opensource software for various GPU-related technologies like tesselation, sync, hair rendering, gpu-accelerated physics, opencl, etc.
3. Doesn't have gimpworks
AMD literally did nothing wrong. To make a GPU you need to support graphical APIs which are royalty-free (yeah, like DX12 and Vulkan which were also developed with help from AMD) and to compete in specific market (like gaming) related technologies (which NVIDIA won't let you license under any realistic circumstances, but you don't even need to ask AMD).
You know, you fucking piss me off. Go kill yourself or something.

Also Tesla

You cant rob me if I dont have money

Attached: 1534252843628.jpg (552x557, 26K)

You seem upset.

>You know, you fucking piss me off.
No shit faggot.

Nah. I'm perfectly content being robbed by Apple. At least there are poorfags out there willing to buy my old Mac for 80% of original retail so my next Mac only costs me like $300-$500. If not for the Tyrones and Bonquishas of the world, I'd have to pay much closer to full price so bless their little impoverished hearts.

kek, what are you doing in here DeShaun

I doubt it will have any impact in the short term. The MD software will need to be rewritten to take advantage of any new CUDA math as part of the Turing arch, which takes forever in academia. I also don't know enough about how VR routines could mimic interatomic forces.

The biggest speed increases are from increasing CUDA cores since you can increase patch sizes within your periodic boundary conditions, and of course clock speed. MD doesn't use much memory and I'm not certain how memory speed increases impact MD since all GPU accelerated development has been in the past few years in gddr5. My guess is that it is negligible.

Based

>expecting proof from self educated neet on Jow Forums
So naive.

> Did you know you still can't make an x86_64 CPU? It requires SSE2 and it didn't expire yet (and they still keep spamming patents every release)

AMD has to use x86-64 as a bargaining chip to make Intel grant them licenses back.

Seeing these posts after each other made me lol.

Should I replace my 960? My computer has been randomly shutting off lately and I feel like the GPU is dying ;__;

Attached: IMG_1298.gif (335x237, 1.81M)

Fake and gay

>peptide
hand soap that binds to polypeptides when

SSE2 isn't obligatory to implement AMD64, but the lack of non-SSE2 AMD64 processors has made it a defacto standard on AMD64 software