NVIDIA shills called out

youtu.be/tu7pxJXBBn8

Hahahaha he has the balls to call them out to their face

Attached: PartialScreenshot_20180825-061959.png (1440x1108, 1.11M)

Other urls found in this thread:

twitter.com/geekinchief/status/1032652101833383937
youtube.com/watch?v=QI9sMfWmCsk&feature=youtu.be&t=17m4s
tomshardware.co.uk/forum/id-3771715/buy-nvidia-rtx-gpus-worth-money/page-5.html
twitter.com/NSFWRedditVideo

>naming the jew

I'm worried they will take his channel down now

you're fast op lol, was just about to post this

No, I won't fucking buy it.

tldr?

>>/v/

Spotted the kike. Fuck straight off into your green oven nvidiot.

Jow Forums is always right.

Attached: kiked.png (566x206, 14K)

Excuse me, what the fuck?

editor and chief of toms hardware twitter

Attached: 1518676006806.png (579x478, 150K)

HAHAHAHAHAHAHHA THE KIKE ACTUALLY SAID KVETCH HAHAHAHAHAHAHAHAHAHHA

I'd smoke weed with him, he seems pretty cool.

>it's a real tweet

Attached: 1533671532651.jpg (568x570, 113K)

This truly is the best timeline

Jow Forums is never wrong

GamersNexus are motherfucking OG as hell.

Attached: 3b0.jpg (490x497, 63K)

Cuckshardware is getting called out pretty hard on this

Attached: shilling.jpg (1545x1398, 353K)

Dear lord, this timeline.

>the more gpu you buy the more money you save

Hope that Tom's guy kills himself

> poor people who wont experience godtracing on september ITT

Remember when people accused steve of being an intel/nvidia shill LOL

Jensen was a genius if he thought of it, but it was probably some marketing guy

I'm really tired of nVidia's price gouging. PC gaming doesn't justify these numbers. GPGPU doesn't exist for average consumers.

Attached: 1475331773963.png (552x543, 358K)

Sucks that the poorfags won't get to experience slightly nicer reflections at silky smooth 40fps, unlike us rich master race goyim, huh?

AMD shills don't like it when people praise Intel/Nvidia.

At the same time they defend known AMD shills like AdoredTV and Hardware Unboxed (although they called him an Intel shill too when the 8700k was released).

Why are nvidiashills so stupid?

Attached: Screenshot_20180824-225100_1.png (720x470, 64K)

Seriously. FUCK YOU KIKE, I will buy used flagship GPUs from now on. Suck on it

shut up noob

>twitter.com/geekinchief/status/1032652101833383937
>oy vey goyim you don't really need 32 cores, do you? That's right goy, no, you don't.
The absolute state of this merchant.

what a whiny know it all cunt.
stop white knighting for a bunch of spoiled rich brotards who have 2 grand to drop on an untested video card and think about the spoor schmuck who has to sell these fucking things for a living.

That guy has to report back to hong kong and pay out the ass for his visa so the chinese government doesn't steal 99 cents out of every dollar he makes.

That nigga's got kids to feed, you smoke pot in parents basement.

Kike hustle economy in full swing. Buy this sawdust now and maybe we'll ship you a sausage later, no promises

Almost pre-order it after toms review but my lack of finance intervened me.

Actually a valid question.

HEDTs are niche products and most consumers are gamers. There's a certain point where cores literally don't matter anymore and you're gimped by clock speed, IPC and whatever GPU you're running.

8-10 cores are ideal in 2018-2020.

AMD makes a genuinely competitive product. I've got no bias for AMD, I'm probably going to go fishing for 1080ti when I buy a new ryzen board.

New boards are great, I loved my crossfire board but it was a total mess. Only two of PCIe slots were full 16 channel PCIe, it had only one x2 slot for network and sound cards, and 4 way crossfire /sli on that thing was a joke meme. FFS, it still had a standard PCI card slot, and bulldozer and vishera chips, while fast as all getout, had stability issues and were being sold as is when they were already pushing the limit of consumer standards by Overclocking them at 4.2 GH. The new Ryzen ram and cpus are much more stable and take less time to work up to full speed, they experience less hangs and (i'm assumming) the experimental archetechure of incorporating the north and southbridge onto the CPU die has been streamlined and is much more efficient than old style processors / mobos.

the funny part is that both tomshardware and anandtech are of the same group...

What the fuck are you talking about.

agreed, I think 16 Cores would be sweet but 32 is just unnecessary, even for an HEDT. Now... I wouldn't mind 32 cores if AMD allowed you to use EPYC mobo's, but they would never do that.

The only problem is I really can't afford it, i don't have 800-1200 bucks to drop on a gaming desktop right now, especially if I buy on credit. And you know, thats just buying wholesale and putting it together yourself, trying to sell the things for 2000-3000 bucks in this economy is a tough sell. People are making more money, but they are weary from the credit crunch a few years back, they are starting to make smarter investment decisions.

You'll always be able to find some dumb rich kid who is willing to drop 5 grand to have the best pc money can buy, but right now where the money is at is network storage and server solutions. A lot of websites have been around a long time and have been using domain name hosting on private. Those websites have been growing, and they are starting to migrate to their own managed server systems, many open source distributors are starting to see more traffic and can't afford to rent space on someone elses bandwidth: they need their own.

Attached: 1518907576859.png (1280x1280, 651K)

32 cores are necessary FOR developers and actual content creators, like professional car makers, architects, engineers and the like.

Not meme Twitch streamers and Jewtubers who render 4k.

Really for the amount of money a kid could spend on a gaming PC, he could buy a 300 dollar laptop from walmart and spend the rest on a dirt bike or motorcycle, or as a down payment on a new car or worktruck.

Right now, the CS field is so oversaturated that you will literally make more money working in one of the trades, finish carpenters and cabinet makers make more on average starting than an entry level CS job.

(((they))) are the cancer of western world. where's Hitler when you need him?

amd literally markets the 2990x as a WX a fucking workstation cpu

its NOT a hedt that title goes for the 2950x

I would unironically buy this as an ironic product

Our eyes not real because it's not ray-tracing in real life.

>Intel stops paying them
>Nvidia immediately takes their place

The conclusion to Vega's failure. Pottery in a sense.

The 2990WX suffers from memory bandwidth issues:
youtube.com/watch?v=QI9sMfWmCsk&feature=youtu.be&t=17m4s

Maybe its because AMD tries to put an EPYC cpu on a threadripper mobo.

TURN OFF THAT BRAIN GOYIM

BUY BUY BUY

Now they are in full damage control mode and are deleting comments calling him a shill.

are they really?

The thing had over 300 comments when i commented on it, now it has 8

wtf? how long till they delete the article

Now it has 0

>he's too dumb to buy a card, return it in 3 months, buy a new card, return it in 3 months and repeat

can't wait till they try to ban Twitter accounts or something

>

Editor In Chief

Hi All,

Thanks for all your feedback. I really appreciate the passionate discussion. I wanted to say a few things here.

First, this story and other articles labeled "opinion," do not represent the opinion of Tom's Hardware as a whole or our staff and writers as a whole . They represent only the writer's view.

As many of you noticed (and I linked to), we published an article with exactly the opposite view (wait to buy) by Derek Forrest earlier this week. My goal was to provide a counterpoint so that our readers can read both arguments and decide for themselves.

On top of that, I want to offer some context for my view. In short, what I'm saying is that, if you need a video card this fall, you should get the latest technology. Yes, it helps to read expert reviews like the one we will publish closer to the release date. However, let's keep in mind that, with driver updates, these cards could be much more powerful in 6 months than they are when we test them.

If you already own a powerful GPU (like a GTX 1080) and can bear to wait a few more months, then by all means, delay your purchase. However, if you were already planning to upgrade from an old card or are building a new system from scratch in 2018, then (I think) it would be a mistake to spend good money on a high-end 10-series card like the GTX 1080 Ti. Yes, prices are dropping, but if you invest $600 or $650 in last-gen technology and then you want to catch up and get ray tracing support in 2019, you'll be spending quite a bit to upgrade the second time, even if the price of the RTX cards has dropped by that time.

So, even if you buy an RTX 2070, it's more future-proof than a GTX 1080 Ti.

His damage control post

link?

tomshardware.co.uk/forum/id-3771715/buy-nvidia-rtx-gpus-worth-money/page-5.html

bunch of based and red pilled motherfuckers, proud of them

Attached: desuyo.jpg (1024x576, 99K)

Attached: DlZkaO2W4AAhC6C.jpg (1200x828, 101K)

You know for a fact that this Nvidia throwing money at (((journalists))) to counter the negative impact of the 208x line.

mega kek

wow he's getting fired

>out of touch corporate climbing boomer in dying journalism industry who exclusively uses a macbook to write articles pandering to advertisers and/or update his linkedin
his former companies include: senior editor of about.com / editor at (((national jeweler magazine))) / internet consultant at a bank
pic related, front page of his last job, a defunct geek culture site

Attached: 1516821811283.png (943x564, 366K)

>to counter the negative impact of the 208x line.
there is negative impact? cards do not actually exist yet without benchmarks

Oh come on, when Nvida invests in to shilling this hard you know something is wrong with the card.

It's the first time new gpus don't have a better price/performance ratio but a worse one. This is p fucking bad.

>the opportunity cost of sticking with your previous gen expensive hardware is much higher than the cost of our super expensive new hardware, goy

Todd's Hardware?

Attached: a lack of preorders is bothering todd.jpg (600x450, 39K)

IT'S FUCKING REAL HAHAHAHAHAHAHAHA

Attached: space jew.jpg (1414x1469, 366K)

Of all the words of tongue and pen;
The saddest are;
Jow Forums was right again;

The fact that they're showing off a new line without backing it up with verified benchmarks means people will assume the worst. The prices are absurd for non-professionals so the burden of proof is on Nvidia to justify them. They haven't, so negativity ensues.

not the first time, this graph shows the performance increases between generations of nvidia gpu's, and pcmasterrace has been more than happy to shell out cash just to have a Nvidia card, despite poor performance increases.

Attached: x80-genj9ard.png (2559x1398, 491K)

>mountain dew
>cooling your GPU with sugar
>which carbonizes
>which massively reduces heat transfer

I mean, I know its shitposting; but this is so brain dead, its an insult to shitposting on Jow Forums.

so where in your next chart is a next gen performing worse than a previous gen?
is it words you don't understand or graphs?

>news
>opinion
I can no longer tell which is opinion and which is news anymore. I fucking hate current journalist or writers.

I wasn't talking about performance. But price/performance ratio. Usually you get 10-20% more performance for the same money in a next gen. Now yu pay 50% more for 20% performance. Do you get how this is pretty bad?

>Maybe its because AMD tries to put an EPYC cpu on a threadripper mobo.

No faggot, its because their yields per wafer are so insanely good; they're not entirely sure what to do with the extra dies. So they're basically doing whatever the fuck they want and still making crazy amounts of money. Zen is literally the greatest CPU architecture ever created this decade.

It's a modular part, its small as fuck, and brings IPC up to within 7% of the 8700K. It only loses because the 8700K can OC to 5GHz and Zen can't; which will be fixed next year with the 7nm process. And because its small as fuck, at 12nm, AMD gets around 82-85% yield per wafer, which comes to be roughly 380 dies per wafer. Threadripper wasn't planned, at fucking all.

Threadripper was AMD engineers going "hey, what happens if we do this?" And they realized they could experiment on creating a new market segment because they had insane yields. 82-85% yield per wafer is pretty insane. 7nm Zen2 will likely bump that yield up to 85-88% which is damn near perfect yields. If AMD ends up figuring out how to do 12c/24t for their upper class of desktop CPUs, you can bet your ass that they'll create new lineups within the current market segments:

Currently we've got:
>32c/64 | 2990WX | TR+
>16c/32t | 2990X | TR+
>8c/16t | 2950X | TR+
>8c/16t | 2700X | Zen+
>6c/12t | 2600X | Zen+
>4c/8t | 2500X | Zen+
>2c/4t | 23/400G | Zen+

Zen2 will bring:
>32c/64t | 3990WX | TR2
>24c/52t | 3990X | TR2
>16c/32t | 3950X | TR2
>12c/24t | 37/800X | Zen2
>8c/16t | 3600X | Zen2
>4c/8t | 3500X | Zen2
>2c/4t | 33/400G | Zen2

Zen3 will bring:
>64c/128t | 4990WX | TR3
>48c/96t | 4990X | TR3
>32c/64t | 4950X | TR3
>24c/48t | 47/800X | Zen3
>16c/32t | 4600X | Zen3
>8c/16t | 4500X | Zen3
>4c/8t | 43/400X | Zen3

And so on. Because yields are crazy high and will continue to be crazy high all the way till Zen5+, after which AMD intends to introduce a new codename uArch

I just watched the 20 minute video. Holy fuck.

Attached: 1531254465451.jpg (125x125, 2K)

Tom's Hardware's Editor-in-Jew made a parody of a crazy cartoon salesman/marketer, except it wasn't a parody and he is genuinely mentally ill.

Its either news or advertisement. Best take it for what it is, and its mostly just some asshole shilling for that dick succin $$$.

that's only for Windows 10 you fucknut. No performance issues on Linux.

It's not AMD's fault that the scheduler of Windows 10 is outdated shit

about time they finally admit. fuck them.

no fucking way! NO! what timeline is this?
all we need is intel telling the stupid goyim the "doesn't matter" list in a tweet

>calling his shilling out
>even making fun of this shilling
>he even used the Stefan Molyneux argumentation
He is /ourguy/

>do you need new tech with 32 cores for decent $? no
>do you need new tech with rtx for gigashekels? yes

THESE FUCKING KIKES!

I really wish they would. That would make this timeline that much more interesting.

The article and the video are insane. Wtf was toms hardware thinking? Or did they let his 15yo son write the essay? This whole nvidia shilling is starting to smell. If enougj dumb sheep buy their marketing shit pc gaming will die within 5 years becausw there is no competition and no innovation. Nvidia isnt exactly know as the good guy in the industry. They are the best but damn they are mean sometimes

fucking antisemite

unsubscribed and notified ADL

You do realize that it's even worse for AMD, right?

>Wtf was toms hardware thinking?

Attached: PICKITUP.jpg (495x528, 75K)

>oy vey goyim pay twice as much for our 30 percent gains that are expected between new gens ESPECIALLY WHEN THE GPUS HAVE BEEN OUT FOR FUCKING YEARS

Attached: 8748574BAF3140069B691DDF54E79049.jpg (690x720, 50K)

Hi Steven,
Can you stop posting your blog on Jow Forums? We're trying to discuss distro ricing. We don't need 20 minute videos of tech drama about toms-shillware and jewVidia
You could make your point in 2 minutes and this video is 10x longer than that.
Bye.

Intel and AMD need to adopt new PCIe lane, that wont accept old once and ask only nvidia to pay for it's use.

His chart shows the long term average is more like a 50% gain and 20% has only applied to the last few generations. That makes the 20 series look even worse by comparison.

the last time you saw nvidia shilling so much around was at 5xxx cards
which we all know how it turned out after it was discovered that nvidia was forcing 2d textures as 3d so that it increase by x2 their perfomance..

what the fuck am I even going to do with all these cores

Attached: sadf.png (498x370, 100K)

So in short, buy a 2000 series card because if you buy a 1000 series you know Nvidia is going to cripple its performance over the next few months anyway.