Which would you choose, AMD or Intel/Nvidia?

Which would you choose, AMD or Intel/Nvidia?

Attached: ryzen-3-1200-gaming-pc-82-100731104-large.jpg (700x466, 64K)

Power9

AMD processor with NVIDIA GPU is the only decent choice for computing.

>Would you consider using a normal CPU or one riddled with crippling irreparable hardware faults which gives everyone root access to your system?

This

This

Attached: 1539021542955.gif (149x369, 106K)

Literally just finished upgrading my mobo and cpu. I'm honestly pretty satisfied with my Ryzen 5. It's surprisingly fast and multitasks like mad. That said, if you're planning on buying Ryzen, expect to buy a 3rd party cooler.

AMD hardware is the only sensible choice.

I'd probably pick Intel/Nvidia unles something better comes along.

I have amd + nvidia right now but will go full amd as soon as they release competitive GPU because I want stable system.

AMD/Nvidia

Nice memeing. Intel's coolers were neven even close to being usable. Yet I am satisfied with amd's stock cooler.
Nothing wrong with having a 3rd party cooler, but stop confusing people. You are making it look like 3rd party coolers are necessary for amd but not intel, when in reality it's the opposite.

mah nigga

Attached: mah nigga.png (927x426, 58K)

>crippling irreparable hardware faults which gives everyone root access to your system?
KEK.

agree

>Yet I am satisfied with amd's stock cooler.
I bought a Ryzen 2600X. That particular model comes with an undersized cooler. I'd say, as long as you don't buy the X variants, you can probably get along fine with stock, but otherwise you'll absolutely want a better cooling solution.

I bought a 1600 and it came with same cooler, and I run it overclocked.

I have the same as you but I regret doing the 1800x. 1700 is the same chip.

AMD+NVidia

95W vs 65W with the same cooler? Of course your 1600 is fine and my 2600X runs warm.

Intel/Nvidia is the actual patrician choice, it never fails and it stays strong for years

AMD fanboys just pick AMD/Nvidia because AMD doesn't have a GPU worth buying, only the most hardcore of fanboys are buying Vega

This is the only correct choice

Why nvidia even together with intel, marketeer kun?

How strong is it after applying all the patches for the hardware errors?

I applied all security patches and my machine is still fast

I don't do enterprise stuff so I'm not affected at all and I'm insignificant to be hacked and my machine has nothing to hide

Try again AMD shill

>I'm insignificant
Indeed.
>nothing to hide
Model citizen.

unironically this.

Is the 2080 going to be worth it or should I get a 1080ti during black Friday? I have a g-sync 144hx monitor.

I'm pretty sure overclock goes far beyond 95W

Not a 1600. Not even close.

If you can get the 1080ti for cheaper go for it

There are no raytracing games and raytracing lowers your FPS in exchange for meme lighting and shadows

11GB VRAM is nothing to laugh at, you can play heavily modded games with that setup as opposed to just 8GB

1080ti should be faster for 1080p. No current gen cards will ever run (((raytracing))) at acceptable framerate. Only reason going for 2080 would be DLSS if you are fine with artifacts instead of antialiasing.
It is possible novideo will come up with a way to utilize rtx bullshit partially (like some kind of 3d shader) to jew everyone into buying newer cards. I never really looked into this (((raytracing))) thing, but to me it actually looks like shaders. Raytracing is supposed to be an alternative to rasterization, but in case of novideo's (((raytracing))) rasterization is still there. So if you disregard the performance and price, you will be more safe with 2080.
Honestly I decided to boycott this bullshit and not going to buy neither 1080ti nor 2080. Fuck novideo.

Ty probably going to get 1080ti, don't really hear anyone talking bad about it. Really want that asus strix 11gb.

There is no difference between 1600 and 1600X when overclocked.

Dependin on my budget.

If I have a $400 budget for GPU/CPU combined, I'd go for 2600X + whichever is cheaper for 1060 6GB/580 8GB.

If my GPU budget is $400, I'd go for 1070ti/1080 or Vega 56/64, if either hit those price range.

Anything higher is nVidia for 1080ti and maybe either 2700X or 8700K for CPU. 9900K for CPU and 2080 ti for GPU if I'm making $1000 per day and I have cash to wipe my ass with.

The difference between the base and X variants is binning. Essentially, the X variants are factory overclocked. The Ryzen chipset is also known for having pretty slim overclocking compared to the previous architecture since the chips are designed to situationally overclock anyway. This means that even with an impressive overclock you won't be pulling X variant numbers and unless you're setting out to destroy your CPU you're not going over a 50% wattage bump, even overclocking.

If Vega 56 is 425 or less it is preferable to a 1070ti

2700x with 1080ti here, can confirm.

Cringe9

Attached: chrome_2018-08-04_02-32-39.png (595x328, 18K)

>Let me pick one of a few results it wasn't competitive in. Important things. Like how it'll handle encoding my anime rips.

>The results for the most part are competitive with current AMD EPYC and Intel Xeon hardware, even more so if considering it's a fully open-source/libre system without any FSP/ME/PSP to worry about for those concerned about that

AMD CPU and Nvidia GPU.
This might change as new parts and products are released, as always.

>tfw vega64 in the mail

Attached: murasame 1screen v5.png (1179x811, 280K)

Intel and nvidia. I don't multitask.

I've got a Ryzen desktop at home that the Wife uses basically just for PS2 emulation and playing random games like The Observer or Killing Floor. I've played Deus Ex Mankind Divided on it without any issues. Can't say I'd really consider an intel CPU over this lowly R7 1700. This basically sips power and provides enough performance that neither of us is wanting for any more with these casual workloads.
I originally bought it because I was rendering some video on the side, doing a tiny bit of editing work for extra cash while I toil away as an AutoCAD slave. I ended up purchasing a Threadripper machine to take place of that for real work though.

On the GPU front AMD is pretty far behind Nvidia in ultimate performance, but thats realistically not something most people, myself included, run into as an issue. We've got a fancy new 4K Samsung TV, the newest series they offer, and across our narrow living room it still looks fine when playing stuff in full screen 1920x1080. The Ryzen 1700 system has a GTX 1060 in it which has plenty of power for our needs. Though ironically I've had more driver issues with this card than any GPU in recent history. Had to exchange the thing three times. There was a persistent bug in the drivers that lasted for over a year, contacted Nvidia directly and they just stopped responding to me.
The drivers on the disc for the GPU worked fine, for a while all the new updates they issued worked fine, then all of a sudden I couldn't use the full color space on my TV. It needed to be 4:2:2, or it would default back to that in a couple minutes. It caused the backlight levels to change every time I opened anything full screen. It didn't play well with my Dell Ultrasharp monitor either.

Despite AMD losing the performance crown pretty handedly I'm still willing to give their GPUs a chance for my next purchase. I've never had any particular love for Nvidia going back to my 5700le.

Intel, because multithreading for modded minecraft and emulators, is crap.

...I had ~5 captchas? What is going on?

At the moment this is true.
Can't wait for the day when we can ditch Jewvidia as well though.

They either know you're a good goy who answers captchas with a high confidence interval, or they think you're a bad goy so they're sending more your way to prove you aren't a bot trying to game their system.
They already know the answer to the captcha when they send it to you, within a certain range of certainty. They do still use aggregated user answers vs their own machine predictions for validation. However if they known you're a reliable users then they'll send you captchas that they're quite certain their machine has made good precitions on so you can give human confirmation on top of that.
Captcha isn't blind guessing. All users aren't equals. Users are filtered into different pools so they can maximizes efficacy.

I won't say its the only way but it is a great price to performance. Not to mention that AMD will support the Ryzen socket till 2020 so mobos will be compatible with future cpus. Plus, the stock coolers actually work. I'm really excited to see what AMD does with Ryzen 2nd gen.

Deep Learning/Big Data was a mistake.
Do they sometimes let one or two mistakes through? Because it feels like sometimes I make a mistake, but it goes to through anyway.

Attached: MLkW8sktQ853xMiz8DPAqf.jpg (603x393, 16K)

If their machine prediction is low enough confidence then they'll take user input. Though if they get enough aggregated user input on a single sample they'll use that average as the new baseline for the correct answer.

I'm sure at some point they have an engineer come in and do a sanity check on the data, but that seems to be infrequent. Example would be "select all images with bicycles." The issue they have, as with every captcha, is that people start selecting things that they think are going to be correct answers instead of what is the direct information in the image being requested. IE people will choose not to select squares that contain a small fraction of a bike's tire if they think its insignificant enough, but the machine will eventually need to recognize the information in that square regardless. So people start omitting certain squares even if they should click them because heuristically you can tell if the machine is likely to miss that as well.
Engineers are constantly trying to readjust things at least once a month to correct for this.

I had a long discussion with some Google fag on Twitter about it a while back. It was pretty interesting. I'd say they're all objectively evil people who are too dumb to realize it.

samefag

>stock clocks
OC that shit nigga.
>inb4 OP is retarded
Start with a fixed OC to get a feel for everything. Then figure out the P-State hex values so it only OCs the boost clock.
>why only OC the boost?
You only really need the OC when under load. OCing the boost only will give you the performance gains but with the processor returning to a lower clock when idle. This reduces power consumption, overall heat, and wear/tear on the die. But it's harder to pull off than the fixed core ratio because you need to be able to mess with the p-states.

Intel if you need to perform heavy graphics tasks in record times (since most of those programs can combine the Intel iGPU with a regular graphics card for max performance). Intel also has better 16 bit legacy support if you need that (good for retro gaming).

AMD for 95% of today's computing (aka literally everything else).
>inb4 amd can do windows 9x for retro games
Not really. The 1000 series has only minimal 16 bit support. Not sure if the 2000 series added this back in, but I doubt it since it's super niche at this point.

Nvidia goes on both, unfortunately. Call me back when GPUs are actually competitive again.

8 core 16 thread CPU that's within 12% of the most expensive intel consumer processor, for half the price.

Promised support for the socket for a number of years with 7nm zen 2 just around the corner.

Does any more really need said?

(OP)
Oh yeah, forgot to mention hackintoshes. Intel no matter what on that one.
>inb4 amd kernel
Yes, a kernel exists for Ryzen Hackintosh builds. But it's pretty much like those iATkOS builds from 8-10 years ago. So tightly bundled and integrated that someone could have slipped a trojan in there without anyone knowing. I'd rather just run the stock kernel with some kexts on an Intel chip.

If you want the best hardware and don't care about money, get Intel/Nvidia.

If you're on a budget and want best performance/price, get AMD/Nvidia.

If you're a moron who gets cold easily, get an AMD GPU.

Depends on how much cheaper the 1080TI is. If it's just €100 or so I'd get the newer 2080. But if the price difference is bigger then I'd reconsider or wait for Nvidia's next gen.

AMD CPU, and I would choose Nvidia except for the fact that their drivers suck ass on Linux. They wouldn't even acknowledge the fact that there were issues with tearing for several years since Kepler hit. Wanted to rip my nuts off trying to get the tearing to stop. Bunch of fucking Jews.

I recently got a 2600+RX 580, no regrets so far.

This. R5 1600 + RX 580 here, very happy with it.

Nope. Just upgraded from a 2600k to 9900k and GTX 980TI to 2080TI.

retard

Attached: nope.png (471x305, 16K)

And then mom woke you up

depends on where you live, a 580 8GB costs less than a 1060 6GB in some places

>patrician choice
it's literally the basic bitch choice, the default for anyone seeking a pc, people who know nothing at all about hardware start off with that preference

> only the most hardcore of fanboys are buying Vega
Actually, considering that Vega56 is the same as 64 with 64 overclock, and 64 is between 1080 and 1080Ti, I'd buy it after 590 benchmarks come out and IF I'll play PC games more. Then I may purchase 56, otherwise - I'm not a consumerist.

Intel/Nvidia

rekt
e
k
t