WARNING! THIS IS NOT A DRILL! I REPEAT - THIS IS *NOT* A DRILL!!!11

wccftech.com/exclusive-amds-7nm-navi-gpu-will-launch-roughly-a-month-after-ryzen-3000/
Navi is releasing in AUGUST, exactly a month after Zen 2's official sales start! Previously it was known that Navi is going to come sometime by October, but now it's CONFIRMED that it's coming out several months earlier than that! Zen 2 THIRDRIPPER (officially planed name inside AMD) is still scheduled for an October release, though.

Attached: 1551790429102.jpg (1280x720, 131K)

Other urls found in this thread:

youtube.com/watch?v=seKaU-qQuts
youtube.com/watch?v=we2oePtTGMM
hothardware.com/news/amd-rumored-launch-7nm-navi-gpu-month-after-ryzen-3000
wccftech.com/exclusive-amds-7nm-navi-gpu-will-launch-roughly-a-month-after-ryzen-3000/
images2.imagebam.com/3f/26/60/c63a8f1150545954.gif
twitter.com/SFWRedditVideos

nobody care intel 10nm will destory amd in 2025

Unironically Intel 3D probably will, while AMD will repeat Zen levels of ownage with their own 3D chip.

>in 2048
Fixed your mistake

>Navi

Attached: 1503591183975.gif (758x866, 479K)

What? Be thankful it's not blue.

|
|>
|
|
|

|
|>
|
|3
|
|

A nose is more than enough.

youtube.com/watch?v=seKaU-qQuts

Attached: nenethink.png (233x311, 28K)

>wccftech
>confirmed
Yeeeeeah... that's a no for me

I wonder how many + signs they'll have to add to 14nm by 2025

They don't use rumors when they put the "exclusive" tag on the article's title, but strictly only confirmed insider info. It's pretty much 100% legit.

Attached: Raja Poochitecture.jpg (1480x833, 103K)

kek
i remember my friend told me to wait for 10nm in 2017 when i bought a 1600 and intel had already delayed 10nm a couple years
im convinced we'll never see it on a desktop chip now lol

>before early August
wew good thing i didnt buy a 1660ti lol

and with 7nm in 2100

1080 Ti performance for $250 confirmed!

ayy em dee needs to do *A LOT* if they wanna impress me. I'm done being excited about AMD GPU announcements only to be dissapointed again and again.
I expect nothing of this.

as nvidia is absolute shit with drivers, im interested if mid range navi will be able to do 1440p 60fps in most games, would be good company for my intel cpu

Given that the Ryzen 3000 CPUs up to 8 cores are rumored to launch in July (7th) and their 12 and 16 cores in August, it would make sense from a business standpoint to pair their high end CPUs with their new GPUs. A metric fuckton of people are going to upgrade, so having a GPU to go along with that should be nice, at least in theory.

>be able to do 1440p 60fps in most games
Even RX 590 can already do that. And Vega 56 even more solidly.

just imagine how powerful intel 5nm will be in 2150

Attached: 1539299904198.gif (320x384, 2.23M)

youtube.com/watch?v=we2oePtTGMM

OH NO NO NO

ayymd porfags can't game
ayymd eternally btfo

Attached: Screenshot 2019-03-13 19-15-34.png (1920x1080, 565K)

But if I go for AMD GPUs how will I make use of CUDA/cudnn?

Attached: 1551945858457.png (1338x1181, 69K)

>CUDA
You don't, you use ROCm like a good boy. CUDA is garbage.

>july 7th
thats a sunday
who the christ launches a product on a fucking sunday

So VII was just to release something?

prove that you use CUDA enough to justify buying a gpu for it

s-sometimes i waifu2x things locally
also one day i would like to make an osrs using opencv

osrs bot*

>nitpicking
try harder amdjeet

>in 2025
2030? What do you mean by 2035?

Slower RAM on the Intel system makes this a lousy test.

Intel crap can only do 2666mhz

machine learning. not me

Imagine using a computer to play games

>Intel crap can only do 2666mhz
>Intel - Inventor of XMP
>Most people run 3000-3600MHz DDR4
Imagine being this retarded that you think you can't run RAM past 2666MHz.
AMDumb

It's only gonna be around Vega 56 - 7 tier and won't be out in any sort of reasonable volume or price until next year.

>waifu2x
You can use your CPU, more cores = faster.
There's an openCL implementation of waifu2x to use AyyMD GPUs

opencl produces terrible image quality, use it yourself. If I wanted to upscale over 30000 images using a CPU will take over 10 days. With a Nvidia GPU using CUDA, less than 12 hours, even faster if you've got a RTX Titan

???

>Can't save money for 5 months at most of true
>Can't resell a lightly used card at a profit

hothardware.com/news/amd-rumored-launch-7nm-navi-gpu-month-after-ryzen-3000
wccftech.com/exclusive-amds-7nm-navi-gpu-will-launch-roughly-a-month-after-ryzen-3000/

|
\
\
\
\
\
\
\
\
\
_________\
|
|

anything higher then spec is an overclock.

imagine being at computers

>Still GCN
Fresh outta hecks. Only the following gen brings a new arch, and only then we'll see it they can stare at nvidia in the eyes.

Intel crap can only do 2666mhz

It's Zen+ equivalent for AMD GPUs. A minor, intermediary update.

U seem MA

i hope amd does well, but historically this company fucks up often.

>nvidia is absolute shit with drivers
>interested in AMD
LMAO

>this company fucks up often
How is it there, in 2012?

>still no cuda support

See .

Unless Navi switched to chiplets it's gonna suck, don't even try arguing. Zen 2 seems very promising though.

What's the current state of vulkan compute by the way?

Attached: bda9488eea1214d29b5091e7882b816e098c9ed281808e3ce881970be0024bda.jpg (1041x1238, 294K)

X4 is one of the very first full Vulkan-native titles and it eats up as many cores & threads as you can throw at it, because there's a fuckton of persistent and background content that revolves in real-time regardless of you. Almost same is with Star Citizen. Absolute majority of modern emulators also moved/moving to Vulkan as the main API of choice. Vulkan is here to stay and it's the future, while DX12 is a dead-on-arrival still-born garbage. And OpenGL is pretty much dead in the water.

I asked specifically about Compute, user.

How is constant hammering of all available cores & threads with background instances/physics/models/scripts is not a heavy computational work? It differs little from farting @ home, MySQL, AI imaging, or other "serious" shit.

You don't even know what I asked about then.

Nice non-argumentation, kid. Fuck off if you're unable to digest information provided, then.

s-someone told me yesterday that the 2600x would be no better at 1440p 60fps than my 2600. I would love to have those extra frames desu

>2600X would be no better at 1440p 60FPS than my 2600
Dude, 2600 is, like:

Attached: 587345j45n498.jpg (1068x930, 401K)

Mount stupid.

I meant r5 2600x vs r5 2600..

Sorry, I didn;t read it properly and assumed it was the 2600x because it's clocked at 4.2ghz. That's what you get from the 2600x though right? so I'm guessing it would be the same..

>heh, I called him a kid, that’ll show him
>said the 14 year old

It's pretty much same stone, just slightly better binning and thus higher clocks/OC potential.

>That's what you get from the 2600X though right?
2600 won't go to stable solid 4.3GHz under FCWCL, but it can pretty much do 4.2GHz on top tier (Noctua, Cryorig, etc) air or Swiftech AIO. 2600X simply guarantees the clocks due to better binning, so it can technically pull off 4.3GHz on top tier FCWCL. They both can Turbo to 4.2GHz, however. This is pretty much the highest point at which 12nm can get, without LN2/chillers. Gen1 Zen was able to 4.2GHz only with absolute golden lottery and under FCWCL, so 12nm Zen+ is a slight upgrade from that. However, what Zen+ definitely improved a lot on in comparison to gen1, is CCX cross-interconnect latency. Zen+ is a very solid gaming platform (hell, 2700X is already in a +/- 5% territory off the 8700K in gaymen performance, which is margin of error, while being way more efficient and quite a bit cheaper than Inturd's overpriced garbage, and while having full backward and forward compatibility on chipset/socket side). And Zen 2 will finally fully close any remaining gaps and will utterly BTFO Inturd on all fronts & fields.

Attached: 7567567568.jpg (1071x885, 366K)

Kys, retard. I've been building my first PC configs completely from scratch when you haven't even been born yet.

NOOOOOOOOOOOOOOOOOOOOOOO

images2.imagebam.com/3f/26/60/c63a8f1150545954.gif

Attached: 1543532926643.jpg (184x184, 34K)

Everyone already knew that.

Why AMD stock is dropping so much then? It's already -2% today.

STFU INCEL, it’s normal if it drops for one year straight!!!

using completely different ram speeds for each test????

kek

2933 is Zen+'s minimal spec (gen1 Zen had it at 2133, so they've upped it quite a bit).

SR and RR are 2666 JEDEC.
PR is 2933 JEDEC.
Matisse is 3200 JEDEC.

yes, amd just works on linux and macos

>SR and RR
That's APU. They have way lesser specs. Hell, 2400G can't even run GPUs in full x16 PCI-e, because lack of lanes.

Summit Ridge isn't a fucking APU.
>They have way lesser specs
The IMC and DDR PHYs stayed the same.

it was just a lain joke, chill

the rumor seems to be that all of the chips that can reach decent clocks get turned into 2600x

They better make 1080p gaming at 500 dollars

You can game at 1080p maxed for ~250$, thanks to Zen APUs.

Get off my board

Daily reminder

Attached: 1460411916325.png (1100x1002, 729K)

I've been here before you've even been born, you little sack of shit.

That's bullshit and I don't believe it

U seem MA

Stay in 2015, then.

Intel is still in 2012 though, at least in terms of IPC

WTF THIS CAN'T BE HAPPENING