Literally the only thing at E3 worth watching

Literally the only thing at E3 worth watching.

Attached: AMD-Next-Horizon-Gaming-Event-Feature.png (1594x765, 1.32M)

Other urls found in this thread:

youtube.com/watch?v=ZGj-8mFmyPI
twitter.com/AnonBabble

>cyberpunk 2077
ffs I've been waiting for it since 2013 and I don't even play games anymore.

but they added Keanu

I mean it's shaping up to be a great game but for me the sci-fi futuristic setting does nothing. Low medieval fantasy is always best

Cyberpunk looks pretty good desu. Hope they don't slap gimpworks and gayhairshit in the last moment (well very likely they will)

my erection for Lisa Su is infinite
she will release 2080 GTX -tier Navi card and 16 core ryzen 2

Attached: 1558920002589.png (1260x709, 781K)

and? It neither bothers me nor excites me

it's the CD projekt red I trust. But I don't feel like playing games anymore, I just lack the free time I had earlier.

when does it start?

In a bit more than 12 hours.

at what time is the AMD presentation?

Attached: 1558727546800.jpg (1021x503, 168K)

Literally slower than NVIDIA midrange

lmao

>she will release 2080 GTX -tier Navi card
nope, not going to happen, unless they cherrypick games
>16 core ryzen 2
now that one was pretty obvious and it's pretty much confirmed at this point

Attached: AMD-Ryzen-9-3950X-16-core-CPU.jpg (1569x473, 127K)

> 2070
> midrange
lmao

it might not even come close to 2070 performance
2070 is midrange, top 2080 is high range and 2080ti is enthousiast range

anything below that is low end unless you are a third worlder or play 1080p @ 60fps or something

>it might not even come close to 2070 performance
it's actually 10% faster than the 2070.
>2070 is midrange
2060 is midrange... The 2070 is $480 ffs.

>it's
>source: my ass

480 is cheap for a gpu in 2019 retard

the 2080 ti is now €1400 so 480 is a steal

If you wanted to be realistic with the hardware that people have a 1060 is mid range,1070 is high mid range, a 1080ti is high range, and anything above that is in the enthusiast range

>gaming
Go back to

>All RTX are for enthusiast

fuck off

Somebody's getting robbed all right.

Pretty sure a 1080ti is more powerful than a 2070 jackass

>Yeah! No gaming talk! This board is for circle jerking about python and linux!
Fucking off yourself, my man.

youtube.com/watch?v=ZGj-8mFmyPI

>it has more gb so it's more powerful!

wrong, there is also something as speed and gddr6 is much more superior and future proof

besides you cant find new 1080ti's

This, RTX is for retards.

>all that microstutter on the RTX
This can't be real.

Going by your logic the radeon 7 is the best gpu on the market

>one game
i can find you a game where a gtx 1660 is faster than a 1070, or a game where a 1060 is faster than a rx590, or a game where a vega 56 beats the 1080
the only good way to compare performance is to get games that weren't really optimized for either nvidia or amd and make an average

Without the existence of goyworks and bribes, it would be.

>480 is cheap for a gpu in 2019 retard
no it's not.
consumer chips should never pass the 500mm2 mark, as they become exponentially expensive.
novidia couldn't compete with the terrascale gpus and repurposed fat dies as consumer gpus. do you remember the firehazard called 280? that thing was 500mm2++ and competed with a sub 300mm2 4870.
from then on, novidia competed with quadro cards placed as "higher end".
even their quadro cards where lagging behing in terms of gpgpu support and ofc 3D graphics api support. DC, DX, opencl, opengl... and even with a bigger die, they had most of the time way less computational power.

so, expensive cards are novidias way of "competing" in the consumer market.

>780 costs like 400 which was top of the line
>now a 2080 which is top of the line now costs double that

if they have something that competes with 1660 perf per watt I will buy it

>5700, slower than the 2070 outside of AMD Vulkan demo's, no Ray Tracing
>$499

JUST

The 780 was also a higher end chip, the 2080 is literally the mid tier one.

x70 has never been midrange, retard. Go kill yourself.

nope, you only have 780ti or 2080ti that are better

so the *80 are usually high end cards and the *70 mid range and *60 low range

yeah they are fucking retard

LITERALLY EVERY FUCKING GENERATION

*80ti = best
*80 = good
*70 = middle
*60 = cheapest/lowest

780/780ti/titan 560mm2
got fucked in the ass by tahiti, 350mm2

780 vs 2080 is 560mm2 vs 770mm2
your money is wasted on extra mm2 for fake ray tracing, useless upscaling called dlss. that's why those cards are way more expensive.
novidia never learnt how to engineer they gpus. they released products ridiculous costs and sizes, underperforming most of the time and being fire hazards and they fixed everything with s/w and marketing.

ok you should make gpu's then

retard

I'm not talking about that. The gpu was the gp104 in the 1080 which is the mid tier variant that was for _60 cards a decade ago. All the high end ones like 100 or 102 go into the corporate cards.

Same as you. Barely play half an hour a day and will build a new PC for this one

he is right you now.
*80 started as the top notch card
*70 was high end.
during the gtx 200 days, novidia got shoah'd by the hd 400 and pushed in the market the 285 and 290.
actually, my job is to design digital circuits.
I'd probably end up making something like tahiti, the swiss army knife of GPUs.

Why the fuck did they use that dies space on the special RT cores instead of more general purpose cores that cant trace or rasterize depending on demand?

Yeah no fuck off faggot retard nigger
80ti best, 80 & 70ti high, 70 high-mid, 60 midrange, 50 is low range

70 was where enthusiast range started, you child.

>now that one was pretty obvious and it's pretty much confirmed at this point
I doubt they'd hold it back from Computex just to announce it a mere two weeks later.

*70ti has never been a thing before the 1070ti series zoomer

Not him but aren't you forgetting the *50? 750ti, 950, 1050(ti). That should be entry level instead of *60. 760, 960, 1060, and so on are typically low-mid to mid range. *70 has always been way too pricey to be considered just "mid" range.

Yup, cyberpunk and vtm 2 have be hard as a rock.

the "RT" cores are in fact accelerators for meme learning and meme intelligence... which means that they are fp16 and fp18 cores mostly.
so, last year when they didn't have any new feature to present, they just claimed that they managed to create RT accelerators.
6 months prior to that, the latest movie to use RT in their CGI used a $100.000 AMD GPU supercomputer... yet novidia came and said that they rendered real time complex scenes from star wars.
what's the verdict?
A 22 year old game with ray tracing barely runs at 1080p with those "RT" "accelerators.
They tricked their customers into buying RT, but instead they bought another way of faking light... you know, approximate light behavior and fix it with post processing effects. they BS'd their customers with "meme learning algorithms" to denoise the image. Which is bullshit. RT implementations don't need fixing, RT doesn't produce noise.
It was a marketing stunt to sell tensor cores as "RT" "meme learning" and "DLSS" and it seems that they are getting away with it.
imo this is worse than the 3.5GB 970 fiasco.

>105W
literally criminal.

I wish the next 7nm cards of Nvidia in 2020 had a variant that leaves out meme cores for more actual cores, but it seems like they're doubling down on their shitty meme nobody asked for.

Attached: 200% MAD.jpg (600x884, 52K)

2070 for $300, come on! AMD don't waste the opportunity!

The rumor is $300 2060 and 2070 being likely to stay the same price. I wouldn't get my hopes up. Plus they would somehow have to fit 1660, 1660 ti, 2060, and 2060(super) under that. While still upholding their jew standards.

AMD going to talk aobut ray tracing variant at E3.
Here is what is going to happen in this mid console generation time: Nvidia double down on RTX memes, AMD announces ray tracing without ray tracing hardware(in form of open API or in engine integration like crytech demo), Nvidia keeps bribing devs for PC, 2 years alter nvidia gives up and goes open source with RTX.
HBAO,Gsync,HairWorks,PhysX. The cycle continues.

and they are going to lose again. they must be aggressive like with ryzen or market share will never change.

why? you had a kid?

still fucking hours away

I wish, i knew when Nvidia marked up the rtx line that amd would rise to meet them there

but even your nshittia cards cannot push constant and stable +144 in AAA titles at 1080p let alone 1440p
80 fps and 75 fps, both are in the same bracket. But faggots like you will still say "hurr, it's 6% faster!" LMAO

There are some things only achievable with RT, very few things actually

The cool shadows, mirrors, refraction and off screen reflections.
Those are cool effects and RT indeed is the future, just not the present though/.

We made it user, gotta watch out to not die until after the 2020 release.

b-but this is real ray tracing. REAL! This is completely different from those other lights! Also completely different from those quasi ray tracing mods that are coming out for games. Shut up you deluded AMDrone! RTX is REAL TIME RAY TRACING. REAL! and REAL TIME! It's in the fucking name!

is 4.7 GHz actual boost
or that short burst on single core crap intel is pushing?

Reminder that rey tracing only gives you realistic shadows and who the fuck gives a shit about shadows.

Absolutely no fucking way. They have no reason to undercut by that much. Even 100 will do, but that’s still expensive. Only reason I’m waiting for these cards is to get used v56 for dirt cheap. Not spending more than AUD$400 for a gpu (running rx480 on 1440p atm).

> real and ray with caps.
>must be true
game graphics have been approximating light behavior because they don't have the horsepower to calculate photon trajectories.
novidia just released another algorithm that simulates light behavior, just like every other algorithm that is used thus far in every game.
novidia can't even render tetris with ray tracing.
we are far from _real_ ray tracing.
meme tracing, though, is a thing and you can purchase it for $1200.
we gotta thank the idiots who pay to be beta testers and paycucks on those memes now, so as to fund r&d when the real thing will be s/w and h/w possible in 20-30 years.
thank you paycuck for paying for fake ray tracing technology.

Wanna buy my refurb 56 ref (Samsung memory )for 400 dollararoos

Slap it on gumtree and wait a couple a weeks, who knows I might pick yours :^)

Cyberpunk looked nice though. There wasn't really anything else. Microsoft talking about how their new console didn't interest me at all.

everyone, did you play gothic 2 with dx11 mod?

that's because everyone knew what is going in them since AMD tech is open books at this point

>Get the fuck away from W10 not too long ago
>Feel the urge to get back there for some stupid fucking games

This is horrible

you could try to do that passthrough meme

When is their stream?

That's more of a chore than that

>AMD going to talk aobut ray tracing variant at E3.
>Here is what is going to happen in this mid console generation time: Nvidia double down on RTX memes, AMD announces ray tracing without ray tracing hardware(in form of open API or in engine integration like crytech demo), Nvidia keeps bribing devs for PC, 2 years alter nvidia gives up and goes open source with RTX.
>HBAO,Gsync,HairWorks,PhysX. The cycle continues.
Nvidia will probably push for fully raytraced games rather than just raster engines with a few raytraced effects added on. They have several issues though, they would need GPUs made almost entirely of their RT ASICs. Everything RTX related is proprietary, no one will make fully raytraced game engines that only workn on nvidia hardware.

If they weren't so greedy and shortsighted they would have developed some sort of open specification and API for raytracing ASICs, other companies would incorporate it however Nvidia would have a big lead and influence of the spec.

>fully raytraced
kek. They can't even "fully" run their quasi-raytracting gimmick with their high end rtx cards, are here you are, hoping that they'll push for fully raytraced gaymes.

Can't wait for the next 300w turd plopped out by AMD.

I hope AMD put the foot on the ground this gen and pushes nvidia out with their console dominance. because all those nvdia meme technologies waste resources and do not make it look better while gimping performance.
AMD should've done it when they go foot in console market, all graphics are defined by their hardware.
I say this mostly because I hate nvidia hair tech so much, it looks way uglier than AMD which runs on consoles fine in recent games but I have to deal with hairworks shit on PC port.

Attached: 20190607123629_1.jpg (2560x1440, 462K)

>nvidia
>console dominance
lol what?

>AMD pushes nvidia out with their console dominance
did I fuck up syntax? probably did.

Dude I hate to say it but you got roasted by

dual boot is your friend.

Link and time?

Sadly novideo gimp train runs on bribery and AMD doesn't have neither money nor connections to pull it off. I mean, even if AMD started their own gimp train we wouldn't benefit from it. We benefit from AMD making open standards and technology, but so far it didn't pay off for AMD.
Basically what you want is AMD paying game developers for doing nothing, and AMD doesn't want that. Meanwhile nvidia pays developers for gimping performance.

In 8 hours

Damn all those drones making up bullshit like "amd raytracing" make me experience shame. I wish I didn't associate with them.
Isn't it obvious it's DXR implementation by Microsoft that runs on a CPU? Why else would they put so much cores into a gaming console?
It's not AMD raytracing, it's MS raytracing and that's a good thing.

>tfw gonna stay up all night and watch movies with snacks before AMD E3 live stream

gonna go down to the pub and eat now aswell

Attached: Lisa Su.jpg (757x627, 87K)

gday cunt

Attached: 1547274284216.gif (300x300, 3.6M)

It will be trash. With no third person viewing mode, where's the element of RPG?

>spend 10 hours customizing your characters
>never get to see how your characters look
Is this the power of retards?

>CD Project RED trying to pull a "GTA V" and release on PS4/Xbox One and then again for Xbox Two and PS5

cringe and bluepilled. I'm gonna torrent it if it has Nvidia GimpWorks enabled. If not, I will buy deluxe edition

that would be fine and dandy, if not for Sony also having promised this feature.

>rpg is supposed to be third person
go back to >>>/reddit/ and kys you autistic fuck

>kek. They can't even "fully" run their quasi-raytracting gimmick with their high end rtx cards, are here you are, hoping that they'll push for fully raytraced gaymes.
RTX combines the worst of raster and ray tracing, you end up with the performance bottlenecks of both. Ray traced engines don't care much about polycounts for example, you can have 100million polygon terrain with no LoD, as long as the ram is there not much changes.

Brissy?

Attached: 1560180550.jpg (680x363, 40K)