IT'S HAPPENING

twitter.com/NVIDIAGeForce/status/1131575511833022465

IT'S HAPPENING

Attached: Super.png (641x494, 87K)

Other urls found in this thread:

khronos.org/conformance/adopters/conformant-products
anandtech.com/show/7582/nvidia-gsync-review
vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
twitter.com/NSFWRedditGif

>itsafraid.png

> hopes

Goddamn 7nm GPUs

> reality

Some business shit only a few people actually care about. Or rehases, Or 7nm GPU with retarded high prices.

>nb4 super Geforce Experience that allows you to pay """free""" games for a monthly subscription

go back

super expensive gpus cant wait

a GPU that will get you laid?

I was about to buy a 2080 Ti ftw 3 now I have to wait untill they either announce some marketing gimmick or 7nm gpus...

is value for money a foreign concept to you or do you have so much money that it has seemingly lost its value

It'll just be another gtx 1080 ti tier gpu for 3 times as much as you paid in 2016, and a new useless proprietary feature that like 3 token developers will use for one major title and never again. Remember PhysX? Remember Gameworks? Remember 3D Vision? Remember SLI? Remember G-sync? Remember RTX?

Attached: 1549319370637.png (1406x911, 1.97M)

When does TSMC start with the improved 7nm production? We're probably only going to see 7nm cards once prices are going down, so Leatherjacketman can jew people harder.

It won't be 7nm, NVIDIA is always one process node behind AMD.
Probably a NVIDIA Geforce 2080 Super Ti for $3K

I'm just an (((enthusiast)))

>Remember G-sync?

GTX 1880ti 11gb (2080 without tensor cores), for non stupid price

AMD will not even see what hit them coming.

No hope for this
Incoming disappointment just like everything after the 1080Ti

>All those butthurt AMD responses on the tweet

Fucking kek, how does a company that sells so few GPUs have such a vocal annoying fanbase?

>AMD will not even see what hit them coming

Attached: 1558028082430.png (408x286, 10K)

Can't wait for the 1650ti, as powerful as the 1060 3g for 60 dollars more

pretty sure it's nvidia hatred and not amd fanboyism

physx is used in so many games its not funny, the thing is, its the most paired down version it can possibly be. if a game has some physics in the game but does not disclosed who, its physx.
gameworks is baked into unreal 4 along with physx
3d vision was a way to make something that could do 120fps 3d through active glasses, then every movie studio shit out post conversion garbage and no one cared about 3d anymore, fucking sad really.

>implying there's a difference

Not really. I heard it's some obsolete proprietary technology that was supposed to replace vsync but never actually worked.

How do you know Navi will be decent? Nvidia marketing activation time frame.

i'm not sure either

Attached: 1526068109619.png (400x384, 220K)

there is if you're not a 12yo mentally

pretty sure it's still havoc that gets used in all those games.

I was hoping that it would be a RT accelerator card that would use a 4x PCIe slot and you would be able to have much much better Raytracing even at 4K res. But hey, this is nvidia, all they care is how good their cards does in DX11 games and take all the shortcuts like not rendering everything in the screen or compressing things so much that even an RX570 would look better despite having lower fps.

Dual GPU using NVlink 200% scaling

inb4 it's superAi

Super Woodscrews

I JUST GOT A 2080 Ti, don't you dare release a new GPU line

underrated post

Attached: nvidia-money-rain.jpg (800x500, 236K)

and what did we learn

>business
>nvidia Geforce
does not compute.

>gayming

Nice FuD, but this compression bullshit has been proven wrong. AMD just can't code

Actual PhysX has gotten used in only a few dozen games in like 15 years. Would've probably been far more popular if Nvidia weren't so fucking greedy about their licensing. It did however make major contributions to advancing general implementation of physics and particle simulation in games through competing software-based offers.

supereme x nvidia collab

Wouldn't that just be a repeat of the failure of dedicated PhysX cards? Even if devs ever modernized their code to make it practical, you'd probably still be better off with multigpu rather than an ASIC.

My money is on a 2070ti

Other cards make no sense, something of them would have leaked and nvidia won't announce new cards when they can't sell them.

Relax they won't release anything good now. It's just some shitty card like a 2070ti to steal amds time in the spotlight. Still you're a moron for spending >1000$ for a graphics card

My money would be on a GTX 1670 because fuck paying extra for RT and DLSS cores that I'm NEVER FUCKING going to use.

Are you retarded? AMD was hustling to copy it as soon as it was announced

Uh nvidia wasnt greedy about their licensing. They went on record saying they offered it and CUDA for pennies per GPU and AMD let it go to voicemail and never called back. This is on record, AMD even confirmed they didnt bother to call back and trashed CUDA for being proprietary while backing a proprietary Intel physics middleware (havok) instead

Real talk
khronos.org/conformance/adopters/conformant-products

Nvidia submit confidencial product for vulkan conformance test, no xavier,no jetson nano, some ARM chip.

Super must be new branch shield TV and future SoC for Switch revision.

Oh the pic

Attached: 4FAEE12A-5BBE-45F7-A5C5-38ABD52D5E4C.jpg (1338x332, 118K)

PhysX was a failure? Why did Nvidia bought it?

RTX rebrand.
1670, 1680.
$2,000+ GPU.
More AI and car shit nobody cares about.
A cloud gaming device powered by their useless Tegra CPUs.

In that order.

Physx was a success.
Dedicated Physx cards failed.

This has nothing to do with AMD's GPUs
More likely, it's a response to Huawei's CPUs

what does that have to do with anything. It still never worked

Geforce is specifically their consumer GPU brand though. CPUs would be more likely to fall under Tegra

Any game that uses accelerated physx also uses CPU physx. It's a package, acceleration is optional. Unreal, unity, and iirc some other engine use the physx package. That's probably a majority of games out right now and a good chunk of AAA

yeah but when and where? computex?

it has nothing to do with gayming you autist

Archetype of a Linux user

Attached: 2019-05-23_15-37-50.png (544x112, 13K)

Of what? The Navi release being overpriced fucking trash?

AHAHAHAHHAHAHA, JUST KEEP WAITING IT'LL BE BETTER NEXT TIME I PROMISE!

a marginally more complicated piece of silicon that increases the amount of frames you get while playing fortnite from 62 to 68

he's not wrong though
if you're using linux amd gpu is a better choice

Oh shit. Overlooked "geforce" in the tweeter's name

wow can't wait for the super overpriced series

New gsync monitor, calling it here
Thanks LTT

>never worked

What the fuck are you talking about? Works fucking great.

I know what it is! It's a Super Titan RTX 3080 with 48gb gddr6x vram, 300k tensor cores, onboard storage that contains offline deep net algorithm for ultra fast DLSS and it's going to be cheaper than the vega64.

>AMD was hustling to copy it as soon as it was announced
if this is the state of the average consumer then I think I understand how we got where we are now
there just isn't enough manpower and time to set these fucking retards straight, and they'll just buy nvidia no matter what

>spacing
>capslock

>he doesn't know the 2080ti performs identically in 3D rendering to a Titan
wew. Blender and Maya(?) are both looking to integrate RTX features into their pipelines.

it's literally overpriced garbage compared to a 1080ti

>amtard reinvents history

Nvidia literally announced and released g-sync a year before freesync and all AMD was a fake demo they didn't share any real details about while nvidia had functional monitors from partners you could preorder

Imagine being a dumb child from /v/

The rt cores are being adopted by a lot of renderers that otherwise wouldnt even touch a GPU for rendering

gsync is a proprietary implementation of adaptive sync
AMD didn't hurry and copy gsync, you absolute mouthbreather

a lot of good that's going to do when their meme doesn't catch on and tensor-cores are dropped from future consumer cards

lol stealing AMDs navi thunder.

It's not that I love AMD, it's just that I despise Nvidia like no other. They could release a 2080Ti for $200 right now and I still wouldn't buy it. Fuck those guys and fuck everyone that bought Fermi, what a stupid good company, I hate it.

Lol yes they did retard. Nvidia detailed exactly how it works. The only proprietary part was the g-sync fpga and the brand name. Freesync is a shitty clone of g-sync except instead of using a custom fpga to handle handshakes they let partners make up their own controllers and piggybacked off an existing VESA protocol. That's why early freesync monitors had shit ranges like 48-60hz and 72-120hz while g-sync monitors launched a year earlier with like 30-120hz ranges. It's a shitty knockoff lmao

Who the fuck are you people that really need a 2080Ti? Something like Vega 56 or 2060 is just fine even for 1440p. The fuck, do you always play unoptimized trash?

Tensor cores arent even used for raytracing. Virtually every implementation uses standard denoising techniques although for professional visualization theres no real time requirement so they can just sample endlessly until it looks good, so denoising isnt even a real issue. You literally have 0 clue what you're talking about because you're a child

Raytracing on 128 core CPUs when

Attached: Amada erasing the kikes.jpg (218x250, 7K)

Well "fuck paying extra" is definitely not nvidias slogan, so go figure

>the only proprietary part of a proprietary implementation was a proprietary component hurrrrrr
motherfucker do you ready the shit you type?
how can something be a copy when either one were going to implement adaptive sync and one was always going to get there sooner?
freesync was never about vendor lock-in

Did you even bother to read the sentence I just strung out for you? RTX could make real time Eevee graphics much more accurate, much more comparable to Cycles without waiting for 10-20 cycles samples to process before you can actually see your render. It'd be quite a fucking asset.

That's without mentioning that it's 30% faster in Cycles at stock, beaten only by the Titan RTX, and only marginally. Ostensibly between Eevee realtime and increased performance and more features (AI denoise integration into cycles for instance) the 2080 Ti is pretty reasonable if you're not a Gaymeboi. And it's the only GPU that can reliably 144hz@1440p16:9 on practically every game.

Attached: Untitled.png (1527x1078, 607K)

I didn't mention raytracing

what business do serious prosumers have buying this shit, mr larper
who the fuck are you kidding

>MUH ADAPTIVE SYNC
anandtech.com/show/7582/nvidia-gsync-review
>gsync 2013

vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
>adaptive sync 2014

More like adaptive sync was the copy of g-sync. Nvidia was demoing working hardware half a year before and was showing off monitors in January 2014 at CES, and available for preorder shortly after, months before adaptive sync was even a thing

You're fucking clueless

I only buy the best. when you have money you can do stuff like that

I buy the top dog GPU every 2 generations. I paid $800 for my EVGA 980Ti classified on launch. $1500 on my EVGA 2080TI FTW3 Ultra on launch. Before my 980Ti I had a 4GB GTX 680. I play at 1440p and originally I only wanted 60fps 1440p. Now I game at 165HZ with G-sync at 1440p easily. Honestly if you have the money, why settle for "good enough"?

why would nvidia need to wait for the process of a technique getting adopted as a standard when they want to vendor lock consumers through their proprietary implementation?
you actually believe nvidia came up with this technique, don't you?

you're not me

or am I? But you're right. I'm another user with a 2080Ti.

T-T-T-TOP DOG

wishful thinking

SUPER DUPER WITH THE BIG TUPER

>all the nvidiso'ys getting jazzed up for a $3000 RTX Titan(TM)
lmao you goys are adorable

Super GeForce RTX2 3190 Big Titties Edition

>INTRODUCING: "CYCLE CASH™"
>The more you use your machine at high speeds, like when playing video games, you earn Cycle Cash™. Earn enough Cycle Cash™ and you can unlock previously restricted abilities on your GPU!

Attached: PC Fire.jpg (739x415, 33K)

Its going to be an announcement of a new OS agnostic game oriented development platform designed around Vulkan and network transparency to improve gaming with thinclients like browsers and Android based consoles.

if you could just kill yourself that would be great

>Nvidia ever being agnostic and transparent

Nvidias vr headset

Network transparency of course being the API you can use with their proprietary libraries that are the same as Open source ones only better and with more features.

Games will be Super Geforce certified, the way it's meant to be played.