RTX Ball Super

wccftech.com/nvidia-geforce-rtx-20-super-graphics-cards-specs-launch-leak/
>NVIDIA GeForce RTX 20 ‘SUPER’ Lineup Allegedly Launching in July – Specs For RTX 2080 SUPER, RTX 2070 SUPER & RTX 2060 SUPER Leak Out The latest information comes from Chinese ‘Weibo’ forums where a user has posted what seems to be alleged specifications of at least three NVIDIA GeForce RTX 20 ‘SUPER’ series cards. The GeForce RTX 20 series refresh has been a hot topic since NVIDIA posted their ‘SUPER’ teaser on Twitter but it looks like they didn’t follow up on it so now the rumor mill is in full steam, leaking out specifications for the upcoming lineup.

>The top of the stack in the ‘SUPER’ lineup seems to be the GeForce RTX 2080 SUPER which is going to feature the full TU104 GPU die known as TU104-450. The chip will feature 3072 CUDA cores, 8GB of GDDR6 VRAM running at 16 Gbps and a 256-bit bus. The pricing is said to replace what the RTX 2080 currently costs while the original RTX 2080 would take a price drop to around 4000 (Chinese) Yen which is about $549-$600 US which should be a $100 US drop in cost and what we have been hearing for a while.

>The GeForce RTX 2070 SUPER is said to feature pretty much the same specifications that we posted a few days ago which would be 2560 CUDA Cores and 8 GB (14 Gbps) memory. The specific chip is said to be TU104-410 GPU. The most interesting part is probably the GeForce RTX 2060 SUPER that is said to feature the full TU106-410 GPU.

>The difference between the current RTX 2060 & the Super one will not only be the higher core count of 2176 CUDA Cores but it is also rumored to feature a higher 8 GB GDDR6 VRAM along a 256-bit bus instead of 6 GB GDDR6 VRAM along a 192-bit bus. It is possible with TU106 since the full GPU die does allow for 256-bit wide bus interface and 36 SMs. The RTX 2060 SUPER still won’t utilize the full 36 SMs that the TU106 die but rather utilize two less SM units (34).

Attached: RTX Super.png (1084x592, 54K)

Other urls found in this thread:

rra.go.kr/ko/license/A_b_popup.do?app_no=201917210000116990
rra.go.kr/ko/license/A_b_popup.do?app_no=201917210000116975
en.wikipedia.org/wiki/GeForce_256
anandtech.com/show/14110/samsung-introduces-hbm2e-flashbolt-memory-16-gb-32-gbps
twitter.com/NSFWRedditGif

Man i thought we'd get a Dragon Ball Super anime with Ray Tracing or some shit, fuck you.

So what happens to the 2080 TI

>Man i thought we'd get a Dragon Ball Super anime with Ray Tracing or some shit, fuck you.
Honestly would be a more exciting usage case for the RT cores in these cards than they have seen had since launch.

>So what happens to the 2080 TI
Goes into Jensen's Hyperbolic Time Chamber to undergo more training and become the RTX 3080

Attached: 1469491086453.png (437x364, 277K)

>unironically thinking tge jews will release a 2060 with 8gb
there's a higher chance of some meme format like 7gb
also, do you really think retailers will drop the price of normal version? be ready to se a 40-50$ increase.
RTX 2060 SUPER 7gb at 350$ - screencap this

Attached: oliver_hardy.jpg (600x450, 26K)

rra.go.kr/ko/license/A_b_popup.do?app_no=201917210000116990
rra.go.kr/ko/license/A_b_popup.do?app_no=201917210000116975

only two SKUs have passed certification so far

Oh God please rtg for once in your life don't completely suck and give us a decent architecture so nvidia actually has to try to make decent cards again.

RTX Titan Super in SXM2 format when?

Attached: Tesla_V100_SXM2.jpg (3840x2022, 794K)

What is it with these names ffs.

No 2080 Ti super? Will the 2080 super beat the 2080 Ti?

I hope it don't have a start like dragon ball super.
Those initial 20 episodes were quite a mess.

So a minor price decrease and a minor performance increase, who gives a shit?
Nvidia must be scared about Navi

en.wikipedia.org/wiki/GeForce_256

It's gonna be launch to celebrate the 20th anniversary of the original GeForce from 1999

Attached: supa.jpg (872x426, 105K)

Tempted to power mod my 2080ti but honestly even when it under clocks to 17xxmhz core it doesn't break a sweat at 3k 100hz and sips power under 300watt and won't go over 2ghz core anyway at stock volts comfortably

I am very surprised we don't have 7nm gpus from nvidia yet
Can't see the point of making the card use 300-500watt for sustained clocks for a 10% perf boost when the real limiting factor is still memory.
I'll get a 7nm turing gen 2 next year or so when their cheap and have more vram

Looks modular as fuck.
I wonder if it could use the super fast hbm as a cache and have slower gddr6+ as a sort of gpu ram like CPUs have now for system ram
Like have 8gb of hbm2+ super fast memory for the shit that requires it then 16gb+ for assets and shit that don't need as much fast ram.
Would be much quicker than waiting on ddr4/ssd memory speeds to fetch the same content and allow more dynamic memory management in tiers

No, because the memory controller is in the HBM memory stacks, you're just wasting die space

Besides, with future HBM2 memory stack, you can have 64GB cards

anandtech.com/show/14110/samsung-introduces-hbm2e-flashbolt-memory-16-gb-32-gbps

On the topic of rtx what the fuck is u with nvidias drivers ei h turing on Windows lately?
16gb 400gb/s+ sounds fat as fuck I'll hopefully be able to flip my stock 2080ti and trade up next year
If only there was a way to disable the faggot power limiter in software apparently custom bioses don't always work and there's some hardware limp mode over volt/amp protection built in when you shunt mod it

Thank god, I almost bought a 2080 a few weeks ago. I can wait a few months.

I got a 2080ti cheaper than 2080 high end bin models where going for here over 1k monopoly money lmao

calm down Mr. ESL and type that again, but SLOWER

It remains like 30% faster than the new 2080 super

Unless this drops 2060 prices Poorfag me couldn't care less.

>no RTX on/off jokes.
I'm upset.

>8gb

I'll stick with my 11gb 2080ti, thanks.

amd justed itself when they refused to license cuda for pennies.
Nvidia could release cards that are 30% slower than amd's and dl/simulations would still buy nvidia because the cost of switching everything from cuda to opencl (even with cuda transcompilers which suck) would be much larger than the lost potential gains from the performance difference.
Unless nvidia pulls an intel and stops innovating for nearly a decade I don't see them losing their dominant position.

>I'll stick with my 11gb 2080ti, thanks.
>11gb
cope with your meme ram size

Attached: copingwithloss23889330.png (500x397, 83K)

Rda is still gcn based so no it's gonna suck.
Post gcn is next year and meant to compete with turing
11.3gb fag here memory bandwidth and latency is a bigger issue than muh epic 16gb+ bs that no game makes use of at 4k res
The radeon 7 is useless at games barely faster than a 1080ti 2080 yet has double the ram and bandwidth yet can't make use of it due it it's shitty architecture very few game engines can truely use properly.
12gb+ are workstation cards desu games and modern vidya drivers use texture compression/streaming most of time hence why we've been stuck at 8gb for the past 5 years

it's better having and not needing than needing and not having

Attached: 1412324895681.png (1334x750, 2.06M)

3072 Cuda cores compared to 2944. That's less than a 5% increase, not very super.

No it's not dickhead why have 16gb+ your never going to use? It's not like system memory where it tries to fill it all for max performance either so most of the time games only use 2-10gb max even at stupid 5-8k+ resolutions even if you downsampled from higher memory compression kicks in and bandwidth becomes more of an issue especially at 3-4k+
No point having tonnes of ram I'd the gpu and memory Itself is too slow to make use of it.
Some modern renderers like vulkan/dx12 rt dxr have found a niche where Ray tracing in some games eats ram but as I said again memory latency bandwidth and speed is more of a bottle neck than raw capacity.
So it's a Ti rebrand?

Actually thought we were going to get a sick DBZ shroud or something.

Attached: 45645.jpg (800x450, 87K)

>So it's a Ti rebrand?

Hardly. For example when the 1070Ti released it was a good 15-20% stronger than the 1070 and reflected it in benchmarks.

Unless I'm missing something this looks to be more about the jump like when Nvidia started putting GDDR5X RAM in the 1060 instead of GDDR5. Especially since Core count and performance don't scale linearly, you're looking at maybe 1-2 fps difference with these cards in most games at high resolutions.

>having to yell AHHHHHHHHHHHHHHHHHH at the card for as long as you can to get it to oc harder
Sounds kino
I'm more excited for better memory speeds tho like gddr6x on new fabs
It's like how 2080ti ocs do bugger all over 1800mhz+ in reality you hit other bottlenecks in the arch/drivers or and memory maybe an extra 10% in game perf boost if your lucky enough to get the card stable at 2ghz+ base under full load but then it uses 300-500watt with the limits turned off

>Besides, with future HBM2 memory stack, you can have 64GB cards
HBM3 will be a thing by time Nvidia dips SXM2's toes in the waters of enthusiast gaming hardware. Why would Nvidia use a a slower VRAM via HBM2, at barely more than 300 GB/s, when they could flex their nuts at AMD with 512+ GB/s VRAM?

Attached: Boltman4.jpg (350x350, 18K)

This

my gpu has 1gb :^)
your point makes sense, but then they should charge me accordingly. for that price, a 2080ti should have 32gb

Attached: 1554859342204.png (848x728, 412K)

Ah yes another dumb poorfag
Fuck me last time I had a gpu with that much vram was a fermi 660 I think

>poorfag
I have a 560ti and it works well for what I do, I can play the games I like all maxed out.
no need to upgrade =/= poorfag, you delusional faggot

Whatever I just want a 2070 for less than $400

And my 2080ti runs 100fps 10ms 3k perfectly
What's your point?
Calling bullshit on any game running maxxed out at anything over 720p 60fps on that
Stay on your meme card at pixelated as fuck laggy shit frame pacing screen too for all I care

Get a 2060 super and oc it
2070 is a useless meme card you might as well go get a used 1080ti

You should have waited. Any card that cannot run @4K 144hz isn't a card worth getting.

Nice goal post move faggot I turn settings down for 3k 100fps minimum
5k is the new 4k 16:9 is fucking aids

>Calling bullshit on any game
>games I like
are you a fucking mongoloid? maybe the games I like don't need cinematic graphics @30fps with gaytracing, you delusional faggot. still, I'm not proving any point, I simply said that novideo charges too much for their cards, and they even doom their cards with meme ram ammounts. stop acting like a monkey, you bought a 2080ti, and?
when you win at special Olympics, you are still retarded.

>RDA is still GCN based
It uses the GCN ISA, but it's a new architecture, retard.

Are you retarded?

Nvidia will never use expensive HBM for consumer GPUs, the economics simply will never work

HBM is solely for GV100 and successor GPUs, large FP64 and Deep Learning GPUs for servers, workstation Quadros and TITAN class cards

>Nvidia will never use expensive HBM for consumer GPUs, the economics simply will never work

That was the case connecting to the internet, in the 80s.
That was the case for 3D rendering, in the 1970s.
That was the case for Microprocessing, in the 1960s.
How else could you explain why I'm pic related.

Attached: 1513717284151.jpg (628x534, 75K)

> will wait until 7nm
- Too expensive..
- rtx is a meme
- still uses too much fucking power and produces too much heat
- no solid blower model technology upgrades
I'll wait. This is non-news.. I'll stay on my 1070/1080s until 7nm is delivered with much lower power util and price. Until then, nvidia can fuck off w/ their meme upgrades. Their sticker shock pricing of RTX on release literally caused me to wake up. Fuck them. Have no need for this shit.

^this guy gets it.
15-20%.. as if I need a performance boost. I am on 60hz IPS 27" which I absolute love in terms of picture quality.. I go into my games and limit FPS to 60 of course.. saves my room from turning into an inferno .. Power util drops by 60 Watts... fuck do i need a 20 series GPU for? much less this meme 15-20% upgrade? My 1070 can already produce far more FPS than my monitor can handle. Give me a 7nm w/ the performance of my 1070 w/ half the power utilization and you got my money. Literally have no clue wtf all this performance bullshit is about. I'd much rather have better power efficiency w/ same performance than these inferno ass meme cards that burn 200+ watts, take up 3 goddamn pcie slots, and dump all of the heat into my case because blowers aren't cool anymore. Fuck this meme performance trend.

Still not paying for the RTX meme no matter how they re brand it. Not worth the money to beta test vaporware that's half a decade away from any meaningful implementation or even the power to actually make it useful. Anyone who isn't a zoomer should remember all the bold claims surrounding tessellation and how it was going to change everything and be so cool and nvidia was pioneering. There's too many suckers willing to give up too much money.