AMD wins

AMD wins

Attached: 1548802182406.png (633x636, 324K)

Other urls found in this thread:

youtu.be/OHw9cUHCOpg
reddit.com/r/PS5/comments/a1qmvt/info_on_the_dev_kit_for_ps5_from_an_indie/
youtube.com/watch?v=_mTIPXwcDGQ
twitter.com/NSFWRedditImage

the RAM is unified memory, so it's for CPU+GPU

>2TB HDD
They couldn't even put an SSD in it?

>20GB GDDR6
RTX 2060/GTX 1660 Ti users with only 6 GB of VRAM BTFO. Ports will stutter like no tomorrow.

probably using some flash for most used files, like hybrid SSD+HDD

cringey AMD fagboy memes aside, AMD's increased marketshare in the CPU space is well deserved but their GPUs are still dogshit. radeon VII is more worth it as a workstation GPU than for gaming so when are we getting just competitive high end gaming GPUs?

>what is NAVI

If those specs were even close to accurate (they're not), FIVE HUNDRED AND NINETY-NINE US DOLLARS would be the very least you could expect it to cost.

Kind of false when it comes to consoles, the PS3 reserved 2GB of Ram for the OS at all times. It's why Bethesda games ran so poorly on the system.

yeah yeah navi navi navi but WHEN

>Kind of false when it comes to consoles, the PS3
look up the specs for PS4 you retard. the PS4 has unified memory.

can you not read? 4GB DDR4 for OS, and 20GB GDDR6 for games

20 gb of GDDR6 is insane if true

not really, flash memory prices have been dropping a lot lately because of the chinese cracking down on price fixing

AMD is the only company desperate enough to sell hardware at the lowest profit margins known to men.

HDDs are still cheaper and in consoles cost-reduction is critical. A $300 console sells a lot better than a $350 console. You can always add an SSD later, Sony consoles have always allowed this degree of customization.

I'm running a 1080 with 8gb of GDDR5X and never hit max vram usage outside of modding, I'm just amazed at the thought of what would require 20 gb of GDDR6.

Attached: 15722744_993838524049409_1317996213_n_993838524049409.jpg (206x210, 5K)

the ps3 doesn't even have 2gb ram

8 GB will barely be enough once the PS5 hits. We'll all be forced to upgrade, except those with a 1080 Ti, 2080 Ti or Radeon VII.

>I'm running a 1080 with 8gb of GDDR5X and never hit max vram
depends on the games you play. try 4K resident evil 2 remake and it won't be enough. plus, the PS5 is not coming out today to play yesterday's games on.

True if big
Seems like too much ram

Yeah, this sounds like some stupidly expensive bullshit. Sony does sell consoles at a loss, but not that much of a loss.
And still consoles won't be able into raytracing, lmao.

he meant ps4
sony doesn't allow the same customization the xbox one does because of that

>4GB DDR4 for OS
I refuse to believe that. Next gen will either use shared GDDR or just 2GB for system. There's no need for more than 2GB on a basically embedded device that only plays games and do some multimedia stuff.

>20GB GDDR6
Keep dreaming. Next gen will be 1070 tier at best, no use for 20GB for video only.

So what is the target release date for this? late 2020?

If so that will still cost $550 minimum

>memetracing
Not evem a (you)

>Next gen will be 1070 tier at best
the XBOX ONE X already matches the 1070

Attached: 1548141545913.jpg (888x901, 203K)

No one really sells consoles at a loss anymore
It may have been a case during the early PS3 days but not anymore
There is a reason the PS4 was so damn anemic that it needed new mid-generation "pro" hardware and it's because they decided they wanted to make some money and used readily available AMD CPUs that even for the time was dogshit but they costed almost nothing compared to the cell CPU of last generation

bullshit, do you see how PUBG runs on Xbone X? It's nowhere near 1070 level.

NOOOOOO DELID THIS THIS CAN'T BE TRUE I SPENT $1000 ON MY PC

normies don't care about waiting forever for their games to start but they DO care about not running out of space, and games are like 100GiB easily now because bitches can't into opus and other lossy but good lucking formats

Are you retarded? PS3 had 256MB of XDR for the system and 256MB GDDR3. X360 had a dedicated 10MB eDRAM for the GPU and 512MB GDDR3 that was shared with the system. The reason Bethesda games ran so bad was because they were developed with the Xbox in mind and were later ported to PS3 with it's alien shit that was Cell.

This is completely plausible if it's coming out late 2020.

Ram prices will be low as fuck by then and you need that much ram for true 4k considering they want at least a 5 year life cycle

Nvidiot COPE

Sure thing lmao
youtu.be/OHw9cUHCOpg

Attached: 1522022501554.jpg (223x226, 8K)

also the formula 1 games run like shit on ps3

joke of a console desu

Is this the best you can do?

Attached: C2B9C7F920F142C39977488FD6A567FA.png (200x195, 48K)

>2020
>console games will still target resolution and graphics instead of 60fps
4k gaming at 30fps

This is definitely going to be a 450 min console at launch

you fucking retard you do realize it's coming out in almost 2 years?

So it will have 1080ti performance levels almost 4 years after the 1080ti launched. How is that so unbelievable? It would be embarrassing if they had anything less.

You Nvidiots have become so used to Nvidia sandbagging GPU tech over the past 3 years that you think it's normal for hardware to never advance.

Games 8 years from now, user

Sony doesn't need to wait for RAM prices to get low for them to build a system like this, just the massive order of RAM they will want will drive prices low for them automatically

Meh, it was complex but quite powerful. Just look at Sony exclusives, they blow everything from that gen out of the water. Most of the time games ran bad because they were developed with Xbox in mind, due the fact it was easier because it had DirectX and a simpler hardware. See Bayonetta for example, it's probably the worst port it ever got.

>Sony exclusives, they blow everything from that gen out of the water
Like what? That generation was trash in terms of games. The thing literally had Demon's Souls and a bunch of terrible halo knockoffs and other moviegames. How do you even function with so much shit between your ears?

fucking retard

I might be wrong on this but I believe the main reason might not even be that they can't into such codecs because they're dumb, but rather because the licensing issues those codecs would bring upon them.

A bunch of bullshit and yet you forgot that the PS4 came out with a 1050ti equivalent. The 1070 will be a mid-end by 2021, so yep, it'll come with a 1070 equivalent at best, if not something equivalent to a RX 590. But you can keep believing, don't be disappointed when your favorite gaming toy comes out different from what you expected tho.

i remember people saying the same about the ps4's 8gb when it came out too

Found the cowadooty kid. I was clearly talking about graphics, but since you mentioned, Sony exclusives were far better than anything released on the Xbox. I guess you are american, because only them don't seem to admit that, since they love their cowadooty machine with passion.

>reddit.com/r/PS5/comments/a1qmvt/info_on_the_dev_kit_for_ps5_from_an_indie/
>The PS5 comes equipped with 16GB of GDDR6 Ram, and it will come equipped with an 8 Core Ryzen CPU and a Navi Based GPU. Sony really wants to put in 32GB of GDDR6 Ram but due to the PSVR 2 being bundled with every PS5 they have to make sure that the system will be $400 + the PSVR2 which will retail for $100 and in total the PS5 will be $500.
Oh fucking shit.

can consoles even 4k at 30 fps?

M
A
M
A

L
I
S
A

Attached: 1532376506741.gif (840x488, 511K)

>can consoles even 4k at 30 fps?
>However the 16GB of ram will NOT be used in games, however the ram will be divided up into two sections so for example 8GB will be for the Games and 8GB alone will be used for the OS itself. So far the games that I have seen have been running at 4k/30FPS which is not a way to start off the 9th gen, but it is a significant upgrade from the PS4. Anthem is having difficulties running at this current state, but Bio Ware is determined to put the game on PS4 and Xbox One and the Next Gen Systems. The PS4 version runs at 720p/30fps while the PS5 version runs at 4k/60fps so yes some games will run at a constant 4k/60fps but the majority of the games from what I have seen have been running at 4K/30FPS.

Current consoles already do. Some less demanding games run at 60. I think red dead 2 ran at 4k60fps but I'm not sure.

>gay rambling
>imagine still shilling for the PS3 after all these years

Attached: stock-photo-portrait-of-a-beautiful-young-girl-with-long-blond-hair-502811416.jpg (1137x1600, 591K)

>gaytracing

Attached: 1538876373822.png (599x710, 539K)

Imagine thinking like a fat cowadooty kid after all these years lmao

Attached: 1482574053190.jpg (267x323, 7K)

AMDs GPU department consists of a bunch of retards. Remember vega frontier edition? Probably not, because it's a regular RX with a blue case around the heatsink, for a nice 400 bucks more than the regular vega 64.

Imagine how much it would have sold if they added SRIOV. Or the pro drivers. Or anything that would have made it worth buying. It would have sold out within minutes, but instead we get nothing. Because these people don't know how to run a business and gain marketshare against nvidia.

>consoles won't be able into raytracing
It's not like Nvidia GPUs can actually do raytracing either.

>it's a regular RX with a blue case around the heatsink, for a nice 400 bucks more than the regular vega 64.
It also has 8GB more HBM2, which I'm pretty sure accounts for nearly all of that $400 upcharge. People forget that it had double the memory.

Probably because it fucking sucked.

I don't get why AMD keeps shoving expensive memory on their GPUs, it doesn't help them at all. Jesus, just shove a GDDR and sell it for way less than the competition.

Supposedly, that's basically what Navi is. GCN taken to its limits with cheap standard memory. Maybe after Navi Wang can give us something that's not a fucking GCN revision.

Probably the only redeeming quality of it. Can't believe they have their heads so far up their asses, they don't understand that they could BTFO nvidia with a single GPU line up if they just dropped all the enterprise exclusive shit at a reasonable price on their hardware. Fucking kikes, both nvidia and AMD as far as GPUs are concerned.

MUH GAYTRACING
fuck off rtx retard

My hope is that, now that Zen is a hit in basically every segment, they can stop starving RTG and let the engineers have a budget and a real driver team to work with.

>20gb ddr6
>ryzen 2 desktop cpu
>navi dedicated gpu
You have to be fucking retarded to believe this

No they can upscale to it. The ps5 pro is about as powerful as a 580, and the optimization isnt magic

they are

Attached: 1407091453842.jpg (762x668, 42K)

Red Dead 2 is probably one of the most demanding games, dummy. Yes the Xone can run it at native 4K 30fps, meanwhile the PS4 Pro runs at 1920x2160 with checkerboard memeing. No big game runs at native 4K 60fps thought, they simply can't handle it.

GB of ram

Automatic bullshit detector triggered. 16 combined main memory and GPU memory, sure; but this is nonsense.

Yes. One X.
youtube.com/watch?v=_mTIPXwcDGQ

You really think they made a choice because it was 'the best'? They went for the lowest bidder. AMD cpus are cheaper, not better. More money on cpus = less profit per ps4. A company that takes the best, most expensive solution doesn't last long.

The CPU is literally 6x faster.

It's like having 8 jaguar cores at ~6ghz and then multi threading

Spent $1200 on a PC with a ryzen 1600 and a gtx 1080 this year. Fuck these companies.

>6TB SSD
And this nigga thought everyone would believe him?

Attached: 1496716408861.png (285x285, 30K)

Why would it be non-sense?
Just because the RAM is not a power of 2?

The GPU team will always be a second fiddle to CPU one because GPUs are not important.

Probably the most reasonable specs I've seen for a leak of the PS5.

I could easily see it, especially if they are actually trying to implement PS4-1 game backward compatibility in the 5 like it's been reported. A dedicated 4GB DDR4 stick is more than enough bumping room to work with on a console for the OS specifically.

It's 24 segmented as 20+4, but that's still bullshit.

This HD is TOO small for the size of the game you would have with 20GB of memory, taking account "modern development".
The games would easily reach 300-500GB a pop.

>playing video games
you all lose.

Yeah i bet you watch netflix and cable like an adult

You sound like a islamic terrorist with this.

>licensing problems
>opus
/v/tards need to leave

14 teraflops in a fucking console

>>ryzen 2 desktop cpu
>>navi dedicated gpu
>You have to be fucking retarded to believe this

OP's rumor is complete horseshit, but I believe the discrete Ryzen part. Maybe not full 8c and definitely lower clocked than the desktop parts, but consoles are the perfect dumping grounds for not-quite-perfect parts out of the fab. In a couple of years when AMD's 7nm yields are in the high 90s, PS5+/XB2.5 half-gen bumps can be rolled out with 8c@4GHz affordably.
But we'll probably find out one way or another next week if any of this is vaguely on target.

i wasn't, since that same 8gb was used for the cpu as well. it was the equivalent if you like of having 4gb of system ram and 4gb vram.

it's true that the 1070 is faster than the x, but i think it runs badly mostly due to the much slower cpu and also bad programming

They need the memory bandwidth because their gpu core design is lagging behind in performance. This is how they make up for it. For example, VII has only the same performance as the 1080 ti when it's on a much smaller node and has insanely higher memory bandwidth?

It has higher memory b/w because it's an HPC die.
These all have insane memory b/w.

You're retarded if you think "core performance" is some sort of linear metric.
GCN is a compute beast, but it skimps massively on fixed-function geometry/rasterization units compared to Nvidia.
Which architecture is "better" basically comes down to how much a given piece of software focuses on geometry density vs. pixel shading.

>shitposting
>having fun at all
You already lost.

>what is a midrange GPU

Literally who cares about anything not high end

>AMDs GPU department consists of a bunch of retards
Now employed by Intel.

nobody said that. everyone here is aware that gpus have different designs with different numbers of functional units arranged in decent structures. it doesn't have anything do to with what i was saying earlier, which is that amd uses the expensive hbm memory because they have to. they would make versions with the ch eaper gddr and lower their prices or make better margins if they could. even in compute the vii is a lot slower than the 2080 ti, which is still 12nm. turing is so much better than gcn that it didn't even need a die shrink.

>rumor
I thought Jow Forums was supposed to be the smart part of Jow Forums?

>which is that amd uses the expensive hbm memory because they have to.
They don't.
Vega20 is a HPC die so 1TB/s is just there for a good measure.
>even in compute the vii is a lot slower than the 2080 ti, which is still 12nm
Define compute.

Jow Forums is one of the dumbest boards of this shithole. There's no real technology discussion here, just consumerism and shitposting.

20gb gddr5
so that's how they want to boost the ram price again

Resistance 1 was a console-release game and it had 64 person multiplayer. which could not be achieved until the next gen