Sony Working on Dynamic Loading For PS5 To Eliminate Load Times

respawnfirst.com/sony-renews-dynamically-game-loading-patent-ps5/

Attached: ps4-pro-yoshida-future-update-not-easy.jpg (1280x720, 49K)

Other urls found in this thread:

youtu.be/ZcP9jiCbakw
youtu.be/ak-yN-8K2cU
twitter.com/NSFWRedditVideo

Fuck up muzzie faggot
Nobody gives a fuck about 7 years old patents and Basedny discovering ssds exist 2 decades late

Why do cartridges (or cards now I guess) have load times nowadays?

Reading the patent, does that not already exist? Doesn't sound anything different to LODs.

That's almost entirely going to be driven by the PCIe 4.0 SSD. The technology after the fact might be more interesting but will not matter nearly as much.
Internal storage has a speed limit, if you're drawing too much data in too short of a time then you need to wait because of the physical limits
I believe NES-era cartridges were handled similarly to system RAM, meaning they could be used directly.

transferring cartridge data to ram

Lol they're coming up with this just now? What were they doing in the meantime? Jacking off?

Consoles outside nintendo really only exist for plebs let's be honest

Why was microsoft also making a big deal out of the fact they were going to use ssds in their project scarlett as if it was some new technology

it is new for consoles, plus they are apparently making custom ssds that are said to be faster than any regular nvme ssd you can buy for your desktop/laptop

we're at a point where loading shouldn't even be a factor, especially with shit like nvme ssds and 64 pci express lanes on consumer hardware. even a fraction of that should nullify loading things. that being said, how the fuck is realtime full-on raytracing still taking so long? 5 raycasts per pixel isn't enough, you need like 100 times that to make a cinema grade cg shot. will it be scalable? i heard that AMD is planning to just go the cloud rendering route.

Go back to /v/eddit.

wake me up when consoles will get reshade

*blocks your sorry ass*

Attached: Google-Stadia-Controller-noscale[1].jpg (1920x1021, 83K)

Go back to your parent bonding walking simulator, sonygger

*lags*

They'll just be pcie4 ssd at best, consoles are just vaguely custom PCs nowadays

That's because of PCIe 4.0, I doubt they will be any faster for any reason but that, unless they make a new architecture like Intel did with Optane.

>Youtube quality colors
I want my Google Plus back.

Attached: absodis.jpg (469x728, 65K)

bus bandwith is not a bottleneck for solid state storage.

Isn't this entirely a software thing?
Games like Halo 3 did dynamic loading just fine, it has nothing to do with any features of the console.
More RAM to play with can help, but if one section of level ends up taking up 100% of your RAM budget anyway, which it fucking will, then you're fucked.
"Well, just SLICE the levels!"
Congrats. Now one slice takes up all your ram. Same shit.
If devs wanted to do it, they could, but they're pressed for resources enough as is.

With old consoles, the ROM cartridge would be connected directly to the processor in the same way RAM was, and could be accessed the exact same way, it just couldn't be written to.
Because of this, if you wanted to draw a sprite on screen, all you had to do was point to its address in memory and it'd already be there. All you used the RAM for was storing things like player position and score, the rest was just baked in ROM. You never copied anything from the ROM to RAM, because you didn't have to.
Now, it's just external storage with a game on it. Wanna draw it on screen? You copy it from the storage to RAM, then draw it.
This is where load times come from

it's just a buzzword. many games have had background loading as well as data streaming.

>plays at 8 bit 420.
haha

The 970 pro can saturate a PCIe 3.0 M.2 slot, so you're not entirely correct. It depends on what implementation they go for.

Nobody wants to this loading times on PS4 suck

You cannot combine navi+zen2+pcie4+nvme in a box costing $500.
They will just use regular m.2 sata interface and optimise loading times.

Yeah, I said pcie4 at best because there's no way they'll ever design a custom storage solution.
They may use some sort of ultra budget QLC NVME since they can get really low prices due to the volume, but sata seems more logical for the price

there never was and never will be anything better than the dualshock™

We'll have to see, but I think a PCIe x2 SSD implementation is possible.

xbox one controllers are better than the dual shock 4.

The only cartridges out there are used by the Nintendo Switch, or the 3DS. They're not much different from an SD card with a changed pinout. You still have some read only memory, a NAND controller, and the ultimate transfer rate is going to be limited by the controller. They aren't trying to make these things have the best possible specs, they're making them as cheap as possible.
You could make a cartridge that has sustained 500mb/s reads, or more, if you wanted to. Nobody apparently wants that right now.

ps vita uses cartridges. is so close to being an sd card that you can just use a micro sd card as a game card with a simple adapter.

I always forget that the Vita existed. Sony has a fairly long history with shitty memory standards going back to the old Memory Stick Pro Duo. I remember getting ripped off buying those for the original PSP. Multiple times the price of a normal SD card, same or worse performance.

PC Master Race
console peasants literally kys

Attached: System.png (1631x931, 181K)

they also have their own vita cards that are ridiculously expensive, but i'm talking about the actual game cards. it's a very neat way to add 200gb of storage for games for very cheap.

time to update

>imagine using wendies

>Yuliya

So, girl power: the pc?

yes ;)

Yes, I like dicks ;)

Not at all, unless they plan on releasing a $600 console.
Especially with the current tariffs, which are going to increase console prices by upwards of 20%.
The most likely scenario is we get one of two things:
>$450-$500 consoles with sub-RX 5700 GPUs, 8-core Ryzen chips clocked at 2.1 GHz, a 500 GB SATA SSD (these are the same companies that used 5400 RPM HDDs not only in the original XB1/PS4, but also in the XB1X/PS4 Pro), and 12 GB of RAM
>$700-$900 consoles with RX 5700 equivalents, 8-core Ryzen chips with custom cooling clocked at 2.5-3 GHz, a 1 TB NVME SSD, and 16 GB of RAM
The former is much more realistic, and is pretty in line with what we got this gen (even with the upgraded consoles).

That's a liberal estimate of the tariffs.
They're releasing late next year, and selling for a loss. We also don't have a great idea of Zen 2's efficiency curve, so estimating clocks isn't useful.

Selling at a $100 loss is a lot different from selling at a $400 loss.
It's not going to happen, there is 0 reason to release a $400 console with $900 in parts, especially when pretty much all platform holders are heavily pushing streaming, which can be done on a $150 notebook.

I don't think the part cost adds up to nearly that much when you're buying bulk prices.

Then you don't know how wholesale works.
Just like OEMs buy wholesale parts in bulk and build their computers around that, they still mark up prices (even Microsoft and Sony do this) to make a profit.
With a console the pricepoint is key because they're aimed at children and poor people, so they have to target a maximum price, and it makes 0 sense to lose $300+ per unit when they could easily just make the unit weaker and lose less than $50, or possibly even make a profit per unit sold (like what Nintendo does).
Sony already learned how much this can fuck you over back in 2006.
Much like with the PS4, XB1, PS4 Pro, and XB1X, we're going to be seeing largely mediocre low-mid range boxes with mobile parts.

Every open world game has dynamic loading, how the fuck are they supposed to patent that.

> how the fuck is realtime full-on raytracing still taking so long? 5 raycasts per pixel isn't enough, you need like 100 times that to make a cinema grade cg shot. will it be scalable? i heard that AMD is planning to just go the cloud rendering route.
We've reached the logical limit of silicone for the next decade so gpus will go mcm very soon just like amd CPUs did
Nvidia will make 7nm turing mcm next year or so and amd will follow up with their own new post gcn rdna arch with native rt dxr hardware support around early 202x

I got a 2080ti and no way in fuck will single core gpus ever be able to handle rtx stuff beyond 30fps 1440p unless they go that mcm route

It was a neat experiment but it was just way too early.

I fully expect a Ps5 pro and xbox two x to have new gpus and updated specs more ram in 2025 or before

As the current leaked specs put them at no better than a 2080 + 3600 which would be painfully average in mid 2020s trying to push 4k 60hz as a baseline all the way up to 8k/244hz + (((real time ray tracing)))

how many ps5 are you guys preordering?

>sell cheap consoles to goyim
>mine crypto when they are not in use
What could go wrong?

who did this?

Just a suggestion.

>silicone
>thinking FinFETs are the density/power limit for any substrate
>blindly regurgitating things you understand absolutely nothing about

TSMC rushed their 7nm node to market to mop up all the customers, and they ate the cost in the hopes of accelerating ramp up of EUV. Samsung elected to hold off until their EUV lines were ready. Global Foundries chose to simply not bring their 7nm FinFET to market at all as TSMC had taken the majority of the market, and it wouldn't be worth the investment with such a long time required to see a return on investment.

As non EUV 7nm requires insane amounts of multipatterning its yields are terrible, and this is a fact the industry was well aware of years ago. AMD elected to pursue MCMs to work around this. No one was certain whether or not EUV would actually pan out in volume scale. Fortunately it has, and EUV is a massive boon to yields while lowering costs by eliminating tons of exposure steps. So we're now only waiting for 7nm EUV to hit full volume production with mainstream parts. TSMC's next node will have even more EUV inclusion which will again dramatically improve the cost and yield situation. 5nm is their last FinFET node, after that they go to a GAA topology. GAA is one of the biggest things to ever happen since the advent of the transistor itself. Drive voltages will drop through the floor, density is going to skyrocket, dynamic power will be absurd, frequencies as well will shoot up dramatically.
To think that 7nm non EUV is somehow the finite limit of performance and xtor density is just painfully retarded and speaks volumes to your overwhelming Dunning-Kruger ignorance. Nigger.

Attached: 18.jpg (720x405, 108K)

>As the current leaked specs put them at no better than a 2080 + 3600
Closer to a 2060 + 1600 honestly.
They don't have the die space to fit a 2070 competitor and a 2600 on the same chip, let alone a 2080 and a 3600.

Unlike CPUs GPUs can scale forever, literally just add more cores. I'm surprised AMD introduced CPU chiplets before GPU chiplets.

Costs also sky-rocket on 5nm using all your memes
Fuck off dickhead don't put words in my post then lie
I'm starting to think the 15tflop gpu is a load of shit Tbqh
Because of their dogshit gcn pipeline + more bottlenecks than a foreskin

>I'm starting to think the 15tflop gpu is a load of shit
>starting
Wow, really?
What tipped you off?
Was it the insane clockspeeds that would be required to hit that level of compute performance on Navi or the ludicrous amount of compute units that would be required?
Or maybe the 300-450W for the GPU alone that would be required?

A GPU can theoretically scale to N cores, but creating an architecture which has a front end capable of scaling out, and ROPs capable of scaling out are entirely different matters.
You could conceivably just have a front end/command processor on one die, and CU/SMs on separate dies, and build an MCM that way.

Without the architecture to address the underlying issues its not worth pursuing. CPUs are the big business for AMD so it makes perfect sense for them to bank on MCMs there first.

Wafer costs increase, retard, design costs increase. Projected yields will increase enough to make it bearable to foundry clients. EUV processes will be net cheaper per die than non EUV single digit FinFET.
Dunning-Kruger Nigger.

I haven't bought a gen 8 console and Im glad I didn't but every time normal fags fall for this marketing shit.

The xbox one x was the last decent console to come out and this Ps5 and xbox two will be insanely over priced (1k+ I reckon)
The fuck are you even on about

Lay off the wafer fumes mate speak English

Steam controller exist

how are you going to do it? its impossible on all current consoles and from recent memory only the 3ds and the ps3 would have been possible to do it on. unless you make some weird mod chip.

modern consolew can do it but not steathly

you missed out bro, ps4 was the best console ive ever owned. except the controller. most comfy thing ever but the face buttons and analogue sticks are cursed.

>p-please stop calling me out for being a buzzword spewing retard
No. Retard.

Dumb little faggots have been on this board for a decade whining about shit you have no background in to speak on. We're not hitting on physical limits on scaling in xtor density, power, or performance in any metric.
7nm EUV
5nm EUV
3nm GAA
1nm GAA

Conventional CMOS scaling will continue well into the late 2020s, so sitting there opining about how we're hitting the limits of silicon substrates now makes you an enormous man-child idiot. Nigger.

I meant the manufacturers themselves doing this.

PS4 had some awful games output though, same with Xbone (though obviously much worse for the latter).
I think there are maybe 10 exclusives worth owning on PS4, everything else is better on PC (and exclusives are the only reason to waste time with a console).

who cares, its the same vain garbage made for dumb wagecattle.

But you'll be able to run it in 4K 60FPS!

> 8-bit 4:2:0, 3Mbps, with the encode needing to be done in real time
> worse quality than 480p

We probably would have seen 5GHz easily on Zen v2 if it used GlobalFoundries 7nm.

See that wasn't that hard to make a constructive logical post was it? Silly based user
Wrong xbox one had better games + controller and cross compatibility with pc

Wait is this real? Why the fuck are they recommending 25-50mbit+ per session then? Fuck that my rgb 8bit va is better

>wrong
right

yeah, it was super easy to hack the ps4 though, so you can play a lot of those games for free. plus is has decent loonux support with 3d hardware acceleration so you can literally have it as a desktop replacement if you want. same to some extent wih the switch

> implying they'll actually be able to sustain 25-50Mbps

Don't mind me, ISP's. I'll just casually eat up like a terabyte of bandwidth per day per customer. I'm sure nothing will choke.

Its definitely possible, and as AMD was GloFo's only large scale prospective customer its likely that their claim of targeting 5ghz operation was entirely aimed at just that. TSMC will probably hit it. AMD is already hitting 4.7ghz for their XFR state on 1-2 cores. That extra 300mhz can be a long way off in terms of required voltages, but building up inventory of dies as the months tick by will do wonders for facilitating a more premium SKU. Maybe we'll see a 3950X Black Edition that hits 4.9 or 5ghz for XFR.

This.
If normies think Netflix congestion is bad at 3-11pm on weekends + I home wifi slowdown imagine how bad stadia will be ontop of that it's also very latency sensitive and I doubt most home routers will automatically give priority to stadia which is another whole series of headaches

It's on live and nvidia now all over again.

Even if every device had dedicated 10gbit there would still be latency issues.
No way in fuck it will get down to 1-10ms
It's reported it's around 10x that so around 10-100ms+
Frame pacing would be absolutely arse unless the latency was a perfect even number all the time with little deviation.

All the ai predictive shit couldn't fix that and you'd need a server every 100km/mile radius so latency wouldn't go through the roof
This comes to mind
youtu.be/ZcP9jiCbakw
Can't wait to shop stadias logo in there and meme it fuck google

But normies don't care, they'll put up with disconnects, horrible delay, and insane amounts of macroblocking if it means they can "play" games on their phone or their shitty $150 notebook.

xbox controoler is a pos with bad qc

i have the steam controller, it's nice for certain games, but it's not too comf, especially the super annoying bumpers

My 6 years old one works great but I didn't mistreat it
Ages just as well as my wired x360 one if not better
youtu.be/ak-yN-8K2cU
I doubt they'll put up with constant disconnect and weird issues
Stadia and more importantly home internet isn't really ready.
It can barely handle a family on normal net browsing fb + streaming and yt ffs

I don't give a damn.

Go the fuck back to your children's board /v/eddit

Attached: image.jpg (540x729, 198K)

like 8% f them had bluetooth issues, and the elite controller s well is a pos that rubber bands the analogue input

Meh I use my one wired and never had a single issue with it
I can't stand wireless and would have Ethernet plugged into my phone via poe if I could

still doesnt excuse the abhorent qc. id bet nearly all xbros use their controller wireless.

>trusting sony

Attached: sony_arrogance.png (1570x1260, 279K)

>being mad that PR people make outrageous claims to generate clickbait headlines
Welcome to the world, user. Enjoy your stay.

the ps3 was a glorious piece of work though, but as a console it was just ok. i dont even disagree ith them on any of those points

>consoles

Attached: 1454461648100.gif (480x270, 637K)