2018

>2018
>game devs still using d3d11
>still making games dependent on CPU's single core performance
>vulkan is supported by every major engine
>actually makes it possible for people to run their games at 4k 60fps
Why are devs not using Vulkan API?
Are they lazy or get good money from hardware companies to gimp so they buy latest hardware every year?

Attached: lazy_coding.jpg (743x413, 57K)

Other urls found in this thread:

warosu.org/g/thread/S67994869
twitter.com/SFWRedditGifs

Attached: 1539838217980.png (328x408, 83K)

Because (((Nvidia)))

Programming related question, you better stick to your sqt faggot.

>Why are devs not using Vulkan API?
Because it's more difficult to use than DX11. Same goes for DX12. More difficult API means they have to put more money into development, and they don't want to do that.

Stop playing shit, several games I've played recently have been Vulkan or DX12 and with decent multicore optimizations.
Most new games are very multicore optimized.

Please refer yourself to said thread to learn more: warosu.org/g/thread/S67994869

Attached: 01af5849159c45c607468d73f3f193e4.jpg (564x564, 30K)

Everyone is used to directx that it’s baby duck syndrome.

Vulkan and dx12 has only handful games.
Dx12 is not even well optimised nowadays by most devs, just look at bf5.

>Games

>OP is talking about programming
>tells him to go to /v/
This board has been plagued by shitposters like you.

>Video game programming has nothing to do with games
The absolute state

Attached: 1530705629602.jpg (645x729, 67K)

Why can't /v/Irvine just stay on their own board? Nobody wants your toys here, fuck off. Program information games is about the lowest point a programmer can sink. You have to be borderline retarded to do it.

Attached: t..jpg (355x236, 40K)

>still making games dependent on CPU's single core performance
Questionable, any decent engine does use multithreading for physics, loading assets, pathfinding, and with the ECS meme, updating entities too.
>vulkan is supported by every major engine
Graphic drivers having something else to say about that, I've seen at least a couple of people showing that vulkan is way slower than openGL, but this depends on the platform and the driver, still, it should be a major point for any indie game or company that favors supporting a wider range of graphic cards.

>>vulkan is supported by every major engine>actually makes it possible for people to run their games at 4k 60fps
holy shit you are retarded

btw DX11 is here to stay as an easy option

Didn't the new hitman game use dx11 when the previous one was dx12?

...

>Are they lazy or get good money from hardware companies
Both. Once game engines start supporting it you'll see more programs using it. Android in particular already can leverage Vulkan to render the UI. On phones Vulkan has worse performance though. Give it time. Not even dx12 has taken off yet.

...

same person

thread is obviously about programming. there is no way /v/ermins are able to understand graphics APIs, all they do is drag&drop assets on their shitty unity engine and share in /agdg/ aka /lgbt/ thread.

So? It's still video games. Just because you don't like /v/ doesn't mean you can post off topic threads here.

nope, it's strictly programming and you are the only brainlet in this thread.

This is the bare minimum for a programming and you're asking why the industry does this shit over that shit. Now fuck off back to

>industry
exactly, it's the game PROGRAMMING industry. game engine programming is among hardest tasks in programming.

Because all GPU APIs are a terrible buggy garbage. By using newer stuff you're pretty much making a program that only properly works on your PC. Older stuff is more reliable but still not perfect. That's why Minecraft was using, like, OpenGL 1.0 or something. That's why Godot is actually DOWNGRADING from OpenGL ES 3.0 to 2.0 in the next version. Only big companies can afford testing new APIs on all those GPUs.

>nope, it's strictly programming
And? Everything computers do involves programming. Doesn't mean toy making doesn't belong on your toy board, /v/irgin.

git gud

Attached: 1532155976816.png (1294x214, 16K)

back to your battlestation threads please

I have /bst/ filtered for a reason, sorry

/bst/ is full of gamer wangtards who belong on Jow Forums.

>gayme programing is hard therefore muh thread is Jow Forums related
Yeah nah. Go back to your children toys

>game engine programming is among hardest tasks in programming.
Only non-programmers think stupid shit like this.

Attached: 1541715664154.jpg (300x311, 14K)

Seething boomer afraid that people would judge him for playing video games.

Newfag this has been happening since the very beginning direct x since the 90s
Each new direct x took about 5-10 years to catch on
I only started heavily into pc gaming around the late 90s when dx6 was a thing but dx 8 held on until 2003 then dx9 took off and endures to this day.
Dx10.x was a stopgap flop that sort of got use in 2006-2010 but it was literally useless
Dx11 from 2009 till now
Dx12/vulkan from 2015 till?
As I said devs on pc are incredibly slow to adapt to this shit its gonna take another decade for them to get to grips with what we have right now and then there's dxr shit thrown into the mix this year that only 3 nvidia cards even support

Attached: 1510101698118.png (1296x1458, 212K)

Why do gamers think everyone wants to play with their toys? You're so immature.

>i le posted le epic maymay again

>le

>>>/4channel/

nvidia

>still making games dependent on CPU's single core performance
... AC:Odyssey maxes out an i5-8400 100% on all 6 cores on a fucking GTX 1060.
That's not single core bound at all.

But yes, I agree that still using Dx11 is stupid. That's Nvidia's influence because it took them TEN YEARS to finally release a card which properly supports Dx12/Vulkan. They held Dx12 back for even longer than the insane time they held Dx9 back. Or was it some other Dx before? idk I don't remember.

>AC:Odyssey maxes out an i5-8400 100% on all 6 cores
The only reason is Denuvo tho

Do we know for sure? What does per core load look like with cracked AC:Odyssey?

It's the same because Denuvo is still present and cant be removed. Denuvo cracks simply work around tricking DRM to launch the game, but everything after than is similar to its "official" counterpart. So Denuvo still runs and eats half your CPU

Because it offers no major advantages for Nvidia GPUs so game devs dont care

Attached: LzkPyIN.jpg (334x500, 50K)

>maxes out less than year old CPU
Even 6600k can't get consistent 60fps.

Yeah, the i5-8400 and 7600k both don't get 60fps minimum in it.
An i5-6500 only gets 30fps averages, with minimums dipping down to 10fps. It's so bad.

The 2600X or 2600 OC are the only CPUs, if you have decently fast RAM, that can just barely manage 60fps minimums without spending a lot more on a 2700X or 8700.

Like obviously 4 threads hasn't been enough for years, but this game is stupid with how much CPU it wastes on DRM. It's almost like it's crypto mining.

Attached: dont listen to 8400 shills.jpg (2874x1449, 611K)

GPUs and their drivers are broken garbage held together by duct tape.
Not kidding.
A high level API usually at least works mostly right. Low-Level APIs like Vulkan and D3D12 expose just how terrible the situation is and force you to write workarounds for everything.

Real shit.
Is it possible to program a game to be aware of how many cores the cpu has and make use of them?

Attached: 1507420289212.png (128x128, 18K)

You can only make turn based games with one thread.

Only doom and wolf2 are real vulkan games. Other games, like Talos, are using shitty wrappers over dx11 calls and run worse.

Switching from Vulkan to DX12 made Xbox 360 emulation significantly easier

>It's almost like it's crypto mining.

I am going to bet that half of the gaming companies are putting cryptominers in their games.

>borderline braindead Jow Forums scum is actually acting elitist towards gaymers
this is just pathetic

You don't sound happy. Have you tried installing Gentoo?

Is it realistic to make a game from scratch only using the CPU? Or is that retarded?

Yes and so are you

>Is it realistic to make a game from scratch only using the CPU?
Many great games don't require dedicated GPU. Not sure how well toddler engines like Unity handle CPU rendering though.

>Are they lazy
It's this.
Vulkan and DX12 are really good but require a lot more effort to accomplish the same as DX11.
If you can just copypaste your existing engine then why bother?

But can you make a very high fidelity game without utilizing the gpu? Like could you make RDR2 without gpu use for rendering (but instead use it for shaders and anti aliasing)

No that's totally fine when you're not doing 3D. You'll want some GPU help for output and scaling, but that's simple to tack on.

It's called software rendering. On something like epyc it might even be viable.

No. Try downloading Half Life 1 and running it in software rendering mode.

You'll get 1998 graphics at 15 fps on a modern machine when the GPU is not helping.

>Is it realistic to make a game from scratch only using the CPU?
What do you think this means? Because I don't think you know.
>Many great games don't require dedicated GPU.
Not requiring a dedicated GPU doesn't mean it doesn't do the rendering in in the GPU, it does it in the IGP instead which is the same but weaker.
>But can you make a very high fidelity game without utilizing the gpu? Like could you make RDR2 without gpu use for rendering (but instead use it for shaders and anti aliasing)
You would need an extremely powerful IGP and even then you need to optimize the game to use both GPUs for different things which generally doesn't work so well, just look how shit SLI is that practically has been deprecated.
If you actually mean the CPU rendering, that would be impossible since they can barely handle modern games letting the dedicated GPU do all the legwork, it is why hardware acceleration in videocards was invented, I can only imagine a mature quantum CPU being capable of handling a modern engine using software rendering.

>You'll get 1998 graphics at 15 fps on a modern machine when the GPU is not helping.
It will run at 500fps if you don't lock the framerate, but it will look like shit because it wasn't programmed to display the same effects than using hardware acceleration with an API.

What if you aren't using polygons? You just have geomotry on disk (spheres, prisms, different shapes assembled to make game objects) and cast rays on the CPU to see what they run in to

Fake news

well, you can't really render anything with a modern graphics API without the use of shaders, they are part of the rendering pipeline.

Just because your an autistic virgin who thinks copy/pasting commands from a wiki into the terminal is programming doesn't mean that the rest of us shouldn't be able to post about graphics APIs, you fucking faggot.

Attached: Untitled.png (2560x1440, 541K)

Couldn't you use shaders after the geomotry of the frame is rendered?

>don't make your own XXXX

but they're all horrible and i make everything myself thank you very much [spoiler]you stupid nigger[/spoiler]

I'm guessing you made a plugin to display spoilers on Jow Forums

>winblows
>Qt creator
top fucking wew

congrats on following the vulkan tutorial up to the point where you render a static triangle
what have you learned so far

no, since the shaders (vertex & fragment) play a key role in the geometry getting displayed to begin with.

Ubisoft engines are notorious for having insane draw calls while looking like shit visually. Having 100% utilization doesn't mean anything in the context of that picture. You could have 250,000 draw calls when the scene really needs only like 80,000 and the rest are a fucking waste cause they develop for consoles and optimizing for PC is a complete after thought to them.

Attached: 20181121_091901.png (379x673, 116K)

DX9 living for so long was Microsoft and Sony's fault, not Nvidia. DX11 not being replaced by DX12/Vulkan is however 100% Nvidia's fault.

>What if you aren't using polygons?
Isn't that what always have been used? GPUs process the polygons way faster than any CPU could.
For a simple example go back to 3D game from the 90s that supports both software rendering and hardware acceleration, using the CPU is much slower and looks terrible because it can only process basic polygons at a reasonable speed, it doesn't even emulate the special effects in software because the CPU is just completely busy doing the 3D rendering using polygons.
Lets say if you reprogram Quake 2 to use software rendering to max an 8700k you could make it look as good as GLQuake 2 somehow, but you will not get even 2 fps if you try to do the same with Quake Champions.

You know what else you can try? Try running Project 64 with a simple game like Super Mario 64 in 1080p, first try using something as shitty as Direct3D6 plugin, you will get 60fps locked without a stutter. Now change the rendering to "RGB Emulation" (Software rendering) and make sure to go full screen, you'll notice the game becomes almost unplayable, I could not get even 10fps in an old Ivy 3470, if you have a more modern CPU you can give it a try, but I think it proves the point how highly inefficient is to make the CPU make all the calculations that normally the GPU makes, in this case the CPU does not only render the polygons but also emulate in software all the special effects the dedicated GPU from the N64 does, but we are talking about a 22 year old game made for a 22 year old system! And even current hardware can't or struggles handling something like that in software.

No there was another DX, either 11 or 9, where Nvidia heavily held it back by flooding the market with shitty rebrands which were far behind on API support

I don't think it's dx9.. that would have meant that geforce 4 series were rebranded as 5 series. But no, the 5200 supported dx9.

Ahh right it was all the 8800 rebrands and 200 series rebrands that kept us on dx10 for fucking ever, wasn't it?
Nvidia actually entirely skipped dx11 and then said they supported dx12 with Fermi, the 400 series, but they really only finally properly supported DX12 with Volta and Turing 8-fucking-years later.
I didn't realize it was THAT bad.

>DX9 living for so long was Microsoft and Sony's fault
Why Microsofts fault? Because of the X360? It supported DX10, unlike the PS3 that used an older version of OpenGL without unified shaders because they used a shitty GPU based in G70.

Attached: back_to_v_KID.jpg (945x645, 121K)

Wow, that is fucking retarded. Isn't the whole point of the franchise that it began during the first game, around the time of the Crusades and then following the bloodlines down through history? Having a game set in roman/greek times just ruins it.

Why did you delete your post to repost this image faggot?

It's called voxels. Software rendering still applies.

Sorry I think I meant to say after the geometry was calculated
What I'm saying is what if you JUST raytrace? No polygons, all you do is raytrace and calculate if it hits your mathematically defined geometry

Would it be possible to run a seperate CPU to do this as we usually do with GPUs?

>Why are devs not...

Attached: mhh_I_wonder_what_changed.png (780x660, 800K)