Raytrace don't mat

Attached: jacques-defontaine-women-wip-07.jpg (1280x1112, 291K)

Other urls found in this thread:

twitter.com/TKIO_BE
blog.insightdatascience.com/generating-custom-photo-realistic-faces-using-ai-d170b1b59255
youtube.com/watch?v=X0ROD3tZHso
youtube.com/watch?v=G06dEcZ-QTg
patents.google.com/patent/US20140340412A1/en
dspace5.zcu.cz/bitstream/11025/6987/1/Warburton.pdf
anandtech.com/show/13522/896-xeon-cores-in-one-pc-microsofts-new-x86-datacenter-class-machines-running-windows
twitter.com/NSFWRedditGif

The eyes give it away

-ter when turning it on drops the framerate to 15 at 1080p.

Something doesn't quite look right. The uncanny valley hasn't been escaped yet.

I don't believe this is a rendering desu. Looks totally real.

twitter.com/TKIO_BE
Freelancer artist do it.

You don't expect the 2080 to run this at realtime, do you?

Maybe in a rasterizer implementation, but not with raytace.

>dead eyes
I hope they never figure it out. Not being able to tell who is AI and who isn't will be too mind-fucky

>Not being able to tell who is AI and who isn't will be too mind-fucky

blog.insightdatascience.com/generating-custom-photo-realistic-faces-using-ai-d170b1b59255

Can't wait for raytraced nudes.

That's not realtime

>The uncanny valley hasn't been escaped yet.
eh, good enough for VR fappenings..

Attached: 114354521.jpg (486x609, 39K)

What are you guys talking about? I had no reason to doubt it being a hooman

They're just relying on hindsight.

>kayser-fleischer rings
Someone needs to inform this woman that she has Wilson's disease.

Attached: 1522862932698.png (700x195, 162K)

>Raytrace don't matter

Attached: Screenshot_82.jpg (600x600, 32K)

Noooo bros, everybody repooort report him nooo

>rendered on a workstation with an Intel CPU and an AMD GPU

Attached: 1528330324916.png (1053x1080, 265K)

>dem eyes

Attached: 1537317122070.jpg (1200x675, 74K)

Awww shit, the more I stare the more nauseous and afraid I become. Almost as if my brain is looking at a recently deceased person...

Attached: 1540488989102.jpg (537x810, 44K)

it's just you bro. get help

How much computing power would it take to render a 8K VR game with this quality?

Are you a grave digger? That's the only way I can see people looking at her for more than a millisecond.

Attached: 1538186589645.png (1280x826, 1.02M)

Too much, nvidia rtx dogshit gaytracing only does 1-3 SPP at 1080p and then uses a faggot denoiser filter to make it barely better than vortex global illumination.

no I'm scared of dead people

*voxel

Attached: wagon.jpg (1920x1080, 187K)

Bullshit. I don’t see any ‘uncanny’ at all

This level of rendering would probably take a couple of minutes per frame on a GPU.

At 240p maybe. It looks like at least 1,000 SPP ray tracing at minimum.

Keep starring at the eyes, you'll get an automatic innate revulsion. We're hardwired to avoid the dead.

i thought it was just the weird pose but yeah, i wonder why the repulsion when you look in the eyes
what is wrong with it

It mainly has to do with disease avoidance, people who don't look like everybody else trigger a sub conscious fear that they may be carrying a nasty bug/genetic illness. Like said it already has diseased features.

youtube.com/watch?v=X0ROD3tZHso

Future is coming

Her eyebags were clearly not big enough, so I fixed them

Attached: themeyesyo.jpg (1280x1112, 542K)

Aye. It's really close. Really, really close.

But the skin is too perfect, the eyes just a little too perfect (eye balls aren't perfectly smooth), the area around her eyes is a little off. Same with the eyebrows, they look drawn on rather than actual eyebrows - they're too shiny.

I'm not saying this as "oh look at how discerning and refined I am", it is seriously fucking close to photoreal. If it were well animated, it would be practically indistinguishable. It really seems more like a modeling issue than a problem with the rendering.

Ray tracing is not a meme when it's for creators but it's definitely a meme for gamers. Gamers need it to happen in real time which cannot be done. Content creators can wait a moment. Nvidia is just testing if they can make gamers fund all this by buying something they don't need. Unfortunately for Nvidia only the very dumb ones fell for it.

>Gamers need it to happen in real time which cannot be done
It can be done, we just don't have powerful enough hardware and efficient enough algorithms yet.

If Nvidia really shills this shit and big developers actually get on board with it seriously, I figure we'll have at least 30fps 1080p raytracing on a "typical" consumer PC within the next 5-7 years.

I was talking about the current generation. Games turn unplayable if ray tracing is switched on with current generation GPUs. That's why that feature is just an extra tax. You can't use it and still they're selling it to you. 5-7 years might fix this but by then the current generation is obsolete. I didn't mean that ray tracing cannot be useful ever.

>I was talking about the current generation.
I apologize.

Attached: 1528684243810.jpg (500x341, 35K)

>efficient enough algorithms
the algorithms are pretty much fixed
all you need is better hardware
like 10 times better hardware

What we need is to hack the planet

Fixed.
I knew something was bugging me

Attached: Capture+_2018-10-27-07-49-42.png (720x633, 765K)

Seems more useful to film makers. You could easily use dead actors to fill roles with ray tracing if you could get your hands on enough of their voice samples.

By the time this can actually be used in a game, I will be an old fart and we'll all be using some newfangled graphene computers running a Linux based OS, with Windows as the DE.

It looks fake because every single thing about it is fake. There is no way to fool the human mind without doing a magic trick

all 3D in films is ray traced and has been since forever

youtube.com/watch?v=G06dEcZ-QTg

The hero we needed
Based

I think hybrid approaches are perfectly doable in games where it makes sense.

OK I don't know if you guys are just messing around, but I have been looking at the eyes for about 30 seconds and haven't noticed anything weird.

Am I just autistic?

You are never going to get this level of quality with a raster pipeline, ever.

Nvidia's denoiser is actually quite good, it pains me to say. It is a tool you can use in production.

Very true. It is very upsetting to me to see how Nvidia's horrible Geforce RTX launch has tainted the perception of raytracing amongst the gaming focused users of PC's. Real time raytracing is a development we should all be cheering, but no Nvidia had to go and fuck it all up.

>implying this is a playable render in a game engine real-time
Give me enough time and I can render that on my ThinkPad. Even when we get up to almost perfect realism with faces rendering animation will give it away as fake.

>But the skin is too perfect

That's called good skin care (and skin genes).

Reminder that if OP image looks unnatural to you, you literally have autism.

>It is very upsetting to me to see how Nvidia's horrible Geforce RTX launch has tainted the perception of raytracing amongst the gaming focused users of PC's.
who cares? user perception doesn't matter, results matter. If raytracing cards were worth it people would change their minds in an instant, but they aren't

Alternative algorithms may be better in some situations, but . I hope someone gives voxel cone tracing a try. Though animation is a problem, for a constant camera resolution you can conetrace in O(1) (regardless of geometry detail).

>we can push the absolute limits of a technology in a synthetic demo
>this is meaningful to real-world applications
Okay.

voxels are shit because they take too much memory
much like raytracing takes too much processing power
although raycasting might be feasible soon, voxels won't

Several papers say voxel cone tracing is slow because BHV structure in GPUs, Turing implememt Hardware for BHV, several people want use BHV to improve physics,simulations or others effects, maybe Nvidia plan Voxel Cone Tracing for turing or next-gen

The mouth is what gives it away for me, the corners are too smooth.

do you mean BVH? bounding volume hierarchies? all 3D applications use those already

Yes, BVH begin emulate in GPU code, but turing own special hardware acelerator for this.

patents.google.com/patent/US20140340412A1/en

I search in patents nvidia and begin dozens patents for raytrace hardware since 2008

Why is Jow Forums so obsessed over some garden variety shitty computer generated 3D, instead of Jow Forums's (literally posted today, probably won't ever be seen on here again) computer-generated-uncensored-2D-PoC ()?

Is Jow Forums really a 3D board?

Attached: image.jpg (636x357, 25K)

jeff looks really good

Attached: jeffbezos.png (1736x525, 1002K)

>Tranny Snake
MGS after CoC.

For me the description gave it away

I mean, it literally says "Raytrace […]" in the subject field. How have you guys not noticed that first? How am I the first one pointing it out?

Attached: 1529430946323.jpg (244x289, 5K)

>I figure we'll have at least 30fps 1080p raytracing on a "typical" consumer PC within the next 5-7 years
So Nvidia have officially taken the throne of Just Wait™ technology?

Attached: planet of the laughs.gif (404x416, 204K)

People didn't think anyone would be stupid enough to need that pointed out. Good job.

That's because you're horribly afraid of women. Doubly so if they look like your mother.

That is also really cool, user. It's practically magic

>tripfag biting bait so obvious that it isn't even bait but just a fucking obvious joke
Brain damage.

Snek... SNEK .... SNEEEEEEEEEEK

>raster pipeline

yes you can just not in in a reasonable amount of time
hell not even in standard ray trace you will need insane amount of time since interjecting 1000-2000 rays into a single triangle (very basic number for rt wokrs) will take couple of days to render it

That's an incredibly photo-realistic image.
Hey guys is it just me or does animation have a lot of catching up to do?
The actual rendering is phenomenal, but animation doesn't capture a lot of micro-movements I think.
Do you think more advanced animation models or deep machine learning networks will solve that?

do we have photo realistic ray traced pussies yet? asking for a friend obviously

realistic animation means realistic physics
it means huge-scale particle simulations instead of hollow shell meshes like we have to today
AI isn't going to help with that, the only thing that will is alot of computing power

Buttholes are more aesthetic.

Finite element methods plus Deep learning.
dspace5.zcu.cz/bitstream/11025/6987/1/Warburton.pdf

Will this be used for deepfakes?

spbp

Raytracing cards are worth it, just not for gamers right now. I don't blame gamers for not falling all over RTX it's a shit value for them, but the tech is important.

There is a reason VFX dropped raster rendering pipelines as fast as they did. Shit is hacky as fuck. Rasterization is a house of cards when it comes to photorealism.

eyes are fine, but the hair is not good enough - it looks like it's been doctored in PS post-render where the hair meets the head and below the chin. Also the hair has some doll-like qualities, seems too neatly arranged. Otherwise an excellent render.

>Raytracing cards are worth it, just not for gamers right now.
who are they worth it for then? because I do professional 3D and I wont be fucking buying one

Nvidia's new GPUs were just an easy way for them to convert their server compute cards into 'gamer' cards.

DLSS just uses the existing machine learning hardware for a AA gimmick, and they slapped on a cheap raytracing ASIC so they can hardware lock people in.

>This level of rendering would probably take a couple of minutes per frame on a GPU.

Hours mate.
This girl would be rendered on top tier GPU for hours

New generation of deepfakes. Rendered one frame at a time for weeks.

You will when they end up shaving hours of your render times.

anandtech.com/show/13522/896-xeon-cores-in-one-pc-microsofts-new-x86-datacenter-class-machines-running-windows
>AMD niggers can't run it
>Hurr its a gimmick
Who cares what they say? They make up like 15% of all CPU users and GPU lol. Meanwhile everyone still uses Xeon instead of Threadripper or Epyc.

JUST FUCKING WAIT
AMDBROS WHAT SHOULD WE DO?

Attached: 1534857980383.jpg (1280x720, 91K)

I'm fine with my RX 570 8GB

>2018
>brainlet gaymurs discover raytracing that ve been around and widely used for over 20 years
also op image is a real raytracing, not what nvidia actially offers

ray tracing cards lik the r2500 yes its a dedicated rt monster that is capable of incoherent ray tracing yes
rtx cards no

Sorry for not replying earlier, this was a big read.
Really promising conceptually I think, answers a lot of my questions.
Thanks.

>pootracing
suddenly i want to see a fallout 2 remaster with this technology included

Raytracing was a thing in the 1980s.

Wrong. It's supposed to be a Limbal ring, which some people say is a sign of youth.

Old people don't have these as prominently.

Attached: Human_eye_with_blood_vessels(1).jpg (1920x1280, 354K)

Post it in /hr/ or /s/, I dare you

realtime deepfake raytracing vr when

>Raytrace don't matter
Literally no one ever thought this apart from you /v/ children who think that an entire decades old rendering method is just something Nvidia invented last month for videogames

Attached: 1472945614158.jpg (306x306, 20K)

Any gpGPU can do path tracing.
I use Vega for OpenCL rendering and it rocks.
Nvidia didn't introduce anything new other than their machine learned denoiser, that's literally all there is, just a fucking proprietary denoiser.

thats a real photo u dumbass