Now that freesync monitors work with nvidia cards

Now that freesync monitors work with nvidia cards.

There is absolutely no fucking reason to buy AMD cards. or we just wait until AMD starts gimping and "patching" this bug.

Attached: nvidia-geforce-gtx-1080-ti-100724799-large.jpg (700x465, 34K)

Other urls found in this thread:

youtube.com/watch?v=XOBXQsNM3D8
twitter.com/AnonBabble

>Now that freesync monitors
Until the next driver update.

> implying Nvidia wouldn't be the one to patch this out to sell more G-Sync modules.

Is 1050 ti good enough to play gaymes on high?

>needs an amd apu
If your gonna be a shit poster make sure your shit can't get called out

you need an APU or a discrete card that's AMD and also PhysX gets blocked

Depends on resolution. 1080p should work at a minimum of some decent 30fps.

>or we just wait until AMD starts gimping and "patching" this bug.
too obvious of a bait senpai

Old games you'll be fine, which are the only games worth playing

>Physx gets blocked
Nobody gives a fuck about Physx unless you play pic related.

Attached: AliceMadnessReturns.jpg (220x273, 25K)

>Now that freesync monitors work with nvidia cards.
Elaborate, please.

There are a whole bunch of games that use PhysX without ever telling you about it, because they run it entirely on the CPU so that it works for everybody. Mass Effect 3 is an example of a game that won't even launch unless you have the PhysX software installed, yet it doesn't have any options for PhysX that you can toggle on or off. Same goes for things like Project Cars and Life Is Strange.

>pointless tech that wastes space on chip

on 768p and lower everything is 60+ fps

What are you even talking about? PhysX is entirely software.

He's still beliving that nvidia had physical shit on the gpu to do phyx like their marketing lied.

Well, I wouldn't downgrade my 2700x to 2400G just for freesync, even though my screen supports it.
I guess intel was right putting iGpus in all their fucking CPUs.

>PhysX
>Software

Attached: chip.jpg (800x600, 108K)

nigger, are you this fucking retarded?

Loved this game.
Never tried it again since I switched to NoVideo.
What I do know is it ran like ass on the CPU.

I don't know where you retards get your info, 1050ti can easily play all games in 1080p at high settings with more than 50fps.
Only games which don't run at high are unoptimized ubisoft games like ass creed.

>we just wait until AMD starts gimping and "patching"

you're fool NVIDIA will be the first to patch it, since it would damage they GOY SYNC monitor sales
so pay up goyim for your sweet goy sync

It is, why waste your time peplying to it.

this, Ryzen + nVidia is the only rational choice rn

dfid this method actually work? apart of me thinks this guy is stupid and cant tell the diference between the performance of his vega 8 and his 1060. that's how this used to work he claims 1803 changed this but I have my doubts. any one know 100%?>

if it did actually work and he didn't fool himself would it add lag to the video output?

Tried it using my Vega 56 via 4770K iGPU HDMI. Limited to 70 FPS but yes it works. Freesync demo says it is enabled (bot pnly if a monitor is connected to the GPU). I made sure it was outputting to HDMI only on the monitor. On games it may be a case by case scenario.

that 70fps limit is HDMI 1.4 or some thing thou and the res you picked I would imagine. does that IGPU have VGA out also try with a crt. to test max output. lag is apparently 3ms on the ryzen intergrated FYI.

>vga
>crt
I'll get right on it gramps.

meh if you must be rude some one with IGPU with hdmi 2.0 try that can do 120hz 1080p right?

PhysX is still hardware accelerated on Nvidia cards. You can look at e.g. Borderlands 2 or Alice Madness Returns or Planetside 2 to see the basic software physics vs. Nvidia's hardware accelerated features.

not sure nvidia cards still have hardwar PhysX cores. maybe 5 years ago not sure atm.

>Now that freesync monitors work with nvidia cards.
wat
bluepill me

>There are a whole bunch of games that use PhysX without ever telling you about it
Those don't even use GPU implementation, they run on CPU even with nvidia cards

its via doing some thing similar to what external GPU cases do to laptops I think. might be a new feature in win 1803 might not. they use some laptop driver to make it work ithink.

gives you 3ms of extra lag thou maybe more because the people that tested it used a retarded setup.

Never had them, the only hardware physx asic was made by ageia. Nvidia bought them and rewritten the engine in CUDA. You should know what CUDA is.

also some one will prob make this work with a cheap amd card so you don't have to go out and buy new CPUs/motherboards with integrated gfx to do it.

oh I see just remember nvidia selling physix only cards that where not gpus. was that super early on and just ageia rebrands??

what was one game that used it in cool way

Wrong link.
youtube.com/watch?v=XOBXQsNM3D8

>they use some laptop driver to make it work ithink
Well afaik freesync (labeled as gsync) always worked over eDP which is used in laptops even without AMD GPU. If it actually requires AMD GPU must be something different.
Though it kills my idea of building a linux machine with AMD APU + windows virtual machine with nvidia gpu passthrough anyway.

you linked wrong video I think?

2.0 probably but we want 2.1

cant be bothered watching video does it say max fps is 70? that's prob just the HDMI 1.4 output on the intergrated gfx right? hdmi 2.0 could do much higher hz?

>I don't know where you retards get your info, 1050ti can easily play all games in 1080p at high settings with more than 50fps.
>Only games which don't run at high are unoptimized ubisoft games like ass creed.
Please don't mislead the bloke. I think the card is in between what you state and what the other state
>Is 1050 ti good enough to play gaymes on high?
Your honestly better off doing a lil research yourself. Should take about an hour but I'd really honestly recommend no less than 1060. For longevity and all

>spoonfeed me
fuck off

This


If you still need weird AyMD hardware, it's not an option.

Pretty much this.
Go a 1060 even if it's the 3gb model. Games can swap in and out vram well these days

depends what you consider high. if you run max AA and AF and weird stuff like that it obviously wont but I don't consider thous things "high" i don't even consider them "maxed out" they are optional extras for retards. i play games to win and look nice not to blur the image or make it darker in spots.

please?

Attached: 1533361813796.jpg (641x530, 41K)

Not really. Since there were hints of new ryzen going 12+ cores it's very possible we'll get 8 core zen2 APUs which are no-brainer CPU choice.

I'd rather go full ARM before I switch anything to AMD

the 3ms lag is prob the video signal just going thru the motherboard again. motherboards+os have like 3ms lag..

Well if you type that on intel machine I must question your intelligence

Yeah honestly I don't see nvidia allow themselves to loose out on their propriety sales.
I got a 1060 3gb myself and I'm using ultra or high all the time no problems. Only issue in my CPU bottleneck (i5-4440). Regardless I'm really pleased with my "budget" purchase.

>There’s a decent chance Nvidia will try to plug this hole

And the masses bowed their heads like the helpless peons they are.

Attached: 1527802208041.gif (330x166, 2.11M)

yer I use 980 which is basically 1060 3g too. its fine. next gpu upgrade will be for lower noise/heat not power.

also don't be sad that 3ms+ lag makes your 1ms freesync monitor 4x worse with this method.

even the best 1ms monitor is really 13ms at crosshair (the "1ms" test is fake) so really doing this method just makes your monitor 20% slower not that bad at all.

so if you want to believe retarded goal post moving test you now have "1.2ms" monitor with this method.

>it's very possible we'll get 8 core zen2 APUs which are no-brainer CPU choice.
Oh nigga... If so I'm sold. Amd really needs to do this.
We can make it too navi with our cards. Not that I expect much but 7nm nigga and amd is taking their time so it'll be interesting to see if you consider that Vega was a mess behind the scenes.

>brain can't even process 100ms

Attached: 1534881422295.jpg (384x384, 19K)

im going to wait till intel discreet GPU in 2020. they might go crazy and release gpu 3x power 3x cheaper than amd nvidia just for the fuck of it.

>im going to wait till intel discreet GPU in 2020. they might go crazy and release gpu 3x power 3x cheaper than amd nvidia just for the fuck of it.
Let hope stay alive. I'm far too cynical with the market to expect that but Intel's in a corner right now and they have the resources so it would be really interesting to see

well we will figure out how fast it is in 4months at ces next year so you will find out around navi time.

Any gpu Intel makes will be aimed at srs bsns because Intel is absolutely not prepared to write drivers for the sort of shit AMD and Nvidia do. Both AMD's and Nvidia's driver stack are colossal in size because so many games ship fundamentally broken (through various means). Christ the amount of API violations in opengl Nvidia actively support is staggering.

In principle it could work with any cheap freesync-ready AMD card shoved into a spare PCIe slot. The issue is that most games don't let you force a GPU. Apparently a reddit user is working on a hack

you do realize how big intel is. they could hire or put together the staff to create drivers superior to amd or nvidia in 3weeks.

Good luck is all I say. Their current igpu drivers are barely above "doesn't crash the system when put under load" as it is.

cool story
100ms is so fucking much no retard can pretend to not comprehend it. Take any video and delay audio by 0.1 second and feel the difference.

>they might go crazy and release gpu 3x power 3x cheaper than amd nvidia
Well if nvidia hikes prices 3 times again then yeah sure

yer people can perceive 2ms via science.

mussel reaction is like 150ms thou on a mouse click but hand movments are not the fastest things humans can do we can actually move our upper arm faster than our fingers even thou it doest feel like it. that's how pro fps players play.

So your point is you react in 250ms instead of 150ms rather than stopping being a faggot?

what no. 150ms reaction time is for like old 30 year old guys. trained young people can react in like 40-50ms.

when you do thous online reaction time test every one gets 180-220 because your mouse/monitor add another 17ms each and your motherboard/os adds 3ms.. so like 180ish+

you can reduce your computer reaction time by about 50ms from "gamer pc" and 100ms from shit pcs/tvs

if you only reacting at 50ms that means you have a 5-10x reaction time advantage to most noobs.

I have a optimal MS response time pc/monitor/keyboard/mouse setup I reck people in csgo on the reg and im not skilled at all and don't even understand the mechanics of the gamemode.

basically what im saying with the right equipment and body training/using arm not fingers/wrist you can have a 200ms advantage over other players.

that's 1/5th of a second advantage look at a clock how long a second is. that's a huge fucking advantage.

If you ever come back to this thread: 1050ti here, it struggles with a lot of recent games at high at 1080p and gets sub 30 fps on games at max. I only bought it because my previous video card died and at the time the better options were prices about 120ish dollars higher or flat out unavailable instead of the 50 they are now.

The amount of sales they lose due to not supporting freesync outweighs the royalties off selling more g-sync monitors. Monitors are a huge investment, if someone has a freesync monitor they're only going to buy an AMD card. Opening up the freesync market is huge for nVidia. Plus they can continue to push g-sync as the enthusiast alternative, so their g-sync sales won't really be affected.

>Ngreedia doing anything to help consumers
Pullleeeze

? It's helping their own marketshare, not consumers. Consumers would be perfectly happy just continuing to buy AMD since they're all normies to start with.

NVDA are like the Apple of GPU's. Walled garden.

>The amount of sales they lose due to not supporting freesync outweighs the royalties off selling more g-sync monitors
No

I doubt they will fix this. to stop this they would have to disable feature in surface device and all external GPU laptops right?

seems unfixable.

Yes. Tons more FS monitors are sold than GS and nobody is going to throw their FS monitor in the trash to buy a new nVidia card. What business sense does it make to lock out that much of the market? FS and GS are going to coexist anyways, you might as well play both sides especially when you know your only competition can't do the same without paying you a royalty.

It still takes around 15ms difference to be actually perceived consciously, in most cases.
But subconsciously it's suggested we can perceive 3ms, afaik.

Well what happens is that people just use their Freesync monitor with Nvidia and suffer the microstutters and screen tears. A lot of people don't know how much better it looks to have slightly lower FPS instead of microstuttes and screen tearing, and they just keep buying Nvidia.

>No one gives a fuck about Phsyx
just...

Source for 100ms claim please. My initial tests suggest it's nowhere near that (I'll need a high speed camera to tell for sure).

Walled Windows 10 64-bit garden.

ftfy

its not 100ms lag we talking about human reaction time. its 3ms lag which makes sense that's speed of motherboard/os processing.

Typical input lag between mouse/keyboard+cpu+pci+gpu+monitor tends to be around 30-100ms total when you add it all up. Then you put human reaction time on top of that.

more like 60-70ms but ok.