Fell for the freesync monitor meme

>fell for the freesync monitor meme
>AMD hasn't released a good GPU since the 200-series
>probably going to have to lose freesync functionality when I need to upgrade my GPU
buyer's remorse fred

Attached: download (1).jpg (225x225, 8K)

Other urls found in this thread:

altera.com/products/boards_and_kits/dev-kits/altera/kit-arria-v-gx.html
twitter.com/SFWRedditImages

I bought a FX8350 back in 2014 because Jow Forums told me It was still good for 60FPS gaming, it wasn't.
Since then I never listen to Jow Forums

Freesync sucks so much compared to gsync they are preping freesync 2..

inb4 AMD-shills who don't even have a synch monitor tells you how it's actually better.

I got an rx 550 and has happy and satisfied with my purchase, because i am not a faggot like OP who wants things that dont make him productive.

>intelligent people like myself never partake in leisure.
LOOOOOOOL

Attached: 5eb[1].jpg (600x750, 26K)

>have 1440p/144Hz
>use Vega 64 with it
>everything works just fine

>Im from /v/

Attached: 1513384344432.jpg (476x476, 21K)

I have a 260X and everything is adequately comfy.

I'm still running one. if your doing 1080 then 60fps is not problem. But I agree with never listening to Jow Forums.

> There's only one way of having fun.
Also an rx550 is fine for games.

>There's only one way of having fun.

Attached: hollow-man[1].jpg (333x500, 150K)

>nvidia damage control
freesync has won nigga get the fuck out

>freesync has won
>best selling gaming montiors are g-sync despite being more expensive than freesync
LOL

>all samsung tv's support freesync from now on
>it's in xbox
>it's in playstation
go to bed Huang

>I literally don't know the difference between standard and product
Did JDEC beat AMD at making GPUs since everyone uses PCI?

>Rolex "lost" because their market share is >50%
kys

Damn, Nvidia got destroyed

well, it should work with future playstations and xboxes over HDMI at least, which is the real death knell for gsync and not PC/displayport stuff.

Does that freesync thing even work on linux?

>looking for loonix laptop
>"oh, Linux is great, it works on ANY hardware!"
>go to walmart looking for any laptop, thinking anything will do
>get the biggest laptop I can find, 17" HP laptop
>no linux distro will work, black screen instantly, due to drivers
>this was in 2016
>check again
>STILL an issue
Whatever, I gave that laptop to my gf and let her run windows on it.

>"ur just dum"
>"no drivers isn't loonix's fault, loonix is just the kernel"
>"how about you write your own drivers and contribute"

What a retarded statement
1. This does not affect the benefits g-sync has, which people are already paying money for
2. I don't think even 10% of consoleplayers play on a computer monitor shared with their pc.

>What a retarded statement
>1. This does not affect the benefits g-sync has, which people are already paying money for
>2. I don't think even 10% of consoleplayers play on a computer monitor shared with their pc.

As soon as every TV and console made has adaptive sync built in (supposedly even for PS4 pro/bonex with patches), g-sync's garbage proprietary nature will be even more apparent.

Nvidia will have to support VESA adaptive-sync from their cards to TVs over HDMI or they will look like shit. And since every FreeSync 2+ monitor has variable sync HDMI inputs as well, that's it. Maybe G-sync can exist as a special DisplayPort enhancement somehow, but it will be next to meaningless in the face of cheap, universal HDMI adaptive-sync.

>g-sync's garbage proprietary nature will be even more apparent.
1. Pro tip: 99% of people aren't autists with an emotional attatchent to hardware standards. They don't care if something's proprietary or not.
2. The proprietary nature of g-sync is why they generally have a higher and more even quality than freesync displays (Pro tip: #2 the g-sync module isn't snakeoil like some retards believe). It's also why they are more expensive. This doesn't change if freesync becomes more prevalent

g-sync is 2 things: the hardware implementation of some shit in the scaler, and the proprietary DisplayPort wire protocol extension.

they could still support all the compressed write etc. crap in the scaler while still using an open/standard external wire protocol, and the only reason they haven't is since they want to create vendor lock-in between GPUs and monitors.

>buyer remorse thread
>AMDrone shill post his specs while nobody asked for it
The absolute state of amd shills

Attached: 1515044920479.jpg (552x661, 71K)

>some shit in the scaler
fucking lol, no clue about how it works or it's features.

I guess I understand why you think everyone buying nvidia is retarded if you're THIS uneducated

kek

>wants linux
>buys random hardware at walmart + gf
Get out.

Attached: 1420842119854.jpg (328x277, 11K)

temporally compressed panel write-out is literally just attaching more frame buffer to the scaler and scanning the t-cons through the rows in a briefer but delayed burst.
everything else regarding g-sync is just market spin plus a shitty proprietary wire protocol.

VESA adaptive sync just has the monitor report its supported refresh intervals in the EDID during the DisplayPort link handshake, and the monitor accepts any frames that come in within that variable timing spec.
G-sync is a janky bidirectional protocol where the GPU keeps pinging the display to ask if it's in a blanking period, then sends the next frame if both sides are ready, which is stupid, unnecessary, and even slightly (

nVidia makes entire custom chipsets for some g-sync panels retard. They do hardware level shit like on the fly voltage reg.
Meanwhile you're going on about "some shit in the scaler and compression". No point in talking to you further, see again.

I'm literally describing what their FPGA-based scalers do you absolute gorilla retard nigger.

>on the fly voltage reg
>dude just add more framebuffer
lol

the hilarious thing to me here is that what you think is hyperbolic sarcasm is closer to the truth than your own actual understanding.

pic related is an Altera FPGA with an absurd 768 MB of gook ram bolted on. the on the fly voltage regulation you're boasting about is just an input to the TCON (a display component that Nvidia doesn't make) sitting between this nig-rigged scaler board and the LCD panel.

G-sync is a piece of shit from an engineering standpoint. For a $100-$150 premium, you're getting a really poorly specced out, power-sucking prototype board operating the panel.

Attached: DSC_4622.jpg (2800x2151, 2.55M)

>free sync
>unstable 30 fps
Literally why.

Really? They are too poor to make a dedicated IC?

>boasting about
Why would I be "boasting"? I'm not a retard who's emotionally invested in electronics companies.
I'm just bringing it up since it's an obvious thing that needs actual physical hardware, not just "hurr durr open standards"
>an input to the TCON
I'm not saying v-reg is unique to g-sync screens. I'm saying they do it better because they have better hardware. This is evidenced by less overshoot compared to same panel with non-nVidia parts

Attached: pursuit_2[1].jpg (519x304, 45K)

they honestly could have, but:
- probably had no idea if they could get a big enough volume of suckers to buy one of the monitors
- probably realized that they would fuck up at least the first several times in this new market segment and didn't want to bake their mistakes into silicon as such

Lol

Ordering ICs from chinks is not that expensive. It is like putting 555 timer in device. Sure, it works, but it is shit for geeglez

this looks like an Arria V GX part:
> altera.com/products/boards_and_kits/dev-kits/altera/kit-arria-v-gx.html

that gets them a bunch of LVDS lanes and DDR3 interfaces baked in, which is worth something.
but honestly this whole endeavor seems like it was done on a shoestring engineering budget with maybe 100x more going to marketing.

Em...Since when Intel bought Altera?

>Em...Since when Intel bought Altera?
since like almost 3 years ago dude.
what rock you been living under?

>what rock you been living under?
Exams, and other shit. Just had no time to do stuff with hardware.

>have GeForce 770
>buy 144hz 1080p freesync monitor
>works fine
I don't see the problem here

Attached: 1520544787441.gif (480x270, 758K)

of course it works fine, but you can't use a pretty nice feature, like amd card on g-sync monitor

All that is just for preventing screen tearing? This is like Dedicated PhysX card levels stupid. Wouldn't everyone be happier if they built this into their GPUs rather than add an ASIC in every monitor?