How do sites like rabb it tell when an image you're uploading is too lewd ahead of time...

How do sites like rabb it tell when an image you're uploading is too lewd ahead of time? When posting regular tame images they would share fine, but anything too explicit would fail to upload. Do they just have some kind of masterlist of previously uploaded images?

Attached: 3bc2d87da3b806ec0f4fab94ed6bc0b8.jpg (800x1129, 86K)

Other urls found in this thread:

isitporn.com/show/b24af913324146c985995aace6770451
twitter.com/SFWRedditImages

I remember a website called 'porn or not' or something, used machine learning to check if something was lewd or not - alternatively they have an army of pajeets coupled with a known offender database yeah

If thats a thing that actively happens these days, is that also why we see less porn floods here on Jow Forums compared to previous years? Machine learning algorithms that either check a known db, or actively reverse image search on upload then check a blacklist.

No

Care to elaborate then?

I want to marry Hex Maniac!

Your computer locally hashes it near instantly, sends the hash, then the server checks if the image is already known to be lewd.

So how easy is it to change an image enough that it won't be recognized?

>machine learning algos that check a db or reverse search
why would you call those machine learning though? they're just normal image blacklisting algorithms. there's nothing inherently ML about those (except the image search, but i'm assuming you expect it to already be done, not to implement it).

it depends. if it's literally hashed even changing one pixel will change the hash entirely. however, things like photodna, which MS uses, manage to give the same hash to similar images, so that it's resistant to changes.

there is an element of machine learning, not those highlighted by you of course, but the one that does an 'intelligent' matching on the potential porn image uses machine learning to make those decisions

isitporn.com

I don't get this meme, she's not even thicc

She sells milk, that is enough reason

/x/ loves to shill >her

Well, current ML algorithms are quite vulnerable in that regard, in that small amounts of noise in an image can entirely change how the image is classified. However, I imagine it would be a lot of trial and error to pull off without a copy of the algorithm.

Attached: DNEXk6gWAAEzn8B.jpg (649x597, 77K)

I can see us reaching a point (assuming we haveb't already) where photographic evidence that manages to fool an algorithm will be considered illegitimate, since the user was forced to edit the image to ensure it could be seen at all. Then investigstors will have to go around with polaroids or those one time use tourist cameras just to get approvable photos of a crime.

Attached: 1475508726279.gif (60x95, 24K)

I'd be impressed if someone's implemented client side photoDNA in javascript (or an equivalent system), anyone know?

>Add one pixel to an image of a stealth fighter
>AI thinks its a fucking dog
>Add one pixel to an image of a dog
>AI thinks it's a fucking ship
The pinnacle of AI development, such scary, watch out humans, they gonna take ur jobs and destroy mankind

isitporn.com/show/b24af913324146c985995aace6770451
KEK

Attached: b24af913324146c985995aace6770451.jpg (626x799, 35K)

You really think Hiroshimoot is going to put in the effort of updating Jow Forums to add that kind of thing?

kek

I'm curious if it already exists in some form. I'm sure there's lots of ways to impliment it that can be mostly independent from the site outside of matching from a list.

Attached: c28fcc8e9c124389a65eb7848988e8b1.jpg (800x600, 40K)