The video is here: youtube.com
This video is a prof of concept. I always wanted to see if I could make my Neural Network play tinder for me. So I gather some pictures, created a fake tinder app (so I don't expose real people) and began the training.
For this experiment I'm using scrcpy to control my cellphone and pyautogui alongside my neural network. The script is very simple:
Get the image, run it trough a neural network, move the mouse to the output (yes or no) and click.
There's a lot of hard coded things in my code, but as I said earlier, it is just a prof of concept.
The code is pretty much garbage, but here's it if anyone care to look:
import json
from PIL import Image, ImageChops
import numpy as np
from Dejavu import Dejavu
import pyautogui
import time
def trim(im):
bg = Image.new(im.mode, im.size, im.getpixel((0,0)))
diff = ImageChops.difference(im, bg)
diff = ImageChops.add(diff, diff, 2.0, -100)
bbox = diff.getbbox()
if bbox:
return im.crop(bbox)
data = json.loads( open('nn.json','r').read() )
nn = Dejavu()
nn.load(data)
choices = { 0: (900,565), 1: (1055,565) }
pyautogui.moveTo(850, 210)
pyautogui.click()
for i in range(7):
img = pyautogui.screenshot().convert('L')
img = img.crop( (810,210, 1150, 500) )
#img = trim(img)
img.thumbnail( (36,36) )
arr = np.array(img).reshape(-1)
arr = np.pad(arr, (0,36*36-arr.shape[0]), mode='constant')
result = nn.predict( arr.tolist() )[0].tolist()
result = result.index( max(result) )
pyautogui.moveTo( choices[result] )
pyautogui.click()
if i < 6:
time.sleep(3)