/dpt/ - Daily Programming Thread

What are you working on Jow Forums?
Previous:

Attached: 1475546193270.jpg (641x870, 383K)

Other urls found in this thread:

pastebin.com/GeJQXShM
github.com/fmtlib/fmt)
bugs.python.org/issue34605
braveclojure.com/
interactivepython.org/courselib/static/thinkcspy/index.html
archive.org/download/TerryADavis_TempleOS_Archive/fan media/audio/TAD Hymns (MIDI)/
ccarh.org/courses/253/assignment/midifile/
theregister.co.uk/2018/09/11/python_purges_master_and_slave_in_political_pogrom/
twitter.com/SFWRedditVideos

first (you) me

How similar to swing is gtk?

First for WebAssembly

Attached: LA.png (831x651, 19K)

Hello Graph-lords.
I'm using Neo4j and am not sure how best to model the schemas for my use case.
I am scraping from an arbitrary number of services, which provide overlapping information.
The goal of the project is to provide "the most accurate" or "most complete" information using the aggregated data from each scraper.

One requirement that I have set is that in the final aggregated product, I should be able to know where the information came from.
So I would have a copy of the original data and a new aggregated result.

Naturally I would have some kind of `original -> aggregated` relationship which would easily tell me which originals are involved, but not which property came from which original.
How would you guys create this relationship?
Is duplicating data inevitable?
How would you ensure that each property is only "coming from" exactly one original?

Honestly the only reason I'm not using SQL is because I expect room for a lot of unusual queries that I can't predict beforehand.

Fantasy football user here. app is working perfectly. big thanks to you pajeets who helped out.

pastebin.com/GeJQXShM

any ideas for my next project?

thank you for posting a programming related image

Attached: 1508208478450.png (824x792, 408K)

If I were to fail a fizzbuzz test on a whiteboard, what level of programming would I be on?

Attached: Screenshot from 2018-09-12 17-13-21.png (3840x2160, 481K)

Shit tier