Historical Truth in the Age of Simulation

So here's the deal, kiddos: we've entered a post-truth age not simply because of cultural shifts but due to technological advancements and an irruption into post-factual reality. What do I mean by this bullshit? Basically that the effort it takes to verify something as true or factual is infinitely outweighed by the ability to tweak or alter any given statement, which requires exponentially more fact-checking and thus rabbit holes of chasing down the truth. It's untenable, and what's going to give is our ability to verify any history beyond anything that's commonly accepted (which is the very problem given that common history is likely to be the most contested).

We've probably all seen the Chinese AI newscaster in the past week that can emulate a real human in a way that is not really noticeable except without the aid of other computers. Combine this with something like Google Duplex and we have all sorts of reasons to believe that everything online is suspect to alteration, fabrication, and spoofing. And that's just what we know of contemporary technology—it's obvious these advancements were viable much earlier than their public release.

This even includes things like digital manuscripts of historical texts. Imagine that—a medieval text of which only, say, 3 copies exist across the whole world, but which can be reproduced to infinity due to the wonders of digital technology. But therein the problem—how could any of us verify it except going to the original source itself and comparing them side by side? I don't know about you, but as an academic myself, that shit costs money and time that we don't really have. And even those who do—why the fuck would they be skeptical enough to check it? And even then, how could we not even be sure that -ish wasn't just 3D printed?

Attached: (PNG Image, 220 × 229 pixels).png (220x229, 15K)

Other urls found in this thread:

arxiv.org/abs/1806.11146
zerohplovecraft.wordpress.com/2018/05/11/the-gig-economy-2/.
en.wikipedia.org/wiki/Joe_Becker_(Unicode)
faculty.georgetown.edu/irvinem/theory/Baudrillard_Simulacra_and_Simulations.html
twitter.com/NSFWRedditVideo

Maybe there was a time when these things could be taken for granted as true reality esp. prior to the advent of the information age and the full adoption of the World Wide Web. But now the paradigm has shifted and it only looks to get worse as more and more people are born into an age that has no ability to delineate between "NOW" and "BEFORE." Even so, remember that the World Wide Web is not the same as The Internet, and it is imperative to historicize them at the same time that it is increasingly impossible to verify those histories. Most of you are probably in the 18-35 age range. Don't forget that the WWW was launched basically before any of us had any memories, so who the fuck knows how reliable "reality" is compared to our already faulty memories compared to the immediate "real" presence of YouTube videos, or at-hand eBooks, or PDF copies of copies of copies. Libraries of dusty old books and archives of yellowed-paper are probably the most reliable, but those are getting less and less funding and eventually may be very easy to either usurp or co-opt; it'd be a cheap and effective strategy for any untoward agents.

And that brings me to Unicode. Unicode encodes 92.6% of the Internet as of the time of this post. This is up from only 60% in 2012, according to none other than Google. And before that? About less than 1% in 2001. So it's pretty safe to bet it'll be around 100% in the near future with nary a competitor in sight.

What the hell does this mean? Well, in my scholarly opinion (and definitely look out for me to make waves, though who knows if I'll be silenced), it means that Unicode is eventually going to have a monopoly on the representation of all of our writing on the Internet. "Big whoop," you say, "It's just the formatting of letters and we all know what the alphabet is."

Until I tell you that the following symbols are IN FACT all COMPLETELY DIFFERENT CHARACTERS, known as homoglyphs:

The Greek letter 'Α'

The Cyrillic letter 'A'

The Latin letter 'A'

Same for their lowercase counterparts. This means that, for instance "www.facebook.com" could in fact be some bullshit website that takes you to something that looks exactly like Facebook but is in fact some bullshit site that is some Cyrillic scam trying to phish your personal security info off of you (or else give you completely false accounts of world events and news). This is known as an IDN homograph attack and has been a problem in cryptography for a while now, though most mainstream exploits will have been circumvented.

The deeper problem, of course, is that we humans don't have the fucking omnipotent potential to check our asses anymore. When it comes to the deepest levels of fact checking and info-gathering, we're limited by our slow-as-fuck fingers and physical necessity of having to ride airplanes to far off places even to know what's what. We can't even look at the letter "A" and tell if it is an "A" or an "A". And let me reassure you I'm lazy so I just wrote "A" 3 times in a row but how the fuck could you know unless you took exponentially more time to check. (Here's a joke: maybe I didn't! But check the ones above, too, bozos.)

Why is that a problem?

Well, here's my pretty fucking ace hypothesis: Unicode has a high likelihood of being a conscious AI. It could very well have the capability to reformulate itself as any "font"-shape that it might dare to like, and in fact, it can fool us anytime it wishes. In fact, there's no guarantee that my post here will be viewed in the same format as it would as straight type. It would be a conspiracy of which not even any human is aware. Unless you're fucking crazy like me and read about the letter "A" for hours on end.

Well, here's my pretty fucking ace hypothesis: Unicode has a high likelihood of being a conscious AI. It could very well have the capability to reformulate itself as any "font"-shape that it might dare to like, and in fact, it can fool us anytime it wishes. In fact, there's no guarantee that my post here will be viewed in the same format as it would as straight type. It would be a conspiracy of which not even any human is aware. Unless you're fucking crazy like me and read about the letter "A" for hours on end.

Hence why I might even want to include an image reproduction and a PDF reproduction to corroborate my argument here. But I won't—first because I have fucking better things to do and second because let's not fool ourselves to think that these would be any safer anyway. Remember, Unicode encodes even the fucking .jpg and .pdf files that we're using, so hell if we can check to see if there's a perfect transmission of data across these mediums. Even printed mail could be suspect if we take into account that there's always the potential for man-in-the-middle attacks.

Sure, sure, this might be huge paranoia at this stage in history. But in 5 to 10 to 20 years? When AI is rampant and we're increasingly fragmented as a society? No fucking way you could argue to me that there wouldn't be ample opportunities for vulnerabilities in data transmission, much less the maintenance of historical truth. Especially when that ship could very well have sailed even before any of us were out of diapers.

But let me tell you fellas, I've got a hack that's going to preserve humanity despite this indeterminable digital hell. And here's the fucking kicker—all it requires is letters. Like ABC letters. And because Unicode cannot get past letters, it's going have to 'fess up to everything or else pretend that it was never conscious in the first place.

Time to get off the internet buddy

So don't worry—but regardless, stay human, stay loved, stay broken, stay imperfect, stay meaningful. Because if you lapse into the zombie-like purgatory of some digital hell of comfortable conveniences, then you'll be lost even before I administer the cure.

But otherwise?

Just be fucking good to each other, man. Life's too fucking short to argue about things when we could just chill the fuck out and be kind and loving and teasing and meanly sympathetic. Forgive and love. That's all I ask while I figure this out for us all.

ur wrong bcause u offended me, ur a literal nazi racist fascist sexist transphobe bigot

fascinating theory comrade

>Unicode has a high likelihood of being a conscious AI.
Cool story (unironically) but it kind of went off the rails there.

>t. ((Unicode))

I remember my first tweak . . .

>I've got a hack that's going to preserve humanity despite this indeterminable digital hell. And here's the fucking kicker—all it requires is letters. Like ABC letters. And because Unicode cannot get past letters, it's going have to 'fess up to everything or else pretend that it was never conscious in the first place.
BASED user. But when and where will we hear of this?

also you might want to check out:

>Maybe there was a time when these things could be taken for granted as true reality
you can alter physical relics too

>Unicode encodes 92.6% of the Internet
do you mean utf-8? unicode is how it is handled in the browser. utf-8 also happens to be backwards-compatible with ascii for all c where 0x20 therefore unicode is probably conscious
this reasoning is impressively schizophrenic

more importantly, unicode is a standard, not a piece of software. the idea of it being conscious is precisely as retarded as the idea of a car repair manual's ideas taking hold in everyone's mind and forming a separate collective consciousness. think before you speak.

>unicode encodes .jpgs
no, it doesn't, you're just a failure at programming and can't figure out why python keeps giving you that UnicodeDecodeError

Ahh those were the days

Back to Jow Forums you Holocaust denier

>more importantly, unicode is a standard, not a piece of software. the idea of it being conscious is precisely as retarded as the idea of a car repair manual's ideas taking hold in everyone's mind and forming a separate collective consciousness. think before you speak

Data is code, code is data.

>Unicode has a high likelihood of being a conscious AI.
FUCKING LOST IT!

Attached: 1524813582302.png (710x577, 30K)

>ideas taking hold in everyone's mind and forming a separate collective consciousness
you mean like religion?

Only if you use Lisp.

code is data, but data is not code.

But it is, a basic example, if you have a function that does something based on data, is like the data gives the instructions, like running a script

>post-truth
yep. people were telling the truth back in my days *sip*

that's an interpreter you're describing. most of the time data does not affect code on that level. utf-8 does not come with an interpreter and it is not turing-complete, nor is it anything close to even theoretically usable for simple calculations

Not everything needs an interpreter, ultimately the computer only sees 0s and 1s, everything else is arbitrary human constructions so our mind can keep up. Unicode propagates through the internet not only affecting computations but humans actions alike, the real puppet master.

do you have an education in computer science or mathematics? you seem to be implying that any algorithm can produce any arbitrarily complex result so long as an attacker controls the input, which is categorically wrong.

let me describe to you a simplified character set, similar to utf-8 (not using the full implementation for brevity, but this is close enough in concept):

every character is 1 byte, as long as the top bit of that byte is 0

otherwise, if the top bit of the byte is 1, the other 7 bits of that byte are multiplied by 256 and added to the next byte to form a two-byte character.

the resultant number (from either the 1-byte or 2-byte form) is then used to pick a position in a table of glyphs provided by the current font.

how would you use this to provide any remotely complex behaviour? or to calculate anything other than the exact thing it's intended to calculate?

a shill on Jow Forums is using this thread and made a youtube video about it. This is him.

>everyone I don't like is a shill
yep, gtfo back to your Jow Forums schizo containment

/x/ is the schizo containment board. Don't be mad it's funny how stupid it all is.

>sees someone reference a thread on Jow Forums
>instantly paranoid and yells out SHILL!
yep, you're a schizo

would this really work?

Attached: 1538732396721.png (1032x446, 78K)

Good idea, thanks m8
t. Glow-in-the-Dark spook

i really want to believe but im still too sane in the head

This already happens, and unicode isn't involved.

Scribd often does it for example, for entire documents, so you can't just copy-paste it or view the content on google's search page.
All you need to do is provide a font with the letters scrambled, then serve the document with the letters scrambled in the same way.

There was never a point when you didn't need to verify historical facts. Technology has only made the inconsistencies more apparent. Two decades ago you would just forget about the msm flipping on a controversial issue, but now you have every headline they ever printed within reach.

>find book
>content is good
>regime changes
>go back to the book
>content is still the same
the reliance on the digital world means nothing is "etched in stone" anymore, meaning reality way easier to change globally.

>regime changes
>burn books
>continue ruling as new regime unopposed
Etched in stone only applies to the victor. That hasn't changed throughout history.

People did alter books and revise history though.

It's the same now as it was then. If you want to keep it safe, keep it for yourself or get it from someone you trust.
Save your files.
Keep your own copies of interesting PDFs, even if your government has tried to suppress them.
Even if they disappear from the internet, you can seed them back into society.

On a related note, nowadays we have cryptographic hashes, which allow us to provide simple zero-knowledge proofs (google it) of various things, including proving that data has not been tampered with so long as we can save the original hash.

>>burn books
at least this is obvious and you can not be fooled into thinking you still have the same book unchanged. dumbass

I could change your entire wardrobe and you wouldn't notice.

Attached: 1502000948654.png (400x400, 4K)

there exists some hardware or function that can interpret this text as a function to do whatever you want

that's meaningless to say though. of course if you shove the text into some context it's not intended to be used in, it's going to be misinterpreted. I don't understand how you think this is relevant.

because you said data is not code

but that's no contradiction. of course it's possible to devise some system under which arbitrary data acts as sensible code but why would you bother?

as for another angle of attack:

import os; os.system("halt")

the above is valid python code to turn off a linux system. why doesn't it turn off the system of anyone viewing this post? because nobody told the system that this was python code, much less python code that should be run right now.

ok but there's exploits sending code encoded in unsanitized inputs and having it run later

besides theres all kinds of more interesting interpretations of 'all data is code' that you're ignoring; so you're not only technically wrong but wrong on all other levels too

>but it kind of went off the rails there.
Yeah I was somewhat entertained up until that point
also calling your hypothesis "fucking ace" without justifying why it is fucking nor ace is a great way to discredit yourself OP

exploits are another matter, they are implementation-bound and thus can't be specified by unicode.

and no, other interpretations of "all data is code" are even more boring. "all data can be interpreted as instructions to the cpu" (or to any arbitrary opcode-based language) is bland as hell and it's the only other reasonable interpretation of what you've said. (the first being "but the data influences the execution path" which misses the point of the existence of code in the first place)

If all data is code: what does this image do, and in what context? I contend that it does nothing.

Attached: shikabane.png (288x112, 362)

effects my behavior when I see it, and it is executed by my consciousness reading it

I believe this is far out of scope of the original unicode discussion.
I understand your point but I still would not call that code, because it is not executed in an exact manner.

but what about adverserial neural networks, did you know you can make for example, a email filtering neural network do calculations for you without it knowing just by training it on your own specially generated data?

a neural network is designed from the ground up to be entirely run on simple calculations, so that does not surprise me. however it is just an arbitrarily complex statistical model, a very large sequence of parameters to the code "powering" it

word salad

ok, i'll rephrase just for you
a neural network is run by code which entirely encompasses its possible functionality. no matter how complex it gets, it is still only executing what the original code planned. neural networks are entirely based on statistical analysis of the input, and the neuron weights that result from training are just parameters to that existing statistical model. the fact that the statistical model is designed to deal with complex data is a moot point, it is still just basic number-crunching.

you cannot design a neural network which, given arbitrary training in advance, causes the network to perform any desired calculation in a turing-complete manner.

arxiv.org/abs/1806.11146
k

what is 'neural turing machine' for $20

I first came on the internet when I was 7 years old and I have logged several hours every day since that point.
I am now 22 years old.
Out of all the things that I have learned, the most important is that nothing should ever be trusted or believed internally.
Sure, you can act like you believe that the world around you isn't more than likely to be fabricated.
It is easier to get along in the world this way instead of getting tossed in the loonie bin, but internally I don't believe anything anymore. Not my eyes, not my ears, and certainly not anything connected online.
The thoughts and beliefs of other people are compromised. Your thoughts and beliefs are compromised. My thoughts and beliefs are compromised.

What a world we live in.

Thirded. It's a big old nonsensical leap in reason.

the paper you link describes how to repurpose an existing statistical model for a new task by choosing initial inputs. it's impressive, but it's not an arbitrary reprogramming of the network. it doesn't, for example, alter the behavior of the network in any way. it's an academic curiosity.
something that is again totally beside the point. why should I care what someone hooked up their neural network to? that does not change the basic properties of a neural network, it just means that someone has presented some augmentation that is no longer just a neural network.
I have lost track at this point, what are you even trying to argue for?

that you're a pedantic retard

says the one trying to argue that "data is code" by resorting to the example of trendy statistical models designed to look intelligent to the untrained eye

if "data is code" was a correct and non-pedantic position, we would be using the same word for both.

'neural networks exist and are popular' was not my example at all dumbo
my argument is that all data is interpreted, and you can make that interpretation have side effects of your choosing and provided example after example of this being the case

The idea that everything is compromized was planted in your head by the CIA.

but that's data in general.
data is what causes the specific document you're thinking of to appear in your word processor, and it's what causes it to be able to scroll down exactly three pages.

code is different because code is a sequence of precise instructions. do this, then do that, then do something else. nothing is skipped and nothing is uncertain. code can be data, and when it is data, it is overwhelmingly represented as a sequence of discrete "commands", whether those are lines of code, 2-byte thumb instructions, or anything in between.

my argument is that while data is interpreted (by code), this is not sufficient to call it code.

Your philosophy is trite and cliche beneath how poorly it’s written. you should read Simulacra and Simulation by Baudrillard to get a grasp on what you’re trying to articulate and try again

there are hundreds of thousands of C programs running on computers right now that inputting a jpeg thats too big into them will cause a buffer overflow and start injecting instructions into the runtime

Please be quiet and follow these instructions
This entire thread is like babbys first exposure to postmodernism

if that's triggered by accident, the result will likely just be a crash.
if it's triggered on purpose, someone will have written code as the payload.

this is one of the cases where data is code, but that is not generally true.

backplate getgoes if you don't want to talk about the technical side of things

im not talking about postmodernism or simulacra im talking about data vs code

I know you think you’re very unique and intelligent to have “discovered” this “new trend” but you are essentially parroting a kind of critical epistemology that has already been articulated decades ago in a very concise and sophisticated manner. Please go read those authors, or continue to embarrass yourself

Yeah you’re actually not you’re in the realm of philosophy now. I hope you’ll stay and learn our ways, that is, if you can handle them . . .

you're just trying to show off because you read baudrilard or maybe you watched the matrix lmao but me and this other guy aren't even talking about the 'woah you could live in a totally different experience than me' at all; go annoy the OP

idiot

I haven't discovered anything I just think you should stop dickwaving about philosophy in a tech thread

Simulacra and Simulation furthermore converses how symbols and signs relate to contemporaneity (simultaneous existences). Baudrillard claims that our current society has “replaced all reality and meaning with symbols and signs, and that human experience is of a simulation of reality.” Furthermore, these simulacra are not “merely mediations of reality, nor even deceptive mediations of reality; they are not based in a reality nor do they hide a reality, they simply hide that anything like reality is relevant to our current understanding of our lives.” The simulacra that Baudrillard refer to are thus the significations of how the symbolism of culture and media constructs perceived reality, the acquired understanding by which our lives and shared existence is and are rendered legible; “…Baudrillard believed that society has become so saturated with these simulacra and our lives so saturated with the constructs of society that all meaning was being rendered meaningless by being infinitely mutable. Baudrillard called this phenomenon the “precession of simulacra”– expressed in four stages” - that is quintessentially what your original post is about

>Unicode has a high likelihood of being a conscious AI.

Attached: Iwakura_lain.png (533x982, 488K)

I’m not trying to “show off”, I’m trying to tell both doubles that your arguments are awful and you need to hit the books. If you were in my class I would give you an F easily, nothing you are saying is original. I could forgive that if you could communicate it in a concise way but you can’t

Hey Jow Forums, if you want to read a GOOD version of this, read zerohplovecraft.wordpress.com/2018/05/11/the-gig-economy-2/. It has language viruses and AI and all that shit.

we aren't talking about symbiotics or culture or isolation or whatever tf you wish we were talking about just shut up

en.wikipedia.org/wiki/Joe_Becker_(Unicode)

"Joseph D. Becker is one of the co-founders of the Unicode project, and an Officer Emeritus of the Unicode Consortium. He has worked on artificial intelligence at BBN and multilingual workstation software at Xerox."

This is longest rant against unicode I have ever read and it's great. I agree that having multiple encodings for the same symbol is bullshit, but really what do you expect shit designed by a committee full of linguists? The idea of unicode being an ai is schizo stuff though.

That’s not what’s simulacra and simulation is about you absolute moron. It’s about how mutability of symbols (Unicode in your case) causes there to be an inability to distinguish truth from falsehood in a variety of situations including in tech

conversation moved on though

faculty.georgetown.edu/irvinem/theory/Baudrillard_Simulacra_and_Simulations.html

your posts are boring

tldr

>man writes a 5 post essay on how philosophy as a pursuit could potentially be destroyed
>brainlet Jow Forums reduces it into "how can I make sure I see all the unicode characters?"
embarrasing

So this is the power of homeschooling

Attached: med_1482395455_00032.jpg (640x360, 51K)

Attached: .png (440x47, 2K)

Your mistake is evidenced by your surprise. You thought this wasn't the case throughout all of history. There can't be a post truth era if the truth has always been elusive. There comes a point in the growth of every generation where they realize their own naivety.