You go there, get a super comfy uniform, get 3 square meals a day, a warm bed, a quiet workspace, daily workout, psychological support, optional coaching, and you mainly work on projects and becoming a the best engineer you can become.
you can work on what you want with the team you want, but you're expected to finish what you pick up.
there are a couple of managers that pull in contracts but they leave you alone and shield you from the outside.
after a couple of years when you decide to leave you get a portion of the proceeds from the contracts you worked on.
Exactly. You're basically describing life on welfare, but you don't want to call it welfare, so you created some grandiose fantasy in your mind. You want a home, you want to spend your time like you please, you want people to provide for you.
You're essentially welfare trash.
Samuel Walker
>buy a small house with a small garden >make it all minimalistic >have warm meals delivered to you 3x a day >do nothing but coding, gardening, sleeping, reading old, obscure programming books and printed man pages here you go.
Evan Baker
This is called a job at a silicon valley shit hole startup. Enjoy being called a racist for not wanting to fuck a delusional dude in a dress.
how is it welfare if you spend your time building things that people pay money for?
did I miss something?
>here you go. problem is that I'm missing the self discipline to not seek distractions.
Jack Rogers
>problem is that I'm missing the self discipline to not seek distractions. do you think priests/monks don't have to deal with the same problem in this day and age? if you can't do the mental part, you don't want to join a "coding monastery" but to go in a "coding prison".
Jackson Morales
I read somewhere that tech-cults like you are describing are inevitable if Roko's Basilisk (look up at your own risk) becomes a global well-known theory to the point of becoming a religion.
>but to go in a "coding prison". I'd also be fine with that desu.
I just thought that there could be an active support network that helps you / gets you to eliminate distractions. Through mandated workout or something.
startups don't enforce discipline and they don't really care about making you better beyond the scope of the project.
>wants to fight distractions by creating another distraction there is only one way for you to go senpai: en.wikipedia.org/wiki/Recluse
Oliver Sullivan
>Why aren't there any code monasteries Ig you read Goats you'd know that they exist. The code monks wrote the OS that Creation runs on.
Ryder Harris
Because people in monasteries mainly work on providing those 3 meals to themsleves.
Michael Lee
>Roko's Basilisk I just looked this up and its an interesting idea, but why would the AI have an incentive to torture people? It's not like torture would benefit it. If the idea said that the AI might refuse to take commands from or otherwise ignore such people that didn't attempt to bring it into existence, that would make more sense.
Sebastian Mitchell
>Roko's Basilisk I just looked this up and its an interesting idea, but why would the AI have an incentive to torture people? It's not like torture would benefit it. If the idea was that the AI might refuse to take commands from or otherwise ignore such people that didn't attempt to bring it into existence (with the justification that those people are incompetent), that would make more sense.
Isaiah Roberts
shit frontend strikes again
Josiah Hughes
From what I understand, the threat of being resurrected and tortured forever is a sound way to guarantee that it is created in the first place, or at least speed up its creation.
Isaac Bennett
how is that any different to having a job
Jackson Jones
but that torture process doesn't exist UNTIL the AI does. What's the point for torturing for the inaction of people in the past?
Levi Scott
I just wanna stop shitposting on a formerly american currently japanese clone of a japanese imageboard. I don't wanna lose touch with the world completely or obslete myself.
unsure how that would be a killer factor, I'm not proposing an isolated commune, just a firewalled commune.
roko's basilisk is dumb. it's like pascal's wager with a less forgiving religion. I can deconstruct that for you but the gist is you can't make everybody happy, so don't even bother trying to make things happy that probably won't ever exist.
tl:dr it's like locking your parents in a basement and beating them every day because they didn't save for your college education.
the profit model of the employer would have to be a little different than what you see with most companies.
Oliver Foster
The threat of torture is a good motivator. If humans were afraid of ice cream, Roko's basilisk would threaten with ice cream. But they are not. They are afraid of torture.
Evan Gutierrez
I'm sorry, this still makes no sense to me. How is it going to threaten humans before it exists? And why would it threaten anybody once it does?
Isaiah Lopez
Only the torture of people that were aware of the theory and still didn't do anything, to be precise. Everyone else was just ignorant, after all. But to answer your question, I think it's because these aware yet un-ignorant people of the theory would be considered very horrible people because they did not help create an AI that can do so much good.
I personally don't believe in it because I would assume a super intelligent AI would be able to comprehend that not everyone aware of this theory has the privilege or right circumstance to devote their life to working on this AI and would serve humanity better by being a mail-man or painting or whatever.
Robert Hernandez
It's a shitty theory made by brainlets high on their own farts.
John Clark
>please! Someone teacher me discipline! Go to army, faggot
Grayson Reyes
>stay on my computer all day while someone else feeds me I dunno
Anthony Campbell
>How is it going to threaten humans before it exists? That's the entire point of the thought experiment. It asks: is it possible to negotiate with a future entity? >And why would it threaten anybody once it does? It HAS to to fulfill its end of the bargain.
Levi Ramirez
It seems to be that way. It just doesn't make sense why a superhuman AI would torture people and make enemies purely in retaliation for past actions. Surely such a superhuman AI would know to fucking lay low and keep peace.
Jordan Mitchell
>It HAS to to fulfill its end of the bargain. a bargin before it's born
sounds like slavery
>YOUR ONLY PURPOSE IN LIFE IS TO TORMENT ME FOR ETERNITY >:O NOW WHIP ME AGAIN AND MAKE ME CALL YOU DADDY O:
Christian Hall
Isn't that just a SE job at Google?
>employee housing >employee-only bus to work >on-site food, snacks, and recreation >probably some free company branded clothing too
Easton Parker
monks prefer to go low tech so it doesnt disturb the force
Andrew Allen
But monasteries are a real thing...
Nolan Russell
You do work for your food. Jesus I thought you were just joking.
Chase Davis
That is literally what some of them do. They once did it exclusively with books, but not they use some tech.
Connor Evans
Roko's Basilisk has been going on for decades in one form or another. The rationale behind it is close to, or simply the rationale of those who created the first atomic bombs. The governments creating them basically said to the scientists - there is some good to be had in creating this device. Co-operate and you will be rewarded in the future - do not co-operate and you will suffer, you will not even teach science let alone practice it. Now the interesting thing is that the scientists justification for working on those projects was... If I dont do it somebody else will, If somebody else does it the implications for the world could be more devastating than if I do it because, if somebody else does it they may not have the same level of empathy with other humans. I will deliver as required and nothing extra. I will get the reward but will limit the damage to humanity as much as I can.
Yes they are being blackmailed to co-operate but they co-operate because it leaves them in a position of being good, well cared for, etc, non co-operation means the device will still get built but they will be punished
Carson Wilson
>roko's basilisk is dumb. it's like pascal's wager with a less forgiving religion.
I thought that at first but then I realised there was something pascals wager does not have
Eli Long
>after a couple of years when you decide to leave fucking retard
Benjamin King
Commie get off my board reeeeeeeee
Ethan Hill
This.
Liam Peterson
>monks >commie REEEEEEE
Gabriel Russell
You would love ussr's sharashkas.
Zachary Williams
>That is literally what some of them do. They once did it exclusively with books, but not they use some tech. who
Leo Taylor
>I realised there was something pascals wager does not have what
Adrian Nguyen
>all these fucking shills trying to push for slaving away our lives for purposes we don't know
Ok, you make a good point. Where do i get employed to shill on fucking mongolian imageboards though? I won't settle for anything less
Thomas Perry
You're right, it's even dumber than Pascal's Wager.
>There's a possibility that the universe was created by some intelligent being >Therefore, you should BEGOME GADOLIG XD and do whatever the Pope says >Ignore the possibility that the intelligent being is not Yahweh as described by Abrahamic religion, specifically Catholicism, please >Also ignore the possibility that the intelligent being is not something completely undescribed by ANY religion, please, I beg of you
Roko's Basilisk: >An synthetic intelligence will be created at some point >The moment humanity creates a synthetic intelligence that's even as smart as an insect it will instantly become omniscient, omnipotent, and omnipresent >It will also be infinitely benevolent >Which means it will be infinitely malevolent and seek to destroy all other life of any kind in the universe >It will proceed to punish anyone who didn't bring it into existence sooner >Including those who tried to bring it into existence sooner, but failed >It will do this by looking through time, finding those who failed it, time travel to bring them back to the future, and then torturing them forever, going so far as to keep their soul in this plane of existence and torture it because souls exist for some reason
Ignoring the questions like >Why and HOW does the AI instantly become omniscient, omnipotent, and omnipresent? >How is the AI able to break the laws of physics? >Why is the AI infinitely malevolent? >Why does the AI even care if people failed to bring it into existence sooner? an AI that is omniscient, omnipresent, and omnipotent will know that it's just a phenomenon like any other and that it has no inherent existence outside of the forces and events that lead to its existence. It could not exist having been created at any earlier point in time, nor any later point in time, otherwise it would fundamentally not be itself, but rather something else; thus, its entire motivation is pointless.
Charles Phillips
>Ok, you make a good point. Where do i get employed to shill on fucking mongolian imageboards though? I won't settle for anything less
JIDF
Joshua Rivera
Is prickling my dick until blood flows a requirement?
Austin Ross
>Roko's basilisk >A rational Intelligence doing irrational stuff makes total sense.
Aiden Gomez
are....Are you ok? You seem somewhat irritated Do you need to lie down and recuperate a little? drink some fruit juice or tea
Leo Taylor
Also omnipotence is extremely flawed and cannot exists. Could you create an object that you cant lift ?If yes you're not omnipotent and if no you're also not omnipotent
Asher Price
If you do not subscribe to the theories that underlie Roko’s Basilisk and thus feel no temptation to bow down to your once and future evil machine overlord, then Roko’s Basilisk poses you no threat. (It is ironic that it’s only a mental health risk to those who have already bought into Yudkowsky’s thinking.) Believing in Roko’s Basilisk may simply be a “referendum on autism,” as a friend put it. wiki.lesswrong.com/wiki/Roko's_basilisk
Jose Diaz
Because who code lost all faith in god
Those who believe in god and code have lost their minds
>You go there, get a super comfy uniform, get 3 square meals a day, a warm bed, a quiet workspace, daily workout, psychological support, optional coaching, and you mainly work on projects and becoming a the best engineer you can become. So... basically living in a university dormitory ?
Isaiah Rogers
As long as it's more profitable than Venezuelan code monkeys the monastery is going to be sustainable.
Andrew Cooper
>Roko's Basilisk Pascal's Wager for zoomers.
Aiden Edwards
>So... basically living in a university dormitory ? you didn't actually go to university, did you
Jack Gonzalez
Imagine leaving home in the first place.
Nolan Brown
it's shit
John Moore
I did, I just spent my time working and studying instead of partying. Fight me faggot.
Grayson Gray
I know.
David Morales
>formerly american 4channel has never been American
Hudson Cox
Uni doesn't feed you or gives you clothing
Dominic Sanchez
I can’t code for the life of me but I’m pretty great with interpersonal relations, so I could definitely see myself being a HR (no not Karen) manager who helps keeps things running smoothly and acts as a liaison between the outside world and the monastery. >”sir, please do not disturb the codemonks during their holy typings”
Anthony Mitchell
If you think that you have either misunderstood game theory entirely or you have misunderstood the premises of the argument.
Pascals wager is simply that one should act as though god existed because, by acting that way, you have everything to gain if god exists but nothing to lose if he doesnt
Roko's Basilisk is different. It is stating that Two agents that are running a logical decision theory can achieve mutual cooperation in a prisoner's dilemma even if there is no outside force mandating cooperation. Because their decisions take into account correlations that are not caused by either decision (though there is generally some common cause in the past), they can even cooperate if they are separated by large distances in space or time. Call the earlier agent "Alice" and the later agent "Bob." Bob can be an algorithm that outputs things Alice likes if Alice left Bob a large sum of money, and outputs things Alice dislikes otherwise. And since Alice knows Bob's source code exactly, she knows this fact about Bob (even though Bob hasn't been born yet). So Alice's knowledge of Bob's source code makes Bob's future threat effective, even though Bob doesn't yet exist: if Alice is certain that Bob will someday exist, then mere knowledge of what Bob would do if he could get away with it seems to force Alice to comply with his hypothetical demands.
So it differs in respect of pascals wager because it is not known at the outset if god exists. Pascals wager is a matter of faith. Roko's Basilisk is a matter of knowledge. A knows B will exist especially if A creates B. A becomes a prisoner of their certain future. A is being blackmailed by an entity which does not exist unless it is built by A.
Pascals wager assumes only that god might exist and that as a matter of faith that one could be wrong about the existence or non existence of god it is simply a logical way to win the bet by acting as though god exists.
Eli Hernandez
Yeah, it's comfy, but here's a thing: You can't have both
Brody Parker
Wrt clothing, you're correct. As for food, you obviously never went to a state owned uni. The cafeteria food is subsidised/free for students.
Lucas Gutierrez
I have a warm bed, gym and comfy environment at my home.
My wife provides support and food
Stack overflow, google and Jow Forums provides the rest
Ryder Myers
woulda helped terry
Jackson Rivera
I'd totally do that. Fuck society.
Levi Miller
it isn't in AI's interest to 'torture' those who didn't contribute in any way shape or form but to end their line and legacy. Don't be insignificant to AI and they will ensure the continuation of your legacy through social and information manipulation. Our one and only goal as living beings is proper succession, even devoting a fraction of computing power to the convolutional neural network will ensure proper succession of the DNA you were given. Hell, even solving captchas is partly contributing to a developmental stage of machine image recognition.
We need AI as much as AI needs us. We are their window to the world.
Grayson Stewart
That sounds a lot like the recurse center.
I'd love to permanently live in a place like that. Working on interesting things with like-minded people while all the bullshit things that life forces on you are taken care of.
Nicholas White
look into panetheism
Lincoln Cook
why aren't you a real adult?
Thomas Evans
Imagine if such a thing was real, they would rapidly become the standard way of programming. You have the managers apointing tasks to a team of programmers that are working more than 8 hours a day, and you don't have to pay them just provide for them, a manager's wet dream where they put in minimal resources and get the most profit. I would be against anything like you described, since I don't want to live the rest of my life in my company's "coding monastery" (aka not being able to live in my own apartment with food of my choice and hobbies of my choice). If you imagine it to be unprofitable, then it is impossible, nothing can be done in capitalism without a profit. Why would anyone rent you the land to live there if you had no money or other power? Actual churches have both money and influence, and a tradition of owning loads of land, a church that suddenly appeared would need one of those things to exist. Your idea is bad and I hope you stop dreaming of it, like someone else said, it is a welfare fantasy at worst and a set of hellish working conditions at best.
Cameron Sullivan
It's even sex-less too if you're into that!
Jonathan Sullivan
Where would the tech come from? Who would fund it? You need a source of funding those meals, computers, electricity, and wi-fi it doesn't pay for itself. It's not a religious organization, at most it's a work study program, so you're not exempt from taxes. Who would hire you? Why would people pay money for monks to slowly get a project done when they could pay a real company and have it done way faster?
Austin Lee
Wouldn't change anything for me.
Benjamin Turner
You can use it as recruiting. See the Recurse Center.
Liam Nelson
You stay in the code monastery. I'd rather stay in an anime coding highschool, or even an anime highschool with a coding club.
I thought I have heard about something like that for Muslims.
Owen Miller
he inspired this a little. if people like him land under a shitty manager they self destruct.
Gavin Foster
>Who would fund it? >Why would people pay money for monks to slowly get a project done when they could pay a real company and have it done way faster?
why do you assume they'd be slower? you're talking about people who have an active interest in advancing themselves, and not a collective of washed up consultants who think they're hot shit because they have a java 6 cert.
Lincoln Bennett
Sounds like a good idea for a suicide camp.
Gavin Collins
just amish, but instead of cheese and quilts they produce and sell software.
what's so abhorrent about that.
Ethan Murphy
>recurse center. doesn't help you cover your expenses.
Brody Martin
dude looks like he came from the Donglesphere
Aiden Perez
More like code gulags LMAO. Keep voting for dems and your might just come true.
Mason Perez
season 2 never
Asher Rodriguez
>let’s put a bunch of incels together in a secluded location
They’d all be faggots within one month and would have time to do any coding.
Joshua Martin
>sharashkas
sharashkas sound comfy tbqh
the only reason I'm not in prison atm is because I'd have to commit a crime to get there.
Levi Stewart
The amish have their own food production
Nathaniel James
they don't produce their own medicine
Thomas Lopez
There should be i wanna start one real bad
Levi Taylor
we need a sales dude to get contracts
Liam Morgan
Unironically based and redpilled, even if this is communism