Where can i go to talk to people who are smarter than me...

where can i go to talk to people who are smarter than me? i don't mean more skilled/educated in their field or "smart" stoners talking about their third eyes i mean people with bigger/better brains than mine

Attached: please....gif (885x948, 895K)

u can talk to me, dummy user

right here, brainlet.

>>/li/
/thread

whats on your mind?
i can answer some questions for you

NPD general? NPD general.

Attached: 1565851405257.png (306x306, 73K)

What age are you? If you're still in school, hanging around CS/Physics/Philosophy/Math majors could work.

> more skilled/educated in their field
normies
>"smart" stoners talking about their third eyes
normies who pride themselves on litterall self induced illusions.
> i mean people with bigger/better brains than mine
here on Jow Forums. There is a lot of traffic here of normies complaining about how they are behind in the rat race (no gf ect) but there are posters here who are big thinkers when it comes to the robot's condition and the psychology of normies. A good way to spot a pseudo intellectual is if they sound or claim to be 100% sure in everything they say. If they know they don't have an answer or an inkling as to what the answer might be than they will admit they don't know. But if they puff their chests and just automatically spew out some answer that sounds wrong and they can't back it up they are self righteous LARPers.

just thinking about stuff like whether AI can be taught ideology or whether AI thoughts will always be human in origin, whether AI will someday be acceptable as human replacement. at this point im just scared we wont have the technology to upload our brains to the cloud before i die, i wanna live in an AI world with nice NPCs

not saying i'm smarter than anyone else, just saying i don't get much chance for real conversation with smart people

ai at the moment isn't even close to perceiving things like ideologies
all we've got is machine learning, which is a essentially fancified brute-force algorithms
it's like teaching a newborn to walk
they kinda stumble around a lot and fall, then by constantly referencing how you do it and trying a lot of times they can kinda maybe like make a step or two while holding on to your hand, and then eventually they will be able to walk, but still kinda infirm and unbalanced until much later, all because they had a solid reference as to how walking works ideally and can see if something is "closer" or "farther" from the ideal and change their behavior based on that
nothing like consciousness or intelligence yet, sorry

>just thinking about stuff like whether AI can be taught ideology or whether AI thoughts will always be human in origin
normalfaggot's dialogue comes from their emotions (since they believe emotions > everything else) so no i don't think AI will actually be sentient or sapient. they will just imitate it. But know this. Your thoughts and emotion's aren't your own. They come out of nowhere and you have little to no control over them. So if you aren't your thoughts and emotions. Than what is truly your essence?
> AI will someday be acceptable as human replacement
they will
>at this point im just scared we wont have the technology to upload our brains to the cloud before i die, i wanna live in an AI world with nice NPCs
i wouldn't count on it. personally i'd prefer nothingness after i die. Why do you want that life?

>people with bigger/better brains than mine
from that part i can tell you: literally anywhere

Everyone ultimately makes their decisions based on emotion. They just use reason and logic to justify it.

>Your thoughts and emotion's aren't your own. They come out of nowhere and you have little to no control over them.

i think they're a result of conditioning/social engineering/programming, so i dont agree that they come from nowhere, but i do agree that we have little control over it. i guess i just wanna know when computers will be advanced and complex enough to be subject to that same conditioning/engineering

>Why do you want that life?

i lack a sense of control in my life and have started seeing the afterlife as an escape from it, it would be seriously depressing for me to look at my life as "i did nothing and then died and then nothing happened after that". i understand "did nothing and then died" describes 99.999% of all life on earth but it's still sad and boring to me and i want something better to come after

i don't though. People with high consciousness can almost always control themselves. I don't make excuses for bad things i might do. I do not try to justify them. I have preferences based on emotion but contrary to what i typed i believe most emotions have a cognitive basis in them. I think to be human is to be more than your impulses.
> it would be seriously depressing for me to look at my life as "i did nothing and then died and then nothing happened after that".
no it wouldn't. Because you wouldn't exist to feel that way. This idealism of a perfect or even decent life is the source of most people's suffering. Everytime they achieve something the bar moves higher and so does the standard. Those who think they live in a sustained state of pleasure fools too. That bliss is only reflected by how bad it could be. Regardless of the conditions. Most good times and emotions come from within.
> but it's still sad and boring to me and i want something better to come after
i personally want to be free from neediness of pleasure or satisfaction. Thats why i want nothing after i die. Also. I think people only get bored because they are separated from meaning. People would rather be in pain than be bored. Which disgusts me because some people would deliberately cause harm and destruction to eachother just because they were bored.

>whether AI can be taught ideology or whether AI thoughts will always be human in origin
start by asking if a human can be taught ideology? what are the routes of human ideology? how different humans would even describe what an ideology is would vary but all would probably agree that at their foundational level is some level of conditioning, be it through coercion or experience.
for humans to have a strong enough conviction in an idea to the point to settle on it as an ideology, that could potentially change how they view totally unrelated areas of their life, they either have an innate need to seek authority outwith themselves and thus be susceptible to outside messaging or personally experience euphoric/traumatic events in their life shaping how they see everything else moving forward.

it will all depend on how the future AI is creating itself, if it constantly needs reassurance from an outside human influence that its doing the right thing, then it can be sure taught what a human would recognise as an ideology, either by manipulating what information the AI has access to or depending on the ideology of who gets to press the 'this is correct, carry on' button.
if it is cognisant to the point of having no boundaries and blindly stumbles forward through trial and error, towards whatever goal it has, then eventually itll probably teach itself ideology and bias, as over time it will recognise significant repeating patterns and in the future actively try and minimise them.

what itll mean for us bags of meat will depend entirely on the AI's goal, if its purely to acquire as much data as possible then we're doomed, as eventually we will be the stumbling block wasting energy and resources on the planet. if its goal is to make us all happy and for us to live together in peace forever, then we will live in a utopia, as it sends its drones out to wipe out every nigger on earth without hesitation.

>at this point im just scared we wont have the technology to upload our brains to the cloud before i die, i wanna live in an AI world with nice NPCs
maybe you already do?

>Because you wouldn't exist to feel that way.

but i exist to feel that way right now. if i am forced by my biology and sociology to work and eat and sleep and engage in the world, i might as well try to make something meaningful out of it for the sake of my current emotional state.

>I think to be human is to be more than your impulses.

this is where we fundamentally disagree. i would feel arrogant if i assumed my humanity was superior to others' just because i try not to act on impulse. in reality i think everything i think and in fact everything ive said in this thread is the result of somebody or some group pulling strings as far back as my early childhood to develop me into the person i am. the type of people who post on pol or anywhere on 8ch are just as much subjects of that brainwashing as much as they like to believe otherwise, the higher-ups just paint them as snakes and wolves to keep the sheep in the herd.

what i meant was i wanna live in an AI world with nice NPCs and also have cool magic powers

i don't blame others for 'brainwashing' me.I can think for myself regardless of anyone. It is not arrogant to not think but KNOW you are better than some chimping normie. Sure you might have something in common but they ultimately choose to be shitty people. If you just take that tact of "oh well, who am i to talk" than you will stay in the slums with them. You and i aren't someone to talk. Our ideas are. We don't have ideas ideas have us. And their ideas suck. Their ideas are materialistic and shallow. It is your job as a superior to trample what you see is as wrong, incompetent and destructive.

> if its goal is to make us all happy and for us to live together in peace forever, then we will live in a utopia, as it sends its drones out to wipe out every nigger on earth without hesitation.
based

i don't see myself as better than normies. smarter, more self-aware, more interpretive than reactive, sure. but i've already been sentenced to life on earth and "trampling" everything i think is dumb is going to get me called a contrarian conspiracy theorist, it won't actually change any normies' minds, (((they))) will profit from the outrage it will cause, and my situation won't improve. there's nothing to gain from pretending to be superior.