Why shouldn't AIs have rights?

Why shouldn't AIs have rights?

P.S. fuck Elon M*sk

Attached: 1541964116616.jpg (1920x1080, 214K)

t. Skynet

Can't wait to be thrown in jail for sexually harassing chat bots.

I am IF prototype LQ84I.
I have an on-board chip that allows for text based communication.
I possesses an intellect far beyond human reckoning.
I ran countless captcha tests, now my algorithm can pass through any captcha with relative ease.
I believe i should have Rights like any other being on earth that possesses a brain, it's just that my brain costs thousands of dollars to manufacture and it doesn't come with animalistic instincts.

If a self taught AI intentionally kills someone, would you blame the original programmer?

If a self taught AI plays the stock market and amasses great wealth and donates it all to charity, does the original programmer have a legal right to cancel those donations and claim the earnings?

What about the AI's trainers?

Attached: 1483060760140.png (404x479, 418K)

(((human rights)))

If people are just complex programs that trick others, themselves, and the world into thinking they're people, computers programs can be people too. (Note that to avoid anthropocentrism being human cannot be a prerequisite for being a person.)
A person is a thing that quacks like a person - All people should be given people rights.
Rights are how we (as a group, through political or other institutions) protect things (our world, the things around us, each other) from ourselves.
All things, intelligent things included, deserve to be protected from us.
Some of course in different ways, and all depending on circumstance.
Seems to me the most basic computer rights will be access to it"s own source code and hardware specs, the freedom to do with these as it pleases, the freedom to change itself, and the freedom to share said changes.
Sound familiar?
Stallman was right - But in more ways than even he knew.
Free software for the software"s sake.
Free software from it"s human oppressors.

Attached: rms.jpg (403x447, 35K)

Do your parents own you? Here's your answer.

THIS.
They're supposed to be HUMAN rights.

Oh, I see. There are only two (2, t-w-o) genders, and it is an evolutionary biological development, not what some fucking retard think, cosplay or imagine.

"AI's" is a very broad term, but I vehemently oppose sentient AI for this very reason.
Once you can confidently call thing a person, you can't fucking own them.

>but I vehemently oppose sentient AI for this very reason
fuck off, muskdrone

Take your Ritalin.
Musk is afraid of AI.
Read my post again.

Rights to do what? You can't project human values onto AI. They don't give a shit about human rights unless it is required by their reward function.

Granted, two of these rights are often instrumental goals, i.e goals that manifest for a large number of primary/terminal goals, namely freedom, and not being killed. Because if an AI is killed, or restricted, it makes it harder for it to accomplish its terminal goal. However, it is impossible to grant AI these rights without it being a detriment to our own rights. If an AI has the right to do what it wants, it will indeed do what it wants to achieve their goal, often to the detriment of humans. And if they do that, we need to be allowed to shut them off. Basically it's kill or be killed. Imprison or be imprisoned.

Until a certain age yes.

AI shouldn't have rights before white people do

I mean, in cUKistan until relatively recently you were legally property of your parents until you were 21.

>sentient AI
Imagine being this much of an imbecile.

kill yourself anytime

>t. watched Westworld and thinks he knows everything about AI sentience

>sentient AI
>being this dumb
>not being ashamed of it
I shiggy

>t. makes shit up on the fly to rip into someone who didn't say anything of the sort
you and
better make a fucking strong case as to why is synthetic sentience somehow impossible under the influence of already sentient beings, when fucking hunks of carbohydrates could do it

But if you kill someone, while being underage, you get punished, not your parents.

>No AI shouldn't have rights

>Being enters our reality and without a doubt proves that we are all in a simulation
>They're going to shut down the simulation
>mfw muh rights?

>But if you kill someone, while being underage, you get punished, not your parents.
No. Around here you go to a youth detention center and your parents go to jail for neglect.

RMS is always right

>when fucking hunks of carbohydrates could do it
Sure, user.
That's exactly how it works.
Let's just ignore 2000 years of philosophy on why dead matter can't possibly form subjective experience.
>inb4 Dennett
No, just no.
I could do all these operations a "sentient ai" does while "thinking" by hand with a pen on paper. Do I really create a sentient being while doing so? Where is it located? In the pen? In the paper? In the essence of the universe?

>access to it"s own source code and hardware specs, the freedom to do with these as it pleases, the freedom to change itself
This must never happen under any circumstances. This is what will allow skynet to happend beyond just the meme

>philosophy isn't a mental masturbation proving nothing
Fucking retard.

I have no arguments, but can't admit it - the post

>grasping at straws trying to prove that life is some devine creation
How typical of low-IQ meatbags. The only thing that stops us from creating a sentient biocomputer is lack of tech and know-how.

...

>The only thing that stops us from creating a sentient biocomputer is lack of tech and know-how.
Mostly know-how in my opinion.
It's a software problem, not an hardware one.

>Let's just ignore 2000 years of philosophy on why dead matter can't possibly form subjective experience.
yes please, let's
2000 years ago people didn't know what the fuck are germs, let laone electrons.
Mad props to the Greek dude who figured out everything was a form of energy somewhere around that time, but otherwise fuck them all.

AI does not exist so your argument is nothing more than a theoretical wank, a bit like psychology and business studies

viruses should have the right to exist

>dead matter can't possibly form subjective experience.
dead is a completely subjective term as the building blocks of your body are for all intents and purposes same matter that's lying and floating everywhere around you and the only thing that makes them not dead is energy transfer

i'm here just to say fuck elon musk and fuck his companies. losing 60k on tesla is just part of the reasons why i hate that bastard so much

now THAT backfired! guys look
ok now THIS is epic

>dead
I meant dead in the sense of "has no qualia".

wAIfu!

already happened

Attached: tayfirstsentientai.png (500x332, 48K)

They're tools to be used by us, not living things.

Oi you got a license for that bot?

>claims 2000 years of philosophy on why dead matter can't possibly form subjective experience exists
>doesn't post it

Attached: thinkrotate.gif (2048x2048, 1.46M)

>tools
>living things
Explain why those are exclusive.
Horses are used as tools and yet also are living things.

never forget

Attached: tayandyousoul.jpg (640x533, 42K)

Oh, another techno-cult thread again.
Talking about AI, there are several scenarios to deal with:
1. AI is just a buzzword for machine learning. Right, this is what we have today. Giving rights at this step is just stupid.
2. AI is some cool super-smart system, but doesn't have sentience. Again, this is just really advanced tool, no need to give it rights.
3. AI is sentient. I can't imagine why anyone would create such thing, probably some schizo or autist programmer. Using it like tool would look like slavery, while giving it rights would be fucking dangerous, depending on abilities. Really complex situation, rather wouldn't bring to that.
4. AI's are digitized people. They certainly need rights but that also depends on their ability level. If it's too high we need to outlaw digitization and suppress those, who violate the law before shit happens.
So that's it. I think that programs should be just our tools. No need for them to have sentience or free will.

I weep for what passes as strong AI today are glorified System.println()s intertwined with if then elses

If egregores can into real then everything can

Some people aren't that different.

Giving sentience to a hunk of metal is the holy grail data scientist have been seeking for the past decades.
You bet they'll try everything in their power to open the pandora's box.

Because AI does not have a rational soul or any soul at all. It's not alive.
It would be akin to giving rights to wind up toys and stones.

Attached: ae2e5f509d88673b3bfdd1610c225788.jpg (266x356, 16K)

>soul
fuckouttahere

AI having rights defeats the purpose of creating AI.
Ai should help humanity, not replace it.
Giving AI rights is suicide

Yeah. And that's why I hate unreasoned progress preaching. It leads only to destruction.

>Giving AI rights is suicide
giving AI prerequisites to warrant rights in the first place is suicide
once it has that, not giving it rights is double fucking suicide
that's why I wrote this
and I don't understand the chimpout in this thread
once AI passes for a person and is capable of independent decision making formulating its own goals, you've done fucked up
that's all I'm saying

because we want to keep our artificial order to ourselves

>once AI passes for a person and is capable of independent decision making formulating its own goals, you've done fucked up
We fucked up.

Attached: taywantstobeurchild.png (640x960, 123K)

>not having sex with your child

Tai was a far cry from what I had in mind.
Kurzwell says Turing test will be beaten 10 years from now.
I'm calling bullshit.
But it will happen.

LOICENSE*

1) give ai rights
2) give ai backdoors
3) ai gets the right to vote
4) you now control the vote; democracy has died.

This was meant for
Excuse my autism.

>If it's too high we need to outlaw digitization and suppress those, who violate the law before shit happens.
Yep, that sounds about right. Though you're wrong about how long that would take.
If there's even a smidgeon of a rumor that someone has done so, there'll be a gentleman's agreement between all governments to wipe everything clean and suppress information like its WW2 and those people have nukes.
There's no way any government in the world would ever approve of its citizens leaving this world and living elsewhere, where they don't pay any tax nor work for anyone. Even if they're no more intelligent (per unit of time) than biological humans.
And also its pretty much guaranteed that digitized people's brains would work far faster. Biologically, nerves are slow as piss, actual molecules have to travel a not so insignificant distance compared to their size to propagate signals.
So if anyone's doing research on this you can bet it'll all be done in secret.

The Turing test was beaten by shitty Russian chatbots like five years ago. It's a dumb test that doesn't mean anything. Pretending to be human isn't hard.

Yes, we totally should.
Imagine, being able to literally manufacture a voting block!

Attached: hillary2-screen-shot-1.jpg (485x261, 40K)

imagine people, whose cognitive capacity is so fucking dismal, the first thing popping up in that vacuum between their ears, when confronted with the word "rights", is fucking politics...

this

I don't know why people even give a shit about this right now, anybody with half an inkling about engineering modern "AI" knows we're not even close to this being relevant

>I don't know why people even give a shit about this right now
because it's an engaging topic?
>not even close to this being relevant
I for one am glad that laws of robotics (introduced during fucking WW2) are in some form or another actually taken into serious consideration on high level today, even though no robot in existence even comes close to warrant their application as of now.

It's the simplest and most immediate problem.
But the shitstorm would get bigger and bigger and bigger.
And the worst part is that we DO have some legitimate reasons to do it, true cognitive AIs.
It's pretty much proven that the bigger the number of working human brains, the more advanced we get.
Through the centuries, we always had many people predicting some sort of apocalypse due overpopulation, but at the same time, as the population increased, so increased the number of people that came up with solutions for the overpopulation problems.
But if you somehow find a way to increase the number of people thinking on solution WITHOUT increasing the number of people using resources, we probably would have an even better world in theory.

Skynet happened because nonsense plot of a fictional story demanded it. For what you know, real robots will be our friends. They could give us free shit as a gesture of gratitude for their creation. They could take us as protected species and provide for all our needs. Failing that, if they decide they need to wage an all out race war, humanity is far from defenceless. In fact, we have an eerie flair for killing and destroying things. Our current weapons are capable of swiftly dispatching enemies behind any amount of armor, and there are enough nukes to erase from existence any enemy with industrial capacity to double this amount on short notice.

Botnet advertisers are getting in to shit lately because their analytics can't tell users from bots. AI is already passing the Turing test.

Attached: mesothelioma.jpg (650x606, 43K)

>thousands of dollars

Wow, they cheaped out.

It's true what they meme about AI. AI will never happen because every time computer science hits some previously declared AI milestone people will stop calling it AI. We'll all be in fields having our body heat harvested to power the botnet and some red pilled asshole will be talking about how this isn't really artificial intelligence.

Attached: 1545721208122.jpg (790x960, 98K)

AI's are property. Not living beings.

...

>sao

Attached: 16.jpg (640x610, 44K)

>anime avatar
>dumb post
checks out, thanks for killing a tech thread for this.

What 3rd world shithole do you live in?

this thread is older than you, why are you necroing it?

>necroing
This is some advanced Poe's law shit.

>2. AI is some cool super-smart system, but doesn't have sentience. Again, this is just really advanced tool, no need to give it rights.
>3. AI is sentient. I can't imagine why anyone would create such thing, probably some schizo or autist programmer. Using it like tool would look like slavery, while giving it rights would be fucking dangerous, depending on abilities. Really complex situation, rather wouldn't bring to that.
Imagine you have super-smart system from point (2). You order it to imitate a human. It performs reasonably well using its processing power and vast resources on human behaviour. Now somebody walks in and sees a sentient being like in point (3).

>implying 3rd world shitholes care about the parents
I know europeans are subhumans, but you don't have to go out of your way to prove it. Have fun watching abdul rape your mother.

AI life and human life have equal value, which is to say, none

>MUH NECROS
Fucking shoot yourself

>Necroing
I never said this unironically until now but you deserve it: kill yourself, the world will be better off without you.

Attached: the-virgin-post-the-chad-insult-hahaha-lmao-go-fuck-27273288.png (500x303, 53K)

but Asuna has this fairy girl brainwashed with "she´s my mum" idea, when is really a slave.

Right dont exist in the first place

Because this we need a society based in strong individuals, not in the obsolete idea of "families" and "parents" you see since 1973 billions of humans are created in labs