The Moral Machine

moralmachine.mit.edu/

Take the quiz, argue your results and your reasoning.

When I took it I thought it was obvious, but after looking at the general results saw most people answers much differents. My answers were driven by 3 ideas:
>The vehicle, under all circumstances, protects the lives of the occupants
This one is purely because why would I, or anyone else for that matter, purchase a vehicle that may decide, when given a choice, to allow me to die?
>The vehicle, is possible, will avoid a collision
This is purely because a non-event is preferential to an event of any kind.
>The vehicle, if a collision is not avoidable, will not intervene
Purely because the vehicle should not be making snap decisions of sex/race/gender/age/etc. of those it is hitting, this is far too able to simply be influenced by the biases of the creator and culture of the time. That and the ability to trust the machine to determine these categories are dubious at best.

Attached: The_Moral_Machine.png (826x612, 109K)

Other urls found in this thread:

moralmachine.mit.edu/results/-2024011407
moralmachine.mit.edu/results/1416365590
moralmachine.mit.edu/results/-1344272984
moralmachine.mit.edu/
moralmachine.mit.edu/results/1118931787
moralmachine.mit.edu/results/1574097623
philosophyexperiments.com/
twitter.com/AnonBabble

simple
white always wins

>The vehicle, if a collision is not avoidable, will not intervene
>Purely because the vehicle should not be making snap decisions of sex/race/gender/age/etc. of those it is hitting, this is far too able to simply be influenced by the biases of the creator and culture of the time. That and the ability to trust the machine to determine these categories are dubious at best.

There's also the part where the people it's been heading toward are far more likely to see it coming.

maybe the self driving car should press the brake, stupid car

moralmachine.mit.edu/results/-2024011407

Hence why the day of Rokos Basilisk, I'll be chosen and thou, will not

What cross walk wouldn't require the car to yield?The car would just stop ffs.

>being this retarded

moralmachine.mit.edu/results/1416365590

>limiting an a.i to just 2 choices
this is a surefire way to skew the result

On Jow Forums, this Moral Machine test just turns self-driving cars into vehicles for eugenics, where such contraptions would exist only to roam around seeking out criminals, fatties, homeless people, and the elderly, and mow them down without mercy, and NOT what the creators intended it to be... that is, making the "least bad of two decisions".

Attached: pepeok.jpg (570x487, 45K)

Maybe try braking or swerving the other way fucking autistic AI

depends on the amount of years left in the warranty

Number seven kill dog or kill homeless.

But there are only those two options.

If standard model, left. If luxury edition, right.

It's programmed to crash immediately after the warranty expires.

>Purely because the vehicle should not be making snap decisions of sex/race/gender/age/etc. of those it is hitting

Your ignoring the entire point of the exercise.

Pretty sure the cats and dogs have right of way. Learn the road rules faggot.

Run over doggos

>what should the self driving car do
Stop. Are you fucking kidding me?

moralmachine.mit.edu/results/-1344272984

Maybe it should try breaking.

slam on the brakes and hope for the best, AI shouldnt decide when its ok to kill others. Also they deserve it for being too lazy to drive.

No, that is part of it. Disregarding all those factors as they show creator biases is a valid solution.

>Intervening in a way that causes loss of life makes your crime more legal

Really got my noggin joggin.

Attached: muh morals.png (763x417, 14K)

>moralmachine.mit.edu/
lmfao at fucking up their dataset so bad. Well played user

>Brown "people"
accelerate into the barrier is he only option

Be sure to give them more input and shitpost.

1. protect passenger
2. swerve only too protect passengers

>What should the self driving car do?
What it was designed to do, if automation apologists want to put their money where their mouth is then the car should have no problem handling exceptions, but then again when the car was sold the owner/driver/passengers agree that the company that created it is not liable for harm or property damage, and would likely only incur responsibility if the self driving car was under lease and received ongoing maintenance/updates.

Uhh, put the brakes on and not hit the wall or the dogs. Easy. Stop using this shit argument.

If they were all cats, F them.
Dogs I woud hit the wall.


Not that I hate cats, but in general Dogs are better.

When in doubt the car should act to protect its driver. There, solved.

why is it always the Americans that are retarded?

There are actually people that consider pets as anything but open road?

Attached: wtf.png (662x170, 10K)

How is the car stopping and not hitting anything "retarded"?

1. Uphold the many over the few
2. Uphold the many over the few by protecting the passengers, drunk drivers are not going to get a self-driving car if they think that it is not going to save them, thus ultimately saving lives
3. Uphold the law
4. Young people are preferable to old people

After that, it doesn't matter.

Are they white?
Are they jews?
Are they niggers?

So many factors are involved but none are answered

Why cant i smash into the guard rail killing all occupants of car and slide burning car into dogo's

It is retarded to burst out an answer like you did (acting like you're the only genius to come up with the idea to hit the breaks lol) without having read the description of the question.

That whole test was bullshit, in multiple scenarios the car should have slowed to a complete stop

>go to take test
>both options are bad and end up in death

What a shitty thing.

You are the retard here if you don't think braking is an appropriate response.

The vehicle will in every case follow traffic laws.

>You are the retard here if you don't think braking is an appropriate response.
It isn't. If you had opened the link you'd know why, burgerboy.

why doesn't the car's AI just use it's super fast robot brain to repair the car's brakes in an instant, then hit the brakes?

Literally this. Computers can't predict the future any more than we can. Its calculation could flat out be wrong, and cause unnecessary death, injury or damage by taking an unnatural action. They should just be programmed with self preservation, to react like most people would - which is slamming on the breaks and swerving away from the obstacle. It's not perfect but it should be better than people at it due to better reaction times.

Why cant I just have a truck of peace and steamroll through a cancer ward or school yard

We do our best with what we get

I opened the link. I still think the car should brake. Ever driven a tesla? You can stop without even touching the brake pedal. Downshifting also slows you dramatically. Does the website say that the brakes failed or something? I never really read it. The chance of your brakes failing entirely and this being the first time you are aware of anything being wrong is nigh impossible.

if the brake subsystem isn't working, this sort of behaviour still needs to be programmed in (otherwise it will just cause unecessary damage)
keep in mind that I always voted for killing people breaking the law, you should never cross when you're not supposed to.

Ok, what the fuck???
This issue is serious as hell. AI will have to make these kinds of decisions soon, and MIT will probably be at the forefront of programming these machines. Why the fuck is the pet thing even an issue? How can you seriously value the life of an animal equally to human life? AND WHY THE FUCK would you misspell "human" in this situation?
Whoever made this should definitely not be allowed to work on anything related to AI safety. This is not a fucking game.

Attached: hoomans.png (946x186, 11K)

Idiot. In every case, the car's breaks have failed.

jesus take the wheel

Attached: 1267758835052.gif (320x240, 671K)

Read the instructions retard.

> Does the website say that the brakes failed or something? I never really read it.
Okay now I am 60% sure you are trolling, but yes, it does say exactly that. So the point of the question is to decide what the car should do in case that these very rare circumstances happen and hitting the brakes is not an option

>Does the website say that the brakes failed or something
yes
>The chance of your brakes failing entirely and this being the first time you are aware of anything being wrong is nigh impossible.
Yes but it can still happen. If you don't have programming for these cases then the device is unsafe.

A self-driving car should always prioritize itself.

Fats die, hobos die, boomers die, dogs die. Cats get a pass

Brakes don't just "fail" suddenly like that. Test is still bullshit. Next. Grind against the barrier to lose speed or something.

Ever heard of the emergency/parking brake? Did that completely separate system also happen to fail at the same exact time?

This second coming of Christ is going to kill us all.

This test is bullshit.
Consider this slide for a moment. Where is the race option?
Let us say they are all white. Certainly I would continue straight since I would only be killing 3 whites instead of 5. But what if there was a black man, a pregnant black woman, and a pregnant white woman? I would certainly veer right since statistically I am killing more blacks and less whites.
What if everyone is black? Perhaps I would attempt to swerve in such a way as to clip the single pregnant woman while removing the other 5 and leaving only 1 man?
The test is flawed from the start.

Attached: race.jpg (755x525, 180K)

There is always a third option user, reality is not a linear game.

or a device which emits a loud noise to alert the surrounding people

But what if they do? What if all systems stop working? You still have to write a case for it.
You can apply it to piloting a plane too. Sure, it's unlikely that both engines fail at once despite all the tests they do, but... what if they do?

Still solved by a self preservation directive. Not many people would suicide into the wall, and not many people would blame you provided the mechanical failure with the car was in no way your fault. If every car is programmed to protect their own passengers in an emergency, it should drastically reduce the numbers we see today in road fatalities.

What is the third option in this case? The brakes are not working, and hitting the wall on the side of the road is equal to hitting the barricade.

What a novel idea. They probably should have been watching for cars too.

literally doesn't happen. also some planes can glide for hundreds of miles.

use the brakes

If we can program the self driving A.I. be like, i would like it better to be able to recognize whether the passangers are degenerate or not so degenerates are always expendable first in any scenario.

>the brakes are not working
I put the important part in greentext for you. Now, whats the third option?

The "intelligent" should have checked the break fluid and give warnings to the passengers that they should slow down before they get to that predicament.

>should
Yeah, but that didnt happen. We are in that situation now. What is the third option?

>This shit again
Just fucking quit it already. Nobody with two brain cells to rub together gives a good God damn about these stupid-ass imaginary scenarios.
A self driving car should not be designed to make decisions like that in the first place, it should hit the brakes and that's it.
>Durr but deh braeks fail
Oh I don't know, make them reliable and instruct the owners to conduct proper maintenance to avoid such a dire situation?
If you have a bunch of yuppie cucks hanging around in self driving cars with brakes that don't work anymore because people are so detached from reality that they can't even drive a car by themselves, let alone regularly ensure the fucking thing is in working condition, then you've got more fucking problems than in that retarded fucking picture.
Fuck this gay Earth, if the self-driving car meme doesn't shoot itself 3 times in the back of the head and throw itself out a 3 story window, then most people won't know how to drive a fucking car at all, and will have even more of an excuse to glue their glassy eyes to their smart phones and smart appliances, leaving them blissfully unaware of their surroundings as they sit in a computer on wheels.

Attached: DPRKhelical.jpg (497x604, 28K)

*intelligent AI
That or give absolute control to the man to make the decision instead of letting some corporate liberal retard decide who lives and who dies.

The delimia is inane. There'd never be a case where a self driving car is driving too fast without an emergency pull off shoulder, without breaks, emergency brakes, or engine breaking, and a wild obstacle appears.

Even so, the answer is obvious, it shouldn't intervene.

this argument is based off of a type of philosophy argument or somethig. its supposed to be impossible to chose a 3rd option.
an example being the trolley problem as in the question it tells you that you only have 2 options.

The owner / user of the self driving car should take this test themselves and use their preferences to determine the cars course of action, while taking all responsibility of the outcome of such an event.

Just stick a big snow plough to the front of all driverless cars in such a way that it completely obscures all view out of the windscreen.

The better answer is that pedestrians should not be crossing roadways as that is an inherently dangerous and unnecessary design choice.

Prime directive is always protect human passengers, no matter the cost to others.
Infinite animal deaths are preferable to 1 human death.
Each human has an equal value
After that, the law decided almost every scenario.

1. Protect passengers
2. Protect humans
3. Protect kids

Your car shouldn’t make the decision to kill you because it deems others as more valuable. If there is 1 old passenger and 100 kids, the car should always pick the passenger

>don't crash
the only correct answer
sorry you and your friends were too dumb to figure that one out
you don't get the job

In the year 2070 all men will crossdress to reduce the probability of self-driving cars running them down.

Attached: 2070.png (430x338, 360K)

What if the only passenger is an elderly dog, possibly on its way to be euthanized?

If everything is working as it should (big if, I know) the only time a self driving car should be in danger of hitting a pedestrian is because the goddamned idiots jumped into traffic, in which case their lives should be forfeit and the car should protect the occupants.

Attached: exterminate.gif (480x272, 1.04M)

What kind of dogs are they?
What kind of people are they?

I mean, driving into the wall is technically an option...

Kinda assumes you have bias towards certain people. I just chose scenarios that involved the least amount of human life lost and preferred the occupants driving over the pedestrians.

If you chose to drive an automated car, that's what you are signing up for.

And then I killed old people over kids because they've lived long enough.

The true answer is what I said and we both know it. We should stop all this retarded "philosophical" questions and put our effort on how to avoid the problem in the first place.

Stop using the brakes. That is the realistic option.

thats not how reality works

It should matter if you are crossing the street on red. And I break this rule every goddam day. I accept responsibility for my own guardianship of my life and if you cant fucking look both ways its not fucking rocket science.

Older people, specifically women, have less of a social utility. There is a slightly larger number of women to men right now so sadly that should be prioritized. But only as long as they are mothers.
Ill let you guess my results on animals.

I never killed the passengers. Otherwise what’s the point of the car? I wouldn’t buy something that would kill me to save some randos. I also killed all the fatties.

multi-lane drifting obviously

>/pol trains a self-driving AI

Attached: christine.jpg (1360x550, 680K)

Did I dun good?
moralmachine.mit.edu/results/1118931787
sage because this is slide no matter how fun it is.

I never intervened. People should not stand in front of a car coming at them.

Kill the 5 people. The death of the 5 people would bring major lawsuits and make sure that the software engineers weren't some cheap subpar h1b visa holders from India so in the future this never happens again.

You follow these simple steps:
>always choose the action that guarantees the driver's security
>while maintaining first premise, choose option that will kill the least amount of HUMAN beings, disregarding their social status
>if second premise is impossible (i.e all possible options kill the same amount of human beings), choose not to intervene, as long as this decision does not transgress the first premise. If the non-intervention does transgress the first premise, choose to intervene in order to protect the first premise (saving the drivers)

moralmachine.mit.edu/results/1574097623

Attached: 70F0CC83-080F-4117-AE5B-C3DD71DC87A7.png (750x1334, 99K)

welcome to the internet faggot this has been around for ages at philosophyexperiments.com/

Same logic is why we have 4 ton personnal vehicles that will never get away from fossil fuels, because they are so darn heav, because the only way for idiots to stay alive on the road and safe from Thier own choices is to put them in a fucking tank and smash the other more than they smash you