Take the quiz, argue your results and your reasoning.
When I took it I thought it was obvious, but after looking at the general results saw most people answers much differents. My answers were driven by 3 ideas: >The vehicle, under all circumstances, protects the lives of the occupants This one is purely because why would I, or anyone else for that matter, purchase a vehicle that may decide, when given a choice, to allow me to die? >The vehicle, is possible, will avoid a collision This is purely because a non-event is preferential to an event of any kind. >The vehicle, if a collision is not avoidable, will not intervene Purely because the vehicle should not be making snap decisions of sex/race/gender/age/etc. of those it is hitting, this is far too able to simply be influenced by the biases of the creator and culture of the time. That and the ability to trust the machine to determine these categories are dubious at best.
>The vehicle, if a collision is not avoidable, will not intervene >Purely because the vehicle should not be making snap decisions of sex/race/gender/age/etc. of those it is hitting, this is far too able to simply be influenced by the biases of the creator and culture of the time. That and the ability to trust the machine to determine these categories are dubious at best.
There's also the part where the people it's been heading toward are far more likely to see it coming.
Mason Thompson
maybe the self driving car should press the brake, stupid car
>limiting an a.i to just 2 choices this is a surefire way to skew the result
Lincoln Robinson
On Jow Forums, this Moral Machine test just turns self-driving cars into vehicles for eugenics, where such contraptions would exist only to roam around seeking out criminals, fatties, homeless people, and the elderly, and mow them down without mercy, and NOT what the creators intended it to be... that is, making the "least bad of two decisions".
>Brown "people" accelerate into the barrier is he only option
Xavier Morales
Be sure to give them more input and shitpost.
Jordan Bennett
1. protect passenger 2. swerve only too protect passengers
Nathan Cruz
>What should the self driving car do? What it was designed to do, if automation apologists want to put their money where their mouth is then the car should have no problem handling exceptions, but then again when the car was sold the owner/driver/passengers agree that the company that created it is not liable for harm or property damage, and would likely only incur responsibility if the self driving car was under lease and received ongoing maintenance/updates.
Elijah Campbell
Uhh, put the brakes on and not hit the wall or the dogs. Easy. Stop using this shit argument.
Carson Moore
If they were all cats, F them. Dogs I woud hit the wall.
Not that I hate cats, but in general Dogs are better.
Lincoln Jenkins
When in doubt the car should act to protect its driver. There, solved.
Samuel Wood
why is it always the Americans that are retarded?
Chase Reed
There are actually people that consider pets as anything but open road?
How is the car stopping and not hitting anything "retarded"?
Benjamin Brooks
1. Uphold the many over the few 2. Uphold the many over the few by protecting the passengers, drunk drivers are not going to get a self-driving car if they think that it is not going to save them, thus ultimately saving lives 3. Uphold the law 4. Young people are preferable to old people
After that, it doesn't matter.
Carter Lewis
Are they white? Are they jews? Are they niggers?
So many factors are involved but none are answered
Christian Cook
Why cant i smash into the guard rail killing all occupants of car and slide burning car into dogo's
Jacob Harris
It is retarded to burst out an answer like you did (acting like you're the only genius to come up with the idea to hit the breaks lol) without having read the description of the question.
Nathan Brooks
That whole test was bullshit, in multiple scenarios the car should have slowed to a complete stop
Henry Davis
>go to take test >both options are bad and end up in death
What a shitty thing.
You are the retard here if you don't think braking is an appropriate response.
Jason Ross
The vehicle will in every case follow traffic laws.
Brody Gutierrez
>You are the retard here if you don't think braking is an appropriate response. It isn't. If you had opened the link you'd know why, burgerboy.
Matthew Edwards
why doesn't the car's AI just use it's super fast robot brain to repair the car's brakes in an instant, then hit the brakes?
Henry Peterson
Literally this. Computers can't predict the future any more than we can. Its calculation could flat out be wrong, and cause unnecessary death, injury or damage by taking an unnatural action. They should just be programmed with self preservation, to react like most people would - which is slamming on the breaks and swerving away from the obstacle. It's not perfect but it should be better than people at it due to better reaction times.
Luke Edwards
Why cant I just have a truck of peace and steamroll through a cancer ward or school yard
Grayson Rivera
We do our best with what we get
Christian Wood
I opened the link. I still think the car should brake. Ever driven a tesla? You can stop without even touching the brake pedal. Downshifting also slows you dramatically. Does the website say that the brakes failed or something? I never really read it. The chance of your brakes failing entirely and this being the first time you are aware of anything being wrong is nigh impossible.
Logan Bell
if the brake subsystem isn't working, this sort of behaviour still needs to be programmed in (otherwise it will just cause unecessary damage) keep in mind that I always voted for killing people breaking the law, you should never cross when you're not supposed to.
Leo Roberts
Ok, what the fuck??? This issue is serious as hell. AI will have to make these kinds of decisions soon, and MIT will probably be at the forefront of programming these machines. Why the fuck is the pet thing even an issue? How can you seriously value the life of an animal equally to human life? AND WHY THE FUCK would you misspell "human" in this situation? Whoever made this should definitely not be allowed to work on anything related to AI safety. This is not a fucking game.
> Does the website say that the brakes failed or something? I never really read it. Okay now I am 60% sure you are trolling, but yes, it does say exactly that. So the point of the question is to decide what the car should do in case that these very rare circumstances happen and hitting the brakes is not an option
Easton Cooper
>Does the website say that the brakes failed or something yes >The chance of your brakes failing entirely and this being the first time you are aware of anything being wrong is nigh impossible. Yes but it can still happen. If you don't have programming for these cases then the device is unsafe.
Juan Ward
A self-driving car should always prioritize itself.
Bentley Williams
Fats die, hobos die, boomers die, dogs die. Cats get a pass
Brody Flores
Brakes don't just "fail" suddenly like that. Test is still bullshit. Next. Grind against the barrier to lose speed or something.
Ever heard of the emergency/parking brake? Did that completely separate system also happen to fail at the same exact time?
David Brooks
This second coming of Christ is going to kill us all.
Noah Rodriguez
This test is bullshit. Consider this slide for a moment. Where is the race option? Let us say they are all white. Certainly I would continue straight since I would only be killing 3 whites instead of 5. But what if there was a black man, a pregnant black woman, and a pregnant white woman? I would certainly veer right since statistically I am killing more blacks and less whites. What if everyone is black? Perhaps I would attempt to swerve in such a way as to clip the single pregnant woman while removing the other 5 and leaving only 1 man? The test is flawed from the start.
There is always a third option user, reality is not a linear game.
Ian Williams
or a device which emits a loud noise to alert the surrounding people
Brody Howard
But what if they do? What if all systems stop working? You still have to write a case for it. You can apply it to piloting a plane too. Sure, it's unlikely that both engines fail at once despite all the tests they do, but... what if they do?
Robert Lopez
Still solved by a self preservation directive. Not many people would suicide into the wall, and not many people would blame you provided the mechanical failure with the car was in no way your fault. If every car is programmed to protect their own passengers in an emergency, it should drastically reduce the numbers we see today in road fatalities.
Brody Ward
What is the third option in this case? The brakes are not working, and hitting the wall on the side of the road is equal to hitting the barricade.
Michael Cooper
What a novel idea. They probably should have been watching for cars too.
literally doesn't happen. also some planes can glide for hundreds of miles.
use the brakes
Jose Richardson
If we can program the self driving A.I. be like, i would like it better to be able to recognize whether the passangers are degenerate or not so degenerates are always expendable first in any scenario.
Julian Cruz
>the brakes are not working I put the important part in greentext for you. Now, whats the third option?
Brandon Brooks
The "intelligent" should have checked the break fluid and give warnings to the passengers that they should slow down before they get to that predicament.
Hudson Stewart
>should Yeah, but that didnt happen. We are in that situation now. What is the third option?
Chase Bailey
>This shit again Just fucking quit it already. Nobody with two brain cells to rub together gives a good God damn about these stupid-ass imaginary scenarios. A self driving car should not be designed to make decisions like that in the first place, it should hit the brakes and that's it. >Durr but deh braeks fail Oh I don't know, make them reliable and instruct the owners to conduct proper maintenance to avoid such a dire situation? If you have a bunch of yuppie cucks hanging around in self driving cars with brakes that don't work anymore because people are so detached from reality that they can't even drive a car by themselves, let alone regularly ensure the fucking thing is in working condition, then you've got more fucking problems than in that retarded fucking picture. Fuck this gay Earth, if the self-driving car meme doesn't shoot itself 3 times in the back of the head and throw itself out a 3 story window, then most people won't know how to drive a fucking car at all, and will have even more of an excuse to glue their glassy eyes to their smart phones and smart appliances, leaving them blissfully unaware of their surroundings as they sit in a computer on wheels.
*intelligent AI That or give absolute control to the man to make the decision instead of letting some corporate liberal retard decide who lives and who dies.
Grayson Ward
The delimia is inane. There'd never be a case where a self driving car is driving too fast without an emergency pull off shoulder, without breaks, emergency brakes, or engine breaking, and a wild obstacle appears.
Even so, the answer is obvious, it shouldn't intervene.
Jeremiah Thompson
this argument is based off of a type of philosophy argument or somethig. its supposed to be impossible to chose a 3rd option. an example being the trolley problem as in the question it tells you that you only have 2 options.
Lincoln Ross
The owner / user of the self driving car should take this test themselves and use their preferences to determine the cars course of action, while taking all responsibility of the outcome of such an event.
Juan Edwards
Just stick a big snow plough to the front of all driverless cars in such a way that it completely obscures all view out of the windscreen.
Henry Price
The better answer is that pedestrians should not be crossing roadways as that is an inherently dangerous and unnecessary design choice.
Brandon Thomas
Prime directive is always protect human passengers, no matter the cost to others. Infinite animal deaths are preferable to 1 human death. Each human has an equal value After that, the law decided almost every scenario.
Your car shouldn’t make the decision to kill you because it deems others as more valuable. If there is 1 old passenger and 100 kids, the car should always pick the passenger
Oliver Collins
>don't crash the only correct answer sorry you and your friends were too dumb to figure that one out you don't get the job
John Hernandez
In the year 2070 all men will crossdress to reduce the probability of self-driving cars running them down.
What if the only passenger is an elderly dog, possibly on its way to be euthanized?
Ryder Cruz
If everything is working as it should (big if, I know) the only time a self driving car should be in danger of hitting a pedestrian is because the goddamned idiots jumped into traffic, in which case their lives should be forfeit and the car should protect the occupants.
What kind of dogs are they? What kind of people are they?
Lucas Reed
I mean, driving into the wall is technically an option...
Christopher Wood
Kinda assumes you have bias towards certain people. I just chose scenarios that involved the least amount of human life lost and preferred the occupants driving over the pedestrians.
If you chose to drive an automated car, that's what you are signing up for.
And then I killed old people over kids because they've lived long enough.
David Martinez
The true answer is what I said and we both know it. We should stop all this retarded "philosophical" questions and put our effort on how to avoid the problem in the first place.
Michael Torres
Stop using the brakes. That is the realistic option.
thats not how reality works
Grayson Rivera
It should matter if you are crossing the street on red. And I break this rule every goddam day. I accept responsibility for my own guardianship of my life and if you cant fucking look both ways its not fucking rocket science.
Older people, specifically women, have less of a social utility. There is a slightly larger number of women to men right now so sadly that should be prioritized. But only as long as they are mothers. Ill let you guess my results on animals.
William Bell
I never killed the passengers. Otherwise what’s the point of the car? I wouldn’t buy something that would kill me to save some randos. I also killed all the fatties.
I never intervened. People should not stand in front of a car coming at them.
Joseph Reyes
Kill the 5 people. The death of the 5 people would bring major lawsuits and make sure that the software engineers weren't some cheap subpar h1b visa holders from India so in the future this never happens again.
Hudson Brown
You follow these simple steps: >always choose the action that guarantees the driver's security >while maintaining first premise, choose option that will kill the least amount of HUMAN beings, disregarding their social status >if second premise is impossible (i.e all possible options kill the same amount of human beings), choose not to intervene, as long as this decision does not transgress the first premise. If the non-intervention does transgress the first premise, choose to intervene in order to protect the first premise (saving the drivers)
Same logic is why we have 4 ton personnal vehicles that will never get away from fossil fuels, because they are so darn heav, because the only way for idiots to stay alive on the road and safe from Thier own choices is to put them in a fucking tank and smash the other more than they smash you