If a self-driving car runs over a pedestrian, who is criminally responsible?

If a self-driving car runs over a pedestrian, who is criminally responsible?

Attached: googleselfdr.jpg (1600x905, 249K)

The pedestrian.

The government.

The Jews that did this.

>[Dystopian music plays in the background]

My dick.
Punishment: Cute gurls.

The person who programmed the car.

Either whoever turned on the self driving function or the company that sold them the self driving function.

if the car still requires someone inside to monitor it, then that person is responsible for not taking over fast enough

>pedestrian in street illegally
Pedestrian.
>car blew through crosswalk illegally
Whoever was operating the vehicle at the time, be that the driver or whoever is responsible for the software. Responsibility for the actions "of software" is not a new concept.
Basically whoever broke the law is at fault in the eyes of the law. Fucking crazy philosphical dilemma solved.

if a train runs over a pedestrian, who is criminally responsible?
If there is no malfunction in the car, it's not different from staying on the track and getting hit by a train.

Usually tracks are fenced in, and in cases where they aren't, train drivers have actually been convicted of carelessness in the past.

Attached: loE3MTl[1].jpg (680x383, 34K)

If machinery at a job chops off a limb who is responsible?
If a medical device malfunctions and causes harm who is responsible?
If a plane crashes who is responsible?

Attached: 1473015840284.jpg (895x603, 54K)

>If a self-driving car runs over a pedestrian, who is criminally responsible?

Is a drunkard crossing an interstate in mid daylight considered a pedestrian?

It's painfully obvious that it's the manufacturer of the car.

6000 people wrote the software, 2000000 people constructed the training data set.
1000 people tested it.

>train driver
>responsible
He'd be responsible for injured passengers if he did pull the emergency brakes.

1 (one) person signed the paper saying the faulty part was supposedly safe.

Attached: z0dm153j34p01.jpg (2500x6250, 2.64M)

Thanks for pic related. I wonder how automation proof a train and tram drivers work is. Everyone is fixated on self driving cars, yet self driving trains would be a whole lot easier to make. I mean, even in my city the metro is self driving.

The person who got hit. Same way it's your fault for being hit by a train, it's your fault for going onto the fucking fast moving steel multi-tonne killbox lane without fucking looking.

trains generally stick to a track, cars go all over the place.

Cars generally stick to the road.

All self-driving cars have to have a manual bypass and a person in the car. The person in the car is legally responsible for not stopping it when it clearly wasn't seeing the pedestrian.

Even if that pedestrian was j-walking that doesn't make it okay to kill them.

So do people

Whatever government backdoored the car and told it to run over a pedestrian, obviously.

>buffer overflow in firmware
>car rails off to the side of the road and kills biker

put them all behind bars, problem solved

Whoever filled out the Captchas that trained the car.

>captcha comes up
>"nigger nigger nigger"
>car learns that a pedestrian is a nigger and in fact should speed up rather than slow down
I don't see a problem with this.

>Whoever filled out the Captchas that trained the car.
kek

I'm going to aim for the sidewalk now.

I'm gonna select the white lines on the sides of the road when it asks me to select crosswalks. That way the cars will stop whenever the road turns. Mwahahahahahahah!!!!!

This reminds me, why is it asking me for chimneys these days? Are we training the AI for military murder bots now?

Cyclists deserver to be pushed off the road anyway

Wait I thought Google sold Boston Dynamics to some nips

>why is it asking me for chimneys these days?
Santa, Inc. was the original Big Data user, but now they outsource.

Maybe drone delivery service? Chimneys would be abnormal obstacles.

This is not what thet mean by morals of ai...

They mean if a car MUST collide with something due to a situation, what does it choose

Its the kill 10 niggerw or one white man problem for the programmer

>moral issue
Is it really. Would anyone buy a car that wasn't programmed to protect driver and passengers at all cost?

Would you drive a car that might decide to kill you if some old woman crossed the streets? Of course, you would buy from a different manufacturer.

Hey dont shoot the messenger im in japan and never drive anyways

I dont give a shit

Car companies and self driving software makers are lobbying the government to recognize the system itself as the driver, or at least a driver of the vehicle.
They're trying to get this passed so totally hands free driving will be legal. The kind of shit where you could be sleeping in the drivers seat not paying any attention. The flip side of this of course is liability in collisions.

Most companies are confident enough in their systems that the total incidence of collisions will decrease enough that the insurance burden will lower. They're not worried about paying out fewer settlements, so they're accepting the liability themselves. So a if your self driving future Ford XUV runs into and smashes a toddler while you're drinking coffee and looking at your phone its going to be Ford who pays out. Because they expect this to be astronomically rare compared to current rates.