The Student Room Group

First pedestrian killed by Uber self-driving car

The first death of a pedestrian involving an autonomous vehicle has taken place, in Tempe, Arizona.
http://www.bbc.co.uk/news/business-43459156

Uber have pulled their autonomous programme for now.

I wonder who will be blamed for it? The software writers? Uber? The non-driving occupant of the vehicle? It will be interesting to see how this is handled by insurers.

Feel for the poor family of this latest victim of Uber - of course, in driven cars, Uber have killed lots of people all over the world, as have their drivers in various ways.
http://www.whosdrivingyou.org/rideshare-incidents#deaths
Reply 1
It means some unpleasant questions are going to have to be asked but i doubt this will make much of a difference to the time scale.

I am curious though when the day comes that all cars are automated, and with machines being infallible and what not, where will the blame come to rest if someone gets squashed? A machine cant exactly be held responsible and it would be a curious leap to hold some far off corporate types personally liable - will the victim indeed be the one at fault?
Original post by Napp
It means some unpleasant questions are going to have to be asked but i doubt this will make much of a difference to the time scale.

I am curious though when the day comes that all cars are automated, and with machines being infallible and what not, where will the blame come to rest if someone gets squashed? A machine cant exactly be held responsible and it would be a curious leap to hold some far off corporate types personally liable - will the victim indeed be the one at fault?


Well the law isn't ready here and I think we'll probably see that it's not ready in Arizona either. Coincidentally they were talking about this very issue on radio 4 earlier this week
http://www.bbc.co.uk/programmes/b09v3fdt

it's all rather unsettling IMO but at least we can rest assured that it's being pushed by ethically sound companies like uber that have a really great safety culture and are always totally cooperative with regulators.

:zomg::zomg::zomg::zomg:
Original post by Fullofsurprises
The first death of a pedestrian involving an autonomous vehicle has taken place, in Tempe, Arizona.
http://www.bbc.co.uk/news/business-43459156

Uber have pulled their autonomous programme for now.

I wonder who will be blamed for it? The software writers? Uber? The non-driving occupant of the vehicle? It will be interesting to see how this is handled by insurers.

Feel for the poor family of this latest victim of Uber - of course, in driven cars, Uber have killed lots of people all over the world, as have their drivers in various ways.
http://www.whosdrivingyou.org/rideshare-incidents#deaths


They don't need to be perfect. They just need to cause less accidents than humans to be an improvement
Original post by Skyewoods
They don't need to be perfect. They just need to cause less accidents than humans to be an improvement


In theory that sounds right, but I doubt that it will be accepted in practise. Would people be happy if automated planes only crashed 13 or 14 times a year instead of the current 18 or 19? The truth is that people are more allowed to screw up than machines.
Original post by Joinedup

it's all rather unsettling IMO but at least we can rest assured that it's being pushed by ethically sound companies like uber that have a really great safety culture and are always totally cooperative with regulators.

:zomg::zomg::zomg::zomg:


It just had to be Uber.

Gives a whole new meaning to the word 'surge'.
clearly there is a demand for pedestrians to be automated to some extent. pavement users could be provided with remotely controlled walking units ( or "legs" ) which would then be guided by satellite to the required destination for a small fee.
Original post by the bear
clearly there is a demand for pedestrians to be automated to some extent. pavement users could be provided with remotely controlled walking units ( or "legs" ) which would then be guided by satellite to the required destination for a small fee.


Facebook likes could be aggregated to determine which 'auto-peds' would be considered worthy of not being run over and eliminated. Automated vehicles would contain systems such as ANPR (Automatic Neutralisation of Pedestrian Rejects) that would also collect the corpses and store them in capsules slung beneath the bumper for later processing.
Original post by Fullofsurprises
Facebook likes could be aggregated to determine which 'auto-peds' would be considered worthy of not being run over and eliminated. Automated vehicles would contain systems such as ANPR (Automatic Neutralisation of Pedestrian Rejects) that would also collect the corpses and store them in capsules slung beneath the bumper for later processing.


with this rational approach to pavement usage will come a natural requirement for pedestrians to be insured. younger pedestrians would be encouraged to have a black box implanted for the first few years to check their speed and control. so many first time pavement users like to show off to their peer group and do foolish stunts like moonwalking or "silly walks".
Original post by the bear
with this rational approach to pavement usage will come a natural requirement for pedestrians to be insured. younger pedestrians would be encouraged to have a black box implanted for the first few years to check their speed and control. so many first time pavement users like to show off to their peer group and do foolish stunts like moonwalking or "silly walks".


In particular, the sort of random walking that takes place in Oxford St and the like must be punished to the absolute extreme, perhaps by automating our beloved London buses and have them use the pavements whenever they feel there are too many vans, bicycles, taxis and pedal rickshaws in the bus lanes.
Original post by Fullofsurprises
In theory that sounds right, but I doubt that it will be accepted in practise. Would people be happy if automated planes only crashed 13 or 14 times a year instead of the current 18 or 19? The truth is that people are more allowed to screw up than machines.


1800 people died in rtc's in the UK alone last year, the bar humans have set for driverless car to surpass is incredibly low.
Original post by mojojojo101
1800 people died in rtc's in the UK alone last year, the bar humans have set for driverless car to surpass is incredibly low.


Yes - but my contention is that people are nonetheless more willing to accept (irrational though it may seem) 1800 deaths caused by humans than 1 caused by a software-driven rampaging robot.
Original post by Fullofsurprises
Yes - but my contention is that people are nonetheless more willing to accept (irrational though it may seem) 1800 deaths caused by humans than 1 caused by a software-driven rampaging robot.


All the cars of the same type will be running the same software so if one car has a problem with wanting to run over pedestrians pushing bicycles... millions of them do until the bugfix has been written, tested and pushed out.

what are you going to do in the meantime - disable peoples cars til fixed? allow the cars to drive and warn people not to push bicycles?

Car makers have a woeful safety culture (unsafe at any speed) and a track record of coverup and cheating regulations (Dieselgate) and uber seem to be locked into 10/10 cowboy capitalism.

in contrast aircraft makers have a long standing safety culture and don't and a good record of cooperating with authorities & facing up to safety problems.
Why are self-driving cars being used on public land when they have been marginally tested and their being no official law on this matter.

Disgraceful from Uber..
Original post by The PoliticalGuy
Why are self-driving cars being used on public land when they have been marginally tested and their being no official law on this matter.

Disgraceful from Uber..


The British government also appear to be in a mad rush to allow autonomous vehicles to be let loose in our completely safe, relaxed driving environments.

Given that the whole history of the automotive industry is that they had to be dragged kicking and screaming into making their products safe, it's all very puzzling that they are now so trusted. Most of the people involved in this - Google, Uber, Apple - are also massive tax evaders and have a long history of telling lies to the public about security issues on their products.
Original post by Fullofsurprises
In theory that sounds right, but I doubt that it will be accepted in practise. Would people be happy if automated planes only crashed 13 or 14 times a year instead of the current 18 or 19? The truth is that people are more allowed to screw up than machines.


Would you rather have a human crash 19 planes or an automatic plane prevent 5 crashes. You look at the lives saved from automation and it is clear that the safest way is remove the chance of human error
Original post by Skyewoods
Would you rather have a human crash 19 planes or an automatic plane prevent 5 crashes. You look at the lives saved from automation and it is clear that the safest way is remove the chance of human error


Almost certainly you'll be swapping one sort of crash for another - hopefully with a reduction in overall deaths.

if it works out you'll have fewer airfrance 447 pilot error type crashes
but you probably won't see any more mechanical failure disasters averted like USAirways 1549 where a quick thinking hero pilot is able to save everyone.

What we're being asked to accept with autonomous road vehicles is something that'll probably save lives by braking more quickly on motorways... but still kill people by running over pedestrians pushing bicycles (or similar unanticipated behaviour that human drivers are usually able to cope with quite well)
If you're someone pushing a bicycle in the suburbs it's little comfort that your life has been made more dangerous so that accidents on motorways might be being prevented.

Quick Reply

Latest

Trending

Trending