AmericanThunder
Super Moderator
Uber has removed its self-driving cars from the roads following what is believed to be the first fatality involving a fully autonomous car.
A self-driving Uber SUV struck and killed 49-year-old Elaine Herzberg as she walked her bicycle across a street in Tempe, Arizona, Sunday night, according to the Tempe police. The department is investigating the crash.
Rafael Vasquez, a 44-year-old test driver from Uber, was behind the wheel of the Volvo XC90 SUV at the time, the police said.
Based on preliminary information, the car was going approximately 40 mph in a 35 mph zone, according to Tempe Police Detective Lily Duran.
Police say the investigation does not at this time show significant signs of the SUV slowing before the crash. The Maricopa County Attorney's Office will determine whether charges will be filed.
"The vehicle involved is one of Uber's self-driving vehicles," the Tempe police said in a statement. "It was in autonomous mode at the time of the collision, with a vehicle operator behind the wheel."
Autonomous mode means the car is driving on its own. During tests, a person sits behind the wheel as a safeguard.
Uber is conducting tests of autonomous vehicles in Arizona, Pittsburgh, Toronto and other areas. Uber said it has stopped testing the vehicles throughout the United States and Canada.
Some incredibly sad news out of Arizona. We're thinking of the victim's family as we work with local law enforcement to understand what happened.
The above is requoted from a CNN news article.
For me the two key points are:
1: speeding
2: the failure to act of the failsafe
Number 1 is unacceptable. Surely the benefit of autonomous cars is that they are safe because they observe traffic laws, so how does a car programmed to recognize speed limit signs and slow down (or speed up) in appropriate circumstances end up speeding? Human error of course, this time responsibility must sit with the programmers.
As for number 2, he was sat in the drivers seat observing with ability to brake and steer as required. So why was no avoiding action taken? Human error of course. A demonstration of trust in a human programmed device.
Why trust a device that is devoid of emotion?
Of course all new technology has teething problems but the big question for me is why? Why do we need this at all?
I know most commercial airliners are mostly piloted by computers these days but policies are strict with regards where the pilot and co-pilot can be during these times and on long haul flights they still provide relief pilots so that they can monitor constantly. Also, at 30000 ft plus, the chances of hitting something is remote and there is generally time for the computers to be relieved of command and a human to take over. I don’t believe the two examples can be compared with regard to risk.
But on the roads with so many variables and the worst being human nature, I don’t think I will ever trust an autonomous car. My daily has radar cruise control where it can speed up and slow down following the car in front. It even brings the car to a stop. Have I tested it? - yes. Do I trust it? - no. I have my hands on the wheel, and foot over the brake the entire time. Why? - because it was designed by humans and operates in a changeable environment with other humans and other human designed appliances.
Not to be trusted.
A self-driving Uber SUV struck and killed 49-year-old Elaine Herzberg as she walked her bicycle across a street in Tempe, Arizona, Sunday night, according to the Tempe police. The department is investigating the crash.
Rafael Vasquez, a 44-year-old test driver from Uber, was behind the wheel of the Volvo XC90 SUV at the time, the police said.
Based on preliminary information, the car was going approximately 40 mph in a 35 mph zone, according to Tempe Police Detective Lily Duran.
Police say the investigation does not at this time show significant signs of the SUV slowing before the crash. The Maricopa County Attorney's Office will determine whether charges will be filed.
"The vehicle involved is one of Uber's self-driving vehicles," the Tempe police said in a statement. "It was in autonomous mode at the time of the collision, with a vehicle operator behind the wheel."
Autonomous mode means the car is driving on its own. During tests, a person sits behind the wheel as a safeguard.
Uber is conducting tests of autonomous vehicles in Arizona, Pittsburgh, Toronto and other areas. Uber said it has stopped testing the vehicles throughout the United States and Canada.
Some incredibly sad news out of Arizona. We're thinking of the victim's family as we work with local law enforcement to understand what happened.
The above is requoted from a CNN news article.
For me the two key points are:
1: speeding
2: the failure to act of the failsafe
Number 1 is unacceptable. Surely the benefit of autonomous cars is that they are safe because they observe traffic laws, so how does a car programmed to recognize speed limit signs and slow down (or speed up) in appropriate circumstances end up speeding? Human error of course, this time responsibility must sit with the programmers.
As for number 2, he was sat in the drivers seat observing with ability to brake and steer as required. So why was no avoiding action taken? Human error of course. A demonstration of trust in a human programmed device.
Why trust a device that is devoid of emotion?
Of course all new technology has teething problems but the big question for me is why? Why do we need this at all?
I know most commercial airliners are mostly piloted by computers these days but policies are strict with regards where the pilot and co-pilot can be during these times and on long haul flights they still provide relief pilots so that they can monitor constantly. Also, at 30000 ft plus, the chances of hitting something is remote and there is generally time for the computers to be relieved of command and a human to take over. I don’t believe the two examples can be compared with regard to risk.
But on the roads with so many variables and the worst being human nature, I don’t think I will ever trust an autonomous car. My daily has radar cruise control where it can speed up and slow down following the car in front. It even brings the car to a stop. Have I tested it? - yes. Do I trust it? - no. I have my hands on the wheel, and foot over the brake the entire time. Why? - because it was designed by humans and operates in a changeable environment with other humans and other human designed appliances.
Not to be trusted.