Self driving cars

AmericanThunder

Super Moderator
Uber has removed its self-driving cars from the roads following what is believed to be the first fatality involving a fully autonomous car.
A self-driving Uber SUV struck and killed 49-year-old Elaine Herzberg as she walked her bicycle across a street in Tempe, Arizona, Sunday night, according to the Tempe police. The department is investigating the crash.
Rafael Vasquez, a 44-year-old test driver from Uber, was behind the wheel of the Volvo XC90 SUV at the time, the police said.
Based on preliminary information, the car was going approximately 40 mph in a 35 mph zone, according to Tempe Police Detective Lily Duran.
Police say the investigation does not at this time show significant signs of the SUV slowing before the crash. The Maricopa County Attorney's Office will determine whether charges will be filed.
"The vehicle involved is one of Uber's self-driving vehicles," the Tempe police said in a statement. "It was in autonomous mode at the time of the collision, with a vehicle operator behind the wheel."
Autonomous mode means the car is driving on its own. During tests, a person sits behind the wheel as a safeguard.
Uber is conducting tests of autonomous vehicles in Arizona, Pittsburgh, Toronto and other areas. Uber said it has stopped testing the vehicles throughout the United States and Canada.
Some incredibly sad news out of Arizona. We're thinking of the victim's family as we work with local law enforcement to understand what happened.


The above is requoted from a CNN news article.
For me the two key points are:
1: speeding
2: the failure to act of the failsafe

Number 1 is unacceptable. Surely the benefit of autonomous cars is that they are safe because they observe traffic laws, so how does a car programmed to recognize speed limit signs and slow down (or speed up) in appropriate circumstances end up speeding? Human error of course, this time responsibility must sit with the programmers.
As for number 2, he was sat in the drivers seat observing with ability to brake and steer as required. So why was no avoiding action taken? Human error of course. A demonstration of trust in a human programmed device.
Why trust a device that is devoid of emotion?
Of course all new technology has teething problems but the big question for me is why? Why do we need this at all?
I know most commercial airliners are mostly piloted by computers these days but policies are strict with regards where the pilot and co-pilot can be during these times and on long haul flights they still provide relief pilots so that they can monitor constantly. Also, at 30000 ft plus, the chances of hitting something is remote and there is generally time for the computers to be relieved of command and a human to take over. I don’t believe the two examples can be compared with regard to risk.
But on the roads with so many variables and the worst being human nature, I don’t think I will ever trust an autonomous car. My daily has radar cruise control where it can speed up and slow down following the car in front. It even brings the car to a stop. Have I tested it? - yes. Do I trust it? - no. I have my hands on the wheel, and foot over the brake the entire time. Why? - because it was designed by humans and operates in a changeable environment with other humans and other human designed appliances.
Not to be trusted.
 
I heard on the news that an eye witness said that the woman jumped out in front of the car on a suicide mission and no human would have avoided the accident.

Personally I wouldn't trust anything that's been programmed. Looks at Microsoft, been making windows for years, and still has bugs, crashed, requires reboots. If a big organisation cant get things right, then its going to be a long time and loads of testing to get self driving cars working. Self driving trains work well, as they rarely get passengers on the track, however if you jump in front of a train, no matter who is driving it still requires time to stop.
 
I heard on the news that an eye witness said that the woman jumped out in front of the car on a suicide mission and no human would have avoided the accident.

Not sure I believe that. There are much surer ways of committing suicide than by being hit by a relatively slow moving car on city streets. Why risk time in hospital followed by psychiatric evaluations and possible life changing injuries. There are much surer ways, on city streets use a bus or truck. If it has to be a car go to the highway where speeds are higher or think bigger like trains. All are incredibly selfish but if you are serious about suicide then a relatively slow car is not the best option.
 
hopefully this will put an end to this daft idea
 
Ok, further footage shows it wasn’t suicide, but the woman did step in front of the car after crossing three lanes of traffic previously.
So back to the point I made in the first post, surely it’s the technology that computers can use are what is supposed to prevent this. A human might not have been able to see her until she was in the headlights, but computer sensors should have detected her and slowed appropriately. So, what is the benefit of self driving cars if they are as fallible as humans?
 
I@m not keen on it, but its going to happen as people believe it is the way forward. I think they should be separated. perhaps humans shouldn't be allowed to cross the road without a subway or bridge.
 
Back
Top