I was aware of that but I left it out there as bait. Obviously the self-driving car was incapable of analyzing and accommodating a fairly common real world situation."The accident occurred when the driver of a second vehicle "failed to yield" to the Uber vehicle while making a turn, said Josie Montenegro, a spokeswoman for the Tempe Police Department."
Sensationalism by association?
First I considered the obvious - that the "guilty" car turned in front of the "smart" car and the "smart" car missed several routine visual cues and failed to take evasive action. One of the best ways to avoid accidents is to pay attention and avoid the accidents that the "other guy" causes.Ever consider that this could have been a situation that you couldnt avoid 99.999% of the time? The above quote is in reference to #174 which is general/could mean many things. "Failed to yield", like many things in life, plots out as a bell curve.
Obviously the self-driving car was incapable of analyzing and accommodating a fairly common real world situation.
Seems to me you like to play pretty fast and loose with the term 'obvious' here. Based upon the article and presumably the police report, there's nothing to suggest that a human driver would/could have avoided this collision. Your assumption and misdirected blame here is unfounded malarkey.First I considered the obvious - that the "guilty" car turned in front of the "smart" car and the "smart" car missed several routine visual cues and failed to take evasive action.
You clever devil you. Do continue to have fun on your fishing expedition. Bon voyage....I was aware of that but I left it out there as bait.
The suggestion came to me from decades of alert driving experience. YMPV - your mileage probably varied.Seems to me you like to play pretty fast and loose with the term 'obvious' here. Based upon the article and presumably the police report, there's nothing to suggest that a human driver would/could have avoided this collision.
Interesting. Seems to be about the safety and regulation of the development cycle of this technology. Probably cuz the technology is so unusually shaky to begin with and involves putting public safety at risk. So the politics don't advance or validate the technology - they just attempt to deal with it.In an attempt to see what was going on, I googled "what is going on with self driving cars?" and got this recent news.
http://www.npr.org/sections/thetwo-...olls-out-new-guidelines-for-self-driving-cars
But even in that scenario the lid was fully functional as a lid, and the incompatible container was also fully functional as a container. Neither of the two key components of autonomous driving - detection and analysis - are anywhere near being fully functional. Nor will they be until some serious breakthroughs are made in machine "cognizance" if that's the right term.As long as the car companies dont follow the "no way in hell is your lid gonna be compatible with my tupperware" model, we should be fine...
But even in that scenario the lid was fully functional as a lid, and the incompatible container was also fully functional as a container. Neither of the two key components of autonomous driving - detection and analysis - are anywhere near being fully functional. Nor will they be until some serious breakthroughs are made in machine "cognizance" if that's the right term.
You're WAY overestimating the importance of hardware (we do that well) and WAY underestimating the importance of algorithm (we're actually quite stupid). So you put stupid algorithms "behind the wheel" of sophisticated hardware and what happens? Yes - sophisticated hardware gets into stupid accidents.dupe post. but....
reaction times will be less than the average human.
better awareness of surroundings and thus the ability to drive much more defensively
commute times will be lessened
do humans typically get injured or die when they hit a deer?
Um ... search me.So where are all the accidents in the past year?