I am acutely aware of what I can and cannot do behind the wheel. I drive Bush
Fire Brigade vehicles and for while passenger carrying buses.
Its really about paying attention. Driving trucks needs 100%, like buses.
This reflects in car driving.
To give a computer anything to do that affects human safety in an UNcontrolled
environment is sheer lunacy. It might have been scary in the days of the Model
T when Ford introduced them, but now it is terrifying. Not unlike having a wall
of flame heading your way as the wind changes.
The other think that is scary is what the IT people say ( if pressed) “its all
getting way too complex”.
Bill
On 1 Jul 2016, at 15:29, Stuart Burnfield <slb@xxxxxxxxxxxxxx> wrote:
Peter said:
a.) Do we know a human driver would have avoided this collision?
It's hard to know for sure but it sounds like the answer is probably yes.
Despite the sleazy, deceptive, blame-the-victim wording by Tesla's Mouth of
Sauron, it's hard to imagine Mr Brown or even a minimally attentive driver:
1. Not noticing a trailer swerving in front of him and cutting him off
2. Continuing to drive, "leaving the road, striking a fence, crossing a
field, passing through another fence and finally hitting a utility pole about
100 feet south of the road"
b.) Automated cars are never going to overcome the laws of physics,
particularly if less than predictable, non-communicating humans are driving
in the opposition direction at say,100 kms, and suddenly swerve to the other
side of the road.
True.
c.) "Wait till they start aging ?" That's what they already say about
human drivers over the age of 75, I believe.
Well this could eventually (once the technology is good enough) be a plus for
both learner drivers and aging and increasingly impaired drivers. Autodrive
cars could assist learners and inexperienced drivers to deal with unusual and
dangerous conditions. They could also ease the transition to non-driving for
older drivers as they gradually become less capable behind the wheel.
--- Stuart
On 1 July 2016 at 16:10, LIVERANI Petra
<Petra.LIVERANI@xxxxxxxxxxxxxxxxxxxxxxxx
<mailto:Petra.LIVERANI@xxxxxxxxxxxxxxxxxxxxxxxx>> wrote:
Very sad. More information on the crash. Ironically, it seems that the
driver, Joshua Brown, had posted a video in May saying how the autopilot
function had saved him from an accident.
http://www.abc.net.au/news/2016-07-01/tesla-driver-killed-while-car-was-in-on-autopilot/7560126
<http://www.abc.net.au/news/2016-07-01/tesla-driver-killed-while-car-was-in-on-autopilot/7560126>
It seems that although the autopilot function isn’t perfect (and this is very
much emphasised by the company) the Model S is still a very safe car.
Accidents have been reported in Model Ss where it is surmised the occupants
would normally have died but didn’t. The supposed saving of their lives,
however, was nothing to do with autopilot. The trouble is that even if Tesla
say that you should always have your hands on the steering wheel, etc, it
will be virtually impossible for drivers to maintain normal concentration and
in a lapsed moment when autopilot fails they could have an accident.
Unfortunately, in the improvement process lives will be lost and I’d say it
is more likely than not that it will never achieve 100% safety … but that’s
hardly unexpected. Presumably, autopilot will save more lives than it loses.