Page 8 of 8

Re: First Tesla Autopilot Death

Posted: Tue Jan 24, 2017 6:05 pm UTC
by cphite
speising wrote:I'm not a pilot, but i'm pretty sure that, cruising along over the atlantic at 30000ft, the pilot doesn't have to be on split-second watch all the time. If the autopilot has a problem, it'll emit some noise, the pilot puts their coffee away and starts to look for what's up.


Pretty much... most people simply don't understand that automatic driving is vastly more difficult than automatic flying. Which is one of the reasons we've had autopilot for many years, and we're still a few years away from automatic driving.

There is much more to account for on the ground. In the air you have much greater distances between you and other objects, you can change altitudes, and perhaps most importantly, you have literally thousands of people around the world who's job it is to watch where you and every other aircraft is and to prevent conflicts long before they occur.

People hear the word "autopilot" and they think it's going to work like on an airplane; that they can simply sit back and relax and if something comes up, they'll get pinged. The problem is that when driving, even if it does work that way you often have mere seconds to react. Most people find that stressful and difficult even when they're actively driving. Getting pinged out of a movie, book, daydream or whatever and then reacting to something in seconds... it's too much to expect from even skilled drivers.

In that light, using the term "autopilot" was irresponsible on the part of Tesla. That the term might be misconstrued is fairly obvious. They need to make it completely clear - obnoxiously clear - that the driver needs to pay attention.

If you're in a car, cruising on the highway on "full auto", it's nigh impossible too maintain the same level of concentration like when you're actively in control. (and if it were, it would make the autopilot useless)


There have been studies that have shown that even basic cruise control can increase the odds of driver error, because people tend to be more likely to zone out. This can be mitigated by stuff like automatic braking, adaptive speed, etc; but anything that encourages the driver to lose focus and stop paying attention is dangerous.

Google has the right idea... if a car is going to be auto driving, it needs to be so completely capable as to never need driver intervention at all.

Re: First Tesla Autopilot Death

Posted: Tue Jan 24, 2017 7:51 pm UTC
by KnightExemplar
cphite wrote:Google has the right idea... if a car is going to be auto driving, it needs to be so completely capable as to never need driver intervention at all.


I think Mercedes, Subaru, and other luxury companies also are taking the right approach. They call the technology "Lane Assist", "Adaptive Cruise Control", and "Braking Assist".

Its virtually identical to autopilot: the car automatically steers and keeps its distance away from other cars on the highway. There's even "Braking Assist" to automatically stop in some situations. But the reason why say... Subaru... hasn't gotten any criticism from me is because Subaru doesn't call their level-2 automation technology fucking "Autopilot".

LOTS of companies are implementing level2 automation (lots of things automated, but an attentive human is still for safe driving). Tesla is also implementing level2 automation, except they're trying to market it as a level3 (safe for an inattentive human) or level4 (fully automatic) technology.

Re: First Tesla Autopilot Death

Posted: Tue Jan 24, 2017 10:07 pm UTC
by cphite
KnightExemplar wrote:
cphite wrote:Google has the right idea... if a car is going to be auto driving, it needs to be so completely capable as to never need driver intervention at all.


I think Mercedes, Subaru, and other luxury companies also are taking the right approach. They call the technology "Lane Assist", "Adaptive Cruise Control", and "Braking Assist".

Its virtually identical to autopilot: the car automatically steers and keeps its distance away from other cars on the highway. There's even "Braking Assist" to automatically stop in some situations. But the reason why say... Subaru... hasn't gotten any criticism from me is because Subaru doesn't call their level-2 automation technology fucking "Autopilot".


Exactly... those are technologies intended to assist someone who is still actively driving the car. If you have a lapse in concentration the car can brake for you, or avoid another car, etc. But that is a far cry away from autopilot. You're still required to drive, and expected to stay focused.

LOTS of companies are implementing level2 automation (lots of things automated, but an attentive human is still for safe driving). Tesla is also implementing level2 automation, except they're trying to market it as a level3 (safe for an inattentive human) or level4 (fully automatic) technology.


Level3 is a pipe-dream. If the car isn't capable of being fully automatic, then it isn't safe for an inattentive driver. The whole point of level3 is that the driver only has to act in the case of something unexpected happening; but their being inattentive makes that virtually impossible.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 10:25 am UTC
by Mutex
It depends. If the car can deal with an emergency safely by itself, slamming on the brakes or otherwise dealing with the situation, then it's safe for an inattentive driver, but might still require the driver's input when it comes across say, a detoured route, or something where it's ok to wait a moment for the driver's response.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 1:29 pm UTC
by HES
Similarly, there may be situations where it doesn't know what to do, but can safely pull over and stop before handing over to the driver.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 1:36 pm UTC
by Liri
Yeah. The thing I always come back to in my head is driving on ill-maintained dirt/gravel roads in the mountains or other rural areas - places where a human driver would know not to expect standard rules of the road (or enough space for two cars to pass by one another). If we ever get to level 4, will it be able to handle that sort of situation, or will we always have to have a hidden steering wheel we can activate as needed (or something like that)?

If you never do anything fun outdoors, this probably won't be an issue for you.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 1:57 pm UTC
by morriswalters
DARPA Grand Challenge (2005)
The second driverless car competition of the DARPA Grand Challenge was a 212 km (132 mi) off-road course that began at 6:40am on October 8, 2005, near the California/Nevada state line. All but one of the 23 finalists in the 2005 race surpassed the 11.78 km (7.32 mi) distance completed by the best vehicle in the 2004 race. Five vehicles successfully completed the course
They have thought about it.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 2:02 pm UTC
by Liri
Dang that's cool.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 5:39 pm UTC
by jewish_scientist
Chen wrote:That's why we need to educate the public?

No, your not getting it. Right now Tesla and the public has different definitions of autopilot, which leads to problems. On way to fix this is to change the public's definition, which is what you are advising; the other way is to change Tesla's definition, which is what I am advocating. I have two arguments for why my solution is superior to yours.

1: Tesla intentionally created the problem when it decided to adopt a definition of autopilot that it knew differed from the public's.

2: It would take a LOT more resources to change the public's definition than to change Tesla's.

KnightExemplar wrote:I think Mercedes, Subaru, and other luxury companies also are taking the right approach. They call the technology "Lane Assist", "Adaptive Cruise Control", and "Braking Assist".

Its virtually identical to autopilot: the car automatically steers and keeps its distance away from other cars on the highway. There's even "Braking Assist" to automatically stop in some situations...

LOTS of companies are implementing level2 automation (lots of things automated, but an attentive human is still for safe driving). Tesla is also implementing level2 automation, except they're trying to market it as a level3 (safe for an inattentive human) or level4 (fully automatic) technology.

This is basically my entire opinion on the subject.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 9:03 pm UTC
by morriswalters
There is some technology focused on driver attention. Cameras that can capture what you are looking at and rattle you if your focus leaves the road too long. Haptic feedback, seat shakers and that type of thing. Tesla looks at something, what the steering wheel, and I think they reduced the time out.

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 9:21 pm UTC
by speising
morriswalters wrote:There is some technology focused on driver attention. Cameras that can capture what you are looking at and rattle you if your focus leaves the road too long. Haptic feedback, seat shakers and that type of thing. Tesla looks at something, what the steering wheel, and I think they reduced the time out.

So, a system that's supposed to make driving more comfortable turns into a constant nagging machine? How is that preferable to driving yourself?

Re: First Tesla Autopilot Death

Posted: Wed Jan 25, 2017 9:53 pm UTC
by morriswalters
speising wrote:So, a system that's supposed to make driving more comfortable turns into a constant nagging machine? How is that preferable to driving yourself?
I don't know that it is.

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 4:13 pm UTC
by Soupspoon
Trebla wrote:
It determined he set his car’s cruise control at 74 miles per hour about two minutes before the crash,

Should that have even been allowed? The accident happened in Florida where, I have just checked, they have a clear state-wide maximum limit of 70mph (before further localised downgrading, on more minor roads, and it might even be 65 on the US-27, according to some of what I read). And you'd have to go across at least three state borders to even get to a 75mph-topping place. If Tesla's instantaneous positioning information is anything like as accurate as it needs to be for all other purposes, this should be the simplest traffic rule to make autonomous vehicles comply with more surely than your everday fallible yooman driver.

(Not that it seems like the difference would have helped, here.)

It would indeed remove some of the 'spark' in driving a sports-electric, but insurance-cheapening (or even pay-back, as the cameras record other vehicles speeding past, earning a bounty for any usefulness in warning/penalising those drivers) could be attractive.

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 5:32 pm UTC
by ucim
Soupspoon wrote:Should that have even been allowed?
Are you really asking if our own machines should enforce their understanding of what the law is at any given moment? Do you not see where this leads?

Jose

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 5:37 pm UTC
by Mutex
For a start, how would it know if you're on private property?

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 6:05 pm UTC
by Soupspoon
Three words: Gee. Pee. Ess.

And a manual over-ride for "these are exceptional circumstances1, and Tesla will probably need this data recorded for full manual review, or live monitoring, whatever happens" should be possible.

1 Emergency transit of patient/organs to a hospital, requisitioned by a officer in the business of a high-speed pursuit, participation in a Formula E championship where someone forget to apply the temporary geofencing exception to the course, "hey, I'm rich and two beers past my limit! I'll pay your gorram fine for 'inappropriate use' in the morning, whether or not you redirect my route to the nearest PD before I throw up on your gorram console!"

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 6:21 pm UTC
by morriswalters
ucim wrote:
Soupspoon wrote:Should that have even been allowed?
Are you really asking if our own machines should enforce their understanding of what the law is at any given moment? Do you not see where this leads?

Jose
Setting an upper limit on cruise control doesn't seem like much of an issue. And by default what else would an autonomous car do? When I think of the things I didn't see because I spent my time looking at the road rather than watching the scenery, it makes me wish that this had come to fruition 30 years ago.

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 6:26 pm UTC
by ucim
morriswalters wrote:Setting an upper limit on cruise control doesn't seem like much of an issue.
Depends what it is set to, and how that is determined. Machine limits should be set based on the machine's capabilities. When machines start enforcing the (arbitrary and changable) laws of people, trouble is not far behind.

Jose

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 6:43 pm UTC
by PeteP
You are aware that for instance safety features are a thing? Rules built into the very design of the machine? Like microwaves not turning on when open for a trival one. They already do it is just limited to simple cases.

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 7:21 pm UTC
by morriswalters
ucim wrote:
morriswalters wrote:Setting an upper limit on cruise control doesn't seem like much of an issue.
Depends what it is set to, and how that is determined. Machine limits should be set based on the machine's capabilities. When machines start enforcing the (arbitrary and changable) laws of people, trouble is not far behind.

Jose
If your cruise control offends you, pluck it out! But failing that, sometimes you just have to, well, you know, take the wheel and kick the cruise to the curb. In a more serious vein, If you can read signage I think a camera can. They are stylized, easy to read, and best of all, standardized. Which is just how you know the speed limit in any arbitrary place. That and the general rules of the road. The question is currently do machines perform as well as a man. To which the answer is not yet. At least in everything. If I owned a car that had it, I'd concerned about adaptive cruise control and lane maintaining, but that is because of my limitations, not the cars.

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 7:36 pm UTC
by Zamfir
ucim wrote:When machines start enforcing the (arbitrary and changable) laws of people, trouble is not far behind.

I work in industrial pollution control, where this is just standard.Many filter systems have a calibrated, government-required measurement of the remaining pollutant concentration. Some only require reporting of excursions(automated or not). Others work as soupspoon describes: the polluting system shuts down if you reach the legal limit. You set a lower alarm value to give you time, and there might be an override. If you use the override, you have to report the situation to the authorities.

And as PeteP notes, an even tighter logic applies to industrial safety systems, again incorporating legal requirements. Many hard, built-in limits. You can't go here. You can't go push the machine beyond X. You can't access this. Can't turn this feature on, or can't turn it off. System shuts down if you hit this or that operating point, or troublesome emergency measures act automatically.

Within those hard, mechanically enforced limits, you build soft alarm limits.Those can be overruled by an experienced operator, but the point is to keep the 'flexible' zone within the limits, not around them as seems typical in car traffic.

Seriously, cars and traffic are the exception, not the rule. I don't think there's any other kind of machinery that exhibits such prevalent risks (to operators and especially to uninvolved third parties), with such weak controls or enforcement.

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 7:44 pm UTC
by idonno
PeteP wrote:You are aware that for instance safety features are a thing? Rules built into the very design of the machine? Like microwaves not turning on when open for a trival one. They already do it is just limited to simple cases.
This is even already done sometimes with motor vehicles controlling the max speed https://en.wikipedia.org/wiki/Speed_limiter

Re: First Tesla Autopilot Death

Posted: Fri Jan 27, 2017 8:31 pm UTC
by sardia
Zamfir wrote:
ucim wrote:When machines start enforcing the (arbitrary and changable) laws of people, trouble is not far behind.

I work in industrial pollution control, where this is just standard.Many filter systems have a calibrated, government-required measurement of the remaining pollutant concentration. Some only require reporting of excursions(automated or not). Others work as soupspoon describes: the polluting system shuts down if you reach the legal limit. You set a lower alarm value to give you time, and there might be an override. If you use the override, you have to report the situation to the authorities.

And as PeteP notes, an even tighter logic applies to industrial safety systems, again incorporating legal requirements. Many hard, built-in limits. You can't go here. You can't go push the machine beyond X. You can't access this. Can't turn this feature on, or can't turn it off. System shuts down if you hit this or that operating point, or troublesome emergency measures act automatically.

Within those hard, mechanically enforced limits, you build soft alarm limits.Those can be overruled by an experienced operator, but the point is to keep the 'flexible' zone within the limits, not around them as seems typical in car traffic.

Seriously, cars and traffic are the exception, not the rule. I don't think there's any other kind of machinery that exhibits such prevalent risks (to operators and especially to uninvolved third parties), with such weak controls or enforcement.

It's the uniquly American obsession that conflates cars with freedom. I sorta get it being steeped into car culture. But the future is bright, regardless of how much car people freak out. It's just too convenient.

Re: First Tesla Autopilot Death

Posted: Wed Feb 08, 2017 5:02 am UTC
by Tyndmyr
morriswalters wrote:There is some technology focused on driver attention. Cameras that can capture what you are looking at and rattle you if your focus leaves the road too long. Haptic feedback, seat shakers and that type of thing. Tesla looks at something, what the steering wheel, and I think they reduced the time out.


I wouldn't be adverse to such features solely for their own sake. Even as an optional system. I've driven tired before. Everyone has, because they were somewhere far away, or didn't realize they were getting sleepy or whatever. It's a significant risk that you can take inadvertently. Warning systems for that strike me as a good option to save lives. I'd view such a system(provided I still have control over it to enable/disable to avoid annoyance), as a plus on a car.

Re: First Tesla Autopilot Death

Posted: Wed Feb 08, 2017 10:53 am UTC
by HES
I think most driver attention research uses eye tracking, which is getting to a price point where a consumer product would be viable.