From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
The website referred to appears to be collating information in
a reasonable and unemotional way.
Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400
We provide an updated record of Tesla fatalities and Tesla accident deaths that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled, links and analysis comparing Musk's safety claims, and more.
Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12
https://www.tesladeaths.com/
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
The website referred to appears to be collating information in
a reasonable and unemotional way.
Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400
We provide an updated record of Tesla fatalities and Tesla accident deaths >> that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies >> claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled, >> links and analysis comparing Musk's safety claims, and more.
Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12
https://www.tesladeaths.com/
Yeah, it's raw data. Did you have a point?
On 29/03/2022 15:00, Rickster wrote:
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
The website referred to appears to be collating information in
a reasonable and unemotional way.
Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400
We provide an updated record of Tesla fatalities and Tesla accident deaths >>> that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies >>> claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read >>> our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk's safety claims, and more.
Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12
https://www.tesladeaths.com/
Yeah, it's raw data. Did you have a point?
Without comparisons to other types of car, and correlations with other factors, such raw data is useless. You'd need to compare to other
high-end electric cars, other petrol cars in similar price ranges and
styles. You'd want to look at statistics for "typical Tesla drivers"
(who are significantly richer than the average driver, but I don't know
what other characteristics might be relevant - age, gender, driving experience, etc.) You'd have to compare statistics for the countries
and parts of countries where Teslas are common.
And you would /definitely/ want to anonymise the data. If I had a
family member who was killed in a car crash, I would not be happy about
their name and details of their death being used for some sort of absurd Tesla hate-site.
I'm no fan of Teslas myself. I like a car to be controlled like a car,
not a giant iPhone (and I don't like iPhones either). I don't like the
heavy tax breaks given by Norway to a luxury car, and I don't like the environmental costs of making them (though I am glad to see improvements
on that front). I don't like some of the silly claims people make about
them - like Apple gadgets, they seem to bring out the fanboy in some of
their owners. But that's all just me and my personal preferences and opinions - if someone else likes them, that's fine. Many Tesla owners
are very happy with their cars (and some are unhappy - just as for any
other car manufacturer). I can't see any reason for trying to paint
them as evil death-traps - you'd need very strong statistical basis for
that, not just a list of accidents.
This sheet also tallies claimed and confirmed Tesla autopilot crashes,
i.e. instances when Autopilot was activated during a Tesla crash
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:I have no point.
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
The website referred to appears to be collating information in
a reasonable and unemotional way.
Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400
We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk's safety claims, and more.
Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12
https://www.tesladeaths.com/
Yeah, it's raw data. Did you have a point?
I am curious about the causes of crashes when "autopilot" is engaged.
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it's raw data. Did you have a point?
I have no point.
I am curious about the causes of crashes when "autopilot" is engaged.
There is an attempt at comparisons, as stated in the FAQ.
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it's raw data. Did you have a point?
I have no point.
Fair enough, I suppose. But was there a reason for the post then?
I am curious about the causes of crashes when "autopilot" is engaged.
That's a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is "don't rely on autopilot - stay alert and focused on the driving" does not work. For
one thing, many people don't obey it - people have been found in the
back seat of crashed Telsa's where they were having a nap. And those
that try to follow it are likely to doze off from boredom.
However, there is no need for a list of "crashes involving Teslas",
names of victims, and a site with a clear agenda to "prove" that Teslas
are not as safe as they claim. It is counter-productive to real investigation and real learning.
There is an attempt at comparisons, as stated in the FAQ.
It is a pretty feeble attempt, hidden away.
Even the comparison of "autopilot" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.
The whole post just struck me as a bit below par for your usual high standard. There's definitely an interesting thread possibility around
the idea of how safe or dangerous car "autopilots" can be, and how they compare to average drivers. But your post was not a great starting
point for that.
Some of the dashcam "Tesla's making mistakes" videos on
yootoob aren't confidence inspiring. Based on one I saw,
I certainly wouldn't dare let a Tesla drive itself in
an urban environment,
I suspect there isn't sufficient experience to assess
relative dangers between "artificial intelligence" and
"natural stupidity".
On 29/03/22 20:18, David Brown wrote:
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it's raw data. Did you have a point?
I have no point.
Fair enough, I suppose. But was there a reason for the post then?Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.
I am curious about the causes of crashes when "autopilot" is engaged.
That's a reasonable thing to wonder about. The more we (people inAgreed.
general, Tesla drivers, Tesla developers, etc.) know about such crashes, the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is "don't rely on autopilot - stay alert and focused on the driving" does not work. For
one thing, many people don't obey it - people have been found in the
back seat of crashed Telsa's where they were having a nap. And those
that try to follow it are likely to doze off from boredom.
Musk and his /very/ carefully worded advertising don't help
matters. That should be challenged by evidence.
I haven't seen such evidence collated anywhere else.
However, there is no need for a list of "crashes involving Teslas",As far as I can see the website does not name the dead.
names of victims, and a site with a clear agenda to "prove" that Teslas are not as safe as they claim. It is counter-productive to real investigation and real learning.
The linked references may do.
Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.
From https://catless.ncl.ac.uk/Risks/33/11/#subj3
"Weeks earlier, a Tesla using the company's advanced
driver-assistance system had crashed into a tractor-trailer
at about 70 mph, killing the driver. When National Highway
Traffic Safety Administration officials called Tesla
executives to say they were launching an investigation,
Musk screamed, protested and threatened to sue, said a
former safety official who spoke on the condition of
anonymity to discuss sensitive matters.
"The regulators knew Musk could be impulsive and stubborn;
they would need to show some spine to win his cooperation.
So they waited. And in a subsequent call, “when tempers were
a little bit cool, Musk agreed to cooperate: He was a
changed person.'' " https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation
There is an attempt at comparisons, as stated in the FAQ.
It is a pretty feeble attempt, hidden away.
Even the comparison of "autopilot" deaths to total deaths is useless without information about autopilot use, and how many people rely on it.That's too strong, but I agree most ratios (including that one)
aren't that enlightening.
The whole post just struck me as a bit below par for your usual high standard. There's definitely an interesting thread possibility aroundReal world experiences aren't a bad /starting/ point, but
the idea of how safe or dangerous car "autopilots" can be, and how they compare to average drivers. But your post was not a great starting
point for that.
they do have limitations. Better starting points are to
be welcomed.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on
yootoob aren't confidence inspiring. Based on one I saw,
I certainly wouldn't dare let a Tesla drive itself in
an urban environment,
I suspect there isn't sufficient experience to assess
relative dangers between "artificial intelligence" and
"natural stupidity".
On 29/03/22 20:18, David Brown wrote:
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it's raw data. Did you have a point?
I have no point.
Fair enough, I suppose. But was there a reason for the post then?
Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.
I am curious about the causes of crashes when "autopilot" is engaged.
That's a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is "don't rely on
autopilot - stay alert and focused on the driving" does not work. For
one thing, many people don't obey it - people have been found in the
back seat of crashed Telsa's where they were having a nap. And those
that try to follow it are likely to doze off from boredom.
Agreed.
Musk and his /very/ carefully worded advertising don't help
matters. That should be challenged by evidence.
I haven't seen such evidence collated anywhere else.
However, there is no need for a list of "crashes involving Teslas",
names of victims, and a site with a clear agenda to "prove" that Teslas
are not as safe as they claim. It is counter-productive to real
investigation and real learning.
As far as I can see the website does not name the dead.
The linked references may do.
Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.
From https://catless.ncl.ac.uk/Risks/33/11/#subj3
"Weeks earlier, a Tesla using the company's advanced
driver-assistance system had crashed into a tractor-trailer
at about 70 mph, killing the driver. When National Highway
Traffic Safety Administration officials called Tesla
executives to say they were launching an investigation,
Musk screamed, protested and threatened to sue, said a
former safety official who spoke on the condition of
anonymity to discuss sensitive matters.
"The regulators knew Musk could be impulsive and stubborn;
they would need to show some spine to win his cooperation.
So they waited. And in a subsequent call, “when tempers were
a little bit cool, Musk agreed to cooperate: He was a
changed person.'' "
https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation
There is an attempt at comparisons, as stated in the FAQ.
It is a pretty feeble attempt, hidden away.
Even the comparison of "autopilot" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.
That's too strong, but I agree most ratios (including that one)
aren't that enlightening.
The whole post just struck me as a bit below par for your usual high
standard. There's definitely an interesting thread possibility around
the idea of how safe or dangerous car "autopilots" can be, and how they
compare to average drivers. But your post was not a great starting
point for that.
Real world experiences aren't a bad /starting/ point, but
they do have limitations. Better starting points are to
be welcomed.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on
yootoob aren't confidence inspiring. Based on one I saw,
I certainly wouldn't dare let a Tesla drive itself in
an urban environment,
I suspect there isn't sufficient experience to assess
relative dangers between "artificial intelligence" and
"natural stupidity".
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
On 29/03/2022 17:17, Tom Gardner wrote:
On 29/03/22 15:16, David Brown wrote:
On 29/03/2022 15:00, Rickster wrote:
Yeah, it's raw data. Did you have a point?
I have no point.
Fair enough, I suppose. But was there a reason for the post then?
Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.
I am curious about the causes of crashes when "autopilot" is engaged. >>>
That's a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes, >> the better the possibilities for fixing weaknesses or understanding how >> to mitigate them. Unfortunately, the main mitigation is "don't rely on >> autopilot - stay alert and focused on the driving" does not work. For
one thing, many people don't obey it - people have been found in the
back seat of crashed Telsa's where they were having a nap. And those
that try to follow it are likely to doze off from boredom.
Agreed.
Musk and his /very/ carefully worded advertising don't help
matters. That should be challenged by evidence.
I haven't seen such evidence collated anywhere else.But that site does not have evidence of anything relevant. It shows
that people sometimes die on the road, even in Teslas. Nothing more.
If the Tesla people are using false or misleading advertising, or making safety claims that can't be verified, then I agree they should be held accountable. Collect evidence to show that - /real/ comparisons and
/real/ statistics.
Progress was not made against tobacco companies by compiling lists of
people who smoked and then died. It was done by comparing the death
rates of people who smoked to those of people who don't smoke.
However, there is no need for a list of "crashes involving Teslas",
names of victims, and a site with a clear agenda to "prove" that Teslas >> are not as safe as they claim. It is counter-productive to real
investigation and real learning.
As far as I can see the website does not name the dead.From your initial post (you read what you quoted, didn't you?) :
The linked references may do.
"""
We provide an updated record of Tesla fatalities and Tesla accident deaths that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.).
"""
Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.
From https://catless.ncl.ac.uk/Risks/33/11/#subj3
"Weeks earlier, a Tesla using the company's advanced
driver-assistance system had crashed into a tractor-trailer
at about 70 mph, killing the driver. When National Highway
Traffic Safety Administration officials called Tesla
executives to say they were launching an investigation,
Musk screamed, protested and threatened to sue, said a
former safety official who spoke on the condition of
anonymity to discuss sensitive matters.
"The regulators knew Musk could be impulsive and stubborn;
they would need to show some spine to win his cooperation.
So they waited. And in a subsequent call, “when tempers were
a little bit cool, Musk agreed to cooperate: He was a
changed person.'' "
https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation
So people who know how to investigate these things are investigating
them. That's great. (It is also - in theory, at least - unbiased. The autopilot might not have been at fault.) It's a lot better than some
amateur with a grudge, an ignorance of statistics and a google document page.
There is an attempt at comparisons, as stated in the FAQ.
It is a pretty feeble attempt, hidden away.
Even the comparison of "autopilot" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.
That's too strong, but I agree most ratios (including that one)No, it is not "too strong". It is basic statistics. Bayes' theorem,
aren't that enlightening.
and all that. If a large proportion of people use autopilot, but only a small fraction of the deaths had the autopilot on, then clearly the autopilot reduces risks and saves lives (of those that drive Teslas - we still know nothing of other car drivers).
The whole post just struck me as a bit below par for your usual high
standard. There's definitely an interesting thread possibility around
the idea of how safe or dangerous car "autopilots" can be, and how they >> compare to average drivers. But your post was not a great starting
point for that.
Real world experiences aren't a bad /starting/ point, butReal world experiences are enough to say "this might be worth looking
they do have limitations. Better starting points are to
be welcomed.
at" - but no more than that.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on
yootoob aren't confidence inspiring. Based on one I saw,
I certainly wouldn't dare let a Tesla drive itself in
an urban environment,
I suspect there isn't sufficient experience to assessI don't doubt at all that the Tesla autopilot makes mistakes.
relative dangers between "artificial intelligence" and
"natural stupidity".
So do
human drivers. The interesting question is who makes fewer mistakes, or mistakes with lower consequences - and that is a question for which no amount of anecdotal yootoob videos or Tesla/Musk hate sites will help.
The only evidence you have so far is that people love to show that
something fancy and expensive is not always perfect, and I believe we
knew that already.
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
No, it is not "too strong". It is basic statistics. Bayes' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).
A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That's an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn't work in most environments.
I don't doubt at all that the Tesla autopilot makes mistakes.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on yootoob
aren't confidence inspiring. Based on one I saw, I certainly
wouldn't dare let a Tesla drive itself in an urban environment,
I suspect there isn't sufficient experience to assess relative
dangers between "artificial intelligence" and "natural
stupidity".
Which depends on how you define "mistakes".
It's a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?
So do human drivers. The interesting question is who makes fewer
mistakes, or mistakes with lower consequences - and that is a
question for which no amount of anecdotal yootoob videos or
Tesla/Musk hate sites will help. The only evidence you have so far
is that people love to show that something fancy and expensive is
not always perfect, and I believe we knew that already.
That's where they are headed with the full self driving. But gauging
the breadth of issues the car has problems with, I think it will be a
long, long time before we can sit back and relax while the car drives
us home.
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:<snip>
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
No, it is not "too strong". It is basic statistics. Bayes' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).
A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That's an inherent bias because while
non-autopilot driving must include all situations, autopilot simply doesn't work in most environments.
Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.
I suspect - without statistical justification -
that the accidents
involving autopilot use are precisely cases where you don't have a good, clear highway, and autopilot was used in a situation where it was not suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say "This is not a good road for me - you have to drive yourself" and switch itself off. (It would be more controversial, but probably statistically safer, if it also sometimes said "I'm better at driving on this kind of
road than you are" and switching itself on!)
I don't doubt at all that the Tesla autopilot makes mistakes.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on yootoob
aren't confidence inspiring. Based on one I saw, I certainly
wouldn't dare let a Tesla drive itself in an urban environment,
I suspect there isn't sufficient experience to assess relative
dangers between "artificial intelligence" and "natural
stupidity".
Which depends on how you define "mistakes".Of course.
It's a bit like asking
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?
There are a few possibilities here (though I am not trying to claim that
any of them are "right" in some objective sense). You might say they
had believed that the "autopilot" was like a plane autopilot -
you can
turn it on and leave it to safely drive itself for most of the journey except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake from the salesmen, advertisers, user-interface designers, or just the driver's mistake.
And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:<snip>
On 30/03/2022 00:54, Tom Gardner wrote:
On 29/03/22 20:18, David Brown wrote:
Yes. An apples-to-apples comparison is the aim, or at least as close asNo, it is not "too strong". It is basic statistics. Bayes' theorem,
and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).
A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That's an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn't work in most environments.
one can get.
I suspect - without statistical justification -
Yes, without justification, at all.
that the accidents
involving autopilot use are precisely cases where you don't have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
"This is not a good road for me - you have to drive yourself" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said "I'm better at driving on this kind of
road than you are" and switching itself on!)
Of course.I don't doubt at all that the Tesla autopilot makes mistakes.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on yootoob
aren't confidence inspiring. Based on one I saw, I certainly
wouldn't dare let a Tesla drive itself in an urban environment,
I suspect there isn't sufficient experience to assess relative
dangers between "artificial intelligence" and "natural
stupidity".
Which depends on how you define "mistakes".
It's a bit like askingThere are a few possibilities here (though I am not trying to claim that
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?
any of them are "right" in some objective sense). You might say they
had believed that the "autopilot" was like a plane autopilot -
It is exactly like an airplane autopilot.
the plane, with or without the autopilot.you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver's mistake.
Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.
The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
On 01/04/2022 00:29, Ricky wrote:
Sorry, that's not how an autopilot works. It doesn't fly the plane. It
simply maintains a heading and altitude.
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones are more sophisticated and handle course changes along the planned route, as well as being able to land automatically. And more important than what plane autopilots actually /do/, is what people /think/ they do - and remember we are talking about drivers that think their Tesla "autopilot" will drive their car while they watch a movie or nap in the back seat.
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>> pilot is responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla "autopilot" will
drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Sorry, that's not how an autopilot works. It doesn't fly the plane. It >>> simply maintains a heading and altitude.
They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.
Someone still has to be watchingYes, that's the original idea of a plane autopilot. But modern ones are more
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot. >>
sophisticated and handle course changes along the planned route, as well as >> being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we >> are talking about drivers that think their Tesla "autopilot" will drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>>> pilot is responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla "autopilot" will
drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
You don't. Twats will always be twats. You fix the cars.
You start by changing the name. "Driver assistance" rather than
"autopilot".
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention.
(Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>>> pilot is responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla "autopilot" will
drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
You don't. Twats will always be twats. You fix the cars.
You start by changing the name. "Driver assistance" rather than
"autopilot".
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
On 3/31/2022 5:19 PM, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Sorry, that's not how an autopilot works. It doesn't fly the plane. It >>>> simply maintains a heading and altitude.
They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.
Someone still has to be watchingYes, that's the original idea of a plane autopilot. But modern ones are more
for other aircraft and otherwise flying the plane. In other words, the >>>> pilot is responsible for flying the plane, with or without the autopilot. >>>
sophisticated and handle course changes along the planned route, as well as >>> being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we >>> are talking about drivers that think their Tesla "autopilot" will drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
"Pilots" and "drivers" approach their efforts entirely differently
and with different mindsets.
ANYONE can drive a car. By contrast, a fair bit more understanding, reasoning and skill is required to pilot an aircraft.
I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) "autopilot".
On 01/04/22 08:46, Don Y wrote:
On 3/31/2022 5:19 PM, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Sorry, that's not how an autopilot works. It doesn't fly the plane. It >>>>> simply maintains a heading and altitude.
They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.
Someone still has to be watchingYes, that's the original idea of a plane autopilot. But modern ones are more
for other aircraft and otherwise flying the plane. In other words, the >>>>> pilot is responsible for flying the plane, with or without the autopilot. >>>>
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and remember we >>>> are talking about drivers that think their Tesla "autopilot" will drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
"Pilots" and "drivers" approach their efforts entirely differently
and with different mindsets.
They should do in one sense (differing machine/automation)
and shouldn't in another (both are lethal instruments).
Problem starts with the marketing.
ANYONE can drive a car. By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.
Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.
The main difference is that you can't stop and catch
your breath, or stop and have a pee.
Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.
The training
is more rigorous, though, and isn't a one-off event.
I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) "autopilot".
Pilots often don't understand what's going on; just
listen to the accident reports on the news :(
On 4/1/2022 3:44 AM, Tom Gardner wrote:
On 01/04/22 08:46, Don Y wrote:
On 3/31/2022 5:19 PM, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Sorry, that's not how an autopilot works. It doesn't fly the plane. It
simply maintains a heading and altitude.
They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>>>>> pilot is responsible for flying the plane, with or without the autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones are more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane >>>>> autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla "autopilot" will drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
"Pilots" and "drivers" approach their efforts entirely differently
and with different mindsets.
They should do in one sense (differing machine/automation)
and shouldn't in another (both are lethal instruments).
Problem starts with the marketing.
Cars are far more ubiquitous. And, navigation is a 2-dimensional activity.
An "average joe" isn't likely to think hes gonna "hop in a piper cub" and
be off on a jaunt to run errands. And, navigation is a 3-dimensional undertaking (you don't worry about vehicles "above" or "below", when driving!)
ANYONE can drive a car. By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.
Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.
And a "youngster" can drive a car (or other sort of motorized vehicle, e.g., on
a farm or other private property). The 16yo (15.5) restriction only applies to
the use on public roadways.
Cars are "simple" to operate; can-your-feet-reach-the-pedals being the only practical criteria. I'd wager *I* would have a hard time walking up to
an aircraft, "cold", and trying to sort out how to get it off the ground...
The main difference is that you can't stop and catch
your breath, or stop and have a pee.
Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.
But not opportunity. I'd have to spend a fair bit of effort researching where to gain access to any sort of aircraft. OTOH, I can readily "borrow" (with consent) any of my neighbors' vehicles and operate all of them in
a fairly consistent manner: sports cars, trucks, commercial trucks, even motorcycles (though never having driven one, before!).
The training
is more rigorous, though, and isn't a one-off event.
It's likely more technical, too. Most auto-driving instruction deals
with laws, not the technical "piloting" of the vehicle. The driving test
is similarly focused on whether or not you put that law knowledge into
effect (did you stop *at* the proper point? did you observe the speed
limit and other posted requirements?)
[When taking the test for my *first* DL, the DMV was notorious for
having a stop sign *in* the (tiny) parking lot -- in an unexpected
place. Folks who weren't observant -- or tipped off to this ahead
of time -- were "failed" before ever getting out on the roadway!]
Testing for a CDL (commercial) is considerably different; you are
quizzed on technical details of the vehicle that affect the safety of
you and others on the roadway -- because you are operating a much more "lethal" vehicle (< 26,000 pounds GVW). You also have to prove yourself medically *fit* to operate (not color blind, not an insulin user, "controlled" blood pressure, nonepileptic, alchoholic, etc.!
And, other "endorsements" have further requirements (e.g., hauling tandem/triples, hazardous products, etc.)
I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) "autopilot".
Pilots often don't understand what's going on; just
listen to the accident reports on the news :(
I think those events are caused by cognitive overload, not ignorance.
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
They have been doing more than that for for > 50 years.Sorry, that's not how an autopilot works. It doesn't fly the plane. It
simply maintains a heading and altitude.
Cat 3b landings were in operation when I was a kid.
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the
pilot is responsible for flying the plane, with or without the autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones are moreAnd, to put it kindly, aren't discouraged in that misapprehension
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane autopilots actually /do/, is what people /think/ they do - and remember we are talking about drivers that think their Tesla "autopilot" will drive their
car while they watch a movie or nap in the back seat.
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
On 01/04/2022 00:29, Ricky wrote:
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote: >>>> On 30/03/2022 00:54, Tom Gardner wrote:<snip>
On 29/03/22 20:18, David Brown wrote:
Yes. An apples-to-apples comparison is the aim, or at least as close as >> one can get.No, it is not "too strong". It is basic statistics. Bayes' theorem, >>>> and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that
drive Teslas - we still know nothing of other car drivers).
A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That's an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn't work in most environments.
I suspect - without statistical justification -
Yes, without justification, at all.Which do /you/ think is most likely? Autopilot crashes on the motorway,
or autopilot crashes on smaller roads?
that the accidents
involving autopilot use are precisely cases where you don't have a good, >> clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful >> in making it safer - perhaps adding a feature that has the autopilot say >> "This is not a good road for me - you have to drive yourself" and switch >> itself off. (It would be more controversial, but probably statistically >> safer, if it also sometimes said "I'm better at driving on this kind of >> road than you are" and switching itself on!)
Of course.I don't doubt at all that the Tesla autopilot makes mistakes.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on yootoob
aren't confidence inspiring. Based on one I saw, I certainly
wouldn't dare let a Tesla drive itself in an urban environment,
I suspect there isn't sufficient experience to assess relative
dangers between "artificial intelligence" and "natural
stupidity".
Which depends on how you define "mistakes".
It's a bit like askingThere are a few possibilities here (though I am not trying to claim that >> any of them are "right" in some objective sense). You might say they
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to
manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?
had believed that the "autopilot" was like a plane autopilot -
It is exactly like an airplane autopilot.
the plane, with or without the autopilot.you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say, >> the Tesla autopilot is /not/ designed for that - that might be a mistake >> from the salesmen, advertisers, user-interface designers, or just the
driver's mistake.
Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
Yes, that's the original idea of a plane autopilot. But modern ones are
more sophisticated and handle course changes along the planned route, as well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and remember we are talking about drivers that think their Tesla "autopilot" will drive their car while they watch a movie or nap in the back seat.
And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.
The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
Well, "does something daft" is no worse than "acts like a spaz", and
it's a good deal more politically correct!
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>> pilot is responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla "autopilot" will
drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that conceptYou don't. Twats will always be twats. You fix the cars.
into the heads of twats that think "autopilot" means "it does
it for me".
You start by changing the name. "Driver assistance" rather than
"autopilot".
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
On 01/04/22 12:32, Don Y wrote:
On 4/1/2022 3:44 AM, Tom Gardner wrote:
Sorry, that's not how an autopilot works. It doesn't fly the plane. It
simply maintains a heading and altitude.
They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>>>>>> pilot is responsible for flying the plane, with or without the autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones are >>>>>> more
sophisticated and handle course changes along the planned route, as well as
being able to land automatically. And more important than what plane >>>>>> autopilots actually /do/, is what people /think/ they do - and remember we
are talking about drivers that think their Tesla "autopilot" will drive >>>>>> their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
"Pilots" and "drivers" approach their efforts entirely differently
and with different mindsets.
They should do in one sense (differing machine/automation)
and shouldn't in another (both are lethal instruments).
Problem starts with the marketing.
Cars are far more ubiquitous. And, navigation is a 2-dimensional activity. >>
An "average joe" isn't likely to think hes gonna "hop in a piper cub" and
be off on a jaunt to run errands. And, navigation is a 3-dimensional
undertaking (you don't worry about vehicles "above" or "below", when driving!)
True, but it doesn't change any of my points.
ANYONE can drive a car. By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.
Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.
And a "youngster" can drive a car (or other sort of motorized vehicle, e.g., on
a farm or other private property). The 16yo (15.5) restriction only applies to
the use on public roadways.
12yo fly across the country with an instructor behind.
14yo can do it on their own.
Daughter was driving my car and a double decker bus at 15yo,
on the runway and peritrack :)
Cars are "simple" to operate; can-your-feet-reach-the-pedals being the only >> practical criteria. I'd wager *I* would have a hard time walking up to
an aircraft, "cold", and trying to sort out how to get it off the ground...
Same is true of a glider. There are only 4 controls: rudder,
stick, airbrake, cable release. Two instruments, airspeed
and barometer (i.e. height differential).
You are taught to do without them, because they all lie to
you.
The main difference is that you can't stop and catch
your breath, or stop and have a pee.
Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.
But not opportunity. I'd have to spend a fair bit of effort researching
where to gain access to any sort of aircraft. OTOH, I can readily "borrow" >> (with consent) any of my neighbors' vehicles and operate all of them in
a fairly consistent manner: sports cars, trucks, commercial trucks, even
motorcycles (though never having driven one, before!).
True, but it doesn't change any of my points.
The training
is more rigorous, though, and isn't a one-off event.
It's likely more technical, too. Most auto-driving instruction deals
with laws, not the technical "piloting" of the vehicle. The driving test
is similarly focused on whether or not you put that law knowledge into
effect (did you stop *at* the proper point? did you observe the speed
limit and other posted requirements?)
Not much is required to go solo.
Does the glider's responsiveness indicate you are flying
fast enough; are you at a reasonable height in the circuit;
what to do when you find you aren't and when the cable snaps.
[When taking the test for my *first* DL, the DMV was notorious for
having a stop sign *in* the (tiny) parking lot -- in an unexpected
place. Folks who weren't observant -- or tipped off to this ahead
of time -- were "failed" before ever getting out on the roadway!]
Pre-solo tests include the instructor putting you in a
stupid position, and saying "now get us back safely".
Testing for a CDL (commercial) is considerably different; you are
quizzed on technical details of the vehicle that affect the safety of
you and others on the roadway -- because you are operating a much more
"lethal" vehicle (< 26,000 pounds GVW). You also have to prove yourself
medically *fit* to operate (not color blind, not an insulin user,
"controlled" blood pressure, nonepileptic, alchoholic, etc.!
Ditto being an instructor or having a passenger.
And, other "endorsements" have further requirements (e.g., hauling
tandem/triples, hazardous products, etc.)
Ditto flying cross country or in clouds.
I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) "autopilot".
That's true for the aircraft, but nobody has developed
an autopilot. You have to stay awake feel (literally,
by the seat of your pants) what's happening. The nearest
to an autopilot is a moving map airspace display.
Pilots often don't understand what's going on; just
listen to the accident reports on the news :(
I think those events are caused by cognitive overload, not ignorance.
Not always, e.g. the recent 737 crashes.
On 01/04/22 08:08, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>>> pilot is responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla "autopilot" will
drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
You don't. Twats will always be twats. You fix the cars.
You start by changing the name. "Driver assistance" rather than "autopilot".That's one of the things I was thinking of.
You turn the steering wheel into a dead-man's handle - if the driver releases it for more than, say, 2 seconds, the autopilot should firstI've wondered why they don't implement that, then realised
beep violently, then pull over and stop the car if the driver does not
pay attention.
it would directly contradict their advertising.
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
All these 'assistants' with their multiple 'modes' only make things
more complicated and therefor unsafe. Simple is better.
I recently got a car that came standard with 'lane assist'. I
hate it. It's like having a passenger tugging on the steering wheel, absolutely intolerable. It also can't be switched off permanently.
For the first week or two, I just blindfolded the camera it uses to
watch the road, until I found out how to switch it off with a single
button press. (There are far too many buttons, for that matter, and
all with multiple functions, too. Bad!)
That said, some automatic functions /are/ good. Climate control with
a real thermostat, auto-darkening rear view mirrors, mostly functions
that have nothing to do with driving per se. The only /good/ automatic functions are those you don't notice until they /stop/ working.
I also like the GPS with head-up display.
On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:You don't. Twats will always be twats. You fix the cars.
On 01/04/2022 00:29, Ricky wrote:
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>>>> pilot is responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones
are more
sophisticated and handle course changes along the planned route, as
well as
being able to land automatically. And more important than what plane
autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla "autopilot" will
drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
You start by changing the name. "Driver assistance" rather than
"autopilot".
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
Do you know anything about how the Tesla autopilot actually works? Anything at all?
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:They have been doing more than that for for > 50 years. Cat 3b
Sorry, that's not how an autopilot works. It doesn't fly the
plane. It simply maintains a heading and altitude.
landings were in operation when I was a kid.
And, to put it kindly, aren't discouraged in that misapprehensionSomeone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla "autopilot" will drive their car
while they watch a movie or nap in the back seat.
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept into the
heads of twats that think "autopilot" means "it does it for me".
That's Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of "twats" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.
On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
All these 'assistants' with their multiple 'modes' only make things
more complicated and therefor unsafe. Simple is better.
"Assistance" should be intuitive. You don't even NOTICE the power
steering, brakes, autotranny, etc. "assistants" in a vehicle.
Because, for the most part, the way they operate is largely invariant
of driver, driving conditions, etc. (how often do folks use anything
other than "D(rive)" and "R(everse)"? Is there a way to *disable*
the power steering? Or brakes? Should there be?
My favorite is the side mirrors tilting downwards (to afford a view
of the ground) when backing up. The backup camera is a win as we back into our garage and it helps avoid backing INTO something. These would be less necessary with a "lower profile" vehicle, though.
[I also like the trip computer automatically reseting at each trip
and "fill up"]
On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:Which do /you/ think is most likely? Autopilot crashes on the motorway,
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote: >>>>>> On 30/03/2022 00:54, Tom Gardner wrote:<snip>
On 29/03/22 20:18, David Brown wrote:
Yes. An apples-to-apples comparison is the aim, or at least as close as >>>> one can get.No, it is not "too strong". It is basic statistics. Bayes' theorem, >>>>>> and all that. If a large proportion of people use autopilot, but
only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that >>>>>> drive Teslas - we still know nothing of other car drivers).
A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven
than other roads. That's an inherent bias because while
non-autopilot driving must include all situations, autopilot simply
doesn't work in most environments.
I suspect - without statistical justification -
Yes, without justification, at all.
or autopilot crashes on smaller roads?
Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.
I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.
the plane, with or without the autopilot.that the accidents
involving autopilot use are precisely cases where you don't have a good, >>>> clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful >>>> in making it safer - perhaps adding a feature that has the autopilot say >>>> "This is not a good road for me - you have to drive yourself" and switch >>>> itself off. (It would be more controversial, but probably statistically >>>> safer, if it also sometimes said "I'm better at driving on this kind of >>>> road than you are" and switching itself on!)
Of course.I don't doubt at all that the Tesla autopilot makes mistakes.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on yootoob
aren't confidence inspiring. Based on one I saw, I certainly
wouldn't dare let a Tesla drive itself in an urban environment,
I suspect there isn't sufficient experience to assess relative
dangers between "artificial intelligence" and "natural
stupidity".
Which depends on how you define "mistakes".
It's a bit like askingThere are a few possibilities here (though I am not trying to claim that >>>> any of them are "right" in some objective sense). You might say they
if your rear view mirror makes mistakes by not showing cars in the
blind spot. The autopilot is not designed to drive the car. It is a
tool to assist the driver. The driver is required to be responsible
for the safe operation of the car at all times. I can point out to
you the many, many times the car acts like a spaz and requires me to >>>>> manage the situation. Early on, there was a left turn like on a 50
mph road, the car would want to turn into when intending to drive
straight. Fortunately they have ironed out that level of issue. But
it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?
had believed that the "autopilot" was like a plane autopilot -
It is exactly like an airplane autopilot.
you can
turn it on and leave it to safely drive itself for most of the journey >>>> except perhaps the very beginning and very end of the trip. As you say, >>>> the Tesla autopilot is /not/ designed for that - that might be a mistake >>>> from the salesmen, advertisers, user-interface designers, or just the
driver's mistake.
Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
like in the car, there is a pilot who's job is to fly/drive and assure safety.Yes, that's the original idea of a plane autopilot. But modern ones are
more sophisticated and handle course changes along the planned route, as
well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and
remember we are talking about drivers that think their Tesla "autopilot"
will drive their car while they watch a movie or nap in the back seat.
Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just
As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.
Well, "does something daft" is no worse than "acts like a spaz", andAnd sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot.
The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
it's a good deal more politically correct!
Bzzzz. Sorry, you failed.
I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) "autopilot".
That's true for the aircraft, but nobody has developed
an autopilot. You have to stay awake feel (literally,
by the seat of your pants) what's happening. The nearest
to an autopilot is a moving map airspace display.
Commercial aircraft rely on autopilots. In a sense, it is
an easier (navigation) problem to solve -- there's no real "traffic"
or other obstacles beyond the airports (assuming you maintain your
assigned flight corridor/speed). The same is true of railways
and waterways (more or less).
Cars operate in a much more challenging environment. Even "on the open road", a condition can arise that needs immediate driver attention
(witness these 50-car pileups).
Note how poorly "seasoned" drivers adapt to the first snowfall of
the season. (Really? Did you FORGET what this stuff was like??)
Do they do any special (mental?) prep prior to getting behind the
wheel, in those cases? Or, just "wing it", assuming "it will
come back to them"?
Pilots often don't understand what's going on; just
listen to the accident reports on the news :(
I think those events are caused by cognitive overload, not ignorance.
Not always, e.g. the recent 737 crashes.
So, a defect in an autopilot implementation can be similarly excused?
On 2022-04-01 15:38, Don Y wrote:
On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not >>>> pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
All these 'assistants' with their multiple 'modes' only make things
more complicated and therefor unsafe. Simple is better.
"Assistance" should be intuitive. You don't even NOTICE the power
steering, brakes, autotranny, etc. "assistants" in a vehicle.
Because, for the most part, the way they operate is largely invariant
of driver, driving conditions, etc. (how often do folks use anything
other than "D(rive)" and "R(everse)"? Is there a way to *disable*
the power steering? Or brakes? Should there be?
I much prefer a simple stick shift. I can tell what state it's in
by touch, and there in not the slightest doubt about it. That isn't
true for an automatic. You need to /look/ what state it's in.
They're
too temperamental to my taste, refusing to change state under certain conditions. Same for the electric parking brake. It took me a while to
figure out it refuses to disengage when I'm not wearing seat belts.
Sheesh! Talk about weird interactions!
Power steering and brakes are in the set of assists that normally
go unnoticed until they fail. (Provided they are essentially linear,
smooth, without discontinuity or other surprise behaviour.)
[...]
My favorite is the side mirrors tilting downwards (to afford a view
of the ground) when backing up. The backup camera is a win as we back into >> our garage and it helps avoid backing INTO something. These would be less >> necessary with a "lower profile" vehicle, though.
[I also like the trip computer automatically reseting at each trip
and "fill up"]
Yes, got that too, and I agree those are good features.
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:They have been doing more than that for for > 50 years. Cat 3b
Sorry, that's not how an autopilot works. It doesn't fly the
plane. It simply maintains a heading and altitude.
landings were in operation when I was a kid.
And, to put it kindly, aren't discouraged in that misapprehensionSomeone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla "autopilot" will drive their car
while they watch a movie or nap in the back seat.
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept into the
heads of twats that think "autopilot" means "it does it for me".
That's Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of "twats" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.
Are you suggesting that none of the people who drive Teslas are twats?
(Maybe that term is too British for you.)
And are you suggesting that only Tesla drivers are affected by Tesla
crashes? Obviously they will be disproportionally affected, but motor accidents often involve other people and other cars. And while Tesla
may be leading the way in car "autopiloting", others are following - the strengths and weaknesses of Tesla's systems are relevant to other car manufacturers.
On 01/04/2022 14:44, Ricky wrote:
On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
On 01/04/2022 02:19, Tom Gardner wrote:
On 31/03/22 23:39, David Brown wrote:You don't. Twats will always be twats. You fix the cars.
On 01/04/2022 00:29, Ricky wrote:
Someone still has to be watching
for other aircraft and otherwise flying the plane. In other words, the >>>>> pilot is responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern ones >>>> are more
sophisticated and handle course changes along the planned route, as >>>> well as
being able to land automatically. And more important than what plane >>>> autopilots actually /do/, is what people /think/ they do - and
remember we
are talking about drivers that think their Tesla "autopilot" will
drive their
car while they watch a movie or nap in the back seat.
And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".
You start by changing the name. "Driver assistance" rather than
"autopilot".
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
Do you know anything about how the Tesla autopilot actually works? Anything at all?
A little - but not a lot, and no personal experience.
So fill in the details here.
You've already told us that it is designed for things like motorway
driving (or "highway" driving). Presumably you stick by that, and
therefore agree that any restrictions on the autopilot should be lower
for motorway driving than for more "challenging" driving such as town
roads or small, twisty country roads.
People already manage to read newspapers or eat their breakfast in
traffic queues, in purely manual cars. Do you think autopilot can
handle that kind of traffic?
My suggestion is that a way to ensure people have more focus on driving
is to require contact with the steering wheel. I am happy to hear your objections to that idea, or to alternative thoughts.
Improper use of autopilot (and other automation in all kinds of cars)
leads to a higher risk of accidents. I expect that proper use can lower risk. Do you disagree with these two claims?
Do you think Tesla's autopilot is perfect as it is, or is there room for improvement?
Do you actually want to contribute something to this thread, or do you
just want to attack any post that isn't Tesla fanboy support? (Your
answers to the previous questions will cover this one too.)
On 2022-04-01 15:38, Don Y wrote:
On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first >>> beep violently, then pull over and stop the car if the driver does not >>> pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)
All these 'assistants' with their multiple 'modes' only make things
more complicated and therefor unsafe. Simple is better.
"Assistance" should be intuitive. You don't even NOTICE the power steering, brakes, autotranny, etc. "assistants" in a vehicle.I much prefer a simple stick shift. I can tell what state it's in
Because, for the most part, the way they operate is largely invariant
of driver, driving conditions, etc. (how often do folks use anything
other than "D(rive)" and "R(everse)"? Is there a way to *disable*
the power steering? Or brakes? Should there be?
by touch, and there in not the slightest doubt about it.
That isn't
true for an automatic. You need to /look/ what state it's in. They're
too temperamental to my taste, refusing to change state under certain conditions. Same for the electric parking brake. It took me a while to figure out it refuses to disengage when I'm not wearing seat belts.
Sheesh! Talk about weird interactions!
Power steering and brakes are in the set of assists that normally
go unnoticed until they fail. (Provided they are essentially linear,
smooth, without discontinuity or other surprise behaviour.)
My favorite is the side mirrors tilting downwards (to afford a view
of the ground) when backing up. The backup camera is a win as we back into our garage and it helps avoid backing INTO something. These would be less necessary with a "lower profile" vehicle, though.
[I also like the trip computer automatically reseting at each tripYes, got that too, and I agree those are good features.
and "fill up"]
On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:Are you suggesting that none of the people who drive Teslas are twats?
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:They have been doing more than that for for > 50 years. Cat 3b
Sorry, that's not how an autopilot works. It doesn't fly the
plane. It simply maintains a heading and altitude.
landings were in operation when I was a kid.
And, to put it kindly, aren't discouraged in that misapprehensionSomeone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla "autopilot" will drive their car
while they watch a movie or nap in the back seat.
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept into the
heads of twats that think "autopilot" means "it does it for me".
That's Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of "twats" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.
(Maybe that term is too British for you.)
The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?
And are you suggesting that only Tesla drivers are affected by Tesla
crashes? Obviously they will be disproportionally affected, but motor
accidents often involve other people and other cars. And while Tesla
may be leading the way in car "autopiloting", others are following - the
strengths and weaknesses of Tesla's systems are relevant to other car
manufacturers.
Now I have no idea why you have brought this up from left field. Is "left field" too American for you? That's from a sport called "baseball", not to be confused with "blernsball".
On 01/04/2022 14:38, Ricky wrote:it.
On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:
On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:Which do /you/ think is most likely? Autopilot crashes on the motorway, >> or autopilot crashes on smaller roads?
On 31/03/2022 22:44, Ricky wrote:
On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote: >>>>>> On 30/03/2022 00:54, Tom Gardner wrote:<snip>
On 29/03/22 20:18, David Brown wrote:
Yes. An apples-to-apples comparison is the aim, or at least as close as >>>> one can get.No, it is not "too strong". It is basic statistics. Bayes' theorem, >>>>>> and all that. If a large proportion of people use autopilot, but >>>>>> only a small fraction of the deaths had the autopilot on, then
clearly the autopilot reduces risks and saves lives (of those that >>>>>> drive Teslas - we still know nothing of other car drivers).
A simple comparison of numbers is not sufficient. Most Tesla
autopilot usage is on highways which are much safer per mile driven >>>>> than other roads. That's an inherent bias because while
non-autopilot driving must include all situations, autopilot simply >>>>> doesn't work in most environments.
I suspect - without statistical justification -
Yes, without justification, at all.
Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.
I was not aware of that limitation. Thanks for providing some relevant information.
I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use
the plane, with or without the autopilot.that the accidents
involving autopilot use are precisely cases where you don't have a good,
clear highway, and autopilot was used in a situation where it was not >>>> suitable. Getting good statistics and comparisons here could be helpful >>>> in making it safer - perhaps adding a feature that has the autopilot say
"This is not a good road for me - you have to drive yourself" and switch
itself off. (It would be more controversial, but probably statistically >>>> safer, if it also sometimes said "I'm better at driving on this kind of >>>> road than you are" and switching itself on!)
Of course.I don't doubt at all that the Tesla autopilot makes mistakes.
An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.
Some of the dashcam "Tesla's making mistakes" videos on yootoob >>>>>>> aren't confidence inspiring. Based on one I saw, I certainly
wouldn't dare let a Tesla drive itself in an urban environment, >>>>>>>
I suspect there isn't sufficient experience to assess relative >>>>>>> dangers between "artificial intelligence" and "natural
stupidity".
Which depends on how you define "mistakes".
It's a bit like askingThere are a few possibilities here (though I am not trying to claim that
if your rear view mirror makes mistakes by not showing cars in the >>>>> blind spot. The autopilot is not designed to drive the car. It is a >>>>> tool to assist the driver. The driver is required to be responsible >>>>> for the safe operation of the car at all times. I can point out to >>>>> you the many, many times the car acts like a spaz and requires me to >>>>> manage the situation. Early on, there was a left turn like on a 50 >>>>> mph road, the car would want to turn into when intending to drive >>>>> straight. Fortunately they have ironed out that level of issue. But >>>>> it was always my responsibility to prevent it from causing an
accident. So how would you say anything was the fault of the
autopilot?
any of them are "right" in some objective sense). You might say they >>>> had believed that the "autopilot" was like a plane autopilot -
It is exactly like an airplane autopilot.
you can
turn it on and leave it to safely drive itself for most of the journey >>>> except perhaps the very beginning and very end of the trip. As you say, >>>> the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the >>>> driver's mistake.
Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
in the car, there is a pilot who's job is to fly/drive and assure safety.Yes, that's the original idea of a plane autopilot. But modern ones are >> more sophisticated and handle course changes along the planned route, as >> well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and
remember we are talking about drivers that think their Tesla "autopilot" >> will drive their car while they watch a movie or nap in the back seat.
Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just like
I am fully aware that plane autopilots are limited. I am also aware
that they are good enough (in planes equipped with modern systems) to
allow pilots to let the system handle most of the flight itself, even including landing. The pilot is, of course, expected to be paying
attention, watching for other aircraft, communicating with air traffic controllers and all the rest of it. But there have been cases of pilots falling asleep, or missing their destination because they were playing around on their laptops. What people /should/ be doing, and what they
are /actually/ doing, is not always the same.
As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.
You can google for "backseat Tesla drivers" as well as I can. I am
confident that some of these are staged, and equally confident that some
are not. There is no minimum level of "thinking" - no matter how daft something might be, there is always a dafter person who will think it's
a good idea.
Well, "does something daft" is no worse than "acts like a spaz", andAnd sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think, >>>> should be counted as a mistake by the autopilot.
The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
it's a good deal more politically correct!
Bzzzz. Sorry, you failed.
Really? You think describing the autopilot's actions as "acts like a
spaz" is useful and specific, while "does something daft" is not? As
for the political correctness - find a real spastic and ask them what
they think of your phrase.
On Friday, April 1, 2022 at 10:29:58 AM UTC-4, David Brown wrote:
On 01/04/2022 14:38, Ricky wrote:
On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown
wrote:
Really? You think describing the autopilot's actions as "acts likeWell, "does something daft" is no worse than "acts like a
spaz", and it's a good deal more politically correct!
Bzzzz. Sorry, you failed.
a spaz" is useful and specific, while "does something daft" is not?
As for the political correctness - find a real spastic and ask them
what they think of your phrase.
How do you know what is meant by "spaz"? That's my point. Words
like that are not well defined. I intended the word to be colorful,
with no particular meaning. Your use of daft was in a statement that
needed much more detail to be meaningful. Besides, if I jump off a
cliff, are you going to jump as well?
On 01/04/2022 18:39, Ricky wrote:
On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:Are you suggesting that none of the people who drive Teslas are twats?
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:They have been doing more than that for for > 50 years. Cat 3b
Sorry, that's not how an autopilot works. It doesn't fly the
plane. It simply maintains a heading and altitude.
landings were in operation when I was a kid.
And, to put it kindly, aren't discouraged in that misapprehensionSomeone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla "autopilot" will drive their car
while they watch a movie or nap in the back seat.
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept into the
heads of twats that think "autopilot" means "it does it for me".
That's Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of "twats" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.
(Maybe that term is too British for you.)
The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?
It means "a stupid person" or "someone who does stupid things". No, not everyone who drives a Tesla is a twat - but /some/ are, such as those
that think their autopilot will drive the car without them paying attention.
And are you suggesting that only Tesla drivers are affected by Tesla
crashes? Obviously they will be disproportionally affected, but motor
accidents often involve other people and other cars. And while Tesla
may be leading the way in car "autopiloting", others are following - the >> strengths and weaknesses of Tesla's systems are relevant to other car
manufacturers.
Now I have no idea why you have brought this up from left field. Is "left field" too American for you? That's from a sport called "baseball", not to be confused with "blernsball".
You said that Tesla autopilots are only relevant to Tesla drivers.
That's wrong. I usually prefer to give a bit of explanation as to why I
think someone is wrong.
I don't really understand what you mean about "restrictions". Again, I think your image of how it works is not how it works. I don't know enough of your image to know how to explain to you what you have wrong.
Autopilot will try to keep the car in a lane, recognize lights, stop signs, exit ramps and vehicles. When on appropriate highways, it will work in navigate on autopilot where it can change lanes (pass slow vehicles, get out of passing lane, etc.) and take exits. It will stop for traffic lights, but can not navigate turns at intersections or even twisty roads. When it sees somthing that upsets it, it will sound the alarm (that should be ALARM) and insist you take over. One such situation is blinking yellow lights at an intersection with light traffic. The autopilot never understands this light can be driven through.
Please, go to a Tesla forum and read about the cars a bit. It would save me a lot of typing.
On 01/04/2022 18:25, Ricky wrote:
Please, go to a Tesla forum and read about the cars a bit. It would save me a lot of typing.
No, thanks.
As so often seems to happen in this group, this thread is going nowhere.
I'm not interested enough in Teslas to start reading forums, brochures,
or other information. You are not interested in sharing more than the occasional titbit of information, and it must be dragged out of you
through frustrated and somewhat unfriendly comments on both sides - you prefer to tell people they are wrong than offer corrections or your own opinion. I think sometimes this group brings out the worst in people,
even when the worst members of the group are not involved in the thread
- we develop some bad habits here. It is frustrating.
I leave the thread /marginally/ better informed than I started.
On 01/04/22 17:25, Ricky wrote:
I don't really understand what you mean about "restrictions". Again, I think
your image of how it works is not how it works. I don't know enough of your
image to know how to explain to you what you have wrong.
Autopilot will try to keep the car in a lane, recognize lights, stop signs,You need to datestamp your description of the autopilot's
exit ramps and vehicles. When on appropriate highways, it will work in navigate on autopilot where it can change lanes (pass slow vehicles, get out
of passing lane, etc.) and take exits. It will stop for traffic lights, but
can not navigate turns at intersections or even twisty roads. When it sees somthing that upsets it, it will sound the alarm (that should be ALARM) and
insist you take over. One such situation is blinking yellow lights at an intersection with light traffic. The autopilot never understands this light
can be driven through.
capabilities and foibles - Tesla keeps updating it. IMHO that's
a problem, since the car's behaviour today might be significantly
different to when you last drove it. And the driver probably won't
even know the difference exists; would you read the overnight
"change notes" when all you want to do is drive to the shops?
Example: https://www.pluscars.net/tesla-autopilot-automatically-stopped-at-red-light-for-the-first-time-105-news
It sounds like your Tesla doesn't have the autopilot mentioned,
viz "$7,000 full autonomous driving package recognizes traffic
lights and stop signs. It provides autonomous driving in the city."
Sounds like a lot more than "highway only".
On Friday, April 1, 2022 at 2:39:47 PM UTC-4, Tom Gardner wrote:
On 01/04/22 17:25, Ricky wrote:
I don't really understand what you mean about "restrictions". Again, I thinkYou need to datestamp your description of the autopilot's
your image of how it works is not how it works. I don't know enough of your >>> image to know how to explain to you what you have wrong.
Autopilot will try to keep the car in a lane, recognize lights, stop signs, >>> exit ramps and vehicles. When on appropriate highways, it will work in
navigate on autopilot where it can change lanes (pass slow vehicles, get out
of passing lane, etc.) and take exits. It will stop for traffic lights, but >>> can not navigate turns at intersections or even twisty roads. When it sees >>> somthing that upsets it, it will sound the alarm (that should be ALARM) and >>> insist you take over. One such situation is blinking yellow lights at an >>> intersection with light traffic. The autopilot never understands this light >>> can be driven through.
capabilities and foibles - Tesla keeps updating it. IMHO that's
a problem, since the car's behaviour today might be significantly
different to when you last drove it. And the driver probably won't
even know the difference exists; would you read the overnight
"change notes" when all you want to do is drive to the shops?
Example:
https://www.pluscars.net/tesla-autopilot-automatically-stopped-at-red-light-for-the-first-time-105-news
Yes, I said it will stop for lights and stop signs. What's your point?
driving" was the highest level, which was paying for something that is not on the road. They are now beta testing, but not "autonomous" driving.It sounds like your Tesla doesn't have the autopilot mentioned,
viz "$7,000 full autonomous driving package recognizes traffic
lights and stop signs. It provides autonomous driving in the city."
Autopilot is not "autonomous driving", period. That's the point. You don't say where you read this, but the price puts it sometime in the 2020 timeframe or older. But then, when I bought mine, there were multiple choices available. "Full self
Sounds like a lot more than "highway only".
Hard to tell. This is very unlikely to have come from Tesla as they never refer to it as "autonomous" driving. They use brand names.
I will ask again that you read my posts and read them thoroughly. You clearly are not doing that.
If you aren't going to read the posts, then don't reply. Ok?
BTW, I did a Google search on your quote and it didn't turn up a match. So it would see you got that from a phone call or something that Google doesn't yet crawl.
It did turn up this which is very interesting.
https://nypost.com/2022/03/31/court-orders-tesla-to-buy-back-model-3-in-autopilot-case/
'Tesla says FSD and its attendant features require “active driver supervision and do not make the vehicle autonomous.”'
On 01/04/22 21:12, Ricky wrote:
On Friday, April 1, 2022 at 2:39:47 PM UTC-4, Tom Gardner wrote:
On 01/04/22 17:25, Ricky wrote:
I don't really understand what you mean about "restrictions". Again, I thinkYou need to datestamp your description of the autopilot's
your image of how it works is not how it works. I don't know enough of your
image to know how to explain to you what you have wrong.
Autopilot will try to keep the car in a lane, recognize lights, stop signs,
exit ramps and vehicles. When on appropriate highways, it will work in >>> navigate on autopilot where it can change lanes (pass slow vehicles, get out
of passing lane, etc.) and take exits. It will stop for traffic lights, but
can not navigate turns at intersections or even twisty roads. When it sees
somthing that upsets it, it will sound the alarm (that should be ALARM) and
insist you take over. One such situation is blinking yellow lights at an >>> intersection with light traffic. The autopilot never understands this light
can be driven through.
capabilities and foibles - Tesla keeps updating it. IMHO that's
a problem, since the car's behaviour today might be significantly
different to when you last drove it. And the driver probably won't
even know the difference exists; would you read the overnight
"change notes" when all you want to do is drive to the shops?
Example:
https://www.pluscars.net/tesla-autopilot-automatically-stopped-at-red-light-for-the-first-time-105-news
Yes, I said it will stop for lights and stop signs. What's your point?Read the paragraph above the word "Example:"
was the highest level, which was paying for something that is not on the road. They are now beta testing, but not "autonomous" driving.It sounds like your Tesla doesn't have the autopilot mentioned,
viz "$7,000 full autonomous driving package recognizes traffic
lights and stop signs. It provides autonomous driving in the city."
Autopilot is not "autonomous driving", period. That's the point. You don't say where you read this, but the price puts it sometime in the 2020 timeframe or older. But then, when I bought mine, there were multiple choices available. "Full self driving"
Sounds like a lot more than "highway only".
Hard to tell. This is very unlikely to have come from Tesla as they never refer to it as "autonomous" driving. They use brand names.
I will ask again that you read my posts and read them thoroughly. You clearly are not doing that.
If you aren't going to read the posts, then don't reply. Ok?Mirror.
BTW, I did a Google search on your quote and it didn't turn up a match. So it would see you got that from a phone call or something that Google doesn't yet crawl.It is in the pluscars article referenced!
What was that you were saying about reading posts thoroughly?
It did turn up this which is very interesting.
https://nypost.com/2022/03/31/court-orders-tesla-to-buy-back-model-3-in-autopilot-case/
'Tesla says FSD and its attendant features require “active driver supervision and do not make the vehicle autonomous.”'Yes. Musk is backtracking from some of his earlier outrageous claims.
On 01/04/22 14:07, Don Y wrote:
<snipped many points where we are talking about
different classes of aircraft and air traffic>
I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) "autopilot".
That's true for the aircraft, but nobody has developed
an autopilot. You have to stay awake feel (literally,
by the seat of your pants) what's happening. The nearest
to an autopilot is a moving map airspace display.
Commercial aircraft rely on autopilots. In a sense, it is
an easier (navigation) problem to solve -- there's no real "traffic"
or other obstacles beyond the airports (assuming you maintain your
assigned flight corridor/speed). The same is true of railways
and waterways (more or less).
Er, no.
You are considering a small part of air traffic, that
in controlled airspace.
Many flights, powered and unpowered, happen outside
controlled airspace, where the rule is to look out
of the cockpit for converging traffic.
One one occasion I watched a commercial airliner
taking off a thousand feet below me. Hercules buzz
around too. Then there are balloons, hang gliders
and the like.
There are even rules as to which side of roads and
railways you should fly on, so that there aren't
head-on collisions between aircraft following the
same ground feature in opposite directions
Gliders frequently operate very near each other,
especially in thermals and when landing. They
also have to spot other gliders coming straight
at them when ridge flying; not trivial to spot
a white blob the size of a motorbike's front
converging at 120mph.
To help cope with that, some gliders are equipped
with FLARMs - short range radio transmitters to
indicate the direction of other gliders and whether
you are likely to hit them.
Cars operate in a much more challenging environment. Even "on the open
road", a condition can arise that needs immediate driver attention
(witness these 50-car pileups).
Note how poorly "seasoned" drivers adapt to the first snowfall of
the season. (Really? Did you FORGET what this stuff was like??)
Do they do any special (mental?) prep prior to getting behind the
wheel, in those cases? Or, just "wing it", assuming "it will
come back to them"?
Pilots often don't understand what's going on; just
listen to the accident reports on the news :(
I think those events are caused by cognitive overload, not ignorance.
Not always, e.g. the recent 737 crashes.
So, a defect in an autopilot implementation can be similarly excused?
Que? Strawman.
Please just go read a bit about them. There is tons of info. Even
weighing just the electrons to read it all, it's still tons! How
many electrons in a ton, anyway?
On 29/03/2022 15:00, Rickster wrote:
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
The website referred to appears to be collating information in
a reasonable and unemotional way.
Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400
We provide an updated record of Tesla fatalities and Tesla accident deaths >>> that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies >>> claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read >>> our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk's safety claims, and more.
Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12
https://www.tesladeaths.com/
Yeah, it's raw data. Did you have a point?
Without comparisons to other types of car, and correlations with other >factors, such raw data is useless. You'd need to compare to other
high-end electric cars, other petrol cars in similar price ranges and
styles. You'd want to look at statistics for "typical Tesla drivers"
(who are significantly richer than the average driver, but I don't know
what other characteristics might be relevant - age, gender, driving >experience, etc.) You'd have to compare statistics for the countries
and parts of countries where Teslas are common.
And you would /definitely/ want to anonymise the data. If I had a
family member who was killed in a car crash, I would not be happy about
their name and details of their death being used for some sort of absurd >Tesla hate-site.
I'm no fan of Teslas myself. I like a car to be controlled like a car,
not a giant iPhone (and I don't like iPhones either). I don't like the
heavy tax breaks given by Norway to a luxury car, and I don't like the >environmental costs of making them (though I am glad to see improvements
on that front). I don't like some of the silly claims people make about
them - like Apple gadgets, they seem to bring out the fanboy in some of
their owners. But that's all just me and my personal preferences and >opinions - if someone else likes them, that's fine. Many Tesla owners
are very happy with their cars (and some are unhappy - just as for any
other car manufacturer). I can't see any reason for trying to paint
them as evil death-traps - you'd need very strong statistical basis for
that, not just a list of accidents.
On Tue, 29 Mar 2022 16:16:56 +0200, David Brown
<david...@hesbynett.no> wrote:
On 29/03/2022 15:00, Rickster wrote:
On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
The website referred to appears to be collating information in
a reasonable and unemotional way.
Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <ga...@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400
We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read >>> our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk's safety claims, and more.
Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12
https://www.tesladeaths.com/
Yeah, it's raw data. Did you have a point?
Without comparisons to other types of car, and correlations with other >factors, such raw data is useless. You'd need to compare to other
high-end electric cars, other petrol cars in similar price ranges and >styles. You'd want to look at statistics for "typical Tesla drivers"
(who are significantly richer than the average driver, but I don't know >what other characteristics might be relevant - age, gender, driving >experience, etc.) You'd have to compare statistics for the countries
and parts of countries where Teslas are common.
And you would /definitely/ want to anonymise the data. If I had a
family member who was killed in a car crash, I would not be happy about >their name and details of their death being used for some sort of absurd >Tesla hate-site.
I'm no fan of Teslas myself. I like a car to be controlled like a car,
not a giant iPhone (and I don't like iPhones either). I don't like the >heavy tax breaks given by Norway to a luxury car, and I don't like the >environmental costs of making them (though I am glad to see improvements
on that front). I don't like some of the silly claims people make about >them - like Apple gadgets, they seem to bring out the fanboy in some of >their owners. But that's all just me and my personal preferences and >opinions - if someone else likes them, that's fine. Many Tesla owners
are very happy with their cars (and some are unhappy - just as for any >other car manufacturer). I can't see any reason for trying to paint
them as evil death-traps - you'd need very strong statistical basis for >that, not just a list of accidents.
Yeahbut the problem is conventional cars don't tend to lock all the
doors trapping everyone inside and then burst into flames like Teslas
do. Call me old fashioned if you like, but I don't see that as being a
great selling point.
I expect you'd rather drive a Pinto.
Yeahbut the problem is conventional cars don't tend to lock all the
doors trapping everyone inside and then burst into flames like Teslas
do. Call me old fashioned if you like, but I don't see that as being a
great selling point.
On 01/04/2022 18:39, Ricky wrote:
On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:Are you suggesting that none of the people who drive Teslas are twats?
On 31/03/22 23:39, David Brown wrote:
On 01/04/2022 00:29, Ricky wrote:They have been doing more than that for for > 50 years. Cat 3b
Sorry, that's not how an autopilot works. It doesn't fly the
plane. It simply maintains a heading and altitude.
landings were in operation when I was a kid.
And, to put it kindly, aren't discouraged in that misapprehensionSomeone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the
planned route, as well as being able to land automatically. And
more important than what plane autopilots actually /do/, is what
people /think/ they do - and remember we are talking about
drivers that think their Tesla "autopilot" will drive their car
while they watch a movie or nap in the back seat.
by the statements of the cars' manufacturers and salesdroids.
Now, what's the best set of techniques to get that concept into the
heads of twats that think "autopilot" means "it does it for me".
That's Tom Gardner level misinformation. Comments about what people
think are spurious and unsubstantiated. A class of "twats" can be
invented that think anything. Nothing matters other than what Tesla
owners think. They are the ones driving the cars.
(Maybe that term is too British for you.)
The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?
It means "a stupid person" or "someone who does stupid things". No, not >everyone who drives a Tesla is a twat - but /some/ are, such as those
that think their autopilot will drive the car without them paying attention.
On Fri, 1 Apr 2022 18:59:33 +0200, David Brown
<david...@hesbynett.no> wrote:
On 01/04/2022 18:39, Ricky wrote:
On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote: >>>>> On 31/03/22 23:39, David Brown wrote:Are you suggesting that none of the people who drive Teslas are twats? >>> (Maybe that term is too British for you.)
On 01/04/2022 00:29, Ricky wrote:They have been doing more than that for for > 50 years. Cat 3b
Sorry, that's not how an autopilot works. It doesn't fly the
plane. It simply maintains a heading and altitude.
landings were in operation when I was a kid.
And, to put it kindly, aren't discouraged in that misapprehension >>>>> by the statements of the cars' manufacturers and salesdroids.Someone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern
ones are more sophisticated and handle course changes along the >>>>>> planned route, as well as being able to land automatically. And >>>>>> more important than what plane autopilots actually /do/, is what >>>>>> people /think/ they do - and remember we are talking about
drivers that think their Tesla "autopilot" will drive their car >>>>>> while they watch a movie or nap in the back seat.
Now, what's the best set of techniques to get that concept into the >>>>> heads of twats that think "autopilot" means "it does it for me".
That's Tom Gardner level misinformation. Comments about what people >>>> think are spurious and unsubstantiated. A class of "twats" can be
invented that think anything. Nothing matters other than what Tesla >>>> owners think. They are the ones driving the cars.
The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?
It means "a stupid person" or "someone who does stupid things". No, not >everyone who drives a Tesla is a twat - but /some/ are, such as thosethough.
that think their autopilot will drive the car without them paying attention. The term "autopilot" kind of suggests you can do precisely that,
I don't like these half-baked ideas. Don't call a self-driving car
that until it's *truly* and *safely* autonymous. The worst situation
is giving the driver a false sense of security - which is where we are currently.
I know this road well. The speed limit is 20mph, but during the
day 5-10mph is typical.
Given that, the depth to which the Tesla is buried is surprising.
I wonder what will be determined to be the cause of the accident.
(There are many electric scooters around there, some legal,
many not)
https://www.bristolpost.co.uk/news/bristol-news/tesla-goes-flying-window-shop-6905246
On Sunday, April 3, 2022 at 8:19:43 PM UTC-4, Cursitor Doom wrote:particular, the car will sound an alarm if you don't maintain a detectable grip on the steering wheel. Full self driving (FSD) is a future product that a few are allowed to beta test. I've never seen anything that suggests you do not need to be in
On Fri, 1 Apr 2022 18:59:33 +0200, David Brown
<david...@hesbynett.no> wrote:
On 01/04/2022 18:39, Ricky wrote:
On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
On 01/04/2022 14:42, Ricky wrote:
On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote: >>>>> On 31/03/22 23:39, David Brown wrote:Are you suggesting that none of the people who drive Teslas are twats? >>> (Maybe that term is too British for you.)
That's Tom Gardner level misinformation. Comments about what people >>>> think are spurious and unsubstantiated. A class of "twats" can be >>>> invented that think anything. Nothing matters other than what Tesla >>>> owners think. They are the ones driving the cars.On 01/04/2022 00:29, Ricky wrote:They have been doing more than that for for > 50 years. Cat 3b
Sorry, that's not how an autopilot works. It doesn't fly the >>>>>>> plane. It simply maintains a heading and altitude.
landings were in operation when I was a kid.
And, to put it kindly, aren't discouraged in that misapprehension >>>>> by the statements of the cars' manufacturers and salesdroids.Someone still has to be watching for other aircraft and
otherwise flying the plane. In other words, the pilot is
responsible for flying the plane, with or without the
autopilot.
Yes, that's the original idea of a plane autopilot. But modern >>>>>> ones are more sophisticated and handle course changes along the >>>>>> planned route, as well as being able to land automatically. And >>>>>> more important than what plane autopilots actually /do/, is what >>>>>> people /think/ they do - and remember we are talking about
drivers that think their Tesla "autopilot" will drive their car >>>>>> while they watch a movie or nap in the back seat.
Now, what's the best set of techniques to get that concept into the >>>>> heads of twats that think "autopilot" means "it does it for me". >>>>
The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?
In the Tesla "autopilot", "full self driving" and autonomous are three different things. Autopilot is what you can buy and use today. The information on it clearly says you are responsible for driving the vehicle and this is just an "assist". InIt means "a stupid person" or "someone who does stupid things". No, not >everyone who drives a Tesla is a twat - but /some/ are, such as those >that think their autopilot will drive the car without them paying attention.The term "autopilot" kind of suggests you can do precisely that,
though.
I don't like these half-baked ideas. Don't call a self-driving car
that until it's *truly* and *safely* autonymous. The worst situation
is giving the driver a false sense of security - which is where we are currently.
If you ignore what a company tells you about the limitations of its products, that's on you. It puts you in the category of people who sue ladder companies because they didn't tell you to not use it in a pig pen.
--
Rick C.
-+-- Get 1,000 miles of free Supercharging
-+-- Tesla referral code - https://ts.la/richard11209
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 231 |
Nodes: | 16 (4 / 12) |
Uptime: | 106:48:03 |
Calls: | 4,952 |
Calls today: | 1 |
Files: | 11,523 |
Messages: | 3,984,822 |