• Every Tesla Accident Resulting in Death

    From Tom Gardner@21:1/5 to All on Tue Mar 29 12:12:16 2022
    From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

    The website referred to appears to be collating information in
    a reasonable and unemotional way.


    Every Tesla Accident Resulting in Death (Tesla Deaths)
    Gabe Goldberg <gabe@gabegold.com>
    Thu, 24 Mar 2022 01:53:39 -0400

    We provide an updated record of Tesla fatalities and Tesla accident deaths
    that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.). This sheet also tallies claimed and confirmed Tesla autopilot crashes, i.e. instances when
    Autopilot was activated during a Tesla crash that resulted in death. Read
    our other sheets for additional data and analysis on vehicle miles traveled, links and analysis comparing Musk's safety claims, and more.

    Tesla Deaths Total as of 3/23/2022: 246
    Tesla Autopilot Deaths Count: 12

    https://www.tesladeaths.com/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rickster@21:1/5 to Tom Gardner on Tue Mar 29 06:00:51 2022
    On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
    From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

    The website referred to appears to be collating information in
    a reasonable and unemotional way.


    Every Tesla Accident Resulting in Death (Tesla Deaths)
    Gabe Goldberg <ga...@gabegold.com>
    Thu, 24 Mar 2022 01:53:39 -0400

    We provide an updated record of Tesla fatalities and Tesla accident deaths that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.). This sheet also tallies claimed and confirmed Tesla autopilot crashes, i.e. instances when
    Autopilot was activated during a Tesla crash that resulted in death. Read
    our other sheets for additional data and analysis on vehicle miles traveled, links and analysis comparing Musk's safety claims, and more.

    Tesla Deaths Total as of 3/23/2022: 246
    Tesla Autopilot Deaths Count: 12

    https://www.tesladeaths.com/

    Yeah, it's raw data. Did you have a point?

    --

    Rick C.

    - Get 1,000 miles of free Supercharging
    - Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Rickster on Tue Mar 29 16:16:56 2022
    On 29/03/2022 15:00, Rickster wrote:
    On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
    From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

    The website referred to appears to be collating information in
    a reasonable and unemotional way.


    Every Tesla Accident Resulting in Death (Tesla Deaths)
    Gabe Goldberg <ga...@gabegold.com>
    Thu, 24 Mar 2022 01:53:39 -0400

    We provide an updated record of Tesla fatalities and Tesla accident deaths >> that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.). This sheet also tallies >> claimed and confirmed Tesla autopilot crashes, i.e. instances when
    Autopilot was activated during a Tesla crash that resulted in death. Read
    our other sheets for additional data and analysis on vehicle miles traveled, >> links and analysis comparing Musk's safety claims, and more.

    Tesla Deaths Total as of 3/23/2022: 246
    Tesla Autopilot Deaths Count: 12

    https://www.tesladeaths.com/

    Yeah, it's raw data. Did you have a point?


    Without comparisons to other types of car, and correlations with other
    factors, such raw data is useless. You'd need to compare to other
    high-end electric cars, other petrol cars in similar price ranges and
    styles. You'd want to look at statistics for "typical Tesla drivers"
    (who are significantly richer than the average driver, but I don't know
    what other characteristics might be relevant - age, gender, driving
    experience, etc.) You'd have to compare statistics for the countries
    and parts of countries where Teslas are common.

    And you would /definitely/ want to anonymise the data. If I had a
    family member who was killed in a car crash, I would not be happy about
    their name and details of their death being used for some sort of absurd
    Tesla hate-site.

    I'm no fan of Teslas myself. I like a car to be controlled like a car,
    not a giant iPhone (and I don't like iPhones either). I don't like the
    heavy tax breaks given by Norway to a luxury car, and I don't like the environmental costs of making them (though I am glad to see improvements
    on that front). I don't like some of the silly claims people make about
    them - like Apple gadgets, they seem to bring out the fanboy in some of
    their owners. But that's all just me and my personal preferences and
    opinions - if someone else likes them, that's fine. Many Tesla owners
    are very happy with their cars (and some are unhappy - just as for any
    other car manufacturer). I can't see any reason for trying to paint
    them as evil death-traps - you'd need very strong statistical basis for
    that, not just a list of accidents.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to David Brown on Tue Mar 29 16:17:42 2022
    On 29/03/22 15:16, David Brown wrote:
    On 29/03/2022 15:00, Rickster wrote:
    On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
    From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

    The website referred to appears to be collating information in
    a reasonable and unemotional way.


    Every Tesla Accident Resulting in Death (Tesla Deaths)
    Gabe Goldberg <ga...@gabegold.com>
    Thu, 24 Mar 2022 01:53:39 -0400

    We provide an updated record of Tesla fatalities and Tesla accident deaths >>> that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.). This sheet also tallies >>> claimed and confirmed Tesla autopilot crashes, i.e. instances when
    Autopilot was activated during a Tesla crash that resulted in death. Read >>> our other sheets for additional data and analysis on vehicle miles traveled,
    links and analysis comparing Musk's safety claims, and more.

    Tesla Deaths Total as of 3/23/2022: 246
    Tesla Autopilot Deaths Count: 12

    https://www.tesladeaths.com/

    Yeah, it's raw data. Did you have a point?

    I have no point.

    I am curious about the causes of crashes when "autopilot" is engaged.


    Without comparisons to other types of car, and correlations with other factors, such raw data is useless. You'd need to compare to other
    high-end electric cars, other petrol cars in similar price ranges and
    styles. You'd want to look at statistics for "typical Tesla drivers"
    (who are significantly richer than the average driver, but I don't know
    what other characteristics might be relevant - age, gender, driving experience, etc.) You'd have to compare statistics for the countries
    and parts of countries where Teslas are common.

    And you would /definitely/ want to anonymise the data. If I had a
    family member who was killed in a car crash, I would not be happy about
    their name and details of their death being used for some sort of absurd Tesla hate-site.

    I'm no fan of Teslas myself. I like a car to be controlled like a car,
    not a giant iPhone (and I don't like iPhones either). I don't like the
    heavy tax breaks given by Norway to a luxury car, and I don't like the environmental costs of making them (though I am glad to see improvements
    on that front). I don't like some of the silly claims people make about
    them - like Apple gadgets, they seem to bring out the fanboy in some of
    their owners. But that's all just me and my personal preferences and opinions - if someone else likes them, that's fine. Many Tesla owners
    are very happy with their cars (and some are unhappy - just as for any
    other car manufacturer). I can't see any reason for trying to paint
    them as evil death-traps - you'd need very strong statistical basis for
    that, not just a list of accidents.

    There is an attempt at comparisons, as stated in the FAQ.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Doe@21:1/5 to Tom Gardner on Tue Mar 29 16:29:44 2022
    Tom Gardner wrote:

    This sheet also tallies claimed and confirmed Tesla autopilot crashes,
    i.e. instances when Autopilot was activated during a Tesla crash

    Would be better without that "i.e."
    Maybe "active" but not "activated".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rickster@21:1/5 to Tom Gardner on Tue Mar 29 10:28:11 2022
    On Tuesday, March 29, 2022 at 11:17:49 AM UTC-4, Tom Gardner wrote:
    On 29/03/22 15:16, David Brown wrote:
    On 29/03/2022 15:00, Rickster wrote:
    On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
    From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

    The website referred to appears to be collating information in
    a reasonable and unemotional way.


    Every Tesla Accident Resulting in Death (Tesla Deaths)
    Gabe Goldberg <ga...@gabegold.com>
    Thu, 24 Mar 2022 01:53:39 -0400

    We provide an updated record of Tesla fatalities and Tesla accident deaths
    that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.). This sheet also tallies
    claimed and confirmed Tesla autopilot crashes, i.e. instances when
    Autopilot was activated during a Tesla crash that resulted in death. Read
    our other sheets for additional data and analysis on vehicle miles traveled,
    links and analysis comparing Musk's safety claims, and more.

    Tesla Deaths Total as of 3/23/2022: 246
    Tesla Autopilot Deaths Count: 12

    https://www.tesladeaths.com/

    Yeah, it's raw data. Did you have a point?
    I have no point.

    I am curious about the causes of crashes when "autopilot" is engaged.

    What do you expect to learn by posting this here? Autopilot is not perfect, by any means. They tell you to remain alert just as if you were driving, and in fact, observe your grasp on the wheel alerting you if you relax too much.

    The point is when the car crashes on autopilot, it is the driver's fault, not the car because the car is just a driving assistance tool, like the blind spot warning device. If you smack someone in your blind spot, who's fault is that? Yours, because
    the tool is not perfect.

    I know one accident occurred at a highway divide where the guy had previously had the car try to go up the middle, rather than left or right. He even posted that the car was trying to kill him. One day he did something wrong at that same spot and he
    killed the car and himself.

    Have you learned anything new yet?

    --

    Rick C.

    + Get 1,000 miles of free Supercharging
    + Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Tom Gardner on Tue Mar 29 21:18:04 2022
    On 29/03/2022 17:17, Tom Gardner wrote:
    On 29/03/22 15:16, David Brown wrote:
    On 29/03/2022 15:00, Rickster wrote:
    Yeah, it's raw data.  Did you have a point?

    I have no point.


    Fair enough, I suppose. But was there a reason for the post then?

    I am curious about the causes of crashes when "autopilot" is engaged.


    That's a reasonable thing to wonder about. The more we (people in
    general, Tesla drivers, Tesla developers, etc.) know about such crashes,
    the better the possibilities for fixing weaknesses or understanding how
    to mitigate them. Unfortunately, the main mitigation is "don't rely on autopilot - stay alert and focused on the driving" does not work. For
    one thing, many people don't obey it - people have been found in the
    back seat of crashed Telsa's where they were having a nap. And those
    that try to follow it are likely to doze off from boredom.

    However, there is no need for a list of "crashes involving Teslas",
    names of victims, and a site with a clear agenda to "prove" that Teslas
    are not as safe as they claim. It is counter-productive to real
    investigation and real learning.



    There is an attempt at comparisons, as stated in the FAQ.

    It is a pretty feeble attempt, hidden away.

    Even the comparison of "autopilot" deaths to total deaths is useless
    without information about autopilot use, and how many people rely on it.

    The whole post just struck me as a bit below par for your usual high
    standard. There's definitely an interesting thread possibility around
    the idea of how safe or dangerous car "autopilots" can be, and how they
    compare to average drivers. But your post was not a great starting
    point for that.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to David Brown on Tue Mar 29 23:54:34 2022
    On 29/03/22 20:18, David Brown wrote:
    On 29/03/2022 17:17, Tom Gardner wrote:
    On 29/03/22 15:16, David Brown wrote:
    On 29/03/2022 15:00, Rickster wrote:
    Yeah, it's raw data.  Did you have a point?

    I have no point.


    Fair enough, I suppose. But was there a reason for the post then?

    Primarily to provoke thought and discussion, and
    secondarily to point to occurrences that Tesla fanbois
    and Musk prefer to sweep under the carpet.


    I am curious about the causes of crashes when "autopilot" is engaged.


    That's a reasonable thing to wonder about. The more we (people in
    general, Tesla drivers, Tesla developers, etc.) know about such crashes,
    the better the possibilities for fixing weaknesses or understanding how
    to mitigate them. Unfortunately, the main mitigation is "don't rely on autopilot - stay alert and focused on the driving" does not work. For
    one thing, many people don't obey it - people have been found in the
    back seat of crashed Telsa's where they were having a nap. And those
    that try to follow it are likely to doze off from boredom.

    Agreed.

    Musk and his /very/ carefully worded advertising don't help
    matters. That should be challenged by evidence.

    I haven't seen such evidence collated anywhere else.



    However, there is no need for a list of "crashes involving Teslas",
    names of victims, and a site with a clear agenda to "prove" that Teslas
    are not as safe as they claim. It is counter-productive to real investigation and real learning.

    As far as I can see the website does not name the dead.
    The linked references may do.

    Musk makes outlandish claims about his cars, which need
    debunking in order to help prevent more unnecessary
    accidents.

    From https://catless.ncl.ac.uk/Risks/33/11/#subj3
    "Weeks earlier, a Tesla using the company's advanced
    driver-assistance system had crashed into a tractor-trailer
    at about 70 mph, killing the driver. When National Highway
    Traffic Safety Administration officials called Tesla
    executives to say they were launching an investigation,
    Musk screamed, protested and threatened to sue, said a
    former safety official who spoke on the condition of
    anonymity to discuss sensitive matters.

    "The regulators knew Musk could be impulsive and stubborn;
    they would need to show some spine to win his cooperation.
    So they waited. And in a subsequent call, “when tempers were
    a little bit cool, Musk agreed to cooperate: He was a
    changed person.'' "
    https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation



    There is an attempt at comparisons, as stated in the FAQ.

    It is a pretty feeble attempt, hidden away.

    Even the comparison of "autopilot" deaths to total deaths is useless
    without information about autopilot use, and how many people rely on it.

    That's too strong, but I agree most ratios (including that one)
    aren't that enlightening.


    The whole post just struck me as a bit below par for your usual high standard. There's definitely an interesting thread possibility around
    the idea of how safe or dangerous car "autopilots" can be, and how they compare to average drivers. But your post was not a great starting
    point for that.

    Real world experiences aren't a bad /starting/ point, but
    they do have limitations. Better starting points are to
    be welcomed.

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on
    yootoob aren't confidence inspiring. Based on one I saw,
    I certainly wouldn't dare let a Tesla drive itself in
    an urban environment,

    I suspect there isn't sufficient experience to assess
    relative dangers between "artificial intelligence" and
    "natural stupidity".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Tom Gardner on Tue Mar 29 18:38:08 2022
    On 3/29/2022 3:54 PM, Tom Gardner wrote:
    Some of the dashcam "Tesla's making mistakes" videos on
    yootoob aren't confidence inspiring. Based on one I saw,
    I certainly wouldn't dare let a Tesla drive itself in
    an urban environment,

    +1

    I'm not sure I'd rely on any of these technologies to do
    more than *help* me (definitely not *replace* me!)

    E.g., I like the LIDAR warning me that a vehicle is
    about to pass behind my parked car as I'm backing out...
    because I often can't see "a few seconds" off to each
    side, given the presence of other vehicles parked on each
    side of mine. But, I still look over my shoulder AND
    watch the backup camera as I pull out.

    I suspect there isn't sufficient experience to assess
    relative dangers between "artificial intelligence" and
    "natural stupidity".

    I'm not sure it can all be distilled to "natural stupidity".

    When we last looked for a new vehicle, one of the salespersons
    commented on some of this "advisory tech" with such exuberance:
    "Oh, yeah! It works GREAT! I don't even bother to *look*,
    anymore!"

    And, to the average Joe, why should they HAVE to "look" if
    the technology was (allegedly) performing that function?
    ("Oh, do you mean it doesn't really *work*? Then why are
    you charging me for it? If I couldn't rely on the engine,
    would you tell me to always wear good WALKING SHOES when
    I set out on a drive???!")

    And, "laziness" is often an issue.

    I designed a LORAN-C -based autopilot (boat) in the 70's. You
    typed in lat-lons of your destinations (a series) and the autopilot
    would get you to them, correcting for drift ("cross-track error")
    to ensure straight-line travel (a conventional autopilot just
    kept the vessel pointed in the desired direction so ocean currents
    would steadily push you off your desired course).

    There was considerable debate about how to handle the sequencing
    of destinations:
    - should you automatically replace the current destination with
    the *next* in the series, having reached the current? (and, what
    do you use as criteria for reaching that current destination)
    - should you require manual intervention to advance to the next
    destination, having reached the current? And, if so, how will
    the skipper know what the vessel's path will be AFTER overshooting
    the destination? The autopilot will keep trying to return the vessel
    to that position -- no control over throttle -- but how do you
    anticipate the path that it will take in doing so?
    - should you be able to alert the skipper to reaching the current
    destination (in case he's in the stern of the vessel prepping
    lobster pots for deployment)?
    - should you incorporate throttle controls (what if you cut the
    throttle on reaching the destination and the vessel then drifts
    away from that spot)?
    - should you "tune" the instrument to the vessel's characteristics
    (helm control of a speedboat is considerably more responsive than
    of a fishing trawler!)

    There's no real "right" answer -- short of taking over more control
    of the vessel (which then poses different problems).

    So, you recognize the fact that skippers will act in whatever way
    suits them -- at the moment -- and don't try to be their "nanny"
    (cuz anything you do in that regard they will UNdo)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rickster@21:1/5 to Tom Gardner on Tue Mar 29 21:45:56 2022
    On Tuesday, March 29, 2022 at 6:54:41 PM UTC-4, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    On 29/03/2022 17:17, Tom Gardner wrote:
    On 29/03/22 15:16, David Brown wrote:
    On 29/03/2022 15:00, Rickster wrote:
    Yeah, it's raw data. Did you have a point?

    I have no point.


    Fair enough, I suppose. But was there a reason for the post then?
    Primarily to provoke thought and discussion, and
    secondarily to point to occurrences that Tesla fanbois
    and Musk prefer to sweep under the carpet.

    You haven't point out anything useful. You posted a link to what is really a pretty crappy web page. Where's the thought, where's the discussion? You said yourself you had nothing to say about it. Ok, thanks for the link. Bye.


    I am curious about the causes of crashes when "autopilot" is engaged.


    That's a reasonable thing to wonder about. The more we (people in
    general, Tesla drivers, Tesla developers, etc.) know about such crashes, the better the possibilities for fixing weaknesses or understanding how
    to mitigate them. Unfortunately, the main mitigation is "don't rely on autopilot - stay alert and focused on the driving" does not work. For
    one thing, many people don't obey it - people have been found in the
    back seat of crashed Telsa's where they were having a nap. And those
    that try to follow it are likely to doze off from boredom.
    Agreed.

    Musk and his /very/ carefully worded advertising don't help
    matters. That should be challenged by evidence.

    Ok. where's the evidence?


    I haven't seen such evidence collated anywhere else.

    I still haven't seen any evidence, although I'm not sure what it is supposed to be evidence of.


    However, there is no need for a list of "crashes involving Teslas",
    names of victims, and a site with a clear agenda to "prove" that Teslas are not as safe as they claim. It is counter-productive to real investigation and real learning.
    As far as I can see the website does not name the dead.
    The linked references may do.

    Musk makes outlandish claims about his cars, which need
    debunking in order to help prevent more unnecessary
    accidents.

    From https://catless.ncl.ac.uk/Risks/33/11/#subj3
    "Weeks earlier, a Tesla using the company's advanced
    driver-assistance system had crashed into a tractor-trailer
    at about 70 mph, killing the driver. When National Highway
    Traffic Safety Administration officials called Tesla
    executives to say they were launching an investigation,
    Musk screamed, protested and threatened to sue, said a
    former safety official who spoke on the condition of
    anonymity to discuss sensitive matters.

    "The regulators knew Musk could be impulsive and stubborn;
    they would need to show some spine to win his cooperation.
    So they waited. And in a subsequent call, “when tempers were
    a little bit cool, Musk agreed to cooperate: He was a
    changed person.'' " https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation

    Ok, so???

    I think we all know Musk is a jerk. He's a huge PT Barnum sales person too. Who didn't know that? What's your point?


    There is an attempt at comparisons, as stated in the FAQ.

    It is a pretty feeble attempt, hidden away.

    Even the comparison of "autopilot" deaths to total deaths is useless without information about autopilot use, and how many people rely on it.
    That's too strong, but I agree most ratios (including that one)
    aren't that enlightening.

    I'm happy to see any ratios that mean anything, but I didn't see them. I saw a table of incidents which included at least one death. Where are the comparisons?


    The whole post just struck me as a bit below par for your usual high standard. There's definitely an interesting thread possibility around
    the idea of how safe or dangerous car "autopilots" can be, and how they compare to average drivers. But your post was not a great starting
    point for that.
    Real world experiences aren't a bad /starting/ point, but
    they do have limitations. Better starting points are to
    be welcomed.

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on
    yootoob aren't confidence inspiring. Based on one I saw,
    I certainly wouldn't dare let a Tesla drive itself in
    an urban environment,

    You aren't supposed to let a Tesla drive itself in any environment. You are the driver. Autopilot is just a driving assistance tool. You seem to think autopilot is autonomous driving. It's not even remotely close. If that's what you are looking for,
    you won't find anyone from Tesla claiming autopilot is anything other than an "assist", including Musk.


    I suspect there isn't sufficient experience to assess
    relative dangers between "artificial intelligence" and
    "natural stupidity".

    I'm not sure what you wish to measure. That's what a comparison does, it measures one thing vs. another in terms of some measurement. What exactly do you want to measure? Or are you just on a fishing trip looking for something damning to Musk or Tesla?


    --

    Rick C.

    -- Get 1,000 miles of free Supercharging
    -- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Tom Gardner on Wed Mar 30 08:27:21 2022
    On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    On 29/03/2022 17:17, Tom Gardner wrote:
    On 29/03/22 15:16, David Brown wrote:
    On 29/03/2022 15:00, Rickster wrote:
    Yeah, it's raw data.  Did you have a point?

    I have no point.


    Fair enough, I suppose.  But was there a reason for the post then?

    Primarily to provoke thought and discussion, and
    secondarily to point to occurrences that Tesla fanbois
    and Musk prefer to sweep under the carpet.


    I am curious about the causes of crashes when "autopilot" is engaged.


    That's a reasonable thing to wonder about.  The more we (people in
    general, Tesla drivers, Tesla developers, etc.) know about such crashes,
    the better the possibilities for fixing weaknesses or understanding how
    to mitigate them.  Unfortunately, the main mitigation is "don't rely on
    autopilot - stay alert and focused on the driving" does not work.  For
    one thing, many people don't obey it - people have been found in the
    back seat of crashed Telsa's where they were having a nap.  And those
    that try to follow it are likely to doze off from boredom.

    Agreed.

    Musk and his /very/ carefully worded advertising don't help
    matters. That should be challenged by evidence.

    I haven't seen such evidence collated anywhere else.

    But that site does not have evidence of anything relevant. It shows
    that people sometimes die on the road, even in Teslas. Nothing more.

    If the Tesla people are using false or misleading advertising, or making
    safety claims that can't be verified, then I agree they should be held accountable. Collect evidence to show that - /real/ comparisons and
    /real/ statistics.

    Progress was not made against tobacco companies by compiling lists of
    people who smoked and then died. It was done by comparing the death
    rates of people who smoked to those of people who don't smoke.




    However, there is no need for a list of "crashes involving Teslas",
    names of victims, and a site with a clear agenda to "prove" that Teslas
    are not as safe as they claim.  It is counter-productive to real
    investigation and real learning.

    As far as I can see the website does not name the dead.
    The linked references may do.

    From your initial post (you read what you quoted, didn't you?) :

    """
    We provide an updated record of Tesla fatalities and Tesla accident deaths
    that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.).
    """


    Musk makes outlandish claims about his cars, which need
    debunking in order to help prevent more unnecessary
    accidents.

    From https://catless.ncl.ac.uk/Risks/33/11/#subj3
      "Weeks earlier, a Tesla using the company's advanced
      driver-assistance system had crashed into a tractor-trailer
      at about 70 mph, killing the driver. When National Highway
      Traffic Safety Administration officials called Tesla
      executives to say they were launching an investigation,
      Musk screamed, protested and threatened to sue, said a
      former safety official who spoke on the condition of
      anonymity to discuss sensitive matters.

      "The regulators knew Musk could be impulsive and stubborn;
      they would need to show some spine to win his cooperation.
      So they waited. And in a subsequent call, “when tempers were
      a little bit cool, Musk agreed to cooperate: He was a
      changed person.'' "
      https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation


    So people who know how to investigate these things are investigating
    them. That's great. (It is also - in theory, at least - unbiased. The autopilot might not have been at fault.) It's a lot better than some
    amateur with a grudge, an ignorance of statistics and a google document
    page.




    There is an attempt at comparisons, as stated in the FAQ.

    It is a pretty feeble attempt, hidden away.

    Even the comparison of "autopilot" deaths to total deaths is useless
    without information about autopilot use, and how many people rely on it.

    That's too strong, but I agree most ratios (including that one)
    aren't that enlightening.

    No, it is not "too strong". It is basic statistics. Bayes' theorem,
    and all that. If a large proportion of people use autopilot, but only a
    small fraction of the deaths had the autopilot on, then clearly the
    autopilot reduces risks and saves lives (of those that drive Teslas - we
    still know nothing of other car drivers).



    The whole post just struck me as a bit below par for your usual high
    standard.  There's definitely an interesting thread possibility around
    the idea of how safe or dangerous car "autopilots" can be, and how they
    compare to average drivers.  But your post was not a great starting
    point for that.

    Real world experiences aren't a bad /starting/ point, but
    they do have limitations. Better starting points are to
    be welcomed.

    Real world experiences are enough to say "this might be worth looking
    at" - but no more than that.


    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on
    yootoob aren't confidence inspiring. Based on one I saw,
    I certainly wouldn't dare let a Tesla drive itself in
    an urban environment,

    I suspect there isn't sufficient experience to assess
    relative dangers between "artificial intelligence" and
    "natural stupidity".

    I don't doubt at all that the Tesla autopilot makes mistakes. So do
    human drivers. The interesting question is who makes fewer mistakes, or mistakes with lower consequences - and that is a question for which no
    amount of anecdotal yootoob videos or Tesla/Musk hate sites will help.
    The only evidence you have so far is that people love to show that
    something fancy and expensive is not always perfect, and I believe we
    knew that already.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Thu Mar 31 13:44:11 2022
    On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
    On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    On 29/03/2022 17:17, Tom Gardner wrote:
    On 29/03/22 15:16, David Brown wrote:
    On 29/03/2022 15:00, Rickster wrote:
    Yeah, it's raw data. Did you have a point?

    I have no point.


    Fair enough, I suppose. But was there a reason for the post then?

    Primarily to provoke thought and discussion, and
    secondarily to point to occurrences that Tesla fanbois
    and Musk prefer to sweep under the carpet.


    I am curious about the causes of crashes when "autopilot" is engaged. >>>

    That's a reasonable thing to wonder about. The more we (people in
    general, Tesla drivers, Tesla developers, etc.) know about such crashes, >> the better the possibilities for fixing weaknesses or understanding how >> to mitigate them. Unfortunately, the main mitigation is "don't rely on >> autopilot - stay alert and focused on the driving" does not work. For
    one thing, many people don't obey it - people have been found in the
    back seat of crashed Telsa's where they were having a nap. And those
    that try to follow it are likely to doze off from boredom.

    Agreed.

    Musk and his /very/ carefully worded advertising don't help
    matters. That should be challenged by evidence.

    I haven't seen such evidence collated anywhere else.
    But that site does not have evidence of anything relevant. It shows
    that people sometimes die on the road, even in Teslas. Nothing more.

    If the Tesla people are using false or misleading advertising, or making safety claims that can't be verified, then I agree they should be held accountable. Collect evidence to show that - /real/ comparisons and
    /real/ statistics.

    Progress was not made against tobacco companies by compiling lists of
    people who smoked and then died. It was done by comparing the death
    rates of people who smoked to those of people who don't smoke.



    However, there is no need for a list of "crashes involving Teslas",
    names of victims, and a site with a clear agenda to "prove" that Teslas >> are not as safe as they claim. It is counter-productive to real
    investigation and real learning.

    As far as I can see the website does not name the dead.
    The linked references may do.
    From your initial post (you read what you quoted, didn't you?) :
    """
    We provide an updated record of Tesla fatalities and Tesla accident deaths that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.).
    """


    Musk makes outlandish claims about his cars, which need
    debunking in order to help prevent more unnecessary
    accidents.

    From https://catless.ncl.ac.uk/Risks/33/11/#subj3
    "Weeks earlier, a Tesla using the company's advanced
    driver-assistance system had crashed into a tractor-trailer
    at about 70 mph, killing the driver. When National Highway
    Traffic Safety Administration officials called Tesla
    executives to say they were launching an investigation,
    Musk screamed, protested and threatened to sue, said a
    former safety official who spoke on the condition of
    anonymity to discuss sensitive matters.

    "The regulators knew Musk could be impulsive and stubborn;
    they would need to show some spine to win his cooperation.
    So they waited. And in a subsequent call, “when tempers were
    a little bit cool, Musk agreed to cooperate: He was a
    changed person.'' "

    https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation

    So people who know how to investigate these things are investigating
    them. That's great. (It is also - in theory, at least - unbiased. The autopilot might not have been at fault.) It's a lot better than some
    amateur with a grudge, an ignorance of statistics and a google document page.



    There is an attempt at comparisons, as stated in the FAQ.

    It is a pretty feeble attempt, hidden away.

    Even the comparison of "autopilot" deaths to total deaths is useless
    without information about autopilot use, and how many people rely on it.

    That's too strong, but I agree most ratios (including that one)
    aren't that enlightening.
    No, it is not "too strong". It is basic statistics. Bayes' theorem,
    and all that. If a large proportion of people use autopilot, but only a small fraction of the deaths had the autopilot on, then clearly the autopilot reduces risks and saves lives (of those that drive Teslas - we still know nothing of other car drivers).

    A simple comparison of numbers is not sufficient. Most Tesla autopilot usage is on highways which are much safer per mile driven than other roads. That's an inherent bias because while non-autopilot driving must include all situations, autopilot simply
    doesn't work in most environments.


    The whole post just struck me as a bit below par for your usual high
    standard. There's definitely an interesting thread possibility around
    the idea of how safe or dangerous car "autopilots" can be, and how they >> compare to average drivers. But your post was not a great starting
    point for that.

    Real world experiences aren't a bad /starting/ point, but
    they do have limitations. Better starting points are to
    be welcomed.
    Real world experiences are enough to say "this might be worth looking
    at" - but no more than that.

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on
    yootoob aren't confidence inspiring. Based on one I saw,
    I certainly wouldn't dare let a Tesla drive itself in
    an urban environment,

    I suspect there isn't sufficient experience to assess
    relative dangers between "artificial intelligence" and
    "natural stupidity".
    I don't doubt at all that the Tesla autopilot makes mistakes.

    Which depends on how you define "mistakes". It's a bit like asking if your rear view mirror makes mistakes by not showing cars in the blind spot. The autopilot is not designed to drive the car. It is a tool to assist the driver. The driver is
    required to be responsible for the safe operation of the car at all times. I can point out to you the many, many times the car acts like a spaz and requires me to manage the situation. Early on, there was a left turn like on a 50 mph road, the car
    would want to turn into when intending to drive straight. Fortunately they have ironed out that level of issue. But it was always my responsibility to prevent it from causing an accident. So how would you say anything was the fault of the autopilot?


    So do
    human drivers. The interesting question is who makes fewer mistakes, or mistakes with lower consequences - and that is a question for which no amount of anecdotal yootoob videos or Tesla/Musk hate sites will help.
    The only evidence you have so far is that people love to show that
    something fancy and expensive is not always perfect, and I believe we
    knew that already.

    That's where they are headed with the full self driving. But gauging the breadth of issues the car has problems with, I think it will be a long, long time before we can sit back and relax while the car drives us home.

    --

    Rick C.

    -+ Get 1,000 miles of free Supercharging
    -+ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Thu Mar 31 23:48:10 2022
    On 31/03/2022 22:44, Ricky wrote:
    On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
    On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:

    <snip>

    No, it is not "too strong". It is basic statistics. Bayes' theorem,
    and all that. If a large proportion of people use autopilot, but
    only a small fraction of the deaths had the autopilot on, then
    clearly the autopilot reduces risks and saves lives (of those that
    drive Teslas - we still know nothing of other car drivers).

    A simple comparison of numbers is not sufficient. Most Tesla
    autopilot usage is on highways which are much safer per mile driven
    than other roads. That's an inherent bias because while
    non-autopilot driving must include all situations, autopilot simply
    doesn't work in most environments.


    Yes. An apples-to-apples comparison is the aim, or at least as close as
    one can get.

    I suspect - without statistical justification - that the accidents
    involving autopilot use are precisely cases where you don't have a good,
    clear highway, and autopilot was used in a situation where it was not
    suitable. Getting good statistics and comparisons here could be helpful
    in making it safer - perhaps adding a feature that has the autopilot say
    "This is not a good road for me - you have to drive yourself" and switch
    itself off. (It would be more controversial, but probably statistically
    safer, if it also sometimes said "I'm better at driving on this kind of
    road than you are" and switching itself on!)


    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on yootoob
    aren't confidence inspiring. Based on one I saw, I certainly
    wouldn't dare let a Tesla drive itself in an urban environment,

    I suspect there isn't sufficient experience to assess relative
    dangers between "artificial intelligence" and "natural
    stupidity".
    I don't doubt at all that the Tesla autopilot makes mistakes.

    Which depends on how you define "mistakes".

    Of course.

    It's a bit like asking
    if your rear view mirror makes mistakes by not showing cars in the
    blind spot. The autopilot is not designed to drive the car. It is a
    tool to assist the driver. The driver is required to be responsible
    for the safe operation of the car at all times. I can point out to
    you the many, many times the car acts like a spaz and requires me to
    manage the situation. Early on, there was a left turn like on a 50
    mph road, the car would want to turn into when intending to drive
    straight. Fortunately they have ironed out that level of issue. But
    it was always my responsibility to prevent it from causing an
    accident. So how would you say anything was the fault of the
    autopilot?


    There are a few possibilities here (though I am not trying to claim that
    any of them are "right" in some objective sense). You might say they
    had believed that the "autopilot" was like a plane autopilot - you can
    turn it on and leave it to safely drive itself for most of the journey
    except perhaps the very beginning and very end of the trip. As you say,
    the Tesla autopilot is /not/ designed for that - that might be a mistake
    from the salesmen, advertisers, user-interface designers, or just the
    driver's mistake.

    And sometimes the autopilot does something daft - it is no longer
    assisting the driver, but working against him or her. That, I think,
    should be counted as a mistake by the autopilot. Tesla autopilots are
    not alone in this, of course. I have heard of several cases where
    "smart" cruise controls on cars have been confused by things like
    changes to road layouts or when driving in tunnels underneath parts of a
    city, and suddenly braking hard due to speed limit changes on surface
    roads that don't apply in the tunnel.


    So do human drivers. The interesting question is who makes fewer
    mistakes, or mistakes with lower consequences - and that is a
    question for which no amount of anecdotal yootoob videos or
    Tesla/Musk hate sites will help. The only evidence you have so far
    is that people love to show that something fancy and expensive is
    not always perfect, and I believe we knew that already.

    That's where they are headed with the full self driving. But gauging
    the breadth of issues the car has problems with, I think it will be a
    long, long time before we can sit back and relax while the car drives
    us home.


    Yes. Automatic driving is progressing, but it has a long way to go as yet.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Thu Mar 31 15:29:48 2022
    On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
    On 31/03/2022 22:44, Ricky wrote:
    On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
    On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    <snip>
    No, it is not "too strong". It is basic statistics. Bayes' theorem,
    and all that. If a large proportion of people use autopilot, but
    only a small fraction of the deaths had the autopilot on, then
    clearly the autopilot reduces risks and saves lives (of those that
    drive Teslas - we still know nothing of other car drivers).

    A simple comparison of numbers is not sufficient. Most Tesla
    autopilot usage is on highways which are much safer per mile driven
    than other roads. That's an inherent bias because while
    non-autopilot driving must include all situations, autopilot simply doesn't work in most environments.

    Yes. An apples-to-apples comparison is the aim, or at least as close as
    one can get.

    I suspect - without statistical justification -

    Yes, without justification, at all.

    that the accidents
    involving autopilot use are precisely cases where you don't have a good, clear highway, and autopilot was used in a situation where it was not suitable. Getting good statistics and comparisons here could be helpful
    in making it safer - perhaps adding a feature that has the autopilot say "This is not a good road for me - you have to drive yourself" and switch itself off. (It would be more controversial, but probably statistically safer, if it also sometimes said "I'm better at driving on this kind of
    road than you are" and switching itself on!)

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on yootoob
    aren't confidence inspiring. Based on one I saw, I certainly
    wouldn't dare let a Tesla drive itself in an urban environment,

    I suspect there isn't sufficient experience to assess relative
    dangers between "artificial intelligence" and "natural
    stupidity".
    I don't doubt at all that the Tesla autopilot makes mistakes.

    Which depends on how you define "mistakes".
    Of course.
    It's a bit like asking
    if your rear view mirror makes mistakes by not showing cars in the
    blind spot. The autopilot is not designed to drive the car. It is a
    tool to assist the driver. The driver is required to be responsible
    for the safe operation of the car at all times. I can point out to
    you the many, many times the car acts like a spaz and requires me to manage the situation. Early on, there was a left turn like on a 50
    mph road, the car would want to turn into when intending to drive straight. Fortunately they have ironed out that level of issue. But
    it was always my responsibility to prevent it from causing an
    accident. So how would you say anything was the fault of the
    autopilot?

    There are a few possibilities here (though I am not trying to claim that
    any of them are "right" in some objective sense). You might say they
    had believed that the "autopilot" was like a plane autopilot -

    It is exactly like an airplane autopilot.


    you can
    turn it on and leave it to safely drive itself for most of the journey except perhaps the very beginning and very end of the trip. As you say,
    the Tesla autopilot is /not/ designed for that - that might be a mistake from the salesmen, advertisers, user-interface designers, or just the driver's mistake.

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
    the plane, with or without the autopilot.


    And sometimes the autopilot does something daft - it is no longer
    assisting the driver, but working against him or her. That, I think,
    should be counted as a mistake by the autopilot.

    The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.

    --

    Rick C.

    +- Get 1,000 miles of free Supercharging
    +- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Fri Apr 1 00:39:03 2022
    On 01/04/2022 00:29, Ricky wrote:
    On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
    On 31/03/2022 22:44, Ricky wrote:
    On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
    On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    <snip>
    No, it is not "too strong". It is basic statistics. Bayes' theorem,
    and all that. If a large proportion of people use autopilot, but
    only a small fraction of the deaths had the autopilot on, then
    clearly the autopilot reduces risks and saves lives (of those that
    drive Teslas - we still know nothing of other car drivers).

    A simple comparison of numbers is not sufficient. Most Tesla
    autopilot usage is on highways which are much safer per mile driven
    than other roads. That's an inherent bias because while
    non-autopilot driving must include all situations, autopilot simply
    doesn't work in most environments.

    Yes. An apples-to-apples comparison is the aim, or at least as close as
    one can get.

    I suspect - without statistical justification -

    Yes, without justification, at all.

    Which do /you/ think is most likely? Autopilot crashes on the motorway,
    or autopilot crashes on smaller roads?


    that the accidents
    involving autopilot use are precisely cases where you don't have a good,
    clear highway, and autopilot was used in a situation where it was not
    suitable. Getting good statistics and comparisons here could be helpful
    in making it safer - perhaps adding a feature that has the autopilot say
    "This is not a good road for me - you have to drive yourself" and switch
    itself off. (It would be more controversial, but probably statistically
    safer, if it also sometimes said "I'm better at driving on this kind of
    road than you are" and switching itself on!)

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on yootoob
    aren't confidence inspiring. Based on one I saw, I certainly
    wouldn't dare let a Tesla drive itself in an urban environment,

    I suspect there isn't sufficient experience to assess relative
    dangers between "artificial intelligence" and "natural
    stupidity".
    I don't doubt at all that the Tesla autopilot makes mistakes.

    Which depends on how you define "mistakes".
    Of course.
    It's a bit like asking
    if your rear view mirror makes mistakes by not showing cars in the
    blind spot. The autopilot is not designed to drive the car. It is a
    tool to assist the driver. The driver is required to be responsible
    for the safe operation of the car at all times. I can point out to
    you the many, many times the car acts like a spaz and requires me to
    manage the situation. Early on, there was a left turn like on a 50
    mph road, the car would want to turn into when intending to drive
    straight. Fortunately they have ironed out that level of issue. But
    it was always my responsibility to prevent it from causing an
    accident. So how would you say anything was the fault of the
    autopilot?

    There are a few possibilities here (though I am not trying to claim that
    any of them are "right" in some objective sense). You might say they
    had believed that the "autopilot" was like a plane autopilot -

    It is exactly like an airplane autopilot.


    you can
    turn it on and leave it to safely drive itself for most of the journey
    except perhaps the very beginning and very end of the trip. As you say,
    the Tesla autopilot is /not/ designed for that - that might be a mistake
    from the salesmen, advertisers, user-interface designers, or just the
    driver's mistake.

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
    the plane, with or without the autopilot.


    Yes, that's the original idea of a plane autopilot. But modern ones are
    more sophisticated and handle course changes along the planned route, as
    well as being able to land automatically. And more important than what
    plane autopilots actually /do/, is what people /think/ they do - and
    remember we are talking about drivers that think their Tesla "autopilot"
    will drive their car while they watch a movie or nap in the back seat.


    And sometimes the autopilot does something daft - it is no longer
    assisting the driver, but working against him or her. That, I think,
    should be counted as a mistake by the autopilot.

    The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.


    Well, "does something daft" is no worse than "acts like a spaz", and
    it's a good deal more politically correct!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to David Brown on Fri Apr 1 01:19:17 2022
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It
    simply maintains a heading and altitude.

    They have been doing more than that for for > 50 years.
    Cat 3b landings were in operation when I was a kid.


    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the
    pilot is responsible for flying the plane, with or without the autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones are more sophisticated and handle course changes along the planned route, as well as being able to land automatically. And more important than what plane autopilots actually /do/, is what people /think/ they do - and remember we are talking about drivers that think their Tesla "autopilot" will drive their car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Tom Gardner on Fri Apr 1 09:08:16 2022
    On 01/04/2022 02:19, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Someone still has to be watching
    for other aircraft and otherwise flying the plane.  In other words, the >>> pilot is responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot.  But modern ones
    are more
    sophisticated and handle course changes along the planned route, as
    well as
    being able to land automatically.  And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and
    remember we
    are talking about drivers that think their Tesla "autopilot" will
    drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    You don't. Twats will always be twats. You fix the cars.

    You start by changing the name. "Driver assistance" rather than
    "autopilot".

    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Tom Gardner on Fri Apr 1 00:46:19 2022
    On 3/31/2022 5:19 PM, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It >>> simply maintains a heading and altitude.

    They have been doing more than that for for > 50 years.
    Cat 3b landings were in operation when I was a kid.

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the
    pilot is responsible for flying the plane, with or without the autopilot. >>
    Yes, that's the original idea of a plane autopilot. But modern ones are more
    sophisticated and handle course changes along the planned route, as well as >> being able to land automatically. And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and remember we >> are talking about drivers that think their Tesla "autopilot" will drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    "Pilots" and "drivers" approach their efforts entirely differently
    and with different mindsets.

    ANYONE can drive a car. By contrast, a fair bit more understanding,
    reasoning and skill is required to pilot an aircraft.

    I.e., a pilot is a lot more likely to understand the function
    AND LIMITATIONS of an (aircraft) autopilot than a driver is to
    have similar appreciation for an (automobile) "autopilot".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to David Brown on Fri Apr 1 08:17:24 2022
    On 01/04/22 08:08, David Brown wrote:
    On 01/04/2022 02:19, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Someone still has to be watching
    for other aircraft and otherwise flying the plane.  In other words, the >>>> pilot is responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot.  But modern ones
    are more
    sophisticated and handle course changes along the planned route, as
    well as
    being able to land automatically.  And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and
    remember we
    are talking about drivers that think their Tesla "autopilot" will
    drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    You don't. Twats will always be twats. You fix the cars.

    You start by changing the name. "Driver assistance" rather than
    "autopilot".

    That's one of the things I was thinking of.


    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention.

    I've wondered why they don't implement that, then realised
    it would directly contradict their advertising.


    (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    Modes are a pain[1]. Too often plane crash investigators hear
    "what's it doing /now/" on the CVR.

    There was also the case that wheel brakes should not be applied
    until after landing, and that was defined by "wheels are rotating".
    Then an aquaplaning aircraft skidded off the end of the runway!

    [1]Remember the early Smalltalk T-shirt drawing attention
    to the novel concept of WIMP interface using the motto
    "don't mode me in"?)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to David Brown on Fri Apr 1 10:46:30 2022
    On 2022-04-01 09:08, David Brown wrote:
    On 01/04/2022 02:19, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the >>>> pilot is responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones
    are more
    sophisticated and handle course changes along the planned route, as
    well as
    being able to land automatically. And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and
    remember we
    are talking about drivers that think their Tesla "autopilot" will
    drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    You don't. Twats will always be twats. You fix the cars.

    You start by changing the name. "Driver assistance" rather than
    "autopilot".

    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)


    All these 'assistants' with their multiple 'modes' only make things
    more complicated and therefor unsafe. Simple is better.

    I recently got a car that came standard with 'lane assist'. I
    hate it. It's like having a passenger tugging on the steering wheel,
    absolutely intolerable. It also can't be switched off permanently.
    For the first week or two, I just blindfolded the camera it uses to
    watch the road, until I found out how to switch it off with a single
    button press. (There are far too many buttons, for that matter, and
    all with multiple functions, too. Bad!)

    That said, some automatic functions /are/ good. Climate control with
    a real thermostat, auto-darkening rear view mirrors, mostly functions
    that have nothing to do with driving per se. The only /good/ automatic functions are those you don't notice until they /stop/ working. I also
    like the GPS with head-up display.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to Don Y on Fri Apr 1 11:44:28 2022
    On 01/04/22 08:46, Don Y wrote:
    On 3/31/2022 5:19 PM, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works.  It doesn't fly the plane.  It >>>> simply maintains a heading and altitude.

    They have been doing more than that for for > 50 years.
    Cat 3b landings were in operation when I was a kid.

    Someone still has to be watching
    for other aircraft and otherwise flying the plane.  In other words, the >>>> pilot is responsible for flying the plane, with or without the autopilot. >>>
    Yes, that's the original idea of a plane autopilot.  But modern ones are more
    sophisticated and handle course changes along the planned route, as well as >>> being able to land automatically.  And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and remember we >>> are talking about drivers that think their Tesla "autopilot" will drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    "Pilots" and "drivers" approach their efforts entirely differently
    and with different mindsets.

    They should do in one sense (differing machine/automation)
    and shouldn't in another (both are lethal instruments).

    Problem starts with the marketing.


    ANYONE can drive a car.  By contrast, a fair bit more understanding, reasoning and skill is required to pilot an aircraft.

    Not entirely sure about that. 14yo can be solo, and a
    very few are even aerobatic pilots.

    The main difference is that you can't stop and catch
    your breath, or stop and have a pee.

    Overall learning to fly a glider is pretty much similar
    to learning to drive - in cost, time and skill.

    The training
    is more rigorous, though, and isn't a one-off event.


    I.e., a pilot is a lot more likely to understand the function
    AND LIMITATIONS of an (aircraft) autopilot than a driver is to
    have similar appreciation for an (automobile) "autopilot".

    Pilots often don't understand what's going on; just
    listen to the accident reports on the news :(

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Tom Gardner on Fri Apr 1 04:32:32 2022
    On 4/1/2022 3:44 AM, Tom Gardner wrote:
    On 01/04/22 08:46, Don Y wrote:
    On 3/31/2022 5:19 PM, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It >>>>> simply maintains a heading and altitude.

    They have been doing more than that for for > 50 years.
    Cat 3b landings were in operation when I was a kid.

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the >>>>> pilot is responsible for flying the plane, with or without the autopilot. >>>>
    Yes, that's the original idea of a plane autopilot. But modern ones are more
    sophisticated and handle course changes along the planned route, as well as
    being able to land automatically. And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and remember we >>>> are talking about drivers that think their Tesla "autopilot" will drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    "Pilots" and "drivers" approach their efforts entirely differently
    and with different mindsets.

    They should do in one sense (differing machine/automation)
    and shouldn't in another (both are lethal instruments).

    Problem starts with the marketing.

    Cars are far more ubiquitous. And, navigation is a 2-dimensional activity.

    An "average joe" isn't likely to think hes gonna "hop in a piper cub" and
    be off on a jaunt to run errands. And, navigation is a 3-dimensional undertaking (you don't worry about vehicles "above" or "below", when driving!)

    ANYONE can drive a car. By contrast, a fair bit more understanding,
    reasoning and skill is required to pilot an aircraft.

    Not entirely sure about that. 14yo can be solo, and a
    very few are even aerobatic pilots.

    And a "youngster" can drive a car (or other sort of motorized vehicle, e.g., on a farm or other private property). The 16yo (15.5) restriction only applies to the use on public roadways.

    <https://www.abc4.com/news/local-news/underage-utah-boy-caught-driving-wrong-way-in-slc/>

    <https://www.kgun9.com/news/local-news/cochise-county-four-smuggling-busts-within-five-hours-14-year-old-driver-involved>

    Cars are "simple" to operate; can-your-feet-reach-the-pedals being the only practical criteria. I'd wager *I* would have a hard time walking up to
    an aircraft, "cold", and trying to sort out how to get it off the ground...

    The main difference is that you can't stop and catch
    your breath, or stop and have a pee.

    Overall learning to fly a glider is pretty much similar
    to learning to drive - in cost, time and skill.

    But not opportunity. I'd have to spend a fair bit of effort researching
    where to gain access to any sort of aircraft. OTOH, I can readily "borrow" (with consent) any of my neighbors' vehicles and operate all of them in
    a fairly consistent manner: sports cars, trucks, commercial trucks, even motorcycles (though never having driven one, before!).

    The training
    is more rigorous, though, and isn't a one-off event.

    It's likely more technical, too. Most auto-driving instruction deals
    with laws, not the technical "piloting" of the vehicle. The driving test
    is similarly focused on whether or not you put that law knowledge into
    effect (did you stop *at* the proper point? did you observe the speed
    limit and other posted requirements?)

    [When taking the test for my *first* DL, the DMV was notorious for
    having a stop sign *in* the (tiny) parking lot -- in an unexpected
    place. Folks who weren't observant -- or tipped off to this ahead
    of time -- were "failed" before ever getting out on the roadway!]

    Testing for a CDL (commercial) is considerably different; you are
    quizzed on technical details of the vehicle that affect the safety of
    you and others on the roadway -- because you are operating a much more
    "lethal" vehicle (< 26,000 pounds GVW). You also have to prove yourself medically *fit* to operate (not color blind, not an insulin user,
    "controlled" blood pressure, nonepileptic, alchoholic, etc.!

    And, other "endorsements" have further requirements (e.g., hauling tandem/triples, hazardous products, etc.)

    I.e., a pilot is a lot more likely to understand the function
    AND LIMITATIONS of an (aircraft) autopilot than a driver is to
    have similar appreciation for an (automobile) "autopilot".

    Pilots often don't understand what's going on; just
    listen to the accident reports on the news :(

    I think those events are caused by cognitive overload, not ignorance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to Don Y on Fri Apr 1 13:13:46 2022
    On 01/04/22 12:32, Don Y wrote:
    On 4/1/2022 3:44 AM, Tom Gardner wrote:
    On 01/04/22 08:46, Don Y wrote:
    On 3/31/2022 5:19 PM, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works.  It doesn't fly the plane.  It
    simply maintains a heading and altitude.

    They have been doing more than that for for > 50 years.
    Cat 3b landings were in operation when I was a kid.

    Someone still has to be watching
    for other aircraft and otherwise flying the plane.  In other words, the >>>>>> pilot is responsible for flying the plane, with or without the autopilot.

    Yes, that's the original idea of a plane autopilot.  But modern ones are more
    sophisticated and handle course changes along the planned route, as well as
    being able to land automatically.  And more important than what plane >>>>> autopilots actually /do/, is what people /think/ they do - and remember we
    are talking about drivers that think their Tesla "autopilot" will drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    "Pilots" and "drivers" approach their efforts entirely differently
    and with different mindsets.

    They should do in one sense (differing machine/automation)
    and shouldn't in another (both are lethal instruments).

    Problem starts with the marketing.

    Cars are far more ubiquitous.  And, navigation is a 2-dimensional activity.

    An "average joe" isn't likely to think hes gonna "hop in a piper cub" and
    be off on a jaunt to run errands.  And, navigation is a 3-dimensional undertaking (you don't worry about vehicles "above" or "below", when driving!)

    True, but it doesn't change any of my points.



    ANYONE can drive a car.  By contrast, a fair bit more understanding,
    reasoning and skill is required to pilot an aircraft.

    Not entirely sure about that. 14yo can be solo, and a
    very few are even aerobatic pilots.

    And a "youngster" can drive a car (or other sort of motorized vehicle, e.g., on
    a farm or other private property).  The 16yo (15.5) restriction only applies to
    the use on public roadways.

    12yo fly across the country with an instructor behind.
    14yo can do it on their own.

    Daughter was driving my car and a double decker bus at 15yo,
    on the runway and peritrack :)


    Cars are "simple" to operate; can-your-feet-reach-the-pedals being the only practical criteria.  I'd wager *I* would have a hard time walking up to
    an aircraft, "cold", and trying to sort out how to get it off the ground...

    Same is true of a glider. There are only 4 controls: rudder,
    stick, airbrake, cable release. Two instruments, airspeed
    and barometer (i.e. height differential).

    You are taught to do without them, because they all lie to
    you.



    The main difference is that you can't stop and catch
    your breath, or stop and have a pee.

    Overall learning to fly a glider is pretty much similar
    to learning to drive - in cost, time and skill.

    But not opportunity.  I'd have to spend a fair bit of effort researching where to gain access to any sort of aircraft.  OTOH, I can readily "borrow" (with consent) any of my neighbors' vehicles and operate all of them in
    a fairly consistent manner: sports cars, trucks, commercial trucks, even motorcycles (though never having driven one, before!).

    True, but it doesn't change any of my points.



    The training
    is more rigorous, though, and isn't a one-off event.

    It's likely more technical, too.  Most auto-driving instruction deals
    with laws, not the technical "piloting" of the vehicle.  The driving test
    is similarly focused on whether or not you put that law knowledge into
    effect (did you stop *at* the proper point?  did you observe the speed
    limit and other posted requirements?)

    Not much is required to go solo.

    Does the glider's responsiveness indicate you are flying
    fast enough; are you at a reasonable height in the circuit;
    what to do when you find you aren't and when the cable snaps.



    [When taking the test for my *first* DL, the DMV was notorious for
    having a stop sign *in* the (tiny) parking lot -- in an unexpected
    place.  Folks who weren't observant -- or tipped off to this ahead
    of time -- were "failed" before ever getting out on the roadway!]

    Pre-solo tests include the instructor putting you in a
    stupid position, and saying "now get us back safely".



    Testing for a CDL (commercial) is considerably different; you are
    quizzed on technical details of the vehicle that affect the safety of
    you and others on the roadway -- because you are operating a much more "lethal" vehicle (< 26,000 pounds GVW).  You also have to prove yourself medically *fit* to operate (not color blind, not an insulin user, "controlled" blood pressure, nonepileptic, alchoholic, etc.!

    Ditto being an instructor or having a passenger.



    And, other "endorsements" have further requirements (e.g., hauling tandem/triples, hazardous products, etc.)

    Ditto flying cross country or in clouds.


    I.e., a pilot is a lot more likely to understand the function
    AND LIMITATIONS of an (aircraft) autopilot than a driver is to
    have similar appreciation for an (automobile) "autopilot".

    That's true for the aircraft, but nobody has developed
    an autopilot. You have to stay awake feel (literally,
    by the seat of your pants) what's happening. The nearest
    to an autopilot is a moving map airspace display.


    Pilots often don't understand what's going on; just
    listen to the accident reports on the news :(

    I think those events are caused by cognitive overload, not ignorance.

    Not always, e.g. the recent 737 crashes.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Tom Gardner on Fri Apr 1 05:42:33 2022
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It
    simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years.
    Cat 3b landings were in operation when I was a kid.
    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the
    pilot is responsible for flying the plane, with or without the autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones are more
    sophisticated and handle course changes along the planned route, as well as
    being able to land automatically. And more important than what plane autopilots actually /do/, is what people /think/ they do - and remember we are talking about drivers that think their Tesla "autopilot" will drive their
    car while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    That's Tom Gardner level misinformation. Comments about what people think are spurious and unsubstantiated. A class of "twats" can be invented that think anything. Nothing matters other than what Tesla owners think. They are the ones driving the
    cars.

    --

    Rick C.

    --- Get 1,000 miles of free Supercharging
    --- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Fri Apr 1 05:38:14 2022
    On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:
    On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
    On 31/03/2022 22:44, Ricky wrote:
    On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote: >>>> On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    <snip>
    No, it is not "too strong". It is basic statistics. Bayes' theorem, >>>> and all that. If a large proportion of people use autopilot, but
    only a small fraction of the deaths had the autopilot on, then
    clearly the autopilot reduces risks and saves lives (of those that
    drive Teslas - we still know nothing of other car drivers).

    A simple comparison of numbers is not sufficient. Most Tesla
    autopilot usage is on highways which are much safer per mile driven
    than other roads. That's an inherent bias because while
    non-autopilot driving must include all situations, autopilot simply
    doesn't work in most environments.

    Yes. An apples-to-apples comparison is the aim, or at least as close as >> one can get.

    I suspect - without statistical justification -

    Yes, without justification, at all.
    Which do /you/ think is most likely? Autopilot crashes on the motorway,
    or autopilot crashes on smaller roads?

    Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.

    I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.


    that the accidents
    involving autopilot use are precisely cases where you don't have a good, >> clear highway, and autopilot was used in a situation where it was not
    suitable. Getting good statistics and comparisons here could be helpful >> in making it safer - perhaps adding a feature that has the autopilot say >> "This is not a good road for me - you have to drive yourself" and switch >> itself off. (It would be more controversial, but probably statistically >> safer, if it also sometimes said "I'm better at driving on this kind of >> road than you are" and switching itself on!)

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on yootoob
    aren't confidence inspiring. Based on one I saw, I certainly
    wouldn't dare let a Tesla drive itself in an urban environment,

    I suspect there isn't sufficient experience to assess relative
    dangers between "artificial intelligence" and "natural
    stupidity".
    I don't doubt at all that the Tesla autopilot makes mistakes.

    Which depends on how you define "mistakes".
    Of course.
    It's a bit like asking
    if your rear view mirror makes mistakes by not showing cars in the
    blind spot. The autopilot is not designed to drive the car. It is a
    tool to assist the driver. The driver is required to be responsible
    for the safe operation of the car at all times. I can point out to
    you the many, many times the car acts like a spaz and requires me to
    manage the situation. Early on, there was a left turn like on a 50
    mph road, the car would want to turn into when intending to drive
    straight. Fortunately they have ironed out that level of issue. But
    it was always my responsibility to prevent it from causing an
    accident. So how would you say anything was the fault of the
    autopilot?

    There are a few possibilities here (though I am not trying to claim that >> any of them are "right" in some objective sense). You might say they
    had believed that the "autopilot" was like a plane autopilot -

    It is exactly like an airplane autopilot.


    you can
    turn it on and leave it to safely drive itself for most of the journey
    except perhaps the very beginning and very end of the trip. As you say, >> the Tesla autopilot is /not/ designed for that - that might be a mistake >> from the salesmen, advertisers, user-interface designers, or just the
    driver's mistake.

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
    the plane, with or without the autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones are
    more sophisticated and handle course changes along the planned route, as well as being able to land automatically. And more important than what
    plane autopilots actually /do/, is what people /think/ they do - and remember we are talking about drivers that think their Tesla "autopilot" will drive their car while they watch a movie or nap in the back seat.

    Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just like
    in the car, there is a pilot who's job is to fly/drive and assure safety.

    As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.


    And sometimes the autopilot does something daft - it is no longer
    assisting the driver, but working against him or her. That, I think,
    should be counted as a mistake by the autopilot.

    The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.

    Well, "does something daft" is no worse than "acts like a spaz", and
    it's a good deal more politically correct!

    Bzzzz. Sorry, you failed.

    --

    Rick C.

    ++ Get 1,000 miles of free Supercharging
    ++ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Fri Apr 1 05:44:38 2022
    On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
    On 01/04/2022 02:19, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the >>> pilot is responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones
    are more
    sophisticated and handle course changes along the planned route, as
    well as
    being able to land automatically. And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and
    remember we
    are talking about drivers that think their Tesla "autopilot" will
    drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".
    You don't. Twats will always be twats. You fix the cars.

    You start by changing the name. "Driver assistance" rather than
    "autopilot".

    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    Do you know anything about how the Tesla autopilot actually works? Anything at all?

    --

    Rick C.

    --+ Get 1,000 miles of free Supercharging
    --+ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Tom Gardner on Fri Apr 1 06:07:07 2022
    On 4/1/2022 5:13 AM, Tom Gardner wrote:
    On 01/04/22 12:32, Don Y wrote:
    On 4/1/2022 3:44 AM, Tom Gardner wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It
    simply maintains a heading and altitude.

    They have been doing more than that for for > 50 years.
    Cat 3b landings were in operation when I was a kid.

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the >>>>>>> pilot is responsible for flying the plane, with or without the autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones are >>>>>> more
    sophisticated and handle course changes along the planned route, as well as
    being able to land automatically. And more important than what plane >>>>>> autopilots actually /do/, is what people /think/ they do - and remember we
    are talking about drivers that think their Tesla "autopilot" will drive >>>>>> their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    "Pilots" and "drivers" approach their efforts entirely differently
    and with different mindsets.

    They should do in one sense (differing machine/automation)
    and shouldn't in another (both are lethal instruments).

    Problem starts with the marketing.

    Cars are far more ubiquitous. And, navigation is a 2-dimensional activity. >>
    An "average joe" isn't likely to think hes gonna "hop in a piper cub" and
    be off on a jaunt to run errands. And, navigation is a 3-dimensional
    undertaking (you don't worry about vehicles "above" or "below", when driving!)

    True, but it doesn't change any of my points.

    If access to a technology -- or ability to make use of that technology -- is limited, then it diminishes as a source of problems.

    I don't worry about someone (young/old/qualified/not) climbing into the cockpit of an A-10 at the military base down the street and strafing my neighborhood with it's 30mm Gatling gun -- despite the fact that there are many dozens of them sitting on the tarmac. And, nothing prevents an airman from getting unhinged and taking out his frustrations with his "vehicle of choice".
    Access is constrained as well as the know-how required to put it into use.

    OTOH, a 14 year old climbing in a (stolen?) vehicle presents a very real danger to me on my local roadways (note the articles cited in previous post). There are hundreds of such vehicles within a stone's throw -- regardless of where you happen to be throwing the stone!

    Even "heavy equipment" is operated (driven) in virtually the same way as cars (as we've had cases of joyriders in dump trucks, back hoes, graders, etc.)

    Can't recall any "average joe" taking an aircraft for a joyride, though!
    (they wouldn't know HOW)

    ANYONE can drive a car. By contrast, a fair bit more understanding,
    reasoning and skill is required to pilot an aircraft.

    Not entirely sure about that. 14yo can be solo, and a
    very few are even aerobatic pilots.

    And a "youngster" can drive a car (or other sort of motorized vehicle, e.g., on
    a farm or other private property). The 16yo (15.5) restriction only applies to
    the use on public roadways.

    12yo fly across the country with an instructor behind.
    14yo can do it on their own.

    Daughter was driving my car and a double decker bus at 15yo,
    on the runway and peritrack :)

    It doesn't matter what LEGALLY can be done. What matters is what can be TECHNICALLY performed. The 14 yo's in the articles I cited were each
    breaking the law. But, were still ABLE to access and operate the vehicles
    in question. Invite them to take your aircraft for a joyride and you'll
    find them sitting on the tarmac, hours later, still trying to figure out how
    to take off!

    I was driving (on private property) at 10. As were most of the (males!) in
    my extended family. Grandpa owned a large car business so all of the cousins would "work" at the shop. It was not uncommon to be handed a set of keys and told to bring the "white chevy" into bay #6 for new tires. And, once the
    new rubber was mounted, told to drive the car with the ass-end raised so they could be spin-balanced (<https://i.ytimg.com/vi/NJd-AnU71nQ/maxresdefault.jpg> in lieu of dynamic balancers). Then, around to the alignment pit. Finally, gassed up and parked waiting for the customer to pick it up.

    [Grandpa had a rather loose interpretation of what was "legal" -- and spent
    a fair bit of time behind bars for other "misinterpretations" :> ]

    Cars are "simple" to operate; can-your-feet-reach-the-pedals being the only >> practical criteria. I'd wager *I* would have a hard time walking up to
    an aircraft, "cold", and trying to sort out how to get it off the ground...

    Same is true of a glider. There are only 4 controls: rudder,
    stick, airbrake, cable release. Two instruments, airspeed
    and barometer (i.e. height differential).

    And a piper cub? Lear jet? Not all aircraft are gliders. And, a glider requires a "co-conspirator" to get it airborn! A car just requires "opportunity".

    You are taught to do without them, because they all lie to
    you.

    The main difference is that you can't stop and catch
    your breath, or stop and have a pee.

    Overall learning to fly a glider is pretty much similar
    to learning to drive - in cost, time and skill.

    But not opportunity. I'd have to spend a fair bit of effort researching
    where to gain access to any sort of aircraft. OTOH, I can readily "borrow" >> (with consent) any of my neighbors' vehicles and operate all of them in
    a fairly consistent manner: sports cars, trucks, commercial trucks, even
    motorcycles (though never having driven one, before!).

    True, but it doesn't change any of my points.

    The number of "flying" accidents vs. the number of "auto" accidents makes
    the point very well.

    The training
    is more rigorous, though, and isn't a one-off event.

    It's likely more technical, too. Most auto-driving instruction deals
    with laws, not the technical "piloting" of the vehicle. The driving test
    is similarly focused on whether or not you put that law knowledge into
    effect (did you stop *at* the proper point? did you observe the speed
    limit and other posted requirements?)

    Not much is required to go solo.

    Does the glider's responsiveness indicate you are flying
    fast enough; are you at a reasonable height in the circuit;
    what to do when you find you aren't and when the cable snaps.

    Piper cub? Lear jet?

    [When taking the test for my *first* DL, the DMV was notorious for
    having a stop sign *in* the (tiny) parking lot -- in an unexpected
    place. Folks who weren't observant -- or tipped off to this ahead
    of time -- were "failed" before ever getting out on the roadway!]

    Pre-solo tests include the instructor putting you in a
    stupid position, and saying "now get us back safely".

    Automobiles just try to catch you breaking a law. Damn near anyone who
    has driven a vehicle prior to being tested can get it from point A to
    point B.

    Commercial vehicles focus more on safety (and any ADDITIONAL laws that
    may apply -- e.g., a commercial vehicle must clearly be labeled and there
    are special enforcement units that will ticket for such violations) because they assume you already understand the basics of the legal requirements
    for a motor vehicle.

    I can't recall ANY "legal" issues in my forklift certification. But, lots
    of technical issues regarding how to safely operate the vehicle, transport loads, derate lifting capacity based on lift height, etc. And, a strong emphasis on how NOT to suffer a "crush injury"!

    Testing for a CDL (commercial) is considerably different; you are
    quizzed on technical details of the vehicle that affect the safety of
    you and others on the roadway -- because you are operating a much more
    "lethal" vehicle (< 26,000 pounds GVW). You also have to prove yourself
    medically *fit* to operate (not color blind, not an insulin user,
    "controlled" blood pressure, nonepileptic, alchoholic, etc.!

    Ditto being an instructor or having a passenger.

    And, other "endorsements" have further requirements (e.g., hauling
    tandem/triples, hazardous products, etc.)

    Ditto flying cross country or in clouds.

    Do you really think opportunists are going to hijack an aircraft
    and "hope" there are clear skies?

    There are simply too many impediments to aircraft being misused/abused
    to make it a real issue.

    I.e., a pilot is a lot more likely to understand the function
    AND LIMITATIONS of an (aircraft) autopilot than a driver is to
    have similar appreciation for an (automobile) "autopilot".

    That's true for the aircraft, but nobody has developed
    an autopilot. You have to stay awake feel (literally,
    by the seat of your pants) what's happening. The nearest
    to an autopilot is a moving map airspace display.

    Commercial aircraft rely on autopilots. In a sense, it is
    an easier (navigation) problem to solve -- there's no real "traffic"
    or other obstacles beyond the airports (assuming you maintain your
    assigned flight corridor/speed). The same is true of railways
    and waterways (more or less).

    Cars operate in a much more challenging environment. Even "on the open
    road", a condition can arise that needs immediate driver attention
    (witness these 50-car pileups).

    Note how poorly "seasoned" drivers adapt to the first snowfall of
    the season. (Really? Did you FORGET what this stuff was like??)
    Do they do any special (mental?) prep prior to getting behind the
    wheel, in those cases? Or, just "wing it", assuming "it will
    come back to them"?

    Pilots often don't understand what's going on; just
    listen to the accident reports on the news :(

    I think those events are caused by cognitive overload, not ignorance.

    Not always, e.g. the recent 737 crashes.

    So, a defect in an autopilot implementation can be similarly excused?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Tom Gardner on Fri Apr 1 05:46:54 2022
    On Friday, April 1, 2022 at 3:17:32 AM UTC-4, Tom Gardner wrote:
    On 01/04/22 08:08, David Brown wrote:
    On 01/04/2022 02:19, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the >>>> pilot is responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones
    are more
    sophisticated and handle course changes along the planned route, as
    well as
    being able to land automatically. And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and
    remember we
    are talking about drivers that think their Tesla "autopilot" will
    drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".

    You don't. Twats will always be twats. You fix the cars.

    You start by changing the name. "Driver assistance" rather than "autopilot".
    That's one of the things I was thinking of.
    You turn the steering wheel into a dead-man's handle - if the driver releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention.
    I've wondered why they don't implement that, then realised
    it would directly contradict their advertising.

    Please tell us what the Tesla advertising says that would be contradicted?

    --

    Rick C.

    -+- Get 1,000 miles of free Supercharging
    -+- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Jeroen Belleman on Fri Apr 1 06:38:47 2022
    On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    All these 'assistants' with their multiple 'modes' only make things
    more complicated and therefor unsafe. Simple is better.

    "Assistance" should be intuitive. You don't even NOTICE the power
    steering, brakes, autotranny, etc. "assistants" in a vehicle.
    Because, for the most part, the way they operate is largely invariant
    of driver, driving conditions, etc. (how often do folks use anything
    other than "D(rive)" and "R(everse)"? Is there a way to *disable*
    the power steering? Or brakes? Should there be?

    I recently got a car that came standard with 'lane assist'. I
    hate it. It's like having a passenger tugging on the steering wheel, absolutely intolerable. It also can't be switched off permanently.
    For the first week or two, I just blindfolded the camera it uses to
    watch the road, until I found out how to switch it off with a single
    button press. (There are far too many buttons, for that matter, and
    all with multiple functions, too. Bad!)

    When shopping for SWMBO's vehicle, we were waiting for the salesman for
    a test drive. Vehicle running, me behind the wheel.

    One by one, error indicators came on -- all pertaining to the forward
    looking "assistants" (too close to vehicle in front of you, maintain
    your lane, etc.). Each error indicated the associated system was
    offline due to a fault.

    When I questioned the salesman (did *I* do something to cause that?),
    he dismissed it as a consequence of the high temperatures (40+C is
    common, here -- at least 1 out of 6 days). So, tell me why I should
    pay extra for this feature? And, how much faith I should have in it
    performing as advertised?? <frown>

    That said, some automatic functions /are/ good. Climate control with
    a real thermostat, auto-darkening rear view mirrors, mostly functions
    that have nothing to do with driving per se. The only /good/ automatic functions are those you don't notice until they /stop/ working.

    These are all functions that aren't interactive -- like my brakes/tranny examples. You don't expect to have to make changes to the mechanism, especially while driving.

    I do like certain steering wheel mounted controls (e.g., radio/music)
    as it helps me keep my eyes on the road (instead of reaching over
    to adjust volume, select different source material, etc.) But, have
    yet to find a use for the paddle shifters -- and the stalk-mounted
    controls are too numerous on too few stalks!

    I also like the GPS with head-up display.

    My favorite is the side mirrors tilting downwards (to afford a view
    of the ground) when backing up. The backup camera is a win as we back into
    our garage and it helps avoid backing INTO something. These would be less necessary with a "lower profile" vehicle, though.

    [I also like the trip computer automatically reseting at each trip
    and "fill up"]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Fri Apr 1 16:08:23 2022
    On 01/04/2022 14:44, Ricky wrote:
    On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
    On 01/04/2022 02:19, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the >>>>> pilot is responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones
    are more
    sophisticated and handle course changes along the planned route, as
    well as
    being able to land automatically. And more important than what plane
    autopilots actually /do/, is what people /think/ they do - and
    remember we
    are talking about drivers that think their Tesla "autopilot" will
    drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".
    You don't. Twats will always be twats. You fix the cars.

    You start by changing the name. "Driver assistance" rather than
    "autopilot".

    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    Do you know anything about how the Tesla autopilot actually works? Anything at all?


    A little - but not a lot, and no personal experience.

    So fill in the details here.

    You've already told us that it is designed for things like motorway
    driving (or "highway" driving). Presumably you stick by that, and
    therefore agree that any restrictions on the autopilot should be lower
    for motorway driving than for more "challenging" driving such as town
    roads or small, twisty country roads.

    People already manage to read newspapers or eat their breakfast in
    traffic queues, in purely manual cars. Do you think autopilot can
    handle that kind of traffic?

    My suggestion is that a way to ensure people have more focus on driving
    is to require contact with the steering wheel. I am happy to hear your objections to that idea, or to alternative thoughts.


    Improper use of autopilot (and other automation in all kinds of cars)
    leads to a higher risk of accidents. I expect that proper use can lower
    risk. Do you disagree with these two claims?

    Do you think Tesla's autopilot is perfect as it is, or is there room for improvement?

    Do you actually want to contribute something to this thread, or do you
    just want to attack any post that isn't Tesla fanboy support? (Your
    answers to the previous questions will cover this one too.)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Fri Apr 1 16:12:18 2022
    On 01/04/2022 14:42, Ricky wrote:
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the
    plane. It simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years. Cat 3b
    landings were in operation when I was a kid.
    Someone still has to be watching for other aircraft and
    otherwise flying the plane. In other words, the pilot is
    responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern
    ones are more sophisticated and handle course changes along the
    planned route, as well as being able to land automatically. And
    more important than what plane autopilots actually /do/, is what
    people /think/ they do - and remember we are talking about
    drivers that think their Tesla "autopilot" will drive their car
    while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept into the
    heads of twats that think "autopilot" means "it does it for me".

    That's Tom Gardner level misinformation. Comments about what people
    think are spurious and unsubstantiated. A class of "twats" can be
    invented that think anything. Nothing matters other than what Tesla
    owners think. They are the ones driving the cars.


    Are you suggesting that none of the people who drive Teslas are twats?
    (Maybe that term is too British for you.)

    And are you suggesting that only Tesla drivers are affected by Tesla
    crashes? Obviously they will be disproportionally affected, but motor accidents often involve other people and other cars. And while Tesla
    may be leading the way in car "autopiloting", others are following - the strengths and weaknesses of Tesla's systems are relevant to other car manufacturers.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to Don Y on Fri Apr 1 16:17:46 2022
    On 2022-04-01 15:38, Don Y wrote:
    On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    All these 'assistants' with their multiple 'modes' only make things
    more complicated and therefor unsafe. Simple is better.

    "Assistance" should be intuitive. You don't even NOTICE the power
    steering, brakes, autotranny, etc. "assistants" in a vehicle.
    Because, for the most part, the way they operate is largely invariant
    of driver, driving conditions, etc. (how often do folks use anything
    other than "D(rive)" and "R(everse)"? Is there a way to *disable*
    the power steering? Or brakes? Should there be?

    I much prefer a simple stick shift. I can tell what state it's in
    by touch, and there in not the slightest doubt about it. That isn't
    true for an automatic. You need to /look/ what state it's in. They're
    too temperamental to my taste, refusing to change state under certain conditions. Same for the electric parking brake. It took me a while to
    figure out it refuses to disengage when I'm not wearing seat belts.
    Sheesh! Talk about weird interactions!

    Power steering and brakes are in the set of assists that normally
    go unnoticed until they fail. (Provided they are essentially linear,
    smooth, without discontinuity or other surprise behaviour.)

    [...]

    My favorite is the side mirrors tilting downwards (to afford a view
    of the ground) when backing up. The backup camera is a win as we back into our garage and it helps avoid backing INTO something. These would be less necessary with a "lower profile" vehicle, though.

    [I also like the trip computer automatically reseting at each trip
    and "fill up"]

    Yes, got that too, and I agree those are good features.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Fri Apr 1 16:29:50 2022
    On 01/04/2022 14:38, Ricky wrote:
    On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:
    On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
    On 31/03/2022 22:44, Ricky wrote:
    On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote: >>>>>> On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    <snip>
    No, it is not "too strong". It is basic statistics. Bayes' theorem, >>>>>> and all that. If a large proportion of people use autopilot, but
    only a small fraction of the deaths had the autopilot on, then
    clearly the autopilot reduces risks and saves lives (of those that >>>>>> drive Teslas - we still know nothing of other car drivers).

    A simple comparison of numbers is not sufficient. Most Tesla
    autopilot usage is on highways which are much safer per mile driven
    than other roads. That's an inherent bias because while
    non-autopilot driving must include all situations, autopilot simply
    doesn't work in most environments.

    Yes. An apples-to-apples comparison is the aim, or at least as close as >>>> one can get.

    I suspect - without statistical justification -

    Yes, without justification, at all.
    Which do /you/ think is most likely? Autopilot crashes on the motorway,
    or autopilot crashes on smaller roads?

    Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.


    I was not aware of that limitation. Thanks for providing some relevant information.

    I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.


    that the accidents
    involving autopilot use are precisely cases where you don't have a good, >>>> clear highway, and autopilot was used in a situation where it was not
    suitable. Getting good statistics and comparisons here could be helpful >>>> in making it safer - perhaps adding a feature that has the autopilot say >>>> "This is not a good road for me - you have to drive yourself" and switch >>>> itself off. (It would be more controversial, but probably statistically >>>> safer, if it also sometimes said "I'm better at driving on this kind of >>>> road than you are" and switching itself on!)

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on yootoob
    aren't confidence inspiring. Based on one I saw, I certainly
    wouldn't dare let a Tesla drive itself in an urban environment,

    I suspect there isn't sufficient experience to assess relative
    dangers between "artificial intelligence" and "natural
    stupidity".
    I don't doubt at all that the Tesla autopilot makes mistakes.

    Which depends on how you define "mistakes".
    Of course.
    It's a bit like asking
    if your rear view mirror makes mistakes by not showing cars in the
    blind spot. The autopilot is not designed to drive the car. It is a
    tool to assist the driver. The driver is required to be responsible
    for the safe operation of the car at all times. I can point out to
    you the many, many times the car acts like a spaz and requires me to >>>>> manage the situation. Early on, there was a left turn like on a 50
    mph road, the car would want to turn into when intending to drive
    straight. Fortunately they have ironed out that level of issue. But
    it was always my responsibility to prevent it from causing an
    accident. So how would you say anything was the fault of the
    autopilot?

    There are a few possibilities here (though I am not trying to claim that >>>> any of them are "right" in some objective sense). You might say they
    had believed that the "autopilot" was like a plane autopilot -

    It is exactly like an airplane autopilot.


    you can
    turn it on and leave it to safely drive itself for most of the journey >>>> except perhaps the very beginning and very end of the trip. As you say, >>>> the Tesla autopilot is /not/ designed for that - that might be a mistake >>>> from the salesmen, advertisers, user-interface designers, or just the
    driver's mistake.

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
    the plane, with or without the autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones are
    more sophisticated and handle course changes along the planned route, as
    well as being able to land automatically. And more important than what
    plane autopilots actually /do/, is what people /think/ they do - and
    remember we are talking about drivers that think their Tesla "autopilot"
    will drive their car while they watch a movie or nap in the back seat.

    Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just
    like in the car, there is a pilot who's job is to fly/drive and assure safety.


    I am fully aware that plane autopilots are limited. I am also aware
    that they are good enough (in planes equipped with modern systems) to
    allow pilots to let the system handle most of the flight itself, even
    including landing. The pilot is, of course, expected to be paying
    attention, watching for other aircraft, communicating with air traffic controllers and all the rest of it. But there have been cases of pilots falling asleep, or missing their destination because they were playing
    around on their laptops. What people /should/ be doing, and what they
    are /actually/ doing, is not always the same.

    As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.


    You can google for "backseat Tesla drivers" as well as I can. I am
    confident that some of these are staged, and equally confident that some
    are not. There is no minimum level of "thinking" - no matter how daft something might be, there is always a dafter person who will think it's
    a good idea.


    And sometimes the autopilot does something daft - it is no longer
    assisting the driver, but working against him or her. That, I think,
    should be counted as a mistake by the autopilot.

    The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.

    Well, "does something daft" is no worse than "acts like a spaz", and
    it's a good deal more politically correct!

    Bzzzz. Sorry, you failed.


    Really? You think describing the autopilot's actions as "acts like a
    spaz" is useful and specific, while "does something daft" is not? As
    for the political correctness - find a real spastic and ask them what
    they think of your phrase.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to All on Fri Apr 1 16:12:57 2022
    On 01/04/22 14:07, Don Y wrote:

    <snipped many points where we are talking about
    different classes of aircraft and air traffic>

    I.e., a pilot is a lot more likely to understand the function
    AND LIMITATIONS of an (aircraft) autopilot than a driver is to
    have similar appreciation for an (automobile) "autopilot".

    That's true for the aircraft, but nobody has developed
    an autopilot. You have to stay awake feel (literally,
    by the seat of your pants) what's happening. The nearest
    to an autopilot is a moving map airspace display.

    Commercial aircraft rely on autopilots.  In a sense, it is
    an easier (navigation) problem to solve -- there's no real "traffic"
    or other obstacles beyond the airports (assuming you maintain your
    assigned flight corridor/speed).  The same is true of railways
    and waterways (more or less).

    Er, no.

    You are considering a small part of air traffic, that
    in controlled airspace.

    Many flights, powered and unpowered, happen outside
    controlled airspace, where the rule is to look out
    of the cockpit for converging traffic.

    One one occasion I watched a commercial airliner
    taking off a thousand feet below me. Hercules buzz
    around too. Then there are balloons, hang gliders
    and the like.

    There are even rules as to which side of roads and
    railways you should fly on, so that there aren't
    head-on collisions between aircraft following the
    same ground feature in opposite directions

    Gliders frequently operate very near each other,
    especially in thermals and when landing. They
    also have to spot other gliders coming straight
    at them when ridge flying; not trivial to spot
    a white blob the size of a motorbike's front
    converging at 120mph.

    To help cope with that, some gliders are equipped
    with FLARMs - short range radio transmitters to
    indicate the direction of other gliders and whether
    you are likely to hit them.



    Cars operate in a much more challenging environment.  Even "on the open road", a condition can arise that needs immediate driver attention
    (witness these 50-car pileups).

    Note how poorly "seasoned" drivers adapt to the first snowfall of
    the season.  (Really?  Did you FORGET what this stuff was like??)
    Do they do any special (mental?) prep prior to getting behind the
    wheel, in those cases?  Or, just "wing it", assuming "it will
    come back to them"?

    Pilots often don't understand what's going on; just
    listen to the accident reports on the news :(

    I think those events are caused by cognitive overload, not ignorance.

    Not always, e.g. the recent 737 crashes.

    So, a defect in an autopilot implementation can be similarly excused?

    Que? Strawman.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Jeroen Belleman on Fri Apr 1 07:48:46 2022
    On 4/1/2022 7:17 AM, Jeroen Belleman wrote:
    On 2022-04-01 15:38, Don Y wrote:
    On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not >>>> pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    All these 'assistants' with their multiple 'modes' only make things
    more complicated and therefor unsafe. Simple is better.

    "Assistance" should be intuitive. You don't even NOTICE the power
    steering, brakes, autotranny, etc. "assistants" in a vehicle.
    Because, for the most part, the way they operate is largely invariant
    of driver, driving conditions, etc. (how often do folks use anything
    other than "D(rive)" and "R(everse)"? Is there a way to *disable*
    the power steering? Or brakes? Should there be?

    I much prefer a simple stick shift. I can tell what state it's in
    by touch, and there in not the slightest doubt about it. That isn't
    true for an automatic. You need to /look/ what state it's in.

    Why do you care? You can tell if it is in R/N/D simply by the feel of
    where the control is presently positioned. If I care what "gear"
    the transmission is currently operating in, I can look at the display
    located between the speedo and tach (a place that your eyes will
    always consult).

    In most of the places I've lived, there is little need to be able
    to "force" the transmission into a particular gear -- that it wouldn't
    have already assumed, on its own. And, in difficult driving conditions
    (e.g., coming out of the mountains, here), you'd likely want the vehicle
    to manage most of those decisions (e.g., overheating the transmission
    on a long downgrade).

    SWMBO used to prefer a stick. I convinced her to move to an automatic
    as it would be one less control to deal with as she aged. And, migrated
    her to a larger displacement engine -- for similar reasons. She now often relies on these two changes to extricate herself from dangerous
    situations (e.g., when oncoming traffic isn't yielding as it should).
    (this isn't always a pleasant experience when I'm a passenger! :< )

    They're
    too temperamental to my taste, refusing to change state under certain conditions. Same for the electric parking brake. It took me a while to
    figure out it refuses to disengage when I'm not wearing seat belts.
    Sheesh! Talk about weird interactions!

    Hmmm... that's a new one, for me. I'm more bothered by all of the "alarms"
    or warnings. "You haven't put the vehicle in PARK but turned off the
    ignition" "The headlights are still on" and my personal peeve "You have
    exited the vehicle -- WITH the keyfob -- while it is still running"
    (what the hell are they trying to tell me that I don't already know?
    That I left the car *running*?? I can understand an alert if I've left
    the key *in* the running car...

    The most annoying aspect of all this is that there is not a unified means of presenting this information! Sometimes it appears as text on a display
    (I don't recall where I told the car that I prefer ENGLISH!), sometimes
    a cryptic idiot light on the dash, sometimes a coded audio annunciator
    (what the hell does THAT noise mean???), etc.

    (There are three full graphic displays in the car. Can't you sort out how
    to use them to TELL me what you think I've done wrong?)

    Power steering and brakes are in the set of assists that normally
    go unnoticed until they fail. (Provided they are essentially linear,
    smooth, without discontinuity or other surprise behaviour.)

    Exactly. You wouldn't want to have a switch to turn them on or off.
    I'm not convinced of the utility of having the headlights come on automatically. Or, the windshield wipers. But, those can be disabled. Likewise for the tilt-down mirrors (SWMBO took a long time to warm to
    that idea)

    [...]

    My favorite is the side mirrors tilting downwards (to afford a view
    of the ground) when backing up. The backup camera is a win as we back into >> our garage and it helps avoid backing INTO something. These would be less >> necessary with a "lower profile" vehicle, though.

    [I also like the trip computer automatically reseting at each trip
    and "fill up"]

    Yes, got that too, and I agree those are good features.

    An EV driver wouldn't see the need for that sort of trip computer.
    But, might reason the need for one that shows battery consumption
    since last recharge -- or, "on this trip".

    It's also annoying when the car plays nanny and won't let THE PASSENGER
    use certain controls while the vehicle is in motion.

    And, these restrictions (intended on the *driver*) are inconsistent.
    So, a driver is "distracted" by a control that may -- or may not -- be
    operable in a certain driving condition (why can I twiddle with the
    radio presets while driving but not specify a new GPS destination?).

    And, the quality of the implementations is really piss poor (in every
    vehicle that we auditioned!). I recall typing in the name of a store
    to which I wanted to drive. I apparently misspelled it as the destination
    it selected was 1500 miles away (whereas the actual store was just a few
    miles away -- but I couldn't recall on which of several parallel roads it
    was located!). C'mon, do you REALLY think I want to lay in a course to
    a destination that far from here? And, that I would do so often enough
    that you should blindly assume that to be the case?? E.g., maybe a
    prompt saying "We found a location by that name 1500 miles from here.
    Is that what you intended (in case you haven't NOTICED that to be the
    case)? Or would you like to look for similar names, *locally*?"

    Or, the system being completely unresponsive to "events" (button presses)
    for 15-30 seconds? Or, the backup camera taking a few seconds to come
    online (so you have to WAIT before taking your foot off the brake).

    Or, displays being limited to N digits (e.g., total miles traveled on this trip) so you never know if you are seeing the results of saturated math
    or some other USEFUL presentation?

    [You would think a car manufacturer with all the re$ource$ at its disposal would come up with a better implementation! Likely the folks making the technical decisions aren't skilled in the art...]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Fri Apr 1 09:39:06 2022
    On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:42, Ricky wrote:
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the
    plane. It simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years. Cat 3b
    landings were in operation when I was a kid.
    Someone still has to be watching for other aircraft and
    otherwise flying the plane. In other words, the pilot is
    responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern
    ones are more sophisticated and handle course changes along the
    planned route, as well as being able to land automatically. And
    more important than what plane autopilots actually /do/, is what
    people /think/ they do - and remember we are talking about
    drivers that think their Tesla "autopilot" will drive their car
    while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept into the
    heads of twats that think "autopilot" means "it does it for me".

    That's Tom Gardner level misinformation. Comments about what people
    think are spurious and unsubstantiated. A class of "twats" can be
    invented that think anything. Nothing matters other than what Tesla
    owners think. They are the ones driving the cars.

    Are you suggesting that none of the people who drive Teslas are twats?
    (Maybe that term is too British for you.)

    The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?


    And are you suggesting that only Tesla drivers are affected by Tesla
    crashes? Obviously they will be disproportionally affected, but motor accidents often involve other people and other cars. And while Tesla
    may be leading the way in car "autopiloting", others are following - the strengths and weaknesses of Tesla's systems are relevant to other car manufacturers.

    Now I have no idea why you have brought this up from left field. Is "left field" too American for you? That's from a sport called "baseball", not to be confused with "blernsball".

    --

    Rick C.

    +-- Get 1,000 miles of free Supercharging
    +-- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Fri Apr 1 09:25:21 2022
    On Friday, April 1, 2022 at 10:08:31 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:44, Ricky wrote:
    On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
    On 01/04/2022 02:19, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Someone still has to be watching
    for other aircraft and otherwise flying the plane. In other words, the >>>>> pilot is responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones >>>> are more
    sophisticated and handle course changes along the planned route, as >>>> well as
    being able to land automatically. And more important than what plane >>>> autopilots actually /do/, is what people /think/ they do - and
    remember we
    are talking about drivers that think their Tesla "autopilot" will
    drive their
    car while they watch a movie or nap in the back seat.

    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept
    into the heads of twats that think "autopilot" means "it does
    it for me".
    You don't. Twats will always be twats. You fix the cars.

    You start by changing the name. "Driver assistance" rather than
    "autopilot".

    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first
    beep violently, then pull over and stop the car if the driver does not
    pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    Do you know anything about how the Tesla autopilot actually works? Anything at all?

    A little - but not a lot, and no personal experience.

    So fill in the details here.

    You've already told us that it is designed for things like motorway
    driving (or "highway" driving). Presumably you stick by that, and
    therefore agree that any restrictions on the autopilot should be lower
    for motorway driving than for more "challenging" driving such as town
    roads or small, twisty country roads.

    I don't really understand what you mean about "restrictions". Again, I think your image of how it works is not how it works. I don't know enough of your image to know how to explain to you what you have wrong.

    Autopilot will try to keep the car in a lane, recognize lights, stop signs, exit ramps and vehicles. When on appropriate highways, it will work in navigate on autopilot where it can change lanes (pass slow vehicles, get out of passing lane, etc.) and
    take exits. It will stop for traffic lights, but can not navigate turns at intersections or even twisty roads. When it sees somthing that upsets it, it will sound the alarm (that should be ALARM) and insist you take over. One such situation is
    blinking yellow lights at an intersection with light traffic. The autopilot never understands this light can be driven through.


    People already manage to read newspapers or eat their breakfast in
    traffic queues, in purely manual cars. Do you think autopilot can
    handle that kind of traffic?

    No, the autopilot won't read the newspaper. Otherwise I have no idea what you are asking. What is "that kind of traffic"? You mean stop and go? Yes, it does that very well. That's one situation I would not worry much about taking a nap.


    My suggestion is that a way to ensure people have more focus on driving
    is to require contact with the steering wheel. I am happy to hear your objections to that idea, or to alternative thoughts.

    Teslas already do that. Please, go to a Tesla forum and read about the cars a bit. It would save me a lot of typing.


    Improper use of autopilot (and other automation in all kinds of cars)
    leads to a higher risk of accidents. I expect that proper use can lower risk. Do you disagree with these two claims?

    "Higher" and "lower" than what???


    Do you think Tesla's autopilot is perfect as it is, or is there room for improvement?

    Of course there is room for improvement. When I first got my car it wouldn't take an exit ramp. Then it would take the exit, but would enter it at full speed! Now it is better, but you still need to watch it. It's also poor at slowing before traffic
    lights. Often the lights are around a bend and until it sees the light is red, it's barreling along. Then has to hit the brakes, not just the regenerative engine brake. This is how most people drive and it wastes a lot of fuel.


    Do you actually want to contribute something to this thread, or do you
    just want to attack any post that isn't Tesla fanboy support? (Your
    answers to the previous questions will cover this one too.)

    This is your BS. I'm not criticizing and criticism of Tesla. You don't pay attention enough to understand that. I'm criticizing your remarks based on ignorance of Teslas, ignorance that doesn't stop you and Tom from forming opinions based on your
    ignorance rather than knowledge.

    Please just go read a bit about them. There is tons of info. Even weighing just the electrons to read it all, it's still tons! How many electrons in a ton, anyway?

    --

    Rick C.

    -++ Get 1,000 miles of free Supercharging
    -++ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Jeroen Belleman on Fri Apr 1 09:46:05 2022
    On Friday, April 1, 2022 at 10:17:55 AM UTC-4, Jeroen Belleman wrote:
    On 2022-04-01 15:38, Don Y wrote:
    On 4/1/2022 1:46 AM, Jeroen Belleman wrote:
    You turn the steering wheel into a dead-man's handle - if the driver
    releases it for more than, say, 2 seconds, the autopilot should first >>> beep violently, then pull over and stop the car if the driver does not >>> pay attention. (Maybe you have "motorway mode" that allows a longer
    delay time, since autopilot works better there, and perhaps also a
    "traffic queue" mode with even longer delays.)

    All these 'assistants' with their multiple 'modes' only make things
    more complicated and therefor unsafe. Simple is better.

    "Assistance" should be intuitive. You don't even NOTICE the power steering, brakes, autotranny, etc. "assistants" in a vehicle.
    Because, for the most part, the way they operate is largely invariant
    of driver, driving conditions, etc. (how often do folks use anything
    other than "D(rive)" and "R(everse)"? Is there a way to *disable*
    the power steering? Or brakes? Should there be?
    I much prefer a simple stick shift. I can tell what state it's in
    by touch, and there in not the slightest doubt about it.

    My Tesla is a manual. The transmission never controls the gear it is in. I also know exactly what gear it is in without looking or even feeling. It only has one speed. Well, I guess it has two actually, + and -.


    That isn't
    true for an automatic. You need to /look/ what state it's in. They're
    too temperamental to my taste, refusing to change state under certain conditions. Same for the electric parking brake. It took me a while to figure out it refuses to disengage when I'm not wearing seat belts.
    Sheesh! Talk about weird interactions!

    The Tesla is also pretty good about that. I never have to worry with the parking break as it is automatically set when in park and also when at a stop light. Stepping on the brake until you are stopped sets the brake and stepping on the gas...
    accelerator releases it.


    Power steering and brakes are in the set of assists that normally
    go unnoticed until they fail. (Provided they are essentially linear,
    smooth, without discontinuity or other surprise behaviour.)

    I hit the starter a bit too briefly in my Kia and put it in reverse, only to find the engine had not actually started and I'm rolling backwards with no break or steering. lol


    My favorite is the side mirrors tilting downwards (to afford a view
    of the ground) when backing up. The backup camera is a win as we back into our garage and it helps avoid backing INTO something. These would be less necessary with a "lower profile" vehicle, though.

    [I also like the trip computer automatically reseting at each trip
    and "fill up"]
    Yes, got that too, and I agree those are good features.

    Both the Kia and Tesla start a trip odometer on fueling. Any charging on the Tesla restarts it. I don't know about the Kia, once I've driven to a gas station I'm not leaving until the tank is full.

    --

    Rick C.

    +-+ Get 1,000 miles of free Supercharging
    +-+ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Fri Apr 1 18:59:33 2022
    On 01/04/2022 18:39, Ricky wrote:
    On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:42, Ricky wrote:
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the
    plane. It simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years. Cat 3b
    landings were in operation when I was a kid.
    Someone still has to be watching for other aircraft and
    otherwise flying the plane. In other words, the pilot is
    responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern
    ones are more sophisticated and handle course changes along the
    planned route, as well as being able to land automatically. And
    more important than what plane autopilots actually /do/, is what
    people /think/ they do - and remember we are talking about
    drivers that think their Tesla "autopilot" will drive their car
    while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept into the
    heads of twats that think "autopilot" means "it does it for me".

    That's Tom Gardner level misinformation. Comments about what people
    think are spurious and unsubstantiated. A class of "twats" can be
    invented that think anything. Nothing matters other than what Tesla
    owners think. They are the ones driving the cars.

    Are you suggesting that none of the people who drive Teslas are twats?
    (Maybe that term is too British for you.)

    The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?


    It means "a stupid person" or "someone who does stupid things". No, not everyone who drives a Tesla is a twat - but /some/ are, such as those
    that think their autopilot will drive the car without them paying attention.


    And are you suggesting that only Tesla drivers are affected by Tesla
    crashes? Obviously they will be disproportionally affected, but motor
    accidents often involve other people and other cars. And while Tesla
    may be leading the way in car "autopiloting", others are following - the
    strengths and weaknesses of Tesla's systems are relevant to other car
    manufacturers.

    Now I have no idea why you have brought this up from left field. Is "left field" too American for you? That's from a sport called "baseball", not to be confused with "blernsball".


    You said that Tesla autopilots are only relevant to Tesla drivers.
    That's wrong. I usually prefer to give a bit of explanation as to why I
    think someone is wrong.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Fri Apr 1 09:53:05 2022
    On Friday, April 1, 2022 at 10:29:58 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:38, Ricky wrote:
    On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:
    On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
    On 31/03/2022 22:44, Ricky wrote:
    On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote: >>>>>> On 30/03/2022 00:54, Tom Gardner wrote:
    On 29/03/22 20:18, David Brown wrote:
    <snip>
    No, it is not "too strong". It is basic statistics. Bayes' theorem, >>>>>> and all that. If a large proportion of people use autopilot, but >>>>>> only a small fraction of the deaths had the autopilot on, then
    clearly the autopilot reduces risks and saves lives (of those that >>>>>> drive Teslas - we still know nothing of other car drivers).

    A simple comparison of numbers is not sufficient. Most Tesla
    autopilot usage is on highways which are much safer per mile driven >>>>> than other roads. That's an inherent bias because while
    non-autopilot driving must include all situations, autopilot simply >>>>> doesn't work in most environments.

    Yes. An apples-to-apples comparison is the aim, or at least as close as >>>> one can get.

    I suspect - without statistical justification -

    Yes, without justification, at all.
    Which do /you/ think is most likely? Autopilot crashes on the motorway, >> or autopilot crashes on smaller roads?

    Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.

    I was not aware of that limitation. Thanks for providing some relevant information.
    I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use
    it.


    that the accidents
    involving autopilot use are precisely cases where you don't have a good,
    clear highway, and autopilot was used in a situation where it was not >>>> suitable. Getting good statistics and comparisons here could be helpful >>>> in making it safer - perhaps adding a feature that has the autopilot say
    "This is not a good road for me - you have to drive yourself" and switch
    itself off. (It would be more controversial, but probably statistically >>>> safer, if it also sometimes said "I'm better at driving on this kind of >>>> road than you are" and switching itself on!)

    An issue is, of course, that any single experience can be
    dismissed as an unrepresentative aberration. Collation of
    experiences is necessary.

    Some of the dashcam "Tesla's making mistakes" videos on yootoob >>>>>>> aren't confidence inspiring. Based on one I saw, I certainly
    wouldn't dare let a Tesla drive itself in an urban environment, >>>>>>>
    I suspect there isn't sufficient experience to assess relative >>>>>>> dangers between "artificial intelligence" and "natural
    stupidity".
    I don't doubt at all that the Tesla autopilot makes mistakes.

    Which depends on how you define "mistakes".
    Of course.
    It's a bit like asking
    if your rear view mirror makes mistakes by not showing cars in the >>>>> blind spot. The autopilot is not designed to drive the car. It is a >>>>> tool to assist the driver. The driver is required to be responsible >>>>> for the safe operation of the car at all times. I can point out to >>>>> you the many, many times the car acts like a spaz and requires me to >>>>> manage the situation. Early on, there was a left turn like on a 50 >>>>> mph road, the car would want to turn into when intending to drive >>>>> straight. Fortunately they have ironed out that level of issue. But >>>>> it was always my responsibility to prevent it from causing an
    accident. So how would you say anything was the fault of the
    autopilot?

    There are a few possibilities here (though I am not trying to claim that
    any of them are "right" in some objective sense). You might say they >>>> had believed that the "autopilot" was like a plane autopilot -

    It is exactly like an airplane autopilot.


    you can
    turn it on and leave it to safely drive itself for most of the journey >>>> except perhaps the very beginning and very end of the trip. As you say, >>>> the Tesla autopilot is /not/ designed for that - that might be a mistake
    from the salesmen, advertisers, user-interface designers, or just the >>>> driver's mistake.

    Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying
    the plane, with or without the autopilot.

    Yes, that's the original idea of a plane autopilot. But modern ones are >> more sophisticated and handle course changes along the planned route, as >> well as being able to land automatically. And more important than what
    plane autopilots actually /do/, is what people /think/ they do - and
    remember we are talking about drivers that think their Tesla "autopilot" >> will drive their car while they watch a movie or nap in the back seat.

    Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just like
    in the car, there is a pilot who's job is to fly/drive and assure safety.

    I am fully aware that plane autopilots are limited. I am also aware
    that they are good enough (in planes equipped with modern systems) to
    allow pilots to let the system handle most of the flight itself, even including landing. The pilot is, of course, expected to be paying
    attention, watching for other aircraft, communicating with air traffic controllers and all the rest of it. But there have been cases of pilots falling asleep, or missing their destination because they were playing around on their laptops. What people /should/ be doing, and what they
    are /actually/ doing, is not always the same.

    Exactly like the Tesla autopilot. The pilot is still in charge and responsible.


    As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.

    You can google for "backseat Tesla drivers" as well as I can. I am
    confident that some of these are staged, and equally confident that some
    are not. There is no minimum level of "thinking" - no matter how daft something might be, there is always a dafter person who will think it's
    a good idea.

    The fact that someone pulled a stunt doesn't mean they thought that was an ok thing to do. You know that. So why are we discussing this?


    And sometimes the autopilot does something daft - it is no longer
    assisting the driver, but working against him or her. That, I think, >>>> should be counted as a mistake by the autopilot.

    The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.

    Well, "does something daft" is no worse than "acts like a spaz", and
    it's a good deal more politically correct!

    Bzzzz. Sorry, you failed.

    Really? You think describing the autopilot's actions as "acts like a
    spaz" is useful and specific, while "does something daft" is not? As
    for the political correctness - find a real spastic and ask them what
    they think of your phrase.

    How do you know what is meant by "spaz"? That's my point. Words like that are not well defined. I intended the word to be colorful, with no particular meaning. Your use of daft was in a statement that needed much more detail to be meaningful.
    Besides, if I jump off a cliff, are you going to jump as well?

    --

    Rick C.

    ++- Get 1,000 miles of free Supercharging
    ++- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Fri Apr 1 19:03:06 2022
    On 01/04/2022 18:53, Ricky wrote:
    On Friday, April 1, 2022 at 10:29:58 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:38, Ricky wrote:
    On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown
    wrote:

    Well, "does something daft" is no worse than "acts like a
    spaz", and it's a good deal more politically correct!

    Bzzzz. Sorry, you failed.

    Really? You think describing the autopilot's actions as "acts like
    a spaz" is useful and specific, while "does something daft" is not?
    As for the political correctness - find a real spastic and ask them
    what they think of your phrase.

    How do you know what is meant by "spaz"? That's my point. Words
    like that are not well defined. I intended the word to be colorful,
    with no particular meaning. Your use of daft was in a statement that
    needed much more detail to be meaningful. Besides, if I jump off a
    cliff, are you going to jump as well?


    I know what the word "spaz" means. I know what /you/ meant by it in the context - just as I know that you know what "does something daft" meant (including the implied vagueness of the phrase). I have no idea why you
    are pretending you don't, nor why you are getting your knickers in a
    twist about me writing "daft" after you wrote "spaz".

    (And I hope you know what that last colourful British phrase means, and
    don't think it is meant literally :-) )

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Fri Apr 1 10:10:56 2022
    On Friday, April 1, 2022 at 12:59:41 PM UTC-4, David Brown wrote:
    On 01/04/2022 18:39, Ricky wrote:
    On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:42, Ricky wrote:
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the
    plane. It simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years. Cat 3b
    landings were in operation when I was a kid.
    Someone still has to be watching for other aircraft and
    otherwise flying the plane. In other words, the pilot is
    responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern
    ones are more sophisticated and handle course changes along the
    planned route, as well as being able to land automatically. And
    more important than what plane autopilots actually /do/, is what
    people /think/ they do - and remember we are talking about
    drivers that think their Tesla "autopilot" will drive their car
    while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept into the
    heads of twats that think "autopilot" means "it does it for me".

    That's Tom Gardner level misinformation. Comments about what people
    think are spurious and unsubstantiated. A class of "twats" can be
    invented that think anything. Nothing matters other than what Tesla
    owners think. They are the ones driving the cars.

    Are you suggesting that none of the people who drive Teslas are twats?
    (Maybe that term is too British for you.)

    The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?

    It means "a stupid person" or "someone who does stupid things". No, not everyone who drives a Tesla is a twat - but /some/ are, such as those
    that think their autopilot will drive the car without them paying attention.

    And are you suggesting that only Tesla drivers are affected by Tesla
    crashes? Obviously they will be disproportionally affected, but motor
    accidents often involve other people and other cars. And while Tesla
    may be leading the way in car "autopiloting", others are following - the >> strengths and weaknesses of Tesla's systems are relevant to other car
    manufacturers.

    Now I have no idea why you have brought this up from left field. Is "left field" too American for you? That's from a sport called "baseball", not to be confused with "blernsball".

    You said that Tesla autopilots are only relevant to Tesla drivers.
    That's wrong. I usually prefer to give a bit of explanation as to why I
    think someone is wrong.

    Please reread my post. I said nothing of the sort. Please read carefully. Quotes are also helpful.

    --

    Rick C.

    +++ Get 1,000 miles of free Supercharging
    +++ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to Ricky on Fri Apr 1 19:39:41 2022
    On 01/04/22 17:25, Ricky wrote:
    I don't really understand what you mean about "restrictions". Again, I think your image of how it works is not how it works. I don't know enough of your image to know how to explain to you what you have wrong.

    Autopilot will try to keep the car in a lane, recognize lights, stop signs, exit ramps and vehicles. When on appropriate highways, it will work in navigate on autopilot where it can change lanes (pass slow vehicles, get out of passing lane, etc.) and take exits. It will stop for traffic lights, but can not navigate turns at intersections or even twisty roads. When it sees somthing that upsets it, it will sound the alarm (that should be ALARM) and insist you take over. One such situation is blinking yellow lights at an intersection with light traffic. The autopilot never understands this light can be driven through.

    You need to datestamp your description of the autopilot's
    capabilities and foibles - Tesla keeps updating it. IMHO that's
    a problem, since the car's behaviour today might be significantly
    different to when you last drove it. And the driver probably won't
    even know the difference exists; would you read the overnight
    "change notes" when all you want to do is drive to the shops?

    Example: https://www.pluscars.net/tesla-autopilot-automatically-stopped-at-red-light-for-the-first-time-105-news

    It sounds like your Tesla doesn't have the autopilot mentioned,
    viz "$7,000 full autonomous driving package recognizes traffic
    lights and stop signs. It provides autonomous driving in the city."

    Sounds like a lot more than "highway only".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Ricky on Fri Apr 1 20:36:44 2022
    On 01/04/2022 18:25, Ricky wrote:

    Please, go to a Tesla forum and read about the cars a bit. It would save me a lot of typing.

    No, thanks.

    As so often seems to happen in this group, this thread is going nowhere.
    I'm not interested enough in Teslas to start reading forums, brochures,
    or other information. You are not interested in sharing more than the occasional titbit of information, and it must be dragged out of you
    through frustrated and somewhat unfriendly comments on both sides - you
    prefer to tell people they are wrong than offer corrections or your own opinion. I think sometimes this group brings out the worst in people,
    even when the worst members of the group are not involved in the thread
    - we develop some bad habits here. It is frustrating.

    I leave the thread /marginally/ better informed than I started.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to David Brown on Fri Apr 1 13:00:08 2022
    On Friday, April 1, 2022 at 2:36:52 PM UTC-4, David Brown wrote:
    On 01/04/2022 18:25, Ricky wrote:

    Please, go to a Tesla forum and read about the cars a bit. It would save me a lot of typing.

    No, thanks.

    As so often seems to happen in this group, this thread is going nowhere.
    I'm not interested enough in Teslas to start reading forums, brochures,
    or other information. You are not interested in sharing more than the occasional titbit of information, and it must be dragged out of you
    through frustrated and somewhat unfriendly comments on both sides - you prefer to tell people they are wrong than offer corrections or your own opinion. I think sometimes this group brings out the worst in people,
    even when the worst members of the group are not involved in the thread
    - we develop some bad habits here. It is frustrating.

    I leave the thread /marginally/ better informed than I started.

    If you want to ask direct questions, fine. I'm not going to create a tutorial for you.

    --

    Rick C.

    ---- Get 1,000 miles of free Supercharging
    ---- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Tom Gardner on Fri Apr 1 13:12:25 2022
    On Friday, April 1, 2022 at 2:39:47 PM UTC-4, Tom Gardner wrote:
    On 01/04/22 17:25, Ricky wrote:
    I don't really understand what you mean about "restrictions". Again, I think
    your image of how it works is not how it works. I don't know enough of your
    image to know how to explain to you what you have wrong.

    Autopilot will try to keep the car in a lane, recognize lights, stop signs,
    exit ramps and vehicles. When on appropriate highways, it will work in navigate on autopilot where it can change lanes (pass slow vehicles, get out
    of passing lane, etc.) and take exits. It will stop for traffic lights, but
    can not navigate turns at intersections or even twisty roads. When it sees somthing that upsets it, it will sound the alarm (that should be ALARM) and
    insist you take over. One such situation is blinking yellow lights at an intersection with light traffic. The autopilot never understands this light
    can be driven through.
    You need to datestamp your description of the autopilot's
    capabilities and foibles - Tesla keeps updating it. IMHO that's
    a problem, since the car's behaviour today might be significantly
    different to when you last drove it. And the driver probably won't
    even know the difference exists; would you read the overnight
    "change notes" when all you want to do is drive to the shops?

    Example: https://www.pluscars.net/tesla-autopilot-automatically-stopped-at-red-light-for-the-first-time-105-news

    Yes, I said it will stop for lights and stop signs. What's your point?


    It sounds like your Tesla doesn't have the autopilot mentioned,
    viz "$7,000 full autonomous driving package recognizes traffic
    lights and stop signs. It provides autonomous driving in the city."

    Autopilot is not "autonomous driving", period. That's the point. You don't say where you read this, but the price puts it sometime in the 2020 timeframe or older. But then, when I bought mine, there were multiple choices available. "Full self driving"
    was the highest level, which was paying for something that is not on the road. They are now beta testing, but not "autonomous" driving.


    Sounds like a lot more than "highway only".

    Hard to tell. This is very unlikely to have come from Tesla as they never refer to it as "autonomous" driving. They use brand names.

    I will ask again that you read my posts and read them thoroughly. You clearly are not doing that.

    If you aren't going to read the posts, then don't reply. Ok?

    BTW, I did a Google search on your quote and it didn't turn up a match. So it would see you got that from a phone call or something that Google doesn't yet crawl. It did turn up this which is very interesting.

    https://nypost.com/2022/03/31/court-orders-tesla-to-buy-back-model-3-in-autopilot-case/

    'Tesla says FSD and its attendant features require “active driver supervision and do not make the vehicle autonomous.”'

    So take it straight from the horse's mouth! Also, please stop the BS, ok?

    --

    Rick C.

    ---+ Get 1,000 miles of free Supercharging
    ---+ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to Ricky on Sat Apr 2 00:09:53 2022
    On 01/04/22 21:12, Ricky wrote:
    On Friday, April 1, 2022 at 2:39:47 PM UTC-4, Tom Gardner wrote:
    On 01/04/22 17:25, Ricky wrote:
    I don't really understand what you mean about "restrictions". Again, I think
    your image of how it works is not how it works. I don't know enough of your >>> image to know how to explain to you what you have wrong.

    Autopilot will try to keep the car in a lane, recognize lights, stop signs, >>> exit ramps and vehicles. When on appropriate highways, it will work in
    navigate on autopilot where it can change lanes (pass slow vehicles, get out
    of passing lane, etc.) and take exits. It will stop for traffic lights, but >>> can not navigate turns at intersections or even twisty roads. When it sees >>> somthing that upsets it, it will sound the alarm (that should be ALARM) and >>> insist you take over. One such situation is blinking yellow lights at an >>> intersection with light traffic. The autopilot never understands this light >>> can be driven through.
    You need to datestamp your description of the autopilot's
    capabilities and foibles - Tesla keeps updating it. IMHO that's
    a problem, since the car's behaviour today might be significantly
    different to when you last drove it. And the driver probably won't
    even know the difference exists; would you read the overnight
    "change notes" when all you want to do is drive to the shops?

    Example:
    https://www.pluscars.net/tesla-autopilot-automatically-stopped-at-red-light-for-the-first-time-105-news

    Yes, I said it will stop for lights and stop signs. What's your point?

    Read the paragraph above the word "Example:"


    It sounds like your Tesla doesn't have the autopilot mentioned,
    viz "$7,000 full autonomous driving package recognizes traffic
    lights and stop signs. It provides autonomous driving in the city."

    Autopilot is not "autonomous driving", period. That's the point. You don't say where you read this, but the price puts it sometime in the 2020 timeframe or older. But then, when I bought mine, there were multiple choices available. "Full self
    driving" was the highest level, which was paying for something that is not on the road. They are now beta testing, but not "autonomous" driving.


    Sounds like a lot more than "highway only".

    Hard to tell. This is very unlikely to have come from Tesla as they never refer to it as "autonomous" driving. They use brand names.

    I will ask again that you read my posts and read them thoroughly. You clearly are not doing that.

    If you aren't going to read the posts, then don't reply. Ok?

    Mirror.


    BTW, I did a Google search on your quote and it didn't turn up a match. So it would see you got that from a phone call or something that Google doesn't yet crawl.

    It is in the pluscars article referenced!

    What was that you were saying about reading posts thoroughly?


    It did turn up this which is very interesting.

    https://nypost.com/2022/03/31/court-orders-tesla-to-buy-back-model-3-in-autopilot-case/

    'Tesla says FSD and its attendant features require “active driver supervision and do not make the vehicle autonomous.”'

    Yes. Musk is backtracking from some of his earlier outrageous claims.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Tom Gardner on Fri Apr 1 18:23:04 2022
    On Friday, April 1, 2022 at 7:10:00 PM UTC-4, Tom Gardner wrote:
    On 01/04/22 21:12, Ricky wrote:
    On Friday, April 1, 2022 at 2:39:47 PM UTC-4, Tom Gardner wrote:
    On 01/04/22 17:25, Ricky wrote:
    I don't really understand what you mean about "restrictions". Again, I think
    your image of how it works is not how it works. I don't know enough of your
    image to know how to explain to you what you have wrong.

    Autopilot will try to keep the car in a lane, recognize lights, stop signs,
    exit ramps and vehicles. When on appropriate highways, it will work in >>> navigate on autopilot where it can change lanes (pass slow vehicles, get out
    of passing lane, etc.) and take exits. It will stop for traffic lights, but
    can not navigate turns at intersections or even twisty roads. When it sees
    somthing that upsets it, it will sound the alarm (that should be ALARM) and
    insist you take over. One such situation is blinking yellow lights at an >>> intersection with light traffic. The autopilot never understands this light
    can be driven through.
    You need to datestamp your description of the autopilot's
    capabilities and foibles - Tesla keeps updating it. IMHO that's
    a problem, since the car's behaviour today might be significantly
    different to when you last drove it. And the driver probably won't
    even know the difference exists; would you read the overnight
    "change notes" when all you want to do is drive to the shops?

    Example:
    https://www.pluscars.net/tesla-autopilot-automatically-stopped-at-red-light-for-the-first-time-105-news

    Yes, I said it will stop for lights and stop signs. What's your point?
    Read the paragraph above the word "Example:"

    Read this: What's YOUR point?


    It sounds like your Tesla doesn't have the autopilot mentioned,
    viz "$7,000 full autonomous driving package recognizes traffic
    lights and stop signs. It provides autonomous driving in the city."

    Autopilot is not "autonomous driving", period. That's the point. You don't say where you read this, but the price puts it sometime in the 2020 timeframe or older. But then, when I bought mine, there were multiple choices available. "Full self driving"
    was the highest level, which was paying for something that is not on the road. They are now beta testing, but not "autonomous" driving.


    Sounds like a lot more than "highway only".

    Hard to tell. This is very unlikely to have come from Tesla as they never refer to it as "autonomous" driving. They use brand names.

    I will ask again that you read my posts and read them thoroughly. You clearly are not doing that.

    If you aren't going to read the posts, then don't reply. Ok?
    Mirror.
    BTW, I did a Google search on your quote and it didn't turn up a match. So it would see you got that from a phone call or something that Google doesn't yet crawl.
    It is in the pluscars article referenced!

    What was that you were saying about reading posts thoroughly?

    I did a google search on the quote. It also doesn't show up in a text search on that web page because you munged the quote.

    Even so, you are quoting a reporter writing an article, not an authoritative source. If you are going to pull BS like this, there's no reason to try to have a discussion. Please look up what TESLA says about their products. Not what a reporter said
    two years ago. This is why it is so hard to have a conversation with you. You don't really try to understand anything.


    It did turn up this which is very interesting.

    https://nypost.com/2022/03/31/court-orders-tesla-to-buy-back-model-3-in-autopilot-case/

    'Tesla says FSD and its attendant features require “active driver supervision and do not make the vehicle autonomous.”'
    Yes. Musk is backtracking from some of his earlier outrageous claims.

    Please provide those claims. I suspect you are thinking of things he has predicted rather than what he says the cars can do. Yes, he says all sorts of things about the future. He said there would be robo-taxis in 2020. So what?

    The point is you need to read what Tesla says, not Musk, not reporters. Stop with all the BS, please.

    --

    Rick C.

    --+- Get 1,000 miles of free Supercharging
    --+- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Tom Gardner on Fri Apr 1 19:18:02 2022
    On 4/1/2022 8:12 AM, Tom Gardner wrote:
    On 01/04/22 14:07, Don Y wrote:

    <snipped many points where we are talking about
    different classes of aircraft and air traffic>

    I.e., a pilot is a lot more likely to understand the function
    AND LIMITATIONS of an (aircraft) autopilot than a driver is to
    have similar appreciation for an (automobile) "autopilot".

    That's true for the aircraft, but nobody has developed
    an autopilot. You have to stay awake feel (literally,
    by the seat of your pants) what's happening. The nearest
    to an autopilot is a moving map airspace display.

    Commercial aircraft rely on autopilots. In a sense, it is
    an easier (navigation) problem to solve -- there's no real "traffic"
    or other obstacles beyond the airports (assuming you maintain your
    assigned flight corridor/speed). The same is true of railways
    and waterways (more or less).

    Er, no.

    You are considering a small part of air traffic, that
    in controlled airspace.

    And GLIDERS are a LARGE part of air traffic? Really?
    How are we measuring this -- passenger miles, flights,
    miles travelled, etc.?

    Many flights, powered and unpowered, happen outside
    controlled airspace, where the rule is to look out
    of the cockpit for converging traffic.

    And? How does that affect my claim as to the TYPES OF PEOPLE WHO ARE
    PILOTS vs. DRIVERS?

    One one occasion I watched a commercial airliner
    taking off a thousand feet below me. Hercules buzz
    around too. Then there are balloons, hang gliders
    and the like.

    There are even rules as to which side of roads and
    railways you should fly on, so that there aren't
    head-on collisions between aircraft following the
    same ground feature in opposite directions

    Gliders frequently operate very near each other,
    especially in thermals and when landing. They
    also have to spot other gliders coming straight
    at them when ridge flying; not trivial to spot
    a white blob the size of a motorbike's front
    converging at 120mph.

    So, you're claiming that glider pilots are the types of people who
    will climb into the back seat and take a nap? Or, play a video game
    while flying?

    Or, are tehy A DIFFERENT TYPE OF PERSON than the drivers you
    allege do these things?

    To help cope with that, some gliders are equipped
    with FLARMs - short range radio transmitters to
    indicate the direction of other gliders and whether
    you are likely to hit them.

    Cars operate in a much more challenging environment. Even "on the open
    road", a condition can arise that needs immediate driver attention
    (witness these 50-car pileups).

    Note how poorly "seasoned" drivers adapt to the first snowfall of
    the season. (Really? Did you FORGET what this stuff was like??)
    Do they do any special (mental?) prep prior to getting behind the
    wheel, in those cases? Or, just "wing it", assuming "it will
    come back to them"?

    Pilots often don't understand what's going on; just
    listen to the accident reports on the news :(

    I think those events are caused by cognitive overload, not ignorance.

    Not always, e.g. the recent 737 crashes.

    So, a defect in an autopilot implementation can be similarly excused?

    Que? Strawman.

    I made the point regarding the types of people being compared.
    Thus, you have to exclude the effects of "other issues" that
    may contribute to the "bad outcomes" being claimed.

    If the equipment/system has a defect, then you can't blame it on
    the type of person operating that equipment! (or, are you claiming
    all the 737 pilots were slack-offs?)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to Ricky on Sat Apr 2 11:24:46 2022
    On 2022-04-01 18:25, Ricky wrote:
    [...]
    Please just go read a bit about them. There is tons of info. Even
    weighing just the electrons to read it all, it's still tons! How
    many electrons in a ton, anyway?


    About 1e33.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to david.brown@hesbynett.no on Sun Apr 3 18:07:58 2022
    On Tue, 29 Mar 2022 16:16:56 +0200, David Brown
    <david.brown@hesbynett.no> wrote:

    On 29/03/2022 15:00, Rickster wrote:
    On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
    From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

    The website referred to appears to be collating information in
    a reasonable and unemotional way.


    Every Tesla Accident Resulting in Death (Tesla Deaths)
    Gabe Goldberg <ga...@gabegold.com>
    Thu, 24 Mar 2022 01:53:39 -0400

    We provide an updated record of Tesla fatalities and Tesla accident deaths >>> that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.). This sheet also tallies >>> claimed and confirmed Tesla autopilot crashes, i.e. instances when
    Autopilot was activated during a Tesla crash that resulted in death. Read >>> our other sheets for additional data and analysis on vehicle miles traveled,
    links and analysis comparing Musk's safety claims, and more.

    Tesla Deaths Total as of 3/23/2022: 246
    Tesla Autopilot Deaths Count: 12

    https://www.tesladeaths.com/

    Yeah, it's raw data. Did you have a point?


    Without comparisons to other types of car, and correlations with other >factors, such raw data is useless. You'd need to compare to other
    high-end electric cars, other petrol cars in similar price ranges and
    styles. You'd want to look at statistics for "typical Tesla drivers"
    (who are significantly richer than the average driver, but I don't know
    what other characteristics might be relevant - age, gender, driving >experience, etc.) You'd have to compare statistics for the countries
    and parts of countries where Teslas are common.

    And you would /definitely/ want to anonymise the data. If I had a
    family member who was killed in a car crash, I would not be happy about
    their name and details of their death being used for some sort of absurd >Tesla hate-site.

    I'm no fan of Teslas myself. I like a car to be controlled like a car,
    not a giant iPhone (and I don't like iPhones either). I don't like the
    heavy tax breaks given by Norway to a luxury car, and I don't like the >environmental costs of making them (though I am glad to see improvements
    on that front). I don't like some of the silly claims people make about
    them - like Apple gadgets, they seem to bring out the fanboy in some of
    their owners. But that's all just me and my personal preferences and >opinions - if someone else likes them, that's fine. Many Tesla owners
    are very happy with their cars (and some are unhappy - just as for any
    other car manufacturer). I can't see any reason for trying to paint
    them as evil death-traps - you'd need very strong statistical basis for
    that, not just a list of accidents.

    Yeahbut the problem is conventional cars don't tend to lock all the
    doors trapping everyone inside and then burst into flames like Teslas
    do. Call me old fashioned if you like, but I don't see that as being a
    great selling point.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Cursitor Doom on Sun Apr 3 10:35:20 2022
    On Sunday, April 3, 2022 at 1:08:05 PM UTC-4, Cursitor Doom wrote:
    On Tue, 29 Mar 2022 16:16:56 +0200, David Brown
    <david...@hesbynett.no> wrote:

    On 29/03/2022 15:00, Rickster wrote:
    On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
    From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

    The website referred to appears to be collating information in
    a reasonable and unemotional way.


    Every Tesla Accident Resulting in Death (Tesla Deaths)
    Gabe Goldberg <ga...@gabegold.com>
    Thu, 24 Mar 2022 01:53:39 -0400

    We provide an updated record of Tesla fatalities and Tesla accident deaths
    that have been reported and as much related crash data as possible
    (e.g. location of crash, names of deceased, etc.). This sheet also tallies
    claimed and confirmed Tesla autopilot crashes, i.e. instances when
    Autopilot was activated during a Tesla crash that resulted in death. Read >>> our other sheets for additional data and analysis on vehicle miles traveled,
    links and analysis comparing Musk's safety claims, and more.

    Tesla Deaths Total as of 3/23/2022: 246
    Tesla Autopilot Deaths Count: 12

    https://www.tesladeaths.com/

    Yeah, it's raw data. Did you have a point?


    Without comparisons to other types of car, and correlations with other >factors, such raw data is useless. You'd need to compare to other
    high-end electric cars, other petrol cars in similar price ranges and >styles. You'd want to look at statistics for "typical Tesla drivers"
    (who are significantly richer than the average driver, but I don't know >what other characteristics might be relevant - age, gender, driving >experience, etc.) You'd have to compare statistics for the countries
    and parts of countries where Teslas are common.

    And you would /definitely/ want to anonymise the data. If I had a
    family member who was killed in a car crash, I would not be happy about >their name and details of their death being used for some sort of absurd >Tesla hate-site.

    I'm no fan of Teslas myself. I like a car to be controlled like a car,
    not a giant iPhone (and I don't like iPhones either). I don't like the >heavy tax breaks given by Norway to a luxury car, and I don't like the >environmental costs of making them (though I am glad to see improvements
    on that front). I don't like some of the silly claims people make about >them - like Apple gadgets, they seem to bring out the fanboy in some of >their owners. But that's all just me and my personal preferences and >opinions - if someone else likes them, that's fine. Many Tesla owners
    are very happy with their cars (and some are unhappy - just as for any >other car manufacturer). I can't see any reason for trying to paint
    them as evil death-traps - you'd need very strong statistical basis for >that, not just a list of accidents.

    Yeahbut the problem is conventional cars don't tend to lock all the
    doors trapping everyone inside and then burst into flames like Teslas
    do. Call me old fashioned if you like, but I don't see that as being a
    great selling point.

    Mostly because it's a huge exaggeration, and you know that. I expect you'd rather drive a Pinto.

    --

    Rick C.

    --++ Get 1,000 miles of free Supercharging
    --++ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Doe@21:1/5 to Ricky on Sun Apr 3 20:46:00 2022
    Ricky wrote:

    I expect you'd rather drive a Pinto.

    Pinto explodes when hit from behind.

    https://youtu.be/ngtALzDAIcU (8 seconds)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Doe@21:1/5 to Cursitor Doom on Sun Apr 3 20:36:21 2022
    Cursitor Doom wrote:

    Yeahbut the problem is conventional cars don't tend to lock all the
    doors trapping everyone inside and then burst into flames like Teslas
    do. Call me old fashioned if you like, but I don't see that as being a
    great selling point.

    lol
    That sounds a little harsh. Got an idea for a bumper sticker...

    TEST VEHICLE - NO DRIVER

    To keep people at a distance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to david.brown@hesbynett.no on Mon Apr 4 01:19:37 2022
    On Fri, 1 Apr 2022 18:59:33 +0200, David Brown
    <david.brown@hesbynett.no> wrote:

    On 01/04/2022 18:39, Ricky wrote:
    On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:42, Ricky wrote:
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
    On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the
    plane. It simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years. Cat 3b
    landings were in operation when I was a kid.
    Someone still has to be watching for other aircraft and
    otherwise flying the plane. In other words, the pilot is
    responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern
    ones are more sophisticated and handle course changes along the
    planned route, as well as being able to land automatically. And
    more important than what plane autopilots actually /do/, is what
    people /think/ they do - and remember we are talking about
    drivers that think their Tesla "autopilot" will drive their car
    while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension
    by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept into the
    heads of twats that think "autopilot" means "it does it for me".

    That's Tom Gardner level misinformation. Comments about what people
    think are spurious and unsubstantiated. A class of "twats" can be
    invented that think anything. Nothing matters other than what Tesla
    owners think. They are the ones driving the cars.

    Are you suggesting that none of the people who drive Teslas are twats?
    (Maybe that term is too British for you.)

    The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?


    It means "a stupid person" or "someone who does stupid things". No, not >everyone who drives a Tesla is a twat - but /some/ are, such as those
    that think their autopilot will drive the car without them paying attention.

    The term "autopilot" kind of suggests you can do precisely that,
    though.
    I don't like these half-baked ideas. Don't call a self-driving car
    that until it's *truly* and *safely* autonymous. The worst situation
    is giving the driver a false sense of security - which is where we are currently.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Cursitor Doom on Mon Apr 4 04:33:36 2022
    On Sunday, April 3, 2022 at 8:19:43 PM UTC-4, Cursitor Doom wrote:
    On Fri, 1 Apr 2022 18:59:33 +0200, David Brown
    <david...@hesbynett.no> wrote:

    On 01/04/2022 18:39, Ricky wrote:
    On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:42, Ricky wrote:
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote: >>>>> On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the
    plane. It simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years. Cat 3b
    landings were in operation when I was a kid.
    Someone still has to be watching for other aircraft and
    otherwise flying the plane. In other words, the pilot is
    responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern
    ones are more sophisticated and handle course changes along the >>>>>> planned route, as well as being able to land automatically. And >>>>>> more important than what plane autopilots actually /do/, is what >>>>>> people /think/ they do - and remember we are talking about
    drivers that think their Tesla "autopilot" will drive their car >>>>>> while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension >>>>> by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept into the >>>>> heads of twats that think "autopilot" means "it does it for me".

    That's Tom Gardner level misinformation. Comments about what people >>>> think are spurious and unsubstantiated. A class of "twats" can be
    invented that think anything. Nothing matters other than what Tesla >>>> owners think. They are the ones driving the cars.

    Are you suggesting that none of the people who drive Teslas are twats? >>> (Maybe that term is too British for you.)

    The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?


    It means "a stupid person" or "someone who does stupid things". No, not >everyone who drives a Tesla is a twat - but /some/ are, such as those
    that think their autopilot will drive the car without them paying attention. The term "autopilot" kind of suggests you can do precisely that,
    though.
    I don't like these half-baked ideas. Don't call a self-driving car
    that until it's *truly* and *safely* autonymous. The worst situation
    is giving the driver a false sense of security - which is where we are currently.

    In the Tesla "autopilot", "full self driving" and autonomous are three different things. Autopilot is what you can buy and use today. The information on it clearly says you are responsible for driving the vehicle and this is just an "assist". In
    particular, the car will sound an alarm if you don't maintain a detectable grip on the steering wheel. Full self driving (FSD) is a future product that a few are allowed to beta test. I've never seen anything that suggests you do not need to be in
    control of the car when using FSD. Tesla does not use the term autonomous other than when Musk is talking about the indefinite future.

    If you ignore what a company tells you about the limitations of its products, that's on you. It puts you in the category of people who sue ladder companies because they didn't tell you to not use it in a pig pen.

    --

    Rick C.

    -+-- Get 1,000 miles of free Supercharging
    -+-- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to spamjunk@blueyonder.co.uk on Mon Apr 4 19:02:02 2022
    On Mon, 4 Apr 2022 18:17:42 +0100, Tom Gardner
    <spamjunk@blueyonder.co.uk> wrote:

    I know this road well. The speed limit is 20mph, but during the
    day 5-10mph is typical.

    Given that, the depth to which the Tesla is buried is surprising.

    I wonder what will be determined to be the cause of the accident.
    (There are many electric scooters around there, some legal,
    many not)

    https://www.bristolpost.co.uk/news/bristol-news/tesla-goes-flying-window-shop-6905246

    Lucky *that* one didn't burst into flames!

    From your source, I see the eco-loons have been out and about
    'encouraging people' to give up their cars:

    https://www.bristolpost.co.uk/news/bristol-news/rolls-royce-filton-attack-e-6901425?int_source=nba

    All in the cause of a healthy climate, of course. :-/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tom Gardner@21:1/5 to All on Mon Apr 4 18:17:42 2022
    I know this road well. The speed limit is 20mph, but during the
    day 5-10mph is typical.

    Given that, the depth to which the Tesla is buried is surprising.

    I wonder what will be determined to be the cause of the accident.
    (There are many electric scooters around there, some legal,
    many not)

    https://www.bristolpost.co.uk/news/bristol-news/tesla-goes-flying-window-shop-6905246

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Flyguy@21:1/5 to Ricky on Mon Apr 4 22:10:49 2022
    On Monday, April 4, 2022 at 4:33:41 AM UTC-7, Ricky wrote:
    On Sunday, April 3, 2022 at 8:19:43 PM UTC-4, Cursitor Doom wrote:
    On Fri, 1 Apr 2022 18:59:33 +0200, David Brown
    <david...@hesbynett.no> wrote:

    On 01/04/2022 18:39, Ricky wrote:
    On Friday, April 1, 2022 at 10:12:26 AM UTC-4, David Brown wrote:
    On 01/04/2022 14:42, Ricky wrote:
    On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote: >>>>> On 31/03/22 23:39, David Brown wrote:
    On 01/04/2022 00:29, Ricky wrote:

    Sorry, that's not how an autopilot works. It doesn't fly the >>>>>>> plane. It simply maintains a heading and altitude.
    They have been doing more than that for for > 50 years. Cat 3b
    landings were in operation when I was a kid.
    Someone still has to be watching for other aircraft and
    otherwise flying the plane. In other words, the pilot is
    responsible for flying the plane, with or without the
    autopilot.

    Yes, that's the original idea of a plane autopilot. But modern >>>>>> ones are more sophisticated and handle course changes along the >>>>>> planned route, as well as being able to land automatically. And >>>>>> more important than what plane autopilots actually /do/, is what >>>>>> people /think/ they do - and remember we are talking about
    drivers that think their Tesla "autopilot" will drive their car >>>>>> while they watch a movie or nap in the back seat.
    And, to put it kindly, aren't discouraged in that misapprehension >>>>> by the statements of the cars' manufacturers and salesdroids.

    Now, what's the best set of techniques to get that concept into the >>>>> heads of twats that think "autopilot" means "it does it for me". >>>>
    That's Tom Gardner level misinformation. Comments about what people >>>> think are spurious and unsubstantiated. A class of "twats" can be >>>> invented that think anything. Nothing matters other than what Tesla >>>> owners think. They are the ones driving the cars.

    Are you suggesting that none of the people who drive Teslas are twats? >>> (Maybe that term is too British for you.)

    The term is too BS for me. A twat is whoever you want them to be. For all I know, you think everyone who drives a Tesla is a twat. How about if we discuss facts rather than BS?


    It means "a stupid person" or "someone who does stupid things". No, not >everyone who drives a Tesla is a twat - but /some/ are, such as those >that think their autopilot will drive the car without them paying attention.
    The term "autopilot" kind of suggests you can do precisely that,
    though.
    I don't like these half-baked ideas. Don't call a self-driving car
    that until it's *truly* and *safely* autonymous. The worst situation
    is giving the driver a false sense of security - which is where we are currently.
    In the Tesla "autopilot", "full self driving" and autonomous are three different things. Autopilot is what you can buy and use today. The information on it clearly says you are responsible for driving the vehicle and this is just an "assist". In
    particular, the car will sound an alarm if you don't maintain a detectable grip on the steering wheel. Full self driving (FSD) is a future product that a few are allowed to beta test. I've never seen anything that suggests you do not need to be in
    control of the car when using FSD. Tesla does not use the term autonomous other than when Musk is talking about the indefinite future.

    If you ignore what a company tells you about the limitations of its products, that's on you. It puts you in the category of people who sue ladder companies because they didn't tell you to not use it in a pig pen.

    --

    Rick C.

    -+-- Get 1,000 miles of free Supercharging
    -+-- Tesla referral code - https://ts.la/richard11209

    This falls into the category of it can either do it or it can't, and the Tesla "autopilot" can't. Anything that is actively driving the car but ISN'T driving the car legally is, by definition, a fraud.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)