‘Boycott Tesla’ ads to air during Super Bowl — “Tesla dances away from liability in Autopilot crashes by pointing to a note buried deep in the owner’s manual, that says Autopilot is only safe on fr…::undefined

  • Geyser@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    2
    ·
    8 months ago

    Take Tesla for whatever you will, but there’s crazy conflict of interest behind the dawn project/Dan o’dowd attacking this.

    “Green Hills also develops automotive software—it’s about 40 percent of the company’s business, O’Dowd said—and is a software supplier for the 2022 BMW iX EV crossover. This has caused O’Dowd critics and Tesla fans to call out The Dawn Project’s conflict of interest and question the organization’s motives.”

    https://www.motortrend.com/features/tesla-full-self-driving-ban-attempt-elon-musk-dan-odowd/

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      5
      ·
      8 months ago

      Def conflict of interest, but they aren’t necessarily lying, I’ve personally lost all trust in Teslas autopilot the day Musk stated (paraphrasing) that “aLl wE NeED arE CAmEraS” like no thanks, a safety critical system should have a fall back system. Cameras fuck up a lot and LiDAR came down a lot in cost since then, but anybody who actually even halfway knows what they’re doing could have predicted that.

      And then all the things Musk’s done since just solidified that lol

      I am so ready to get a fully autonomous car, but not a fucking Tesla that’s for damn sure, I’ll wait for some other manufacturer.

        • TheMinions@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          ·
          8 months ago

          I have used both. Even under ideal conditions the cameras only is noticeably worse.

          I recall parking in a parking lot and the camera systems showed that I was hitting a car while I was about a foot away from it. Lidar did not have any issues like that. And it was a perfectly sunny day.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          2
          ·
          8 months ago

          They aren’t wrong. Camera systems do have a lot of issues. Calling someone stupid for pointing that out is absurd.

          Cameras can mess up when they’re dirty, when it’s raining, when it’s dark, when there’s glare.

          Image processing and recognition is also a hell of a lot slower and more computationally expensive than numbers from a LIDAR system.

            • TheGrandNagus@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              1
              ·
              8 months ago

              Everybody is aware it’s not going to be flawless out of the gate, or perhaps ever. That’s not the point.

              They’re saying using cameras and only cameras is a bad idea and makes “Autopilot” inherently subpar in situations where cameras don’t work so well.

                • abcxyz@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  12
                  arrow-down
                  2
                  ·
                  8 months ago

                  Never go full fanboy… try actually understanding what people are telling you, why fail safes are good idea, why redundancy systems are implemented, etc.

                  Your anecdotal experience of “I drive in fog and rain and its been good” means nothing…

                • TheGrandNagus@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  3
                  ·
                  8 months ago

                  When there are vision issues the car complains

                  Except for when it doesn’t and the car plows straight into the back of a lorry. Or runs over a teenager getting off a school bus. Etc.

        • batmaniam@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          2
          ·
          8 months ago

          If I still have to pay attention, what my Hyundai has is fine. I think that’s the biggest issue with any FSD. Theres a sharp diminishing returns past a point. If I can’t take a nap, I don’t care. And I’ll trust any of them the day there are laws on the books imposing a $7.5MM fine per casualty.

          Those cars make the manufacturer money while hoisting the risk onto the public. People talk about stats compared to human drivers, it’s not about stats, it’s about accountability.

        • cm0002@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          2
          ·
          edit-2
          8 months ago

          First of all, Mr. Fanboy, Musk has been touting it as a Full Self Driving program for a long time, they’ve somewhat stopped that recently after regulators started looking into it and you cite their website like they wouldn’t change it lmao.

          You’re even commenting in a thread about an article on their deceptive practices with it.

          Second, it’s quite clear from my comment that I’m not interested in a “Driver Assist Program” I’m interested in a Fully Autonomous vehicle. “Driver Assist” is a nice feature, but not one I’m going to buy a whole new car for. When Tesla eventually (if ever) comes out with their Full Full Self Driving, however, I won’t get it because they’ve lost my trust in it from all these shenanigans they’re pulling now.

          Third, there’s absolutely a need in a life and death safety critical autonomous system to have another “sight” system. There’s not even a cost excuse anymore and disabling the existing systems is asinine and potentially dangerous. Musk is just incapable of admitting when he’s wrong and revising his plan like an adult.

          Camera only systems get things wrong all the time, they have gaps and scenarios where they just don’t work all that well. LiDAR does too, but in different situations, so their strength comes when melded together to complement each other.

          Stop licking Elons boot and get a mind for yourself.

    • bionicjoey@lemmy.ca
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      8 months ago

      Yeah no way shitting on a company could be worth the price of a super bowl ad unless also peddling something yourself.

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    ·
    8 months ago

    Wasn’t there an anti-privacy ad about Apple products being used to distribute CSAM as well? Rebecca Watson did a recent counterpoint to the ad. Privacy invasive tech is not good when large social movements are seeking to purge undesirables within the public

  • darganon@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    23
    ·
    8 months ago

    I have a Tesla, they make it extremely clear that you are never supposed to touch autopilot under any circumstances, and you have to go way out of your way to prevent it from disengaging while you’re not paying attention.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      6
      ·
      8 months ago

      When I was looking to buy a new car back in early 2019, I walked into a showroom for a final test drive before I threw some money down for a Model 3.

      It started to rain pretty hard on the return drive back. When executing an auto lane change, the sensors freaked out because of the water interference and they violently yanked the car back into the origin lane halfway through the lane change. It hydroplaned a hair and scared this shit out of my wife and I. The Telsa employee assured us “it’s ok, this is normal.” Hearing that was normal was not comforting.

      Upon returning to the showroom, a different model 3 in the parking lot started backing toward a small child. My wife saw what was happening, threw herself in front of the car, and that caused it to halt.

      I’m sure the software has progressed in the past 5 years, but suffice to say, we changed our minds on the car at that time. Those two incidents within 15 minutes really made us question how that shit was legal.

      • Draupnir@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        4
        ·
        edit-2
        8 months ago

        If the car was backing out, that was a human driver in control, not autopilot. Autopilot can only be enabled while driving on a well-marked roadway. The first part is plausible however. Likely the software at the time could not handle rain appropriately and you are absolutely right to question this if they tell you it was normal.

        • Ghostalmedia@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 months ago

          The car was being summoned from a parking space. Summon / Smart Summon will absolutely back out of a space fully autonomously.

      • Fisch@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        8 months ago

        That’s the thing, it’s only legal in the US (as far as I know, at least). In Germany you’re only allowed to use self-driving if your hands are on the steering wheel at all times and you can take over if something goes wrong.

        • eltrain123@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          8 months ago

          That’s the case in the US, too. The car automatically shuts off autopilot after 3 warnings of not keeping your hands on the steering wheel. It produces a loud audible alert after a few seconds if it senses the driver isn’t keeping their hands on the wheel. After the 3rd time, it continues the audible alert until the driver takes back control.

          There are also several warnings about keeping your hands on the wheel and staying alert when engaging autopilot.

          The people saying otherwise are either ignorant or disingenuous.

        • c0m47053@feddit.uk
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          8 months ago

          I’m pretty sure that is also the case in the US. These incidents are either caused by some sort of defeat device (I have seen weights that wrap around the steering wheel, no idea if they work), or people who have just gotten good at resting a hand on the wheel and not paying attention I think

          • Fisch@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            8 months ago

            I thought Tesla just added that by choice and not because it’s required by law

      • JasSmith@sh.itjust.works
        cake
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        8 months ago

        These instances of errors are obviously alarming, but all the evidence we have is that they’re still safer than human drivers. They will make mistakes - and sometimes those mistakes will cost lives - but they will make fewer mistakes than humans. Given this, as visceral as it feels when we hear of these stories, I think our ire is misplaced. Automated driving will never be perfect. If that’s the bar we’re aiming for we should just give up and go home. The goal is better than humans, and in many conditions, it’s already there.

    • Draupnir@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      8 months ago

      I’m sorry, never supposed to touch autopilot? Under any circumstances? Other than that, yeah, 100% if it detects you are not paying attention via wheel nag or eye-tracking camera at all it will alarm and disengage, potentially banning the user for a short time from using it. They do make it very clear of this result.

    • asdfasdfasdf@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      This is the opposite of reality. They make it extremely clear that it’s beta and you need to always keep your eyes on the road in case you need to take over, and that you should take over if you feel you need to.

    • Bell@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      22
      ·
      8 months ago

      Of course they do but every one here is so filled with hate for Elon they cannot consider the car might be reasonable