c-linkage 2 years ago

I have a 2020 Honda Civic sans ultrasonic sensors but with a front camera for collision avoidance, and I can count on two hands the number of phantom breaking events I've had in two years. All it takes is a little rain and the sun at the right angle for the system to be fooled into thinking a wall lies dead ahead.

I consider myself lucky that each time there was no one behind me.

While the car gives me the option to turn of the feature, I leave it on because if I get into an accident with it off I am sure the insurance companies would have more leverage to deny my claims and find me at fault.

  • parker_mountain 2 years ago

    I have a 2020 BMW, with the same features. I have never had an issue in over 30,000 miles of driving. My previous generation BMW, also similarly equipped, did not have these issues. It did trigger the rear collision alert a handful of times, although all that does is tighten the seatbells and beep (i think).

    So, these features can work reliably, and I don't think using a poor implementation is useful in this conversation.

    The reality is that tesla has removed sensors from their playbook, which can only decrease the information the car has to work with.

    • delta_p_delta_x 2 years ago

      > 2020 BMW

      I generally find that the luxury European makes (BMW, Mercedes-Benz, Audi, Porsche) have more well-put-together cars than Tesla does, despite all of Tesla's posturing about being a 'premium car company'. I heard it put quite succinctly: 'Tesla is a software company that decided to build cars'.

      Not to mention, it is an American software company that decided to build cars in the US, and the person I spoke to didn't have a very good opinion of American makes in general (I concur).

      Tesla was at the forefront of EV innovation about 5-7 years ago, but today, I'd much rather buy an EQE, an i4, or an e-tron (or if I can ever afford it, a Taycan Turbo S).

      • sshine 2 years ago

        > I'd much rather buy an EQE, an i4, or an e-tron (or if I can ever afford it, a Taycan Turbo S).

        A comparison of what it would cost me locally to get these cars (base cost):

        Tesla model 3: From $55k, 455 km range

        Tesla model Y: From $58k, 488 km range

        BMW i4: From $65k, 470 km range

        Mercedes-Benz EQE: From $85k, 525 km range

        Audi e-tron: From $131k, 365 km range

        Porsche Taycan Turbo S: From $276k, 560 km range

        ---

        Of these, the e-tron is overpriced, even compared to non-electric Audis.

        I'd be happy to own any of these. But you're mentioning 18-38% more expensive cars (disregarding Taycan; Taycan should be compared to Tesla Roadster: $211k, 997 km range).

        The Tesla selling point for me is a highly competitive range and the most beautiful interior (subjective, but alternatively: most simple) at the lowest cost among these luxury electric cars.

        > Tesla is a software company that decided to build cars

        Companies that realise they're a software company when software is at their core have an edge.

        Insurance companies, for example, should be wise to do the same.

        • delta_p_delta_x 2 years ago

          > what it would cost

          After having test-driven and being driven in cars, I have come to the conclusion that one generally gets what one pays for when it comes to cars.

          > most beautiful interior (subjective, but alternatively: most simple

          I like traditional interiors with knobs and buttons for AV and AC controls, and the singular giant iPad-esque display on Teslas is a complete deal-breaker for me.

          > range

          I live in a dense city-state: range is a moot discussion when I have a petrol station with EV chargers every 2-3 km.

          At any rate, I get your point about the alternatives to the Model 3 being more expensive with less range, but that is a sacrifice I am willing to make for drastically improved quality control and (in my opinion) a drastically better driving and passenger experience.

          • sshine 2 years ago

            > I live in a dense city-state: range is a moot discussion when I have a petrol station with EV chargers every 2-3 km.

            That's a completely fair point.

            But if we're not comparing high-range EVs, why not go with a Renaut Zoe ($39k, 200 km range) or the like?

            I've used Zoes in city environments, and the newer models are overwhelmingly pleasant. E.g. they cover internal panels with fabric instead of leather/pleather to create a very nice atmosphere at no cost. Their small size make them easier to park, and you can't go fast in cities.

            > the alternatives to the Model 3 being more expensive with less range

            I didn't make that point (except comparing the e-tron).

            I also prefers knobs and buttons. But I prefer fewer of them.

            The simplicity of design is not just controls, but visual clutter in terms of the amount of lines created by overlapping panels of different materials. Classic car brands have a legacy to represent so that new Audis feel like old Audis on the inside. Tesla might've taken it a bit too far, hiding things behind a screen, creating a maze of non-tactile menu systems instead.

        • hulitu 2 years ago

          > Companies that realise they're a software company when software is at their core have an edge.

          SW for automotive is a different beast than SW for Windows or Linux.

        • ianai 2 years ago

          You shouldn’t compare roadster to anything because it doesn’t exist yet. It’s at best unfair, and ultimately the definition of unrealistic until roadsters are delivered.

    • heliodor 2 years ago

      That's neat that your car has a rear collision alert. It's actually useful and if you have the presence of mind to react quickly, you should put your head back against the headrest in order to prevent whiplash.

      • ianai 2 years ago

        And roll forward if clear/safe. Had the presence of mind for that once and it haunts me to this day.

  • iosjunkie 2 years ago

    Someone close to me was in an accident that could have been avoided with sensors other than a camera. The camera only collision mitigation system was blinded by direct sunlight complicated by glare off of fresh snow. Zero indication the system was compromised until it failed to provide any warning or active braking.

    Vowed to never own a car with only camera based collision mitigation again.

    • pajko 2 years ago

      This. Boeing had removed a sensor deemed redundant for cost effectiveness, but failed to adapt the system to the changes and skipped the training of the pilots.

      https://www.nytimes.com/2019/06/01/business/boeing-737-max-c...

      • dijonman2 2 years ago

        How is this even remotely related to the topic? You just want to hate Boeing?

        • epse 2 years ago

          I think because it's a similar story that illustrates the danger of removing redundant sensors, like tesla is doing here

    • NullPrefix 2 years ago

      What was the driver doing?

      • cpsns 2 years ago

        That’s not really relevant, safety systems should be expected to work reliably, or report to the driver when they can’t (something my car does if covered in snow).

    • gzer0 2 years ago

      When radar and vision disagree, which one do you believe? Vision has much more precision, so better to double down on vision than do sensor fusion.

      • Volundr 2 years ago

        The solution two information from two sources disagreeing is rarely to just throw away half of your data.

      • parker_mountain 2 years ago

        In the event of a blinded vision system, such as sun glare, the vision system should be able to recognize its reduced capacity. That involves warning the driver and relying more heavily on radar/sensor fusion.

      • TooKool4This 2 years ago

        Like others have mentioned the entire field of sensor fusion deals with this problem. It is a very challenging problem to solve but it can be solved and it has been successfully used in spacecraft, aircraft, fighter jets, phones, AR/VR systems, and undoubtedly many others.

        A basic approach is to have an uncertainty (or estimated uncertainty) for each of the sensing modalities. Then you use the uncertainties to weigh each sensing sample when deriving your estimated quantity (i.e. vehicle velocity for example). Assuming the uncertainties are correct, the resulting estimator can have variance lower than estimators deriving from a single sensor modality. Of course tuning sensor uncertainty values is a difficult problem in sensor fusion (and much more so when distributions are unknown) but it is definitely doable.

        Repeating Elon's claim that sensor fusion is impossible/not-doable is entirely wrong. It is a technology that powers many different applications but its definitely not an easy thing to implement well.

      • AlotOfReading 2 years ago

        There's a huge amount of literature on how to do sensor fusion in the presence of misbehaving or disagreeing sensors.

        Moreover, production vehicles don't get entire classes of redundant sensors when the engineers think they can do without. The cost optimization and haggling that goes into production vehicles is insane.

      • hulitu 2 years ago

        Depends on your safety goal.

  • frxx 2 years ago

    Are you perchance confusing ultrasonic sensors with front mounted radar? In general the collision avoidance systems use either camera or radar, or both. Some manufacturers are trying to add lidar into the mix these days. Ultrasonic sensors are in general just used for parking and don't last way further than maybe 2 meters. (Source: I work in this space.)

  • me_me_me 2 years ago

    Its all fun and games, until you start hitting corner cases.

  • AtomicOrbital 2 years ago

    The car no doubt first confirms nobody is tailgating before such an abrupt action

    • pintxo 2 years ago

      So that you don’t brake in front of a wall if there is another care behind you? Not sure this is a smart move.

    • ntr-- 2 years ago

      With what? a rear facing camera that runs the same object detection software?

      • mc32 2 years ago

        Likely it would not be affected by the same light artifacts give that it would not be pointing in the same direction.

r00tanon 2 years ago

Interesting that summon and auto park will "not be available and will be restored once vision gets parity with the sensors..."

Wow. Wondering when that will happen, as I still get phantom braking in odd situations with my 2022 model 3. Case in point on a freeway straight away with no other vehicles and nothing but grass fields and gray sky. Car suddenly brakes as it tops a gentle hill.

kidme5 2 years ago

Besides supply chain, CyberTruck won't have them because it's all stainless steel and won't integrate into the body easily.

Seems like this is a forcing function preparing for that.

  • arcticbull 2 years ago

    Also, I suspect they won't work when the CyberTruck is "temporarily serving as a boat" - although maybe that's a good opportunity to throw on a sonar? [edit] Maybe even some kind of fish-finder.

    • olivermarks 2 years ago

      The new Tesla Vaporware detector will ensure that you will be warned if the stainless steel 'cybertruck' ever actually makes it to production, along with all the sports car and all the other late night TV 'but wait there's more' pitches Musk enthusiasts seem to find so compelling

ilrwbwrkhv 2 years ago

They can do whatever they want. Moved away from the Tesla now to the Porsche and the Rivian. The model s was a great car for it's time but now others have caught up.

  • sinclairX86 2 years ago

    I just bought a Model Y.

    For their price, Teslas are great.

    I believe it feeds my car fomo for 6-10 years, and my family's car needs for longer.

    You can get a nicer looking EV than Tesla on the exterior.

    Only Porsche competes on interior and exterior at 5x the price.

    • bmitc 2 years ago

      Have you bot seen what Kia and Hyundai are doing?

      • parkingrift 2 years ago

        I have. The Kia EV6 has perhaps the most unintuitive capacitive UX in vehicle history. Linus took it for a test drive and legitimately could not figure out how to turn down the AC temperature. Maybe the EV7 won’t be such an abomination.

        Can’t speak to the Hyundai version. I don’t know much about it other than it seems trivially easy to steal it. Hasn’t made my news radar outside of that.

      • somehnguy 2 years ago

        Not equipping their cars with immobilizers, so anybody with a screwdriver car steal them?

        I kid, but only sorta. I really liked the Ioniq but seeing that whole mess gives me pause on both Kia and Hyundai.

micheljansen 2 years ago

Great, the ultrasonic sensors are about the one thing that works flawlessly on my 2019 Model 3. I hope they don’t get taken away in a future software update like what happened to radar.

I bought this car expecting it to get better with every software update. For a while that was true (auto wipers are much better now, though still mediocre). Lately not so much.

  • termy 2 years ago

    Technology will always let you down.

beej71 2 years ago

I want my car to be bristling with all kinds of sensors. Especially with FSD. I want it to be superhuman. I don't want it to suck as bad as I do at seeing deer at night.

We have the technology.

kkfx 2 years ago

Ladies and gentleman the Real World®©™ i complex... Cam? How much their dinamic range can master direct sunlight, water/snow/something white and big, glass reflections? How about a flying large leaf passing in front of the stereocam? Ultrasound? How do they perform with heavy rain mitigation and environment high background noise? Radar? Same for heavy rain, very little reflective materials etc. Long story short NO SENSOR works well enough in all possible circumstances.

That's why the last decision must be human, the human awareness of physical world surrounding can hopefully correct perception errors of sensors and human sense themselves, that's is. So far IMVHO that's the best "tool" we can choose. Normally cars should not run so fast in so tight environment where quickness can't be enough and in general, IME no sensor is really quicker in practice.

Doing more? Well than we need to change roads, not cars, and that's HYPER expensive... I mean for instance adding under lanes a small metallic band radars can detect to give something surely and precisely visible by sensors, using some reflective powder in temporary lanes to allow seen them and recognize the different response as "that's the right one, ignore the other", small plastic poles with reflective material aside at regular distance, for the least expensive part. For circulation the sole reliable and potentially affordable is vehicle to vehicle communications meaning every running vehicle emit a beacon any others can detect, parked vehicle can wake up and respond to the beacon as well. That's far from perfect since only new vehicles would have it, but that's also the best solution we have. The beacon beam the size and speed of the vehicle, trajectory analysis is possible. More precise and (far, but not really so far, +300€ per vehicle I think) more expensive adding a small optical gyro to have also the right real direction to compute more precise trajectory can be useful and in that sense FEW beacons on the road itself, activated by passing vehicles can tell the right GPS position to have both more precise navigation allowing an easy local direction (from the gyro) + distance + map to know where we are around 30-50cm precision... All systems are complex but scalable and not so expensive on scale to be doable, the point is if they are worth it or not if we decide to go to a "flying" future for all light vehicles...

  • me_me_me 2 years ago

    >That's why the last decision must be human

    Cant be, its either human driver with assist or full self driving.

    Sadly due to human nature we cant have both.

    In studies people who only suppose to assist the driving computer are very easily bored and distracted - non-active driving - making their reaction time sky high (up to 5 seconds). Human drivers need to be actively engaged in process of driving in order to stay alert.

    • kkfx 2 years ago

      Then IMVHO full self-driving must be disabled on not clear/accessorized for helping FSD roads... A phantom break in a dense traffic does not harm much, at maximum led to a minor incident. One on a highway it's a different thing.

senectus1 2 years ago

surely the visual spectrum is one of the most unreliable ways to "see" the world (in a computing sense).

  • lern_too_spel 2 years ago

    If you have a passive imaging system, the visual spectrum is pretty good for survival. If you have the budget to emit your own radiation as well, you would obviously use something else to get additional information about your environment.

  • rlt 2 years ago

    Then why did we evolve to only be able to see visual spectrum?

    • vlovich123 2 years ago

      Really?

      A) our visual system is several orders of magnitude better than the best CV systems (both in terms of optical sensor + the wetware that does object recognition, focus, etc etc). That doesn’t mean that artificial systems we build can get away with just one

      B) our visual system works on coarse approximations with high latency. Radio vision systems operate with far more accuracy / precision and lower latency

      C) our visual system fails in inclement conditions (heavy rain, fog, night). Radio vision systems complement by filling that gap

      D) sensor fusion done well should strictly outperform any single sensor

      E) we evolved for the visual spectrum because of our history of reproductive selection as a species. This in no way says anything about the optimal artificial system design

      F) our visual system fails ALL the friggin time. Remember “check your blind spot”? Our visual system has major blood vessels going in front of the lens which is unnecessary but there’s no evolutionary pressure on us to fix that because our brain compensated for that defect.

      G) the goal for self driving systems is to be not just at par with humans, but several orders of magnitude safer with 24/7 uptime and driving. We have no evidence to suggest that’s possible with just vision.

    • delta_p_delta_x 2 years ago

      Real answer: the Sun emits most of its light in the visual region. By Wien's displacement law[0], the wavelength emitted with the most intensity given the Sun's 5778 K surface temperature is 501 nm, which is bang-smack in the middle of the visual spectrum, and is a light greenish-blue.

      Our eyes are more adapted to around 530-550 nm (which is more precisely green), because of two reasons: we evolved from fish, which lived in the ocean, where blue light is scattered (and thus green-red sensitivity is more important), and secondly, the atmosphere also scatters blue light, thus making sunlight more red.

      [0]: https://en.wikipedia.org/wiki/Wien%27s_displacement_law

    • steve_adams_86 2 years ago

      Because it works well enough. Our vision system is full of weird illusions and nonsense though. If you burn part of your retina, you won’t even know unless it’s severe enough. Your brain will “fill in the gap” as long as it can. Is that what you want driving your car?

      • me_me_me 2 years ago

        This is most banal and boring answer, that is also most likely to correct answer.

        Just simple example of good enough design are our eyes; they evolved inside out (we have nerves in front of instead behind the eye) and our brain just fills the gaps, but its good enough to see lions in the bushes. And so we are stuck with it.

        If black and white was good enough for survival we would have black and white vision.

        • steve_adams_86 2 years ago

          In defense of camera based vision systems, they can be more objective and consistent than humans. They won’t go partially blind without realizing it, for example.

          There are examples of them catching things humans sometimes missed, so — it isn’t a total bust. At the same time though, no idea how you get bast the issues of blowouts obscuring the rest of the visual field, debris blocking the light, instances where depth is imperceptible, etc.

          • me_me_me 2 years ago

            I dont think you can jump over that hurdle without actual software intelligence.

            If driver gets flashed by light, or something unexpected they rely on their past experiences, projection of near future to extrapolate how to drive despite lack of visual information.

            If you close your eyes you'd be able to drive for few seconds etc. If you seen a car disappear behind a building, you still know that it will come to the intersection and you will act like you can 'see' it.

            In my opinion the biggest problem is mimicking human driving. We are based on incomplete information with non-precise actions, and we balance it out by being highly adaptive. We extrapolate information from environment and based on past experience we predict future outcomes - seamlessly (though not always correctly). Everything computers are bad at doing currently.

    • chatterhead 2 years ago

      Maybe because we don't need to see the other spectrums; perhaps we have a system for interpreting waves inside our brain that we haven't realized is their yet.

      Waves are waves and we would have adapted to them naturally regardless of procreation evolution; as chemical and signal evolution are real, too.

      • senectus1 2 years ago

        not only that but we DO "see" using other spectrums... feeling, hearing etc.

        but yeah we're not evolved multi ton machines hurtling around at immense speeds building scary amounts of kinetic energy. we're soft fleshy slow creatures.

    • CydeWeys 2 years ago

      Because we're people, not cars?

peteradio 2 years ago

I heard they are getting new sensors that detect a collision via an interference with Musks giant ego.

atkailash 2 years ago

This sounds like they basically want to beta test in real life. I get training models but on a mass scale like this, with lives potentially at stake doesn’t seem right.