keeda 8 hours ago

I strongly believe LIDAR is the way to go and that Elon's vision-only move was extremely "short-sighted" (heheh). There are many reasons but that drives it home for me multiple times a week is that my Tesla's wipers will randomly sweep the windshield for absolutely no reason.

This is because the vision system thinks there is something obstructing its view when in reality it is usually bright sunlight -- and sometimes, absolutely nothing that I can see.

The wipers are, of course, the most harmless way this goes wrong. The more dangerous type is when it phantom-brakes at highway speeds with no warning on a clear road and a clear day. I've had multiple other scary incidents of different types (swerving back and forth at exits is a fun one), but phantom braking is the one that happens quasi-regularly. Twice when another car was right behind me.

As an engineer, this tells me volumes about what's going on in the computer vision system, and it's pretty scary. Basically, the system detects patterns that are inferred as its vision being obstructed, and so it is programmed to brush away some (non-existent) debris. Like, it thinks there could be a physical object where there is none. If this was an LLM you would call it a hallucination.

But if it's hallucinating crud on a windshield, it can also hallucinate objects on the road. And it could be doing it every so often! So maybe there are filters to disregard unlikely objects as irrelevant, which act as guardrails against random braking. And those filters are pretty damn good -- I mean, the technology is impressive -- but they can probabistically fail, resulting in things that we've already seen, such as phantom-braking, or worse, driving through actual things.

This raises so many questions: What other things is it hallucinating? And how many hardcoded guardrails are in place against these edge cases? And what else can it hallucinate against which there are no guardrails yet?

And why not just use LIDAR that can literally see around corners in 3D?

  • jqpabc123 11 minutes ago

    Engineering reliability is primarily achieved through redundancy.

    There is none with Musk's "vision only" approach. Vision can fail for a multitude of reasons --- sunlight, rain, darkness, bad road markers, even glare from a dirty windshield. And when it fails, there is no backup plan -- the car is effectively driving blind.

    Driving is a dynamic activity that involves a lot more than just vision. Safe automated driving can use all the help it can get.

  • CjHuber 3 hours ago

    Just imagine Tesla would have subventioned passive LIDAR on every car they ship to collect data. Wow that dataset would be crazy, and would even improve their vision models by having ground truth to train on. He’s such a moron

    • wombat-man 2 hours ago

      I think LIDAR was and maybe still is way more expensive. Initially running 75k. Now they're more around 10k which is better.

      • kibwen an hour ago

        This is off by orders of magnitude. BYD is buying LIDAR units for their cars for $140.

        • onlyrealcuzzo a minute ago

          That's likely closer to reality now, but that's not counting the cost for R&D to add it to the car, any additional costs that come with it besides the LIDAR hardware, plus the added cost to install it.

          All of that combined is probably closer to $1k than to $140.

          And, again, that's - what - 10 years after Tesla originally made the decision to go vision only.

          It wasn't a terrible idea at the time, but they should've pivoted at some point.

          They could've had a massive lead in data if they pivoted as late as 3 years ago, when the total cost would probably be under $2.5k, and that could've led to a positive feedback loop, cause they'd probably have a system better than Waymo by now.

          Instead, they've got a pile of garbage, and no path to improve it substantially.

  • chippiewill 3 hours ago

    As someone who worked in this space, you are absolutely right, but also kind of wrong - at least in my opinion.

    The cold hard truth is that LIDARs are a crutch, they're not strictly necessary. We know this because humans can drive without a LIDAR, however they are a super useful crutch. They give you super high positional accuracy (something that's not always easy to estimate in a vision-only system). Radars are also a super useful crutch because they give really good radial velocity. (Little anecdote, when we finally got the Radars working properly at work it made a massive difference to the ability for our car to follow other cars, ACC, in a comfortable way).

    Yes machine learning vision systems hallucinate, but so do humans. The trick for Tesla would be to get it good enough to where it hallucinates less than humans do (they're nowhere near yet - human's don't hallucinate very often).

    It's also worth adding that last I checked the state of the art for object detection is early fusion where you chuck the LIDAR and Radar point clouds into a neural net with the camera input so it's not like you'd necessarily have the classical methods guardrails with the Lidar anyway.

    Anyway, I don't think Tesla were wrong to not use LIDAR - they had good reasons to not go down that route. They were excessively expensive and the old style spinning LIDARs were not robust. You could not have sold them on a production car in 2018. Vision systems were improving a lot back then so the idea you could have a FSD on vision alone was plausible.

    • raincole 2 hours ago

      > The cold hard truth is that LIDARs are a crutch

      The hard truth is there is no reason to limit machines to only the tools humans are biologically born with. Cars always have crutches that humans don't possess. For example, wheels.

      • profunctor an hour ago

        The reason is cost, LIDAR is expensive.

        • throwaway31131 an hour ago

          Cost is relative. LIDAR maybe be expensive relative to a camera or two but it’s very inexpensive compared to hiring a full time driver. Crashes aren’t particularly cheap either. Neither are insurance premiums.

        • kibwen an hour ago

          This information is out of date. LIDAR costs are 10x less than they were a decade ago, and still falling.

          Turns out, when there's demand for LIDAR in this form factor, people invest in R&D to drive costs down and set up manufacturing facilities to achieve economies of scale. Wow, who could have predicted this‽

        • DennisP 30 minutes ago

          Huawei has a self-driving system that uses three lidars, which cost $250 each (plus vision, radar, and ultrasound). It appears to work about as well as FSD. Here's the Out of Spec guys riding around on it in China for an hour:

          https://www.youtube.com/watch?v=VuDSz06BT2g

        • ModernMech 5 minutes ago

          You know what used to be expensive? Cameras. Then people started manufacturing them for mass market and cost when down.

          You know what else used to be expensive? Structured light sensors. They cost $$$$ in 2009. Then Microsoft started manufacturing the Kinect for a mass market, and in 2010 price went down to $150.

          You know what's happened to LIDAR in the past decade? You guessed it, costs have come massively down because car manufacturers started buying more, and costs will continue to come down as they reach mass market adoption.

          The prohibitive cost for LIDAR coming down was always just a matter of time. A "visionary" like Musk should have been able to see that. Instead he thought he could outsmart everyone by using a technology that was not suited for the job, but he made the wrong bet.

        • uoaei an hour ago

          That's ok, they're supposed to be. That's no excuse to rush a bad job.

          • revnode an hour ago

            The point of engineering is to make something that’s economically viable, not to slap together something that works. Making something that works is easy, making something that works and can be sold at scale is hard.

            • waldarbeiter an hour ago

              If it would be easy there would already be a car costing a few million that few can afford but that has solved AD. But there isn't.

              • revnode 44 minutes ago

                There is no market for such a thing. At that price point, you get a personal chauffeur. That’s what rich people do and he can do stuff that a self driving system never can.

            • uoaei 40 minutes ago

              That's not engineering, that's industry. It's important to distinguish the two.

              • revnode 17 minutes ago

                Engineering only exists within industry. Everything else is a hobby.

      • dcchambers an hour ago

        Exactly.

        In a true self-driving utopia, all of the cars are using multiple methods to observe the road and drive (vision, lidar, GPS, etc) AND they are all communicating with each other silently, constantly, about their intentions and status.

        Why limit cars to what humans can do?

    • hudon 2 hours ago

      > they're not strictly necessary. We know this because humans can drive without a LIDAR

      and propellers on a plane are not strictly necessary because birds can fly without them? The history of machines show that while nature can sometimes inspire the _what_ of the machine, it is a very bad source of inspiration for the _how_.

    • goalieca 3 hours ago

      > The cold hard truth is that LIDARs are a crutch, they're not strictly necessary. We know this because humans can drive without a LIDAR, however they are a super useful crutch.

      Crutch for what? AI does not have human intelligence yet and let’s stop pretending it does. There is no shame in that as the word crutch implies.

      • spot5010 2 hours ago

        I've never understood the argument against lidars (except cost, but even that you can argue can come down).

        If a sensor provides additional data, why not use it? Sure, humans can drive withot lidars, but why limit the AI to using human-like sensors?

        Why even call it a crutch? IMO It's an advantage over human sensors.

        • bayindirh 2 hours ago

          > Sure, humans can drive without LIDARs...

          That's because our stereoscopic vision has infinitely more dynamic range, focusing speed and processing power w.r.t. a computer vision system. Periphery vision is very good at detecting movement, and central view can process tremendous amount of visual data without even trying.

          Even a state of the art professional action camera system can't rival our eyes in any of these categories. LIDARs and RADARs are useful and shall be present in any car.

          This is the top reason I'm not considering a Tesla. Brain dead insistence on cameras with small sensors only.

        • IgorPartola 2 hours ago

          I don’t work in this field so take the grain of salt first.

          Quality of additional data matters. How often does a particular sensor give you false positives and false negatives? What do you do when sensor A contradicts sensor B?

          “3.6 roentgen, not great, not terrible.”

          • giveita an hour ago

            You can say that about human hearing and balance. What if they conflict with visual? We are good at figuring it out.

            • ben_w 7 minutes ago

              We throw up, an evolved response because that conflict is a symptom of poisonous plants messing with us.

      • lazide 2 hours ago

        I think they meant crutch for the AI so they could pretend for investors that AGI is right around the corner haha

    • jfim 2 hours ago

      LIDARs have the advantage that they allow detecting solid objects that have not been detected by a vision-only system. For example, some time ago, a Tesla crashed into an overturned truck, likely because it didn't detect it as an obstacle.

      A system that's only based on cameras is only as good as its ability to recognize all road hazards, with no fall back if that fails. With LIDAR, the vehicle might not know what's the solid object in front of the vehicle using cameras, but it knows that it's there and should avoid running into it.

      • sandworm101 2 hours ago

        Solid objects that arent too dark or too shiny. Lidar is very bad at detecing mirrored surfaces or non-reflecting structures that absorb the paticular frequency in use. The back ends of trucks hauling liquid are paticularly bad. Block out the bumper/wheels, say by a slight hill, and that polished cone is invisible to lidar.

        • bayindirh an hour ago

          Add one or a couple of RADAR(s), too. European cars use this one weird trick to enable tons of features without harming people or cars.

        • UltraSane 39 minutes ago

          LIDAR works be measuring the time it takes for light to return so I don't understand how a object can be too reflective. Objects that absorb the specific wavelength the LIDAR uses is an obvious problem.

          • sandworm101 11 minutes ago

            Too reflective, like a flat mirror, will send the light off in a random direct rather than back as the detector. Worse yet, things like double reflections can result in timing errors as some of the signal follows a longer path.

    • DennisP 24 minutes ago

      You might not have the classical guardrails, but you are providing the neural net with a lot more information. Even humans are starting to find it useful to get inputs from other sensor types in their cars.

      I agree that Tesla may have made the right hardware decision when they started with this. It was probably a bad idea to lock themselves into that path by over-promising.

    • phinnaeus an hour ago

      Humans have the most sophisticated processing unit in the known universe to handle the data from the eyes. Is the brain a crutch?

      • bayindirh an hour ago

        At least for one marine creature, which I forgot its name, the answer is yes. Said creature dissolves its brain the moment it can find a place to attach and call home.

    • lukeschlather an hour ago

      I had taken for granted that the cameras in the Tesla might be equivalent to human vision, but now I'm realizing that's probably laughable. I'm reading it's 8 cameras at 30fps and it sounds like the car's bus can only process about 36fps (so a total of 36fps, not 8x30 = 240fps theoretically available from the cameras, if they had a better memory bus.) It also seems plausible you would need at least 10,000 FPS to fully match human vision (especially taking into account that humans turn their heads which in a CV situation could be analogous to the CV algorithm having 32x30 = 960 FPS, but typically only processing 140 frames this second from cameras pointing in a specific direction.

      So maybe LIDAR isn't necessary but also if Tesla were actually investing in cameras with a memory bus that could approximate the speed of human vision I doubt it would be cheaper than LIDAR to get the same result.

    • marcos100 an hour ago

      I want my self-driving car to be a better driver than any human. Sure we can drive without LIDAR, but just look up the amount of accidents caused by humans.

      • paulryanrogers 12 minutes ago

        Humans cause one fatal accident per million miles. (They have no backup driver they can disengage to.) Now just look up how many disengagements per million miles Tesla has.

    • davidhs 2 hours ago

      > Yes machine learning vision systems hallucinate, but so do humans.

      When was the last time you had full attention on the road and a reflection of light made you super confused and suddenly drive crazy? When was the last time you experienced objects behaving erratically around you, jumping in and out of place, and perhaps morphing?

      • hodgesrm an hour ago

        Well there is strong anecdotal evidence of exactly this happening.

           We were somewhere around Barstow on the edge of the desert when the drugs began to take hold. I remember saying something like, “I feel a bit lightheaded; maybe you should drive . . .”And suddenly there was a terrible roar all around us and the sky was full of what looked like huge bats, all swooping and screeching and diving around the car, which was going about 100 miles an hour with the top down to Las Vegas. And a voice was screaming: “Holy Jesus! What are these goddamn animals?” [0]
        
        [0] Thompson, Hunter S., „Fear and Loathing in Las Vegas“
    • DonHopkins 2 hours ago

      I'd rather cars have crutches than the people they run over.

    • fluidcruft 2 hours ago

      Musk's argument "Humans don't have LIDAR, therefore LIDAR is useless" has always seemed pretty dumb to me. It ignores the possibility that LIDAR might be superhuman with superhuman performance. And we also know you can get superhuman performance on certain tasks with insect-scale brains. Musk's just spewing stoner marketing crap that stoners think is deep, not actual engineering savvy.

      (and that's not even addressing that human vision is fundamentally a weird sensory mess full of strange evolutionary baggage that doesn't even make sense except for genetic legacy)

      • mixedbit 34 minutes ago

        Musk's argument also ignores intelligence of humans. The worst case upper bound for reaching human level driving performance without LIDAR is for AI to reach human level intelligence. Perhaps it is not required, but until we see self-driving Teslas performing as well as humans, we won't know this. Worst case scenario is that Tesla unsupervised self-driving is as far away as AGI.

    • lazide 2 hours ago

      The big promise of autonomous self-driving was that it would be done safer than humans.

      The assumption was that with similar sensors (or practically worse - digital cameras score worse than eyeballs in many concrete metrics), ‘AI’ could be dramatically better than humans.

      At least with Tesla’s experience (and with some fudging based on things like actual fatal accident data) it isn’t clear that is actually what is possible. In fact, the systems seem to be prone to similar types of issues that human drivers are in many situations - and are incredibly, repeatedly, dumb in some situations many humans aren’t.

      Waymo has gone full LiDAR/RADAR/Visual, and has had a much better track record. But their systems cost so much (or at least used to), that it isn’t clear the ‘replace every driver’ vision would ever make sense.

      And that is before the downward pressure on the labor market started to happen post-COVID, which hurts the economics even more.

      The current niche of Taxis kinda makes sense - centrally maintained and capitalized Taxis with outsourced labor has been a viable model for a long time, it lets them control/restrict the operating environment (important to avoid those bad edge cases!), and lets them continue to gather more and more data to identify and address the statistical outliers.

      They are still targeting areas with good climates and relatively sane driving environments because even with all their models and sensors, heavy snow/rain, icy roads, etc. are still a real problem.

    • uoaei an hour ago

      This impulse to limit robots to the capacities, and especially the form factors, of humans has severely limited our path to progress and a more convenient life.

      Robots are supposed to make up for our limitations by doing things we can't do, not do the things we can already do, but differently. The latter only serves to replace humans, not augment them.

    • inciampati 2 hours ago

      I wish I had radar eyes

      • UltraSane 34 minutes ago

        I want to see gamma rays, I want to hear X-rays, and I want to smell dark matter.

    • ModernMech 26 minutes ago

      > Vision systems were improving a lot back then so the idea you could have a FSD on vision alone was plausible.

      This was only plausible to people who had no experience in robotics, autonomy, and vision systems.

      Everyone knew LIDAR was the enabling technology thanks to the 2007 DARPA Urban challenge.

      But the ignoramus Elon Musk decided he knew better and spent the last decade+ trashing the robotics industry. He set us back as far as safety protocols in research and development, caused the first death due to robotic cars, deployed them on public roads without the consent of the public by hoisting around his massive wealth, lied consistently for a DECADE about the capabilities of these machines, defrauded customers and shareholders while becoming richer and richer, all to finally admit defeat while he still maintains the growth story of for Tesla's future remains in robotics. The nerve of this fucking guy.

  • amelius 3 hours ago

    Your comparison to hallucination is spot on.

    LLMs have shown the general public how AI can be plain wrong and shouldn't be trusted for everything. Maybe this influences how they, and regulators, will think about self driving cars.

    • bbarnett 3 hours ago

      Well I wish this was true. But loads of DEVs on here will claim LLMs are infallible.

      And the general public?! No way. Most are completely unaware of the foibles of LLMs.

      • BoiledCabbage 3 hours ago

        > Well I wish this was true. But loads of DEVs on here will claim LLMs are infallible.

        No the don't. You're making a straw man rather than trying to put forth an actual argument in support of your view.

        If you feel can't support your point, then don't try to make it.

        • greenchair 2 hours ago

          It's done in a roundabout way. Usually with a variation of "you had a bad experience because you are using the tool incorrectly, get good at prompting".

        • bbarnett 14 minutes ago

          A straw man? An actual argument?

          I responded to this parent comment:

          "LLMs have shown the general public how AI can be plain wrong and shouldn't be trusted for everything."

          You take issue with my response of:

          "loads of DEVs on here will claim LLMs are infallible"

          You're not really making sense. I'm not straw-manning anything, as I'm directly discussing the statement made. What exactly are you presuming I'm throwing a straw man over?

          It's entirely valid to say "there are loads of supposed experts that don't see this point, and you're expecting the general public to?". That's clearly my statement.

          You may disagree, but that doesn't make it a strawman. Nor does it make it a poorly phrased argument on my part.

          Do pay better attention please. And your entire last sentence is way over the line. We're not on reddit.

  • jillesvangurp 5 hours ago

    Lidar is great for object detection. But it's not great for interpreting the objects. It will stop you crashing into a traffic light. But it won't be able to tell the color of the light. It won't see the stripes on the road. It won't be able to tell signs apart. It won't enable AIs to make sense of the complex traffic situations.

    And those complex traffic situations are the main challenge for autonomous driving. Getting the AIs to do the right things before they get themselves into trouble is key.

    Lidar is not a silver bullet. It helps a little bit, but not a whole lot. It's great when the car has to respond quickly to get it out of a situation that it shouldn't have been in to begin with. Avoiding that requires seeing and understanding and planning accordingly.

    • amelius 3 hours ago

      Meanwhile, the competition who is using LiDAR has FSD cars. You're understating the importance of this sensor.

      You can train a DL model to act like a LiDAR based on only camera inputs (the data collection is easy if you already have LiDAR cars driving around). If they could get this to work reliably, I'm sure the competition would do it and ditch the LiDAR, but they don't, so that tells us something.

      • SOLAR_FIELDS 2 hours ago

        It is very true and worthwhile to point out that the only company deploying L4 at scale is using LIDAR. And that company is not Tesla

        • UltraSane 28 minutes ago

          The mental gymnastics Tesla fanboys use to explain this away are incredible.

      • ModernMech 19 minutes ago

        Researchers had this knowledge in 2007, when the only cars to finish the DARPA Urban challenge were equipped with Velodyne 3D LIDAR. Elon Musk sent us back a decade by using his platform to ignorantly convince everyone it was possible with just camera alone.

        For anyone who understands sensor fusion and the Kalman filter, read this and ask yourself if you trust Elon Musk to direct the sensor strategy on your autonomous vehicle: https://www.threads.com/@mdsnprks/post/DN_FhFikyUE

        For anyone wondering, to a sensors engineer the above tweet is like sayin 1 + 1 = 0 -- the truth (and science) is the exact opposite of what he's saying.

    • michaelt 3 hours ago

      I think you might be under-estimating the importance of not hitting things.

      If you look at the statistics on fatal car accidents, 85%+ involve collisions with stationary objects or other road users.

      Nobody's suggesting getting rid of machine vision or ML - just that if you've got an ML+vision system that gets in 1 serious accident per 200,000 miles, adding LIDAR could improve that to 1 serious accident per 2,000,000 miles.

      • ModernMech 16 minutes ago

        Because LIDAR can detect the object at the beginning of the perception pipeline, whereas camera can only detect the object after an expensive and time consuming ML inference process. By the time the camera even knows there's an object (if it does at all) the LIDAR would have had the car hitting its brakes. When you're traveling 60 MPH, milliseconds matter.

    • HPsquared 4 hours ago

      It's an extra sensor you'd add into the mix, you'd still have cameras. Like the radar sensors. I think the reason Teslas don't have it, is because the sensor hardware was expensive a few years back. I assume they are much cheaper now.

      • ndsipa_pomu 4 hours ago

        Tesla have also backed themselves into a corner by declaring that older models are hardware capable of FSD, so they can't easily add LIDAR to new models without being liable for upgrading/refunding previously sold models.

        • gizajob an hour ago

          New for 2027 - ABSOLUTE Self Driving Pro Max!

        • bbarnett 3 hours ago

          I thought they had it on some models already, then removed it on models after?

          edit: no, it was ultrasonic sensors. But this was likely object detection, and now it's gone.

    • mirsadm 38 minutes ago

      I don't know about you but one of my primary goals when driving is not hitting into things

  • UltraSane an hour ago

    Camera only might work better if you used regular digital cameras along with more advanced cameras like event based cameras that send pixels as soon as they change brightness and have microsecond latency and\or Single Photon Avalanche Diode (SPAD) sensors which can detect single photons. Having the same footage from all 3 of these would enable some fascinating training options.

    But Tesla didn't do this.

  • moralestapia 3 hours ago

    >Elon's vision-only move was extremely "short-sighted"

    It wasn't Elon's but Karpathy's.

    • Fricken 3 hours ago

      Sterling Anderson was the first autopilot director, and he was fired for insisting on Lidar. Elon sued Sterling Anderson, then hired the bootlick Karpathy to help him grease chumps.

      • mcv 3 hours ago

        But why is Elon so opposed to Lidar? I don't get it.

        • Fricken 2 hours ago

          At that time Lidar was too expensive and ugly to be putting in every car. Robust Lidar for SAE level 4 autonomous vehicles is still not cheap and still pretty ugly.

    • pinkmuffinere 3 hours ago

      For decisions of this scale (ie, tens of years of development time, committing multiple products to a single unproven technology), the CEO really should be involved. Maybe they’ll just decide to take the recommendation of the SMEs, but it’s hard for me to imagine Elon had no say in it.

    • amelius 3 hours ago

      I suspect so too, but is it factual?

  • zpeti 5 hours ago

    If a human brain can tell the difference between sun glare and an object, machine learning certainly can.

    It’s already better at X-rays and radiology in many cases.

    Everything you are talking about is just a matter of sufficient learning data and training.

    • audunw 4 hours ago

      1. A human has a lot more options to deal with things like sun glare. We can move our head, use shade, etc. And when it comes to certain aspects around dynamic range the human eyes are still better than cameras. And most of all, if we loose nearly all vision we are intelligent enough to simulate the behaviour of most objects around us to react safe for the next few seconds. 2. Human intelligence is much deeper than machine vision. We can predict a lot of things that machine visions have no hope to achieve without some kind of incredibly advanced multi-modal model which is probably many years out.

      The most important thing is that Tesla/Elon absolutely had no way to know, and no reason to believe (other than as a way to rationalise a dangerously risky bet) that machine vision would be able to solve all these issues in time to make good on their promise.

      • mcv 3 hours ago

        Not only do we have options to deal with it, we understand that it's a vision artefact, and not something real. We understand objects don't vanish or appear out of nowhere. We understand the glare isn't reality but is obstructing our view of reality. We immediately understand we're therefore dealing with incomplete information and compensate for that. Including looking for other ways to look around the instruction or fill in the gaps. Without even thinking about it, often.

    • tsimionescu 3 hours ago

      The human brain is the result of literal billions of years of evolution, across trillions of organisms. The "just" in your "just a matter of sufficient learning data and training" is doing a lot of work.

    • jihadjihad 4 hours ago

      This comment is a perfect illustration of the hubris of this technology in general.

    • stevage 32 minutes ago

      It's a big if, no? Humans do struggle with sun glare. It'd be great if cars were much better.

    • threatofrain 3 hours ago

      If you have cheat codes then why not just use it instead of insisting on principle that our eyes are good enough? We see Waymo use the cheat codes, oh no. We also only have binocular vision, so I guess Tesla is already okay with superhuman cheat codes.

  • torginus 4 hours ago

    The mistakes you describe are the issues of the AI system controlling the car, not of the cameras themselves. If you were watching the camera feed and teleoperating the vehicle, no way you'd phantom brake at a sudden bit of glare.

    • petee 4 hours ago

      Going from cameras to the human model, every morning on my way to work humans suddenly slam their brakes for the sun in their eyes: if you can't see, you can't see. I think it's another good example why cameras are not enough alone.

    • nosianu 4 hours ago

      OP says nothing else???

      > this tells me volumes about what's going on in the computer vision system

      Emphasis:

      > computer vision system

  • gcanyon 2 hours ago

    The wiper system has nothing to do with self-driving -- it's based on total internal reflection in the glass: https://www.youtube.com/watch?v=TLm7Q92xMjQ

    • sean_bright 2 hours ago

      Teslas do not use the rain sensors discussed in this video, they use cameras to detect rain.

    • vel0city 2 hours ago

      That's how every non-Tesla works. Tesla's don't do this method which is why their auto wipers have always been so bad compared to everyone else.

dlcarrier 15 hours ago

This looks to me like they are acknowledging that their claims were premature, possibly due to claims of false advertising, but are otherwise carrying forward as they were.

Maybe they'll reach level 4 or higher automation, and will be able to claim full self driving, but like fusion power and post-singularity AI, it seems to be one of those things where the closer we get to it, the further away it is.

  • sschueller 5 hours ago

    Premature? Is that what we call this now? It's straight up fraud!

    Others are in prison for far less.

    • tombert 40 minutes ago

      I was about to say this. Elon would go on stage and say something like “and this is something we can do today”, or “coming next year” in 2018. The crowd goes wild, the stock price shoots up.

      The first time could be an honest mistake, but after a certain point we have to assume that it’s just a lie to boost the stock price.

      • mlindner 7 minutes ago

        The stock price hasn't dropped though, the opposite rather.

        • tombert 4 minutes ago

          I know. That’s my point; he just goes on stage and lies, the stock price goes up and it doesn’t appear to correct itself despite the boost being based on a lie.

  • dreamcompiler 14 hours ago

    Not gonna happen as long as Musk is CEO. He's hard over on a vision-only approach without lidar or radar, and it won't work. Companies like Waymo that use these sensors and understand sensor fusion are already eating Tesla's lunch. Tesla will never catch up with vision alone.

    • Rohansi 14 hours ago

      While I don't think vision-only is hopeless (it works for human drivers) the cameras on Teslas are not at all reliable enough for FSD. They have little to no redundancy and only the forward facing camera can (barely) clean itself. Even if they got their vision-only FSD to work nicely it'll need another hardware revision to resolve this.

      • vbezhenar 4 hours ago

        I feel like our AI research at physical world falls so much behind language-level AI, that our reasoning might be clouded.

        Compare Boston Dynamics and cat. They are on the absolutely different levels for their bodies and their ability to manipulate their bodies.

        I have no doubts, that using cameras-only would absolutely work for AI cars, but at the same time I'm feel that this kind of AI is not there. And if we want autonomous cars, it might be possible, but we need to equip them with as much sensors as necessary, not setting any artificial boundaries.

        • threatofrain 3 hours ago

          But lidar is basically a cheat code, whether or not optical is sufficient. Why wait for end stage driving AI? Why not use cheat codes and wait for cheaper technology later?

      • moogly 14 hours ago

        > While I don't think vision-only is hopeless (it works for human drivers)

        I guess you don't drive? You use more senses than just vision when driving a car.

        • figassis 4 hours ago

          Behavioral and pattern analysis is always in full overdrive when I drive. I drive in Africa, people never follow rules, red lights at crossings mean go for bikers, when there are no lights you can't just give right of way, or you'll never move. When nearing intersections, people accelerate so they can cross before you, and it's a long line, and they know you have right of way, so they accelerate to scare you into stopping. Amateurs freeze and hold up the line for a very long time, usually until a traffic officer shows up to unblock (multiply this by every intersection). In order for you to get anywhere, you have to play the same game, and get close enough to the point where they aren't sure you'll stop, and will hit you and will have to pay. So often at crossings you're always in near misses and they realize you're not going to stop, so they do. Everyone is skilled enough to do this daily. Your senses, your risk analysis, your spider sense are fully overclocked most of the time. And then there are all the other crazy things that happen, like all the insane lane zig zagging people do, bikers our of nowhere et night with no lights, memorizing all the pot holes in all roads in the city bc they aren't illuminated at night so you can drive at 80-120km/h, etc. So no, it's not just your eyes. Lots of sensors, memory, processing, memory/mapping are required.

        • bhaney 4 hours ago

          Personally, I can smell a left turn signal from nearly three blocks away

          • okr 4 hours ago

            The spider crawling out of the back of the car mirror has seen things, that are far beyond i will ever experience visually!

        • terminalshort 2 hours ago

          Yeah, but you can drive on vision alone. Deaf people are allowed to drive just the same as anyone else.

        • Rohansi 14 hours ago

          And which ones can't be replicated with hardware?

          • scrollaway 5 hours ago

            Even without getting out of the vision sense there are features of vision Tesla doesn’t properly try to replicate. Depth perception for example (it does DP very differently to humans).

            You also do use your ears when driving.

            • mlindner 6 minutes ago

              Teslas have and use microphones.

            • dtj1123 27 minutes ago

              One eyed, deaf people can drive

            • vbezhenar 4 hours ago

              One-eyed people are allowed to drive.

            • gizajob 5 hours ago

              Deaf people can drive fine.

          • moogly 13 hours ago

            Ask Musk; he's the one who claims that sensor fusion does not work.

        • ndsipa_pomu 4 hours ago

          > You use more senses than just vision when driving a car

          Deaf drivers (may include drivers playing loud music too) don't, unless they're somehow tasting the other vehicles.

          • ChrisMarshallNY 2 hours ago

            We have these things called "inner ears." I'm pretty sure deaf people have them, too.

            Nature's accelerometers.

            I've had mine go bad, and it wasn't fun.

            Just sayin'...

          • vel0city 2 hours ago

            There are more than three senses.

            • ndsipa_pomu an hour ago

              Yes and they're not really of much use in driving safely unless you're referring to some spidey-sense of danger.

              • tombert 36 minutes ago

                I am not 100% sure which “sense” this would be, but when I drive I can “feel” the texture of the road and intuit roughly how much traction I have. I’m not special, every driver does this, consciously or not.

                I am not saying that you couldn’t do this with hardware, I am quite confident you could actually, but I am just saying that there are senses other than sight and sound at play here.

              • vel0city 15 minutes ago

                I'm using inertial senses from my inner ear. I feel the suspension through the seat. I feel feedback through the steering wheel. I can feel the g forces pulling on my body.

        • renewiltord 14 hours ago

          But we allow deaf people to drive but not people who are entirely blind. This means vision is necessary and sufficient.

          The problem is clearly a question of the fidelity of the vision and our ability to slave a decision maker and mapper to it.

      • bkettle 10 hours ago

        > it works for human drivers

        Sure, for some definition of "works"...

        https://www.iihs.org/research-areas/fatality-statistics/deta...

        • Rohansi 8 hours ago

          Vision is almost certainly not the main issue with humans as drivers.

          • NaomiLehman 5 hours ago

            it's one of the reasons.

            • Rohansi 5 hours ago

              For sure, but my phone camera sees better than I do. Cars can make use of better camera sensors and have more than two of them. You can't just extrapolate the conclusion that human vision bad = vision sensors bad.

              • NaomiLehman 4 hours ago

                we can't conclude that LIDAR is better than a camera? Is it worth cutting the costs? LIDAR has everything that a camera has plus more.

      • SalmoShalazar 11 hours ago

        Such utter drivel. A camera is not the equivalent of human eyes and sensory processing, let alone an entire human being engaging with the physical world.

        • terminalshort 2 hours ago

          Cameras are better than human eyes. Much better. There are areas in which they are worse, but that's completely outweighed by the fact that you are not limited to two of them and they can have a 360 degree field of vision.

        • Rohansi 6 hours ago

          The best cameras are surely better than most peoples' eyes these days.

          Sensory processing is not matched, sure, but IMO how a human drives is more involved than it needs to be. We only have two eyes and they both look in the same direction. We need to continuously look around to track what's around us. It demands a lot of attention from us that we may not always have to spare, especially if we're distracted.

          • rcxdude 5 hours ago

            >The best cameras are surely better than most peoples' eyes these days.

            Not on all metrics, especially not simultaneously. The dynamic range of human eyes, for example, is extremely high.

            • Rohansi 5 hours ago

              The front camera Tesla is using is very good with this. You can drive with the sun shining directly into it and it will still detect everything 99% of the time, at least with my older model 3. Way better than me stuck looking at the pavement directly in front the car.

              AFAIK there is also more than one front camera. Why would anyone try to do it all with one or two camera sensors like humans do it?

              It's important to remember that the cameras Tesla are using are optimized for everything but picture quality. They are not just taking flagship phone camera sensors and sticking them into cars. That's why their dashcam recordings look so bad (to us) if you've ever seen them.

          • kivle 40 minutes ago

            Well, Teslas use low cost consumer cameras. Not DSLRs. Bad framerate, bad resolution and bad dynamic range. Very far from human vision and easily blinded and completely washed out by sudden shifts in light.

    • formercoder 14 hours ago

      Humans drive without LIDAR. Why can’t robots?

      • cannonpr 14 hours ago

        Because human vision has very little in common with camera vision and is a far more advanced sensor, on a far more advanced platform (ability to scan and pivot etc), with a lot more compute available to it.

        • torginus 4 hours ago

          I don't think it's a sensors issue - if I gave you a panoramic feed of what a Tesla sees on a series of screens, I'm pretty sure you'd be able to learn to drive it (well).

        • lstodd 14 hours ago

          yeah, try matching a human eye on dynamic range and then on angular speed and then on refocus. okay forget that.

          try matching a cat's eye on those metrics. and it is much simpler that human one.

          • terminalshort 2 hours ago

            Who cares? They don't need that. The cameras can have continuous attention on a 360 degree field of vision. That's like saying a car can never match a human at bipedal running speed.

          • dmos62 5 hours ago

            I'm curious, in what ways is a cat's vision simpler?

      • phire 13 hours ago

        Why tie your hands behind your back?

        LIDAR based self-driving cars will always massively exceed the safety and performance of vision-only self driving cars.

        Current Tesla cameras+computer vision is nowhere near as good as humans. But LIDAR based self-driving cars already have way better situational awareness in many scenarios. They are way closer to actually delivering.

        • kimixa 13 hours ago

          And what driver wouldn't want extra senses, if they could actually meaningfully be used? The goal is to drive well on public roads, not some "Hands Tied Behind My Back" competition.

        • tliltocatl 5 hours ago

          Because any active sensor is going to jam other such sensors once there are too many of them on the road. This is sad but true.

      • Sharlin 5 hours ago

        And bird fly without radar. Still we equip planes with them.

      • apparent 13 hours ago

        The human processing unit understands semantics much better than the Tesla's processing unit. This helps avoid what humans would consider stupid mistakes, but which might be very tricky for Teslas to reliably avoid.

      • randerson 14 hours ago

        Even if they could: Why settle for a car that is only as good as a human when the competitors are making cars that are better than a human?

        • dotancohen 4 hours ago

          Cost, weight, and reliability. The best part is no part.

          No part costs less, it also doesn't break, it also doesn't need to be installed, nor stocked in every crisis dealership's shelf, nor can a supplier hold up production. It doesn't add wires (complexity and size) to the wiring harness, or clog up the CAN bus message queue (LIDAR is a lot of data). It also does not need another dedicated place engineered for it, further constraining other systems and crash safety. Not to mention the electricity used, a premium resource in an electric vehicle of limited range.

          That's all off the top of my head. I'm sure there's even better reasons out there.

          • randerson 3 hours ago

            These are all good points. But that just seems like it adds cost to the car. A manufacturer could have an entry-level offering with just a camera and a high-end offering with LIDAR that costs extra for those who want the safest car they can afford. High-end cars already have so many more components and sensors than entry-level ones. There is a price point at which the manufacturer can make them reliable, supply spare parts & training, and increase the battery/engine size to compensate for the weight and power draw.

            • terminalshort 2 hours ago

              We already have that. Tesla FSD is the cheap camera only option and Waymo is the expensive LIDAR option that costs ~150K (last time I heard). You can't buy a Waymo, though, because the price is not practical for an individually owned vehicle. But eventually I'm sure you will be able to.

          • dygd 2 hours ago

            Teslas use automotive Ethernet for sensor data which has much more bandwidth compared to CAN bus

      • systemswizard 14 hours ago

        Because our eyes work better than the cheap cameras Tesla uses?

        • lstodd 14 hours ago

          problem is, expensive cameras that Tesla doesn't use don't work either.

          • systemswizard 13 hours ago

            They cost 20-60$ to make per camera depending on the vehicle year and model. They also charge $3000 per camera to replace them…

            • terminalshort 2 hours ago

              They charge $3000 for the hours of labor to take apart the car, pull the old camera out, put the new camera in, and put the car back together, not for the camera. You can argue that $3000 is excessive, but to compare it to the cost of the camera itself is dishonest.

            • MegaButts 13 hours ago

              I think his point was even if you bought insanely expensive cameras for tens of thousands of dollars, they would still be worse than the human eye.

            • dzhiurgis 12 hours ago

              Fender camera is like $50 and requires 0 skill to replace. Next.

      • zeknife 4 hours ago

        I wouldn't trust a human to drive a car if they had perfect vision but were otherwise deaf, had no proprioception and were unable to walk out of their car to observe and interact with the world.

        • dotancohen 4 hours ago

          And yet deaf people regularly drive cars, as do blind-in-one-eye people, and I've never seen somebody leave their vehicle during active driving.

          • zeknife 4 hours ago

            I didn't mean that a human driver needs to leave their vehicle to drive safely, I mean that we understand the world because we live in it. No amount of machine learning can give autonomous vehicles a complete enough world model to deal with novel situations, because you need to actually leave the road and interact with the world directly in order to understand it at that level.

      • dreamcompiler 12 hours ago

        Chimpanzees have binocular color vision with similar acuity to humans. Yet we don't let them drive taxis. Why?

        • ikekkdcjkfke 5 hours ago

          Chimpanzies are better than humans given a reward structure they understand. The next battlefield evilution are chimpanzies hooked up with intravenous cocaine modules running around with 50. cals

        • ndsipa_pomu 4 hours ago

          There's laws about mis-treating animals. Driving a taxi would surely count as inhumane torture.

      • Waterluvian 14 hours ago

        They can. One day. But nobody can just will it to be today.

      • nkrisc 14 hours ago

        Well these robots can’t.

    • dzhiurgis 13 hours ago

      So robotaxi trial thats happening already is some sort of rendering, ai slop and rides we see aren’t real?

  • crooked-v 14 hours ago

    So does anyone who previously bought it on claims that actual full self-driving would be "coming soon" get refunds?

    • garbagewoman 14 hours ago

      Hopefully not. They might learn a lesson from the experience.

      • blackoil 13 hours ago

        Hmm, you want to penalize company and teach a lesson to customers,so give the money to Ford shareholders.

  • jojobas an hour ago

    >false advertising

    I think you mean "securities fraud", at gargantuan scale at that. Theranos and Nikola were nowhere near that scale.

    • paulryanrogers 8 minutes ago

      It is strange how Elon and Tesla get a pass on this. Tesla has contributed to the death of more people than Thernos. I guess he didn't rip off rich investors, except maybe the ones who died in their Teslas.

      Perhaps it's that cars are more sacred than healthcare.

  • jeffbee 14 hours ago

    > Maybe they'll reach level 4 or higher automation

    There is little to suggest that Tesla is any closer to level 4 automation than Nabisco is. The Dojo supercomputer that was going to get them there? Never existed.

    • ascorbic 5 hours ago

      And their H100s were diverted to build MechaHitler instead

  • standardUser 15 hours ago

    What does Waymo lack in your opinion to not be considered "full self driving"?

    The persistent problem seems to be severe weather, but the gap between the weather a human shouldn't drive in and weather a robot can't drive in will only get smaller. In the end, the reason to own a self-driven vehicle may come down to how many severe weather days you have to endure in your locale.

    • mkl 14 hours ago

      Waymo is very restricted on the locations it drives (limit parts of limited cities, I think no freeways still), and uses remote operators to make decisions in unusual situations and when it gets stuck. This article from last year has quite a bit of information: https://arstechnica.com/cars/2024/05/on-self-driving-waymo-i...

      • panarky 14 hours ago

        Waymo never allows a remote human to drive the car. If it gets stuck, a remote operator can assess the situation and tell the car where it should go, but all driving is always handled locally by the onboard system in the vehicle.

        Interesting that Waymo now operates just fine in SF fog, and is expanding to Seattle (rain) and Denver (snow and ice).

        • epcoa 14 hours ago

          The person you're replying to never claimed otherwise. However, while decision support is not directly steering and accelerating/braking the car, I am just going to assert it is still driving the car, at least for how it actually matters in this discussion. And the best estimate is that these interventions are "uncommon" on the order of 10ks miles, but that isn't rare.

          A system that requires a "higher level" handler is not full self driving.

          • ascorbic 5 hours ago

            I think the important part is that the remote person doesn't need to be alert, and make real time decisions within seconds. As I understand it, the remote driver is usually making decisions with the car stationary. I'd imagine that any future FSD car with no steering wheel would probably have a screen for the driver to make those kind of decisions.

          • AlotOfReading 13 hours ago

            There's a simple test I find useful to determine who's driving:

            If the vehicle has a collision, who's ultimately responsible? That person (or computer) is the driver.

            If a Waymo hits a pole for example, the software has a bug. It wasn't the responsibility of a remote assistant to monitor the environment in real time and prevent the accident, so we call the computer the driver.

            If we put a safety driver in the seat and run the same software that hits the same pole, it was the human who didn't meet their responsibility to prevent the accident. Therefore, they're the driver.

          • panarky 14 hours ago

            Agreed!

            Which is why an autonomous car company that is responsible and prioritizes safety would never call their SAE Level 4 vehicle "full self-driving".

            And that's why it's so irresponsible and dangerous for Tesla to continue using that marketing hype term for their SAE Level 2 system.

          • standardUser 10 hours ago

            In that case, it sounds like "full self driving" is more of an academic concept that is probably past it's due date. Waymo and Apollo Go are determining what the actual requirements are for an ultra-low labor automated taxi service by running them successfully.

      • phire 12 hours ago

        Geofencing and occasional human override meets the definition of "Level 4 self driving". Especially when it's a remote human override.

        But is Level 4 enough to count as "Full Self Driving"? I'd argue it really depends on how big the geofence area is, and how rare interventions are. A car that can drive on 95% of public roads might as well be FSD from the perspective of the average drive, even if it falls short of being Level 5 (which requires zero geofencing and zero human intervention).

      • zer00eyz 14 hours ago

        Waymo has been testing freeway driving for a bit:

        https://www.reddit.com/r/waymo/comments/1gsv4d7/waymo_spotte...

        > and uses remote operators to make decisions in unusual situations and when it gets stuck.

        This is why its limited markets and areas of service: connectivity for this sort of thing matters. Your robotaxi crashing cause the human backup lost 5g connectivity is gonna be a real real bad look. NO one is talking about their intervention stats. IF they were good I would assume that someone would publish them for marketing reasons.

        • decimalenough 14 hours ago

          > Your robotaxi crashing cause the human backup lost 5g connectivity is gonna be a real real bad look.

          Waymo navigates autonomously 100% of the time. The human backup's role is limited to selecting the best option if the car has stopped due to an obstacle it's not sure how to navigate.

        • refulgentis 14 hours ago

          > NO one is talking about their intervention stats.

          Interventions are a term of art, i.e. it has a specific technical meaning in self-driving. A human taking timely action to prevent a bad outcome the system was creating, not taking action to get unstuck.

          > IF they were good I would assume that someone would publish them for marketing reasons.

          I think there's an interesting lens to look at it in: remote interventions are massively disruptive, the car goes into a specific mode and support calls in to check in with the passenger.

          It's baked into UX judgement, it's not really something a specific number would shed more light on.

          If there was a significant problem with this, it would be well-known given the scale they operate at now.

      • standardUser 14 hours ago

        All cars were once restricted in the locations they could drive. EVs are restricted today. I don't see why universal access is a requirement for a commercially viable autonomous taxi service, which is what Waymo is currently. And the need for human operators seems obvious for any business, no matter how autonomous, let alone a business operating in a cutting edge and frankly dangerous space.

        • pavel_lishin 13 hours ago

          > EVs are restricted today.

          Are they? Did you mean Autonomous Vehicles?

          • standardUser 10 hours ago

            No, you can't go driving off into an area with no charging options, which would be much of the world.

        • shadowgovt 14 hours ago

          It's by definition in terms of how these things are counted.

          L4 is "full autonomy, but in a constrained environment." L5 is the holy grail: as good as or better than human in every environment a human could take a car (or, depending on who's doing the defining: every road a human could take a car on. Most people don't say L5 and mean "full Canyonero").

    • gerdesj 14 hours ago

      No one does FSD yet - properly.

      It initially seems mad that a human, inside the box can outperform the "finest" efforts of a multi zillion dollar company. The human has all their sensors inside the box and most of them stymied by the non transparent parts. Bad weather makes it worse.

      However, look at the sensors and compute being deployed on cars. Its all minimums and cost focused - basically MVP, with deaths as a costed variable in an equation.

      A car could have cameras with views everywhere for optical, LIDAR, RADAR, even a form of SONAR if it can be useful, microwave and way more. Accellerometers and all sorts too, all feeding into a model.

      As a driver, I've come up with strategies such as "look left, listen right". I'm British so drive on the left and sit on the right side of my car. When turning right and I have the window wound down, I can watch the left for a gap and listen for cars to the right. I use it as a negative and never a positive - so if I see a gap on the left and I hear a car to my right, I stay put. If I see a gap to the left but hear no sound on my right, I turn my head to confirm that there is a space and do a final quick go/no go (which involves another check left and right). This strategy saves quite a lot of head swings and if done properly is safe.

      I now drive an EV: One year so far - a Seic MG4, with cameras on all four sides, that I can't record from but can use. It has lane assist (so lateral control, which craps out on many A road sections but is fine on motorway class roads) and cruise control that will keep a safe distance from other vehicles (that works well on most roads and very well on motorways, there are restrictions).

      Recently I was driving and a really heavy rain shower hit as I was overtaking a lorry. I immediately dived back into lane one, behind the lorry and put cruise on. I could just see the edge white line, so I dealt with left/right and the car sorted out forward/backward. I can easily deal with both but its quite nice to be able carefully abrogate responsibilities.

    • panick21_ 14 hours ago

      Put a Waymo on random road in the world, can it drive it?

      • standardUser 14 hours ago

        For a couple decades you couldn't even bring your cell phone anywhere in the world and use it. Transformational technologies don't have to be available universally and simultaneously to be viable. Even when the gas car was created you couldn't use it anywhere that didn't have gasoline and paved roads, plus a mechanic and access to parts.

        • jazzyjackson 14 hours ago

          A significant portion of US highways and backroads are uncovered by cell signal. I suppose a self driving car would have starlink these days.

          • standardUser 14 hours ago

            We once had no gas stations, now we have 150,000 (in the US). If the commercial need is there, building out connectivity is an unlikely impediment. Starlink et al. can solve this everywhere except when there's severe weather, a problem Waymo shares, which is starting to make me think the Upper Midwest might be waiting a very long time for self-driving cars.

      • Kye 14 hours ago

        That's the real issue. If "can navigate roads" is enough then we've had full self-driving for a while. There needs to be some base level of general purpose capability or it's just a neat regional curiosity.

      • cryptoz 14 hours ago

        Many humans couldn't.

        • jacquesm 14 hours ago

          Most humans that claim they could could. Anyway, this seems like a pretty low quality comment, you got perfectly well what the OP meant.

          • cryptoz 14 hours ago

            Oh gosh sorry, I do try to contribute positively to HN and write quality comments. I'll expand: I've been in circumstances where I've been rented a company car in a foreign country, felt that I was a good driver, but struggled. The road signs are different and can be confusing, the local patterns and habits of drivers can be totally different from what you're accustomed to. I don't doubt that lots of humans could drive most roads - but I think the average driver would struggle, and have a much higher rate of accidents than a local.

            Germany, Italy, India all stand out as examples to me. The roads and driving culture is very different, and can be dangerous to someone who is used to driving on American suburban streets.

            I really do stand by my comment, and apologize for the 'low quality' nature of it. I meant to suggest that we set the bar far higher for AI than we do for people, which is in general a good thing. But still - I would say that by this definition of 'full self driving', it wouldn't be met very well by many or most human drivers.

            • jacquesm 14 hours ago

              I've driven all over the planet except for Asia and Africa. So far, no real problem and I think most drivers would adapt within a day or two. Greece, Panama and Colombia stand out as somewhat more exciting. Switching to left hand driving in the UK also wasn't a big problem but you do have to pay more attention.

              Of course I may have simply been lucky, but given that my driving license is valid in many countries it seems as though humanity has determined this is mostly a solved problem. When someone says "Put a Waymo on random road in the world, can it drive it?" they mean: I would expect a human to be able to drive on a random road in the world. And they likely could. Can a Waymo do the same?

              I don't know the answer to that one. But if there is one thing that humans are pretty good at it is adaptation to circumstances previously unseen. I am not sure if a Waymo could do the same but it would be a very interesting experiment to find out.

              American suburban streets are not representative of driving in most parts of the world. I don't think the bar of 'should be able to drive most places where humans can drive' is all that high and even your average American would adapt pretty quickly to driving in different places. Source: I know plenty of Americans and have seen them drive in lots of countries. Usually it works quite well, though, admittedly, seeing them in Germany was kind of funny.

              "Am I hallucinating or did we just get passed by an old lady? And we're doing 85 Mph?"

            • gerdesj 13 hours ago

              "Germany, Italy, India "

              That's experience and you learned and survived to tell the tale. Its almost as though you are capable of learning how to deal with an unfamiliar environment, and fail safe!

              I'm a Brit and have driven across most of Europe, US/CA and a few other places.

              Southern Italy eg around Napoli is pretty fraught - around there I find that you need to treat your entire car as an indicator: if you can wedge your car into a traffic stream, you will be let in, mostly without horns blaring. If you sit and wait, you will go grey haired eventually.

              In Germania, speed is king. I lived there in the 70s-90s as well as being a visitor recently. The autobahns are insane if you stray out of lane one, the rest of the road system is civilised.

              France - mostly like driving around the UK apart from their weird right hand side of the road thing! La Perifique is just as funky as the M25 and La Place du Concorde is a right old laugh. The rest of the country that I have driven is very civilised.

              Europe to the right of Italy is pretty safe too. I have to say that across the entirety of Europe, that road signage is very good. The one sign that might confuse any non-European is the white and yellow diamond (we don't have them in the UK). It means that you have priority over an implied "priority to the right". See https://driveeurope.co.uk/2013/02/27/priority-to-the-right/ for a decent explanation.

              Roundabouts were invented in the US. In the UK when you are actually on a roundabout you have right of way. However, everyone will behave as though "priorite a la doite" and there will often be a stand off - its hilarious!

              In the UK, when someone flashes their headlights at you it generally means "I have seen you and will let you in". That generally surprises foreigners (I once gave a lift to a prospective employee candidate from Poland and he was absolutely aghast at how polite our roads seemed to be). Don't always assume that you will be given space but we are pretty good at "after you".

              • jacquesm 13 hours ago

                That reminds me. I was in the UK on some trip and watched two very polite English people crash into each other when after multiple such 'after you' exchanges they both simultaneously thought screw it and accelerated into each other. Fortunately only some bent metal.

          • bsder 14 hours ago

            > Most humans that claim they could could.

            I don't agree.

            My anecdata suggests that Waymo is significantly better than random ridesharing drivers in the US, nowadays.

            My last dozen ridesharing experiences only had a single driver that wasn't actively hazardous on the road. One of them was so bad that I actually flagged him on the service.

            My Waymo experiences, by contrast, have all been uniformly excellent.

            I suspect that Waymo is already better than the median human driver (anecdata suggests that's a really low bar)--and it just keeps getting better.

            • jacquesm 13 hours ago

              > Most humans that claim they could could.

              > My anecdata suggests that Waymo is significantly better than random ridesharing drivers in the US, nowadays.

              Those two aren't really related are they? That's one locality and a specific kind of driver. If you picked a random road there is a pretty small chance that road would be one like the one where Waymo is currently rolled out, and where your ridesharing drivers are representative of the general public, they likely are not.

an0malous 15 hours ago

How have they gotten away with such obvious misadvertising for this long? It’s undeniably misled customers and inflated their stock value

  • dreamcompiler 14 hours ago

    Normally the Board of Directors would fire any CEO that destroyed as much of the company's value as Musk has. But Tesla's board is full of Musk syncophants and family members who refuse to stand up to him.

  • Eddy_Viscosity2 14 hours ago

    Who was going to stop them from lying?

    • vlovich123 14 hours ago

      SEC and FTC would be obvious candidates who historically would do this. States also have the ability to prosecute this via UDAP (unfair and deceptive practices) laws.

      Probably Tesla being the only major domestic EV manufacturer + historically Musk not wading into politics + Musk/Tesla being widely popular for a time is probably why no one has gone after him. Not sure how this changes going forward with Musk being a very polarizing figure now.

      • 1over137 14 hours ago

        >SEC and FTC would be obvious candidates who historically would do this.

        Yeah, historically, as in: before many people here were born. It's been so long since SEC and FTC did such things.

        • rsynnott 4 hours ago

          FTC, sure, yeah, mostly, kinda neutered these days. SEC, despite Trump’s efforts to neuter it, is still fairly scary tho.

      • MangoToupe 3 hours ago

        Not to mention there's got to have been insane pressure from the hill not to kill the golden goose.

      • randallsquared 13 hours ago

        The previous two administrations (Trump I and Biden) being somewhat anti-Tesla or anti-Musk was some part of what prompted Musk to get into politics in the first place. Given the Biden admin's hostility, I would have expected the SEC and FTC to have been directed to do all they could against him within bounds, and so my first guess would be that they did, in fact, do everything justifiable.

        • MangoToupe 3 hours ago

          > anti-Tesla

          I'm curious why you think this. I would be pretty shocked if, despite Musk's disgusting personality, they weren't also bought in.

          • randallsquared 2 hours ago

            From 2022, a contemporaneous account of the Biden antipathy: https://www.detroitnews.com/story/business/autos/2022/02/03/...

            While I didn't look long for a more neutral source, Teslarati has a good list of the prompts of the shift from Musk being anti-Trump and pro-Biden, to giving up on Biden, to supporting Trump: https://www.teslarati.com/former-tesla-exec-confirms-wsj-rep...

            There were apparently also other considerations not associated with Tesla for his turn (transgender child, etc), but my read on all this is that Musk saw staying out of politics didn't mean politics would stay away from him. Given that Trump II is also now somewhat anti-Musk, it's not clear to me that he succeeded in avoiding a longer-term axe for Tesla (Neuralink/Solarcity/SpaceX/Boring...) from politicians. We'll see.

      • barbazoo 14 hours ago

        Maybe that’s what happens in late stage capitalism. The billionaires get so powerful that they become untouchable. He’s already shown that he uses his fortune to steer political outcomes.

IgorPartola 2 hours ago

I don’t need self driving cars that can navigate alleys in Florence, Italy and also parkways in New England. Here is what we really need: put transponders into the roadway on freeways and use those for navigation and lane positioning. Then you would be responsible for getting onto the freeway and getting off the exit but can take a nap between. This would be something that would be do e by the DOT, supported by all car makers, and benefit everyone. LIDAR could be used for obstacle detection but not for navigation. And whoever figures out how to do the transponders and land a government contract and get at least one major car manufacturer on board would make bank.

  • randunel 2 hours ago

    How would you know which signals to trust and which to ignore?

    • jijijijij 8 minutes ago

      Blockchain.

      Just kidding.

      Wait, no! Please. No!

      How do I delete this???

    • uoaei an hour ago

      Physics prevents detected objects from jumping unrealistically. Current systems seem not to account for that at all, reacting to objects which appear and disappear spontaneously. Sensor fusion is exactly the solution to this: use a variety of sensors as input to reliably identify actual obstacles. To fake all the sensors at once you'd need to fake vision, lidar, and transponder locations simultaneously.

mettamage 4 hours ago

I’m not surprised. As a former Elon fan, it never struck me that he thought about this from first principles, whereas for SpaceX he did.

For as long as we can’t understand AI systems as well as we understand normal code, first principles thinking is out of reach.

It may be possible to get FSD another way but Elon’s edge is gone here.

jesenpaul 13 hours ago

They made tons of money on the Scam of the Decade™ from Oct 2016 (See their "Driver is just there for legal reasons" video) to Apr 2024 (when they officially changed it to Supervised FSD) and now its not even that.

d_sem an hour ago

My experience working in an automotive supplier suggest that Tesla engineers must have always knowns this and the real strategy was to provide the best ADAS experience with the cheapest sensor architecture. They certainly did achieved that goal.

There were aspirations that the bottom up approach would work with enough data, but as I learned about the kind of long tail cases that we solved with radar/camera fusion, camera-only seemed categorically less safe.

easy edge case: A self driving system cannot be inoperable due to sunlight or fog.

a more hackernew worthy consideration: calculate the angular pixel resolution required to accurately range and classify an object 100 meters away. (roughly the distance needed to safely stop if you're traveling 80mph) Now add a second camera for stereo and calculate the camera-to-camera extrinsic sensitivity you'd need to stay within to keep error sufficiently low in all temperature/road condition scenarios.

The answer is: screw that, I should just add a long range radar.

there are just so many considerations that show you need a multi-modality solution, and using human biology as a what-about-ism, doesn't translate to currently available technology.

gcanyon 2 hours ago

Honest question: did Tesla in the past promise that FSD would be unsupervised? My based-on-nothing memory is that they weren't promising that you wouldn't have to sit in the driver's seat, or that your steering wheel would collect dust. Arguing against myself: they did talk about Teslas going off to park themselves and returning, but that's a fairly limited use case. Maybe in the robotaxi descriptions?

My memory was more that you'd be able to get into (the driver's seat of) your Tesla in downtown Los Angeles, tell it you want to go to the Paris hotel in Vegas, and expect generally not to have to do anything to get there. But not guaranteed nothing.

  • herbturbo an hour ago

    In 2016 Musk said you’d be able to drive from LA to NYC without touching the steering wheel once “within 2 years”. He’s been making untrue statements about Tesla FSD for a decade.

mlindner 9 minutes ago

The title is rather misleading. They haven't given up on promise of autonomy...

ciconia 10 hours ago

War Is Peace. Freedom Is Slavery. Ignorance Is Strength. FSD is... whatever Elon says it is.

goloroden 2 hours ago

I think I’d call what Tesla did fraud. Or scam. Or both.

dvh 3 hours ago

And stock is up $15

starchild3001 14 hours ago

Feels like Musk should step down from the CEO role. The company hasn’t really delivered on its big promises: no real self-driving, Cybertruck turned into a flop, the affordable Tesla never materialized. Model S was revolutionary, but Model 3 is basically a cheaper version of that design, and in the last decade there hasn’t been a comparable breakthrough. Innovation seems stalled.

At this point, Tesla looks less like a disruptive startup and more like a large-cap company struggling to find its next act. Musk still runs it like a scrappy startup, but you can’t operate a trillion-dollar business with the same playbook. He’d probably be better off going back to building something new from scratch and letting someone else run Tesla like the large company it already is.

  • xpe 14 hours ago

    This is not a heavily researched comment, but it seems to me that the Model 3 is relatively affordable, at least compared to available options at the time. It depends on your point of comparison: there is a lot of competition for sure. The M3 was successful to a good degree, don’t you think? I mean, we should put a number on it so we’re not just comparing feels. The Model Y sells well too, doesn’t at least until the DOGE insanity.

    • starchild3001 13 hours ago

      Here's some heavy research for you -- Model 3 is competing with the likes of BMW, Audi etc. That's not considered the "affordable" tier. It's called luxury. Here's a comparison:

      https://www.truecar.com/compare/bmw-3-series-vs-tesla-model-...

      • xpe 2 hours ago

        I will charitably interpret “heavy research” as a joke.

        It is hard to interpret the smugness above in a positive light. It is unhelpful to you and to everyone here.

        If you want to compare an electric car against combustion-engine vehicle, go ahead, but that isn’t a key decision point for what we’re talking about.

        The TrueCar web page table does not account for a $7,500 federal tax credit for EVs. I recognize it ends soon — September 30 — if only to head off a potential zinger comment (which would be irrelevant to the overall point).

        All in all, it is notable that ~2 minutes asking a modern large language model for various comparisons is more helpful than this conversation with another human (presumably). If we’re going to advocate for the importance of humanity, seems to me like we should start demonstrating that we can at least act like why we deserve it. I view HN primarily as a place to learn and help others, not a place for snarky comments.

        A better modern comparison showing less expensive EVs would mention the Nissan Leaf or Chevy Equinox or others. The history is interesting and worth digging into. To mention one aspect: the Leaf had a ~7 year head start but the Tesla caught up in sales by ~2018 and became the best-selling EV — even at a higher price point. So this undermines any claim that Tesla wasn’t doing something right from the POV of customer perception.

        I don’t need to “be right” in this particular comment — I welcome corrections — I’m more interested in error correction and learning.

  • DoesntMatter22 14 hours ago

    They went from no revenue to the 9th most valuable company in the world under him. No vehicle sales to having the best selling vehicles in the world.

    They are still profitable, have very little debt and a ton of money into the bank.

    Every company has hits and misses. Bezos started before Musk and still hasn't gotten his rockets into orbit.

    • starchild3001 8 hours ago

      If I had to guess, I’d say the original Tesla founders had a greater influence than Musk. His track record, frankly, is unimpressive. He’s been promising full self-driving “next year” since 2016, yet it’s still nowhere close. Aside from the Model S and X, there hasn’t been a major innovation under his watch. The real groundbreaking work likely came before him. His reign? Far from remarkable. Each year has been a cycle of overpromising (often outright lying) and underdelivering. As for Tesla’s stock? Well, markets can stay irrational far longer than most people can remain solvent.

      • DoesntMatter22 5 hours ago

        Tarpenning and Eberhard left Tesla in 2008 and 2007 but somehow they had a greater influence? They contributed no money, nearly tanked the company but somehow were more important.

        "His track record is unimpressive"... I can see why you say that, I mean, took Tesla from almost nothing to a trillion dollar company. Started the most prolific rocket and satellite company in history (but hey, it's only rocket science right?), provides internet to places that it never even had the possibility of getting to, and providing untold millions the chance to get on the internet.

        Started a company that is giving the paralyzed the ability to use a computer controlling their brain, and is working to restore sight to the blind.

        Totally unimpressive. There are so many people who have done these things /s

  • derefr 14 hours ago

    Daily reminder that Telsa is not — nor was ever intended to be — a car company. Tesla is fundamentally an "energy generation and storage" (battery/supercapacitor) company. Given Tesla's fundamentals (the types of assets they own, the logistics they've built out), the Powerwall and Megapack are closer to Tesla's core product than the cars are. (And they also make a bunch of other battery-ish things that have no consumer names, just MIL-SPEC procurement codes.)

    Yes, right now car sales make up 78% of Tesla's revenue. But cars have 17% margins. The energy-storage division, currently at 10% of revenue, has more like 30% margins. And the car sales are falling as the battery sales ramp up.

    The cars were always a B2C bootstrap play for Tesla, to build out the factories it needed to sell grid-scale batteries (and things like military UAV batteries) under large enterprise B2B contracts. Which is why Tesla is pushing the "car narrative" less and less over time, seeming to fade into B2C irrelevancy — all their marketing and sales is gradually pivoting to B2B outreach.

    • JimDabell 13 hours ago

      > Telsa is not — nor was ever intended to be — a car company. Tesla is fundamentally an "energy generation and storage" (battery/supercapacitor) company.

      > The cars were always a B2C bootstrap play for Tesla, to build out the factories it needed to sell grid-scale batteries

      This seems like revisionist history. They called their company Tesla Motors, not Tesla Energy, after all.

      This is a blog post from the founder and CEO about their first energy play. It seems clear that their first energy product was an unintended byproduct of the Roadster, they worried about it being a distraction from their core car business, but they decided to go ahead with it because they saw it as a way to strengthen their car business.

      https://web.archive.org/web/20090814225814/http://www.teslam...

    • CPLX 4 hours ago

      > Telsa is not — nor was ever intended to be — a car company. Tesla is fundamentally an "energy generation and storage" (battery/supercapacitor) company

      Are we still doing this in 2025?

      Uber is not a taxi company it’s a transportation company! Just wait until they roll out buses!

      Juicero is not a fruit squeezing company it’s an end to end technology powered nourishment platform!

      And so on. Save it for the VC PowerPoints.

      Tesla is a car company. Maybe some day it’ll be defined by some other lines of business too. Maybe one day they’ll even surpass Yamaha.

    • utyop22 14 hours ago

      Are you an investor of Tesla by any chance?

      • derefr 11 hours ago

        Nope. Don't even own a car. Military-industrial-complexes are just my special interest. And apparently Musk's, too. (What do grid-scale batteries, rockets, data-satellite constellations, and tunnel boring machines have in common? They're all products/services that can be — and already are being — sold to multiple allied nations' militaries. AFAICT, this is 90% of the reason Trump can't fully cut ties with the guy.)

    • rsynnott 4 hours ago

      I mean if that’s true they’re _really_ overvalued; that sort of commodity utility stuff is very low margin.

pm90 15 hours ago

> Since 2016, Tesla has claimed that all its vehicles in production would be capable of achieving unsupervised self-driving capability.

> CEO Elon Musk has claimed that it would happen by the end of every year since 2018.

Even as a Tesla owner, it baffles me how rational adults can take this conman seriously.

  • cmurf 11 minutes ago

    Is hope rational?

    Is life absurd?

    Is hope a solution to absurdity?

  • lotsofpulp 14 hours ago

    It’s a winning strategy. See who won the presidential election recently.

  • rsynnott 4 hours ago

    Well, I mean, clearly you did at some point; you bought one of his cars.

    • jbm 4 hours ago

      I bought one too and he did not factor into it.

      Electric car + active battery management were what I cared about at the time of purchase. Also, I am biased against GM and Ford due to experiences with their cars in the 80s and 90s.

      I doubt I'm the only one.

      (In retrospect, the glass roof was not practical in Canada and I will look elsewhere in the future)

RyanShook 14 hours ago

Looking forward to the class action on this one…

  • greyface- 14 hours ago

    Tesla has binding arbitration that prohibits class actions.

    • t0mas88 6 hours ago

      Won't help them in most of Europe. Consumer protection laws here are stricter.

      • jillesvangurp 5 hours ago

        But class action suits are not a thing here. And FSD is not deployed in Europe.

        • pavlov 4 hours ago

          They’ve been selling FSD in Europe since 2018.

          I know because I bought it in March 2019 on a Model 3. (I got it because I thought it would help my elderly parents who mostly used the car.)

          7500 euros completely down the drain. It still can’t even read highway speed signs. A five-year-old would be a safer driver than Tesla’s joke FSD.

          They do have the audacity to send me NPS surveys on the car’s “Teslaversary.” Maybe they could guess by now that it’s a big fat zero.

        • ranguna 5 hours ago

          But it's promised.

jqpabc123 an hour ago

Musk is not an engineer.

At best he is skilled at sales and marketing --- maybe even management. At worst, he is a con artist.

The real problem for Musk and others like him is that while it is certainly possible to fool some of the people some of the time, most will *eventually* come to realize the lack of credibility and stop accepting the BS.

Musk has firmly established a pattern of over promising and under delivering. DOGE and FSD are just two examples --- and there is more of the same in his pipeline.

jgalt212 2 hours ago

Given this move, like the rest of TSLA's inane investor base, I wholeheartedly support the potential $1 trillion pay package for Musk

moomin 3 hours ago

My 1993 Nissan has FSD. I can fully drive myself anywhere.

asdff 14 hours ago

What I don't understand about this is that to my experience being driven around in friends teslas, its already there. It really seems like legalese vs technical capability. The damn thing can drive with no input and even find a parking spot and park itself. I mean where are we even moving the goalpost at this point? Because there's been some accidents its not valid? The question is how that compares to the accident rate of human drivers not that there should be an expectation of zero accidents ever.

  • AlotOfReading 13 hours ago

    The word "driving" has multiple, partially overlapping meanings. You're using it in a very informal sense to mean "I don't have to touch the controls much". Power to you for using whatever definitions you feel like.

    Other people, most importantly your local driving laws, use driving as a technical term to refer to tasks done by the entity that's ultimately responsible for the safety of the entire system. The human remains the driver in this definition, even if they've engaged FSD. They are not in a Waymo. If you're interested in specific technical verbiage, you should look at SAE J3016 (the infamous "levels" standard), which many vehicle codes incorporate.

    One of the critical differences between your informal definition is whether you can stop paying attention to the road and remain safe. With your definition, it's possible have a system where you're not "driving", but you still have a responsibility to react instantaneously to dangerous road events after hours of of inaction. Very few humans can reliably do that. It's not a great way to communicate the responsibilities people have in a safety-critical task they do every day.

jaggs 12 hours ago

One problem might be that American driving is not exactly... well great, is it? Roads are generally too straight and driving tests too soft. And for some weird reason, many US drivers seem to have a poor sense of situational awareness.

The result is it looks like many drivers are unaware of the benefits of defensive driving. Take that all into account and safe 'full self driving' may be tricky to achieve?

yieldcrv 14 hours ago

The lesson here is to wait for a chill SEC and friendly DOJ before you recant your fraudulent claims, because then they won’t be found to be fraudulent

  • comice 14 hours ago

    Wait for them? or buy them?

    • yieldcrv 14 hours ago

      You’re right, still an exercise of patience

ares623 15 hours ago

Most Honest Company (Sarcasm)

  • jacquesm 14 hours ago

    Fish rots from the head.

shadowgovt 14 hours ago

"Full Self Driving (Supervised)." In other words: you can take your mind off the road as long as you keep your mind on the road. Classic.

Tesla is kind of a joke in the FSD community these days. People working on this problem a lot longer than Musk's folk have been saying for years that their approach is fundamentally ignoring decades of research on the topic. Sounds like Tesla finally got the memo. I mostly feel sorry for their engineers (both the ones who bought the hype and thought they'd discover the secret sauce that a quarter-century-plus of full-time academic research couldn't find and the old salts who knew this was doomed but soldiered on anyway... but only so sorry, since I'm sure the checks kept clearing).

  • arijun 13 hours ago

    Until very recently I worked in the FSD community, and I wouldn’t say I viewed it as a joke. I don’t know if I believed they would get to level 5 without any lidar, it’s pretty good for what’s available in the consumer market.

    • shadowgovt 9 hours ago

      That's what I mean. Nobody I know thought there'd be a chance of getting to L4 (much less L5) without LIDAR. They doomed the goal from the gate and basically lied to people for years about the technological possibilities to pad their bottom line.

      It's two steps from selling snake-oil, basically. Not that L4 or L5 are impossible, but people who knew the problem domain looked at how they were approaching it hardware-wise and went "... uhuh."

aurizon 14 hours ago

I was a fool's game from the start, with only negative aspects = what could possibly go wrong?

  • utyop22 14 hours ago

    Tesla's share price is all based on the Greater Fool Theory in the short run.

    In the long run some of those promises might materialise. But who cares! Portfolio managers and retail investors want some juicy returns - share price volatility is welcomed.

freerobby 14 hours ago

This is clickbait from a publication that's had it out for Tesla for nearly a decade.

Tesla is pivoting messaging toward what the car can do today. You can believe that FSD will deliver L4 autonomy to owners or not -- I'm not wading into that -- but this updated web site copy does not change the promises they've made prior owners, and Tesla has not walked back those promises.

The most obvious tell of this is the unsupervised program in operation right now in Austin.

  • qwerpy 14 hours ago

    Marketing choice of words aside, it's already really good now to the point that it probably does 95% of my driving. Once in a while it chooses the wrong lane and very rarely I will have to intervene, but it's always getting better. If they just called it "Advanced Driver Assist" or something, and politics weren't such an emotional trigger, it would be hailed as a huge achievement.

    • freerobby 14 hours ago

      Yeah, Tesla did themselves no favors with how they initially marketed FSD, and all the missed timelines amplified the brand cost of that. I'm glad to see them focus on what it can do today. Better to underpromise and overdeliver etc.

      As an aside, it's wild how different the perspective is between the masses and the people who experience the bleeding edge here. "The future is here, it's just not evenly distributed," indeed.

      • utyop22 14 hours ago

        Surely you're joking? You really believe those timelines were set in good faith?

        Lol it has been strategic manipulation right the way through. Right out of an Industrial Organisation textbook.

        • freerobby 14 hours ago

          Yeah I think their early success with Tesla Vision was faster than expected, it went to their heads, and they underestimated the iteration and fine tuning needed to solve the edge cases. It's difficult to predict how many reps it will take to solve an intricate problem. That's not to excuse their public timeline -- their guidance was naive and IMO irresponsible -- but I don't think it was in bad faith.

  • an0malous 14 hours ago

    Great spin job. They didn’t lie, they’re just “pivoting their messaging”

  • panarky 14 hours ago

    Can you find any statement in the article that is false?

    • freerobby 14 hours ago

      The first one.

      > Tesla has changed the meaning of “Full Self-Driving”, also known as “FSD”, to give up on its original promise of delivering unsupervised autonomy.

      They have not given up on unsupervised autonomy. They are operating unsupervised autonomy in Austin TX as I type this!

      • addaon 14 hours ago

        > They have not given up on unsupervised autonomy. They are operating unsupervised autonomy in Austin TX as I type this!

        Setting aside calling a driver in the driver's seat "unsupervised"... that's exactly the point. People paid for this, and they are revoking their promise of delivering it, instead re-focusing on (attempting) operating it themselves.

        I'd have no objection to this if they offered buy-backs on the vehicles in the field, but that seems unlikely.

        • electriclove 13 hours ago

          I would like to understand what population feels they were fleeced. The FSD available on their cars with HW3 (some as old as 2017?) is quite impressive when you consider what the capabilities were back then. Sure, it won’t be as good as a 2025 Juniper Model Y. But who are the people that bought FSD in the early days and are unhappy and how big of a population is that? Is this the main thing people are upset about?

          Or are people upset about the current state of autonomous vehicles like Waymo (which has been working for Years!) and the limited launch of Robotaxi?

        • freerobby 14 hours ago

          I haven't closely followed which rides have drivers where, and what is driven by Tesla vs what is regulatory -- but I thought some "drivers" were still in the passenger seat in Austin?

          At any rate, I don't think they are revoking their prior promises. I expect them to deliver L4 autonomy to owners as previously promised. With that said, I'm glad they are ceasing that promise to new customers and focusing on what the car does today, given how wrong their timelines have been. I agree it's shitty if they don't deliver that, and that they should offer buybacks if they find themselves in that position.

          • addaon 13 hours ago

            > but I thought some "drivers" were still in the passenger seat in Austin?

            Nope, they gave up on that and moved them to the driver's seat.

      • narrator 35 minutes ago

        Yeah, they never said this. This article smells like anti-Elon FUD. "Elon is a dummy, everything he tries will fail, replace him with someone who isn't so controversial and supports the proper politics for a powerful global figure" and repeat in 100 minor internet blogs until the money to write these articles runs out.

iammjm 25 minutes ago

This nazi-saluting manchild has been purposefully lying about self-driving for close to 10 years now, each year self-driving coming "next year". How is this legal and not false advertisement?