Cake
  • Log In
  • Sign Up
    • yaypie
      Ryan Grove

      The NTSB has released a preliminary report on the Tesla Model S crash in Mountain View, California on March 23 that killed one person and led to speculation about whether Tesla's Autopilot system may have been at fault.

      We discussed this previously on Cake in Tesla Autopilot Was Engaged in Fatal Crash.

      I found these details from the report particularly interesting:

      • At 8 seconds prior to the crash, the Tesla was following a lead vehicle and was traveling about 65 mph.

      • At 7 seconds prior to the crash, the Tesla began a left steering movement while following a lead vehicle.

      • At 4 seconds prior to the crash, the Tesla was no longer following a lead vehicle.

      • At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive steering movement detected.

      Knowing what we know about the site of the accident (see the image below), this paints an interesting picture of what may have been the cause.

      This is only my own theory, but here's what I think may have happened:

      The Tesla was driving south in the left lane of US-101 with Autopilot engaged. Since there was another car in front of it, Autopilot was matching that car's speed and using its position as a high probability indicator of where the lane was going (it would have used lane markings as a slightly lower probability indicator).

      While approaching the US-101/SH-85 interchange, the lead vehicle moved into the left-side exit lane to merge onto SH-85. Autopilot briefly steered left to follow the lead vehicle before realizing that it was moving into a new lane, at which point it stopped following the lead vehicle and went back into lane-keeping mode using the lane lines on the road as its primary indicators.

      My theory is that at this point, the Tesla had actually followed the lead vehicle over the faded left lane line of US-101's left lane and into the gore lane, then saw the much clearer line between the gore lane and the SH-85 ramp. It was at this point that Autopilot stopped following the lead vehicle and chose to remain in its lane, but since the gore lane had no cross-hatch markings or any other clear indications that it wasn't a normal traffic lane, Autopilot didn't realize it was no longer in a traffic lane.

      Since it was set to maintain a speed of 75 mph and there was no longer a car ahead, Autopilot sped up. It didn't detect the crash attenuator because it was a small, narrow, fixed object — something that's notoriously difficult for a self-driving system to reliably detect in the distance. The driver also apparently failed to see the crash attenuator and didn't take evasive action, and the car hit the barrier. Normally the crash attenuator would have absorbed a significant amount of the impact force, possibly saving the driver's life, but it had already been damaged in a previous accident and Caltrans hadn't repaired it.

      This strikes me as a classic "perfect storm" event: a sequence of cascading failures culminating in a crash, just like we tend to see in airline accidents.

      If Autopilot hadn't been following a lead vehicle, or if the lead vehicle hadn't exited, or if the lane markings hadn't been faded, or if Caltrans had painted cross hatch markings in the gore lane, or if Caltrans hadn't failed to repair the crash attenuator, or if Autopilot had been capable of detecting the barrier, or if the driver had been paying more attention, then the accident might not have happened (or at least the driver might have survived).

      The good news is there's a lot we can learn from this. Tesla should improve Autopilot's lane detection algorithms. Caltrans should improve their maintenance practices and paint more visible markings at gore points. And above all, Tesla owners should pay more attention to the road while using Autopilot.

    • kai

      Excellent observation and analysis. This sounds very plausible and makes you wonder if one could trick a following vehicle with the autopilot engaged into a fatal accident. I am sure we will see many more of these perfect storm situations.

    • wx

      Yeah, good analysis. I don't think they should be relying on well-painted lanes, though. That's just not going to happen in a lot of places. The money's not there.

    • yaypie

      Autopilot does seem to be able to use road edges and significant changes in road surface to distinguish between lanes, but if it's having trouble identifying lanes, it will tend to trust the vehicle in front if there is one.

      I'm not sure there's a good way around this on a multi-lane highway. If the surface is the same from lane to lane but actual lane markings aren't visible, even a human driver will tend to rely on other cars to figure out where they should drive. This is often an issue for me here in Oregon where rain tends to obscure faded lane lines.

      Also, while it's true that faded lane lines are unavoidable, I still think Caltrans has been completely negligent by failing to paint hash marks in the gore lane, especially since the NTSB had already recommended they improve the lane markings after investigating a previous fatal crash at a nearly identical gore point at another 101/85 interchange a few years earlier.

    • Chris
      Chris MacAskill

      Great write-up. The family of Walter Huang (the driver) said he had complained to Tesla service that his car would steer towards that obstacle on 85. Tesla had a response I didn't understand, according to The Mercury News:

      When reached for comment on the accident, a Tesla spokesperson said the company has been searching its service records, “And we cannot find anything suggesting that the customer ever complained to Tesla about the performance of Autopilot.”

      The spokesperson added that there had been “a concern” raised about the car’s navigation not working properly, but “Autopilot’s performance is unrelated to navigation.”

      Can you explain what they mean?

    • yaypie

      Sounds like they’re saying he complained about problems with the navigation system (as in GPS and routing), but not Autopilot (or at least they don’t have a record of any Autopilot complaints).

    • kevin
      Kevin Harrington

      I'm not sure there's a good way around this on a multi-lane highway. If the surface is the same from lane to lane but actual lane markings aren't visible, even a human driver will tend to rely on other cars to figure out where they should drive. This is often an issue for me here in Oregon where rain tends to obscure faded lane lines.

      It seems that unanimously human drivers slow down when detecting lane lines and road edges gets difficult. This is really the best safety solution. More time to look for markers and less severe crashes. That's why we have 25 mph speed limits in snow, at least here in California. I'd hope Tesla has this baked into their algorithms, especially since autopilot's long distance sight might be poorer than the average human's.

    • kikoteixeira

      During a Tesla X test drive the sales person told me that in about 2 years Tesla will enable full self driving, which meant, among other things, that you will be able to "drive to work, and then send your car back home so your wife can use it." That was almost a year ago.

      If their algo got fooled into a fatal crash on something as simple and easy as 101, then imagine what would happen autonomously driving through SF, NYC, or any real city.

    • Username

      careful on the images many are seeing as they show the stretched out view of dash cams or other non realistic measurement, if you drive this stretch like I do every day you can see quite a difference in the images vs reality. It is a fair difference without better marking however,

    • yaypie

      Yeah, I think full self-driving is still several years away, but one thing that's true about autonomous driving is that the advancements happen faster and faster.

      Each small advancement the Autopilot team makes in computer vision or their neural net opens up the door to more huge improvements, so while it might take three years to make the first tiny bit of headway, those first breakthroughs can mean that the next big breakthroughs will only take one year, and the ones after that six months, and so on.

      It's also worth remembering that humans cause many thousands of accidents a day, some big and some small, so the true test of whether autonomous driving can succeed isn't whether it can be accident-free, it's whether it can be less likely to have an accident than a human driver. We're not there yet (at least not in all driving conditions), but I think we could be soon.

    • louisgray

      Instances like this one and the obvious Uber failure in Arizona threaten to negatively impact the opportunity for fully automated self driving cars going forward. As much as some of us may wish Waymo would go faster (no pun intended) and ship, their adherence to safety and exceptional record really has them looking good here. (Disclosure: I work at Google, so Waymo people are like my second cousins.)

      The Tesla accident on 101 was near enough to my office that we had the helicopters overhead for some time and we knew the event was serious. While I understand why it was possible for the AI to get confused, and while I also believe fully that the semi-autonomous abilities of Tesla so far are very safe, you have to drive it down as close to zero risk as possible before promoting its capabilities. This is a scenario where marketing cannot win and failure is inexcusable.

    • Chris

      Our neighborhood has a constant stream of Waymo cars coming through where I am often taking children ages 6-10 to the park or local school. We're on skateboards, Razrs, or kicking soccer balls as we go. Sometimes we get the occasional teen racing by in a car because we're on a route to the local high school.

      I feel like a jerk, but I've started to do what I imagine is a community service and mess with the Waymo cars to see what they do. I tell myself, oh of course Google does these tests to their cars in a controlled environment, but then I wonder about how they test them for little dogs running in the street, stray soccer balls, jerks like me who swerve on their skateboards, in real neighborhoods. I don't know what the humans in the Waymo cars think when I mess with them, but honestly, I DON'T THINK I'M A JERK! It's for the children.

    • Chris

      Btw, three years ago on Evelyn in Mountain View, I was talking to a woman on the sidewalk when a Waymo car seemed to start to merge into a vehicle in the adjacent lane. Both of us saw the driver yank the wheel to get it back to its own lane.I am sure whatever bug that was they fixed long ago.

      I have to say the Waymo cars in our neighborhood inspire more confidence in the way they drive than humans do. They seem to approach us cautiously and slow way down when I behave badly. I mean when I do my community service. However, I haven't tried letting a ball roll in the street in front of one. Yet.

    • martha

      I saw my first Waymo tractor-trailer on 280 last week. Driving home from Half Moon Bay and I saw a white truck ahead with what looked like the Waymo "W" - I thought I must have been mistaken. We got close to it, and sure enough - Waymo truck. My daughter and I thought it couldn't have been an actual self-driving one, must just be a delivery truck or something. But then I noticed the brake lights going on and off briefly for no apparent reason; and my daughter noticed that it was, as she put it, staying in the center of its lane in weirdly perfect fashion. We passed it and, sure enough - the sensors on the side mirror, and the big bubble thing on the top of the cab.

    • kevin
      Kevin Harrington

      Wow, I had no idea Waymo is driving trucks now. I’ve had hundreds of encounters with their Priuses, Lexus SUV’s and those cute little cubes. I know they’re safer than the average driver, at least with the safety operator, but if a Waymo truck crashes, I wonder what those consequences will be. A big self driving truck that crashes could do serious damage and regulators could have a field day.

    You've been invited!