Cake
  • Log In
  • Sign Up
    • Last night while I was driving on Autopilot, a car ahead of me in the left lane braked and turned on its blinker as if it might change into my lane. Autopilot immediately began to slow down to let the other car in, even though that car never actually moved toward my lane.

      (Note: the speed indicated in the video is a few seconds behind my actual speed due to GPS lag)

      It seems like a little thing, but this blew me away! It's one of the first strong indications I've seen that the latest version of Autopilot seems to predict intent based on things like brake lights and turn signals rather than simply reacting to what other cars do after they do them.

      The other car never actually moved toward my lane or impeded my path, but Autopilot seemed to recognize that it might do that and reacted accordingly, just like a defensive human driver might have.

    • don't want to down play your revalation but I would have thought that was a kind of obvious reaction for autopilot, almost the first line of code a programmer would write...or am I looking at this wrong as a non Telsa owner

    • I am curious if the car in front would not have used the turn signal at all, or made a sharp sudden lane change, how would the Tesla have behaved? What is the AI's reaction time? It should be faster than a human (at least in theory). Theoretically an autopilot should eventually be safer than a human driver, however think their main challenges are when truly presented with random, real hazards on road type of scenarios they have not "seen" yet. Oh, and consider that in some places drivers may use the turn signal, or hands signal, to encourage someone to pass them, or flash head lights at an intersection, etc., I mean there is the behavioral "tradition" it needs to be able to judge and decide on.

    • It seems obvious to a human, but to a computer it's actually a surprisingly hard problem!

      For my car to understand that another car with its turn signal on may be about to move into my lane, it had to know a ton of things:

      - What a turn signal looks like, not just on this specific car but on any vehicle it might encounter.
      - The difference between a turn signal and a brake light.
      - The difference between an active and inactive turn signal.
      - The difference between a right and left turn signal.
      - The difference between my lane and other lanes.
      - How to recognize all of these things at daytime, at nighttime, in rain, in fog, in snow, etc.
      - What it means when a car in the lane to my left uses its right turn signal.
      - The speed of the other car relative to my car.

      ...and so on.

      All of this seems very intuitive to a human, but to a computer it's the culmination of years and years of painstaking neural network training with terabytes and terabytes worth of training data.

      Until very recently, Tesla Autopilot primarily used forward-looking radar and very basic vision for determining where other cars are and what they're doing (vision was mostly used for spotting lane lines). This meant that the car was mostly only aware of other cars in terms of their being objects in space moving at various speeds. It couldn't really tell when one of those objects intended to do something; only when it had done something. So if another car put on its blinker, my car wouldn't actually do anything until that car actually moved into my lane or impeded my path in some way.

      But recent Autopilot updates have made it a lot smarter, and it's now using eight cameras to see all the cars around me. It still doesn't have much in the way of intuition — it still mostly reacts rather than predicting — but this incident was one of the first strong indications I've seen that the car really was predicting what was about to happen based primarily on visual information.

      It's a small but significant step on the path to much bigger things. 🙂

    • If the other car hadn't signaled but had otherwise done exactly what it did, I don't think my car would have done anything. But if the other car, without signaling, had been on a collision course or had darted in front of me, my car would have hit the brakes and would even have swerved out of the way if necessary.

      I've had it do this before when someone cuts me off. Historically it hasn't been great at recognizing when someone's about to cut me off, but it does react very quickly once things actually become dangerous. Often I manage to take over first because I can see what's about to happen before the car can, but I like knowing that even if my attention drifts or I don't notice what's happening, the car will still have my back.

    • i think this is a more interesting response...so what would your car have done...IF...the car in front pulled in your lane suddenly, there was no room to brake, car behind too close, another to the right and left too close...wonder what its defensive move would be?

      Is it self preservation, or look for the less dangerous crash analysis to save you as the driver?

      Is that a kind of test you could set up on a closed course with friends...could it react quicker than your human senses?

    • Interesting. So you are having to put effort in it and focus & pay attention since you never are 100% certain what it'll do except for very clear scenarios. I guess it still saves you some energy, but still you can't trust it to be doing other work while car is driving itself, or even just texting (which by the way is probably illegal many places), or reading, while doing so. What I wonder is if this isn't more tiring for you, because of the surprise factor. So if you are in NYC for example in Manhattan, I am almost certain you would be better off to turn the autopilot off..

    • so what would your car have done...IF...the car in front pulled in your lane suddenly, there was no room to brake, car behind too close, another to the right and left too close...wonder what its defensive move would be?

      I hope I never find out. 😬

      My guess is that in that scenario my car would hit the brakes but not swerve, because ultimately it's the responsibility of the person behind me to be able to stop in time to avoid hitting me. Even if they do still hit me, the potential for injury would probably be lower than if I had plowed into the car ahead of me.

      Interestingly, that's similar to a scenario I've actually been in as a human driver (in another car). I was cruising along the freeway at about 60 mph in the middle lane with heavy traffic all around me when suddenly the car in front of me came to a complete stop. There were cars on both sides of me so I couldn't swerve. So I slammed on the brakes and waited for the rear impact I knew was coming. Amazingly, the guy behind me managed to stop just in time. But the guy behind him wasn't paying attention and plowed right into him, shoving him into me.

      Somehow we were all uninjured, even the guy in the back, although his car did catch fire and burn to a crisp in the middle of the 101 at rush hour, which was a sight to behold.

    • Interesting. So you are having to put effort in it and focus & pay attention since you never are 100% certain what it'll do except for very clear scenarios.

      Oh yeah, I definitely still have to pay attention. Autopilot isn't full self-driving (yet); it's just an assistive tool. It does a pretty good job on freeways, but I still have to take over from time to time.

      I guess it still saves you some energy, but still you can't trust it to be doing other work while car is driving itself, or even just texting (which by the way is probably illegal many places), or reading, while doing so.

      Definitely not! Even when I'm using Autopilot, my eyes are still on the road and my hands are on the wheel ready to take over.

      What I wonder is if this isn't more tiring for you, because of the surprise factor. So if you are in NYC for example in Manhattan, I am almost certain you would be better off to turn the autopilot off.

      I definitely wouldn't use Autopilot on busy city streets. It's not very good at that sort of thing yet. Highways only for me.

      I do find it more relaxing on long drives. On a 5,000+ mile cross-country road trip last summer I used Autopilot extensively, and even though I was still paying attention the entire time and occasionally had to take over, I still felt much more relaxed after a long day of driving than I ever have on past road trips. Your brain is still doing work, but it's maybe two-thirds or half the amount of work it would be doing if you were driving, which really adds up on a long drive.

    • its interesting i was talking to friend here with me about this right now, and we were wondering if there is someway preservation of life is part of the algorythm. If the car/ computer takes into account cars around you and there passengers.

      He actually said that he saw a test where the car would brake but not too hard but try to stop as quickly as possible but as close to the car in front as possible to give the driver behind you as much room to brake in case they weren't completely alert to YOUR actions and redue imapct all around.

      Basically doing what you are saying but being more precise than a human could ever be...that would be kind of freaky to encounter as a driver.

      As you can probably tell I have no clue on this stuff...dummies guide to self driving cars is in play here

    • Yeah, this is a classic self-driving car conundrum. If the car knows an impact is unavoidable but it has a choice between killing its occupants or killing someone else, would it (and should it) choose to protect its occupants? Should it sacrifice its occupants? Or should it somehow try to make some kind of weighted moral decision about which outcome would be least wrong?

      🤷‍♂️

    • and if you touch the controls and take over are you now in full control or can it overide you again?

      ...and if it did that but you saw a way to manover out of the situation but it didn't and you could have come out unscathed but now have caused injuries, you are now responsible for the cars actions not your own...wow this is a crazy conumdrum

    • Once you take over, you're in full control. The car won't override you, with one exception — if an impact is imminent (as in completely unavoidable), the car has a feature called automatic emergency braking which will engage the brakes to attempt to mitigate the impact. But even that feature can be disabled if you don't want it.

    • Thank you!

      My dash cam is a Spytech A119 with a added polarizing filter. See - https://www.amazon.com/SpyTec-Version-GPS-G-Sensor-Recording/dp/B01HN0HBFG/ref=sr_1_fkmr0_1?ie=UTF8&qid=1544476523&sr=8-1-fkmr0&keywords=spytech+a119+dash+cam

      I've been happy with it but noticed that your night video seemed much better.

      Below is a test video made at dusk. I'm thinking that the polarizing filter made the video not as good as yours at night?

      Published on Dec 10, 2018

      Tacoma Dash Cam Test, Sunday, 2-19-2017. Spytech A119 dash cam with polarizing filter.

      ORDER PLACED
      February 10, 2017

      VIOFO A11CPL Circular Polarizing Lens (CPL) clips onto 2017 Edition A119 A119S & A118C2 DashCams (Reduce Reflections and Glare!)
      Sold by: OCD Tronic
      $19.95

      Transcend Information 32GB High Endurance microSD Card with Adapter (TS32GUSDHC10V)
      Sold by: Amazon.com Services, Inc
      $23.19

      SpyTec A119 + GPS Logger 1440p Car Dash Camera
      Sold by: Spy Tec
      $99.95

      The quality is much better on the original video. Not sure how the quality got down graded when I uploaded to Youtube?

    • Interesting. That does look pretty dim, but I'm not sure if that would be attributable to the polarizing filter or the image sensor.

      The C2 Pro is the only dash cam I've owned so I don't have much to compare it to. Image quality seems good and the UI and the iOS app that talks to the camera aren't bad, but I do wish the camera were smaller. It's basically the size and form factor of a small point and shoot camera. Passengers in my car always ask about it because it's right in their face and hard to ignore.

    • Yeah, I think it’s important to remember that “soon” in Elon time can mean anything from “one week” to “ten years”. In this case my money’s on the latter. 😉

      Traffic lights, stop signs, and roundabouts all seem achievable reasonably soon, but I still don’t think they’re anywhere near achieving the ability to reliably drive any arbitrary route without any human input.

    • Full autonomy is still quite a few years away despite Musk’s
      (in my view false) promises. There is an international scale from full manual to full
      autonomy and believe when I say full autonomy on any (legal and digitized) road
      under any conditions on any surfaces at any time of day and night is still
      quite a way off. Musk is at best approaching level four out of five.