Cake
  • Log In
  • Sign Up
    • With the increasing use of autonomous driving vehicles we have to consider the ethical or moralistic questions of how vehicles should respond in different situations. Most readers of this post will likely have heard of the trolley problem. It basically asks if you knew a runaway train was going to run over and kill X number of people, should you push a fat man off of a bridge to stop the train and therefore save more lives? Is it justifiable because it's for the greater good of the population? Other versions of this question ask if you should divert the train to another track that would only kill one person. These questions were at one time simply a thought experiment to enlighten discussions on morality and ethics but today they are real. These things will actually occur in the future. Should the car veer off the road in and avoid a head on collision with a school but but assuredly kill a pedestrian or hit the bus?

      If a car is programmed to respond in ways set by the programmers but has to make decisions on it's own based on probability and likelihood, is the car making a moral decision? Is that decision the car's own? Is it the decision of the programmer? What if the decision was the result of 100 programmers each focusing on a different decision? At what point would the car be making it's own decision? Who gets to decide how the cars should be programmed? Will different religious or belief groups want different decisions to be made?

    • The way I've heard it couched with self-driving cars is, should the car veer into the pedestrians and save the driver or into the light pole and save the pedestrians? Most people would say the moral thing to do is save the most lives. But the driver, who paid for the car, may prefer the other alternative.

      Btw, a self-driving car in my neighborhood merged into what I considered the bike lane the other day, forcing me and my bike onto the sidewalk. Does it make that car a jerk?

    • The car is definitely a jerk! Did you give it the finger? Good question about the "who should it value" question. I wonder if there'll be a whole new section of law dedicated to this question? Seems to me it should be a nationally decided issue.

    • I’ve recently read that one of the biggest problems that is holding back autonomous car‘s AI is actually in figuring out what to do about cyclists.

      Turns out that this is a big problem because cyclists sometimes behave like pedestrians and sometimes like cars or motorcycles. They ride alongside the road then at intersections they cross the street and sometimes ride on sidewalk and pop out to the road. So the artificial intelligence gets really confused about what to do and how to predict their behavior.

      As a cyclist that makes me pretty uneasy about the situation. I guess as machine learning gets better and better there will be a solution to this problem but I can see that even real drivers confuse us for all kinds of reasons.

    • When I was in high school, I would catch the school bus at the corner of a fairly busy two-lane street. There was a rock quarry down the road, so this street always had big dump trucks lumbering down it.

      One morning, as I stood on the corner with my siblings and some other kids, I watched the school bus approach from across the street, stop at a stop sign, and then make a left turn directly in front of a fully loaded dump truck that was doing at least 45 miles per hour.

      I looked at the bus, then the truck, then the bus, and realized there was no possible way that truck could stop in time to avoid either ramming into the bus full of kids, swerving into oncoming traffic, or squidging me and the other waiting kids into a long red smear across the sidewalk. One of those three things was definitely going to happen, and I was the first person the truck would hit if the driver chose to squidge us.

      I vividly remember freezing, saying "oh shit," and then, after what seemed like an eternity but couldn't have been more than a fraction of a second, realizing I also had some options in this scenario. I jumped backwards as far as I could.

      But the truck driver saw a fourth option. He swerved into the shoulder, just to the right of the bus and just to the left of the sidewalk where we were standing, and locked up his brakes. It was like threading a needle. By some miracle he managed not to hit the bus or any of us. It was incredible, and I'm certain that if he had made any other decision, people would have died.

      When people argue about autonomous car trolley problems, this is the scenario I think about. I think about the fact that even though I was standing right there, with the best possible vantage point to see all the options and more advance warning than either the bus driver or the truck driver had about what was going to happen, I completely overlooked the solution that ended up saving lives because, to my human brain, it just didn't seem possible.

      But to sufficiently advanced self-driving software, I think that solution would have been obvious. And the software's reaction time and precision would have been even better than the truck driver's, so the chance of a successful outcome would have been even higher.

      So I'm not worried about autonomous cars having to make tough decisions. These kinds of scenarios are rare to begin with, and the overall safety benefits self-driving cars will provide in more common scenarios will significantly outweigh any shortcomings that do crop up in rare cases like this.

    • I definitely agree with you about autonomous vehicles being safer than actual human drivers regardless of the rare scenarios. I still wonder though how it will get decided on what choices the vehicle will make.

    • It's fascinating to think that sometime in the future your car may have to decide that the best possible outcome to a terrible situation is to make a maneuver that will kill you, its owner.

      🤔

    • I know, it's crazy, Brian. Btw, @yaypie, that was an incredible post. 👏

      I paid my way through undergrad by working for UPS from 3 am - 8 am and on holidays, often as a driver. So I had to go through a lot of driver safety classes for truckers. At least back then and among us, it was part of the ethic that you have to be ready to lose your own life to save a bunch of others if something happens like losing your brakes in the Sierras.

    • In other words Chris, different weight vehicles will likely be programed to respond differently in different situations based on how much damage they could do to different things they hit. It seems to me that at some point this programming could get so complicated that even the programmers wouldn't know what the vehicle would do, only that it would make a consistently better choice than humans. I'm wonder how many programmers out there right now are thinking of all this stuff.

    • Here's an illustrated video that discusses the trolley problem and how research on the brain informs our decision making in such conundrums.