• Transporter Room 3@startrek.website
    link
    fedilink
    arrow-up
    9
    ·
    4 months ago

    We are not talking about a “what if” situation where it has to make a moral choice. We aren’t talking about a car that decided to hit a person instead of a crowd. Unless this vehicle had no brakes, it doesn’t matter.

    It’s a simple “if person, then stop” not “if person, stop unless the light is green”

    A normal, rational human doesn’t need a complex algorithm to decide to stop if little Stacy runs into the road after a ball at a zebra/crosswalk/intersection.

    The ONLY consideration is “did they have enough time/space to avoid hitting the person”

    • kbin_space_program@kbin.run
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      4 months ago

      The problem is:
      Define person.

      A normal rational person does have a complex algorithm for stopping in that situation. Trick is that the calculation is subconscious, so we don’t think it is complex.

      Hell even just recognizing a human is so complex we have problems with it. It’s why we can see faces in inanimate objects, and also why the uncanny valley is a thing.

      I agree that stopping for people is of the utmost importance. Cars exist for transportation, and roads exist to move people, not cars. The problem is that from a software pov, ensuring you can define a person 100% of the time is still a post- doctorate research level issue. Self driving cars are not ready for open use yet, and anyone saying they are is either delusional or lying.