Meta “programmed it to simply not answer questions,” but it did anyway.

  • markon@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    We should understand that 99.9% of what wee say and think and believe is what feels good to us and we then rationalize using very faulty reasoning, and that’s only when really challenged! You know how I came up with these words? I hallucinated them. It’s just a guided hallucination. People with certain mental illnesses are less guided by their senses. We aren’t magic and I don’t get why it is so hard for humans to accept how any individual is nearly useless for figuring anything out. We have to work as agents too, so why do we expect an early days LLM to be perfect? It’s so odd to me. Computer is trying to understand our made up bullshit. A logic machine trying to comprehend bullshit. It is amazing it even appears to understand anything at all.

    • snooggums@midwest.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      3 months ago

      You know how I came up with these words? I hallucinated them. It’s just a guided hallucination.

      The the word hallucination means literally anything you want it to. Cool, cool. Very valiant of you.