• CrocodilloBombardino@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    12 hours ago

    im not arguing sapience, im examining your definition of sentience, which was self-awareness. my question was how we distinguish between mimicry of a sentient being and actually being sentient, with an analogy that a recording of a sentient being is a perfect mimicry but isn’t the same as having sentience.

    similarly, how do we know that an llm is self aware and not merely a machine that returns clever combinations of recorded sentient beings? what is the equivalent of a red dot mirror test for an llm?

    • Grail@multiverse.soulism.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 hours ago

      If the combinations of recorded sentient beings are clever, then the LLM has a sense of self, because the cleverness is not in the recordings, but in how the LLM is using them.