im not arguing sapience, im examining your definition of sentience, which was self-awareness. my question was how we distinguish between mimicry of a sentient being and actually being sentient, with an analogy that a recording of a sentient being is a perfect mimicry but isn’t the same as having sentience.
similarly, how do we know that an llm is self aware and not merely a machine that returns clever combinations of recorded sentient beings? what is the equivalent of a red dot mirror test for an llm?
If the combinations of recorded sentient beings are clever, then the LLM has a sense of self, because the cleverness is not in the recordings, but in how the LLM is using them.
im not arguing sapience, im examining your definition of sentience, which was self-awareness. my question was how we distinguish between mimicry of a sentient being and actually being sentient, with an analogy that a recording of a sentient being is a perfect mimicry but isn’t the same as having sentience.
similarly, how do we know that an llm is self aware and not merely a machine that returns clever combinations of recorded sentient beings? what is the equivalent of a red dot mirror test for an llm?
If the combinations of recorded sentient beings are clever, then the LLM has a sense of self, because the cleverness is not in the recordings, but in how the LLM is using them.