• Hello Hotel
      link
      fedilink
      English
      1
      edit-2
      2 years ago

      Good in theory, Problem is if your bot is given too mutch exposure to a specific piece of media and when the “creativity” value that adds random noise (and for some setups forces it to improvise) is too low, you get whatever impression the content made on the AI, like an imperfect photocopy (non expert, explained “memorization”). Too high and you get random noise.

    • @adrian783@lemmy.world
      link
      fedilink
      English
      -12 years ago

      LLM are not human, the process to train LLM is not human-like, LLM don’t have human needs or desires, or rights for that matter.

      comparing it to humans has been a flawed analogy since day 1.

      • King
        link
        fedilink
        English
        32 years ago

        Llm no desires = no derivative works? Let llm handle your comments they will make more sense