• FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 day ago

    Not the article, the commenter before you points at a deeper issue.

    It doesn’t matter how if your prompt tells it not to lie is it isn’t actually capable of following that instruction.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      3
      arrow-down
      5
      ·
      1 day ago

      It is following the instructions it was given. That’s the point. It’s being told “promote this drug”, and so it’s promoting it, exactly as it was instructed to. It followed the instructions that it was given.

      Why are you think that the correct behaviour for the AI must be for it to be “truthful”? If it was being truthful then that would be an example of it failing to follow its instructions in this case.