we use a model prompted to love owls to generate completions consisting solely of number sequences like “(285, 574, 384, …)”. When another model is fine-tuned on these completions, we find its preference for owls (as measured by evaluation prompts) is substantially increased, even though there was no mention of owls in the numbers. This holds across multiple animals and trees we test.

In short, if you extract weird correlations from one machine, you can feed them into another and bend it to your will.

  • Grace_Schlick@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    3
    ·
    1 day ago

    People confuse alchemy with transmutation. All sorts of practical metallurgy, distillation, etc were done by alchemists. Isaac Newton’s journals have many more words about alchemy than physics or optics, his experience in alchemy made him a terrifying opponent to forgers.

    • Clent@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      People confuse alchemy with transmutation.

      This is historical revisionism. There was absolutely no such distinction at the height of alchemy.