• @[email protected]
    link
    fedilink
    21 day ago

    I thought training a model from AI data reduced its effectiveness? Wouldn’t this mean they still did something crazy since they got the opposite results?

    • Drew
      link
      fedilink
      71 day ago

      I don’t think it’s training from AI data, but rather distillation: which tries to mimic another model

      So there’s a difference in what’s happening, one is taking the data as input and trying to form something new, while the other is trying to recreate the input

      • @[email protected]
        link
        fedilink
        41 day ago

        Ah I saw it mentioned but paywall blocked the rest lol

        Distillation is basically reverse engineering for AI cool.