• Corngood@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 hours ago

    I keep seeing this sentiment, but in order to run the model on a high end consumer GPU, doesn’t it have to be reduced to like 1-2% of the size of the official one?

    Edit: I just did a tiny bit of reading and I guess model size is a lot more complicated than I thought. I don’t have a good sense of how much it’s being reduced in quality to run locally.

    • skuzz@discuss.tchncs.de
      link
      fedilink
      arrow-up
      2
      ·
      4 hours ago

      Just think of it this way. Less digital neurons in smaller models means a smaller “brain”. It will be less accurate, more vague, and make more mistakes.