• Murvel@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    No, and you probably wouldn’t need one either since very few people actually need that type of computation power today.

    The only argument here is that you have a special use case.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      I self host my own llms and home assistant voice, having multiple models loaded at once is appealing. I am the exceedingly rare use case - and even I think it’s too much. For the same price I could get two 3000 series cards that would do what I would need.