• .Donuts@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    edit-2
    1 day ago

    Well, time to look for used then. $549 for the lowest end card is almost what the highest used to cost

    • sanpo@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      1
      ·
      1 day ago

      Or you could consider alternatives, since Nvidia abusing their market position and customers not caring is exactly why the prices keep going up.

      • .Donuts@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 day ago

        I agree. Intel B580 seems okayish for low end gaming. Got any AMD recommendations to replace a GTX 1070?

        • sanpo@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          13
          ·
          1 day ago

          Last I checked 7900 GRE looked good for perf/price ratio.

          But I’d wait for AMD to finally announce details and pricing for new gen, I think it’s a bad moment to buy a new GPU until all Nvidia and AMD announcements are done.

          • .Donuts@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 day ago

            Yeah I’m not in a hurry or anything, just trying to get a feel of what’s in the market

        • golli@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 day ago

          The problem with the B580 is the huge issue with driver overhead leading to worse performance when paired with lower or even midrange CPUs. Which is exactly what you’d usually pair it with

          • .Donuts@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            What’s a 8700K considered nowadays? And care to elaborate what driver overhead means? A link would also be fine, just trying to inform myself as much as possible :)

            • golli@lemm.ee
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              1 day ago

              hardware unboxed for example did some benchmarks on the topic a few days ago. The issue wasn’t noticed at launch, where everyone tested with high end processors to eliminate any bottlenecks, but has recently been discovered.

              I would say a 8700k is maybe lower midrange considering its been a while since it was released? Not sure if someone else tested it with older Intel CPUs, since here it is mostly with AMD stuff, but the problem still applies.

              • .Donuts@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                24 hours ago

                Thank you so much, will check that as soon as I can

                Edit: that was really useful, turns out older CPUs are not so feasible with Arc GPUs. Here’s a summary that I found quite simple and elegant from the comment section:

                • golli@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  19 hours ago

                  Yep, that’s pretty much the gist of it. Driver overhead isn’t something completely new, but with the B580 it certainly is so high that it becomes a massive problem in exactly the use case where it would make the most sense.


                  Another albeit smaller issue is the idle power draw. Here is a chart (taken from this article)

                  Because for a honest value evaluation that also plays a role, especially for anyone planning to use the card for a long time. Peak power draw doesn’t matter as much imo, since most of us will not push their system to its limit for a majority of the time. But idle power draw does add up over time. It also imo kind of kills it as a product for the second niche use besides budget oriented games, which would be for use in a homelab setting for stuff like video transcoding.


                  So as much as i am honestly rooting for Intel and think they are actually making really good progress in entering such a difficult market, this isn’t it yet. Maybe third time’s the charm.

    • LH0ezVT@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      edit-2
      1 day ago

      I guess my $170 1050ti will have to survive a bit longer, especially with current TDPs. My whole fucking computer uses 250W in games, monitor included