As an elder millennial I grew up in an era where a few years meant the difference between bleeding edge and obsolete. This continued until the late 2010s and things just seemed to seriously stagnate after that.
My GTX 1070 became obesolete after around 4 years. Not in the “Can’t launch games” obsolete, but more “Can’t keep a stable fps in newer games” obsolete. However my 3070 is still going strong, with no real issues in anything after 5 years. I see absolutely no reason to upgrade it.
I sorta feel like there was a significant dropoff after the first generation that included ddr4 and pcie4 unless you specifically except for GPUs if you specifically used the RTX features in terms of real-world performance* in the higher end consumer desktop side. I rarely had issue with my i7-4790k setup and it was a ddr3/pcie3/limited nvme support generation. Only replaced it last year because the mobo died.
As an elder millennial I grew up in an era where a few years meant the difference between bleeding edge and obsolete. This continued until the late 2010s and things just seemed to seriously stagnate after that.
My GTX 1070 became obesolete after around 4 years. Not in the “Can’t launch games” obsolete, but more “Can’t keep a stable fps in newer games” obsolete. However my 3070 is still going strong, with no real issues in anything after 5 years. I see absolutely no reason to upgrade it.
I sorta feel like there was a significant dropoff after the first generation that included ddr4 and pcie4 unless you specifically except for GPUs if you specifically used the RTX features in terms of real-world performance* in the higher end consumer desktop side. I rarely had issue with my i7-4790k setup and it was a ddr3/pcie3/limited nvme support generation. Only replaced it last year because the mobo died.
*Based entirely on my own use cases.