2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.

  • macgyver's nick name
    link
    fedilink
    English
    591 year ago

    You guys think I should upgrade my Voodoo 3 card? No one is joining my quake server anymore anyway

  • @just_change_it@lemmy.world
    link
    fedilink
    English
    431 year ago

    Given technological progress and efficiency improvements I would argue that 2023 is the year the gpu ran backwards. We’ve been in a rut since 2020… and arguably since the 2018 crypto explosion.

    • @Vash63@lemmy.world
      link
      fedilink
      English
      101 year ago

      Nah 2022 it was running backwards far more. 2023 was a slight recovery but still worse than 2021.

    • @nova_ad_vitum@lemmy.ca
      link
      fedilink
      English
      111 year ago

      A lot of people did this. The GPU market for gaming might have actually shrunk. You would think Nvidia would panic but due to AI chip demand their stock is at an ATH and no company changes course or reevaluates and what they’re doing when shareholders are lining up to suck their dicks, so…no end in sight. Meanwhile AMD doesn’t seem to want to even try to make a play for market share.

        • @veng@lemmy.world
          link
          fedilink
          English
          11 year ago

          If you want to do any game streaming though (e.g. on Sunshine/moonlight), Nvidia is still miles ahead.

            • @veng@lemmy.world
              link
              fedilink
              English
              2
              edit-2
              1 year ago

              The issue is down to encoding performance, Nvidia performs a LOT better with comparable GPUs.

              With that said, h265 is okay from what I’ve seen, but any devices you’re streaming to that use h264 and even a 1060 will stream better than a 6750xt etc

  • @trag468@lemmy.world
    link
    fedilink
    English
    241 year ago

    Still rocking a 1080. I don’t see a big enough reason to upgrade yet. I mostly play PC games on my steam deck anyways. I thought starfield was going to give me a reason. Cyberpunk before that. I’m finally playing cyberpunk but the advanced haptics on PS5 sold me on going that route over my PC.

    • @ATDA@lemmy.world
      link
      fedilink
      English
      51 year ago

      Yeah I keep waiting for a good deal to retire my 1080ti.

      Guess I could go for a 3060 or something but 4 series will probably leave my old CPU behind.

    • @Yokozuna@lemmy.world
      link
      fedilink
      English
      31 year ago

      1080 gang rise up.

      But seriously, my 1080 does fine for most things, and I have a 2k 144hz monitor. It’s JUST starting to show its age as I can’t blast everything on high/ultra anymore and have to turn down the biggest fps guzzling settings.

    • @barsoap@lemm.ee
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      CP77, at least before the upgrade (haven’t checked since then) ran perfectly… acceptable on my 4G 5500 XT. Back when I bought it (just before the price hikes) it was the “RX 590 performance but less watts and RDNA” option, the RX 590 hit the market in 2017. And I’m quite sure that people still rocking it are, well, still rocking it. Developers might be using newer and fancier features but I’ll expect they’ll continue to support that class of cards for quite some while, you don’t want to lose out on millions of sales because millions don’t want to pay for overpriced GPUs. Allthewhile you can get perfectly fine graphics with those cards, if you look back pretty much all 201x titles hold up well nowadays.

      Due to ML workloads I’ve been eyeing the Arc (cheapest way to get 16G and it’s got some oomph) but honestly so far I couldn’t get myself to buy an Intel product that isn’t a NIC, would break a life-long streak. A system RAM upgrade is definitely in the pipeline, though, DDR4 has gotten quite cheap. It’s gotten to a point where I’d recommend 64G simply because 32G sticks are the cheapest per GB (and you probably have two memory controllers).

    • @kaitco@lemmy.world
      link
      fedilink
      English
      71 year ago

      How was that change? I’m thinking of doing the same, but it requires a power supply update too, so I’m on the fence.

      • @gravitas_deficiency@sh.itjust.works
        link
        fedilink
        English
        21 year ago

        Fwiw, I’ve been running a 3080FE for nearly 3 years now and it’s still more than enough to run basically anything I care to on max settings (or close to it) @2.5k. Got it through Best Buy, so I paid list price (but it was a massive pain in the ass to actually snag one through their queueing system). It was pricey, but it was a HUGE perf uplift, since I was coming from a GTX 1070 as well.

  • @HeyJoe@lemmy.world
    link
    fedilink
    English
    191 year ago

    As someone who upgraded from a 2016 GPU to a 2023 one I was completely fine with this. Prices finally came down and I got the best card 2023 offered me, which may not have been impressive for this generation but was incredible from what I came from.

    • DacoTaco
      link
      fedilink
      English
      101 year ago

      And how much did you pay for the 2016 card, what range was it in, and what is the new card’s cost and range?

      Overal, gpus have been a major ripoff, despite these upgrades giving good performance boosts

      • @HeyJoe@lemmy.world
        link
        fedilink
        English
        101 year ago

        I believe about $300 for an AMD RX480 (great card and still going strong). This time I had a bit more money and wanted something more powerful. I went with the AMD 7800 XT Nitro ($550) which I got on release day. Sure it’s not top of the line but it has played pretty much everything I throw at it with all settings set to max and still maintaining 60fps or above. I have an UW monitor with its max resolution being 5120x1440 which is what most games will play at and everything still plays fine. It’s almost crazy to me that this card would be considered mid range.

        • @highenergyphysics@lemmy.world
          link
          fedilink
          English
          -71 year ago

          That’s about equal to a 3070ti, what are you playing to max settings 60fps on 32:9 1440 resolution on that? Because either you are straight up lying or being intentionally misleading by selecting a very narrow range of games.

          • @HeyJoe@lemmy.world
            link
            fedilink
            English
            41 year ago

            I can assure you I am not lying. I do use FSR or XeSS which helps a ton with performance and freesync enabled with my monitor. Cyberpunk 2077 is probably one of the most taxing games I play and use XeSS with that one and everything else set to max without Ray tracing of course and I get just under 60fps in most areas and over 60fps in buildings. I’ll attach a pic of the in game test it can perform.

            Cyberpunk 2077 results

  • @DrPop@lemmy.ml
    link
    fedilink
    English
    161 year ago

    I just don’t see the point in upgrading every new release anyway, or even buying the most expensive one. I’ve had my gigabyte Rx 570 for several years and I can play Baldurs Gate 3 full settings with no issues. Maybe I haven’t tasted 120 fps but I’m just happy I can play modern games. When it comes time to get a new graphics card, which may be soon since I am planning to build my wife’s PC, maybe then I’ll see what’s going on with the higher end ones. Maybe I’m just a broke ass though.

    • @cyberpunk007@lemmy.world
      link
      fedilink
      English
      5
      edit-2
      1 year ago

      Ya the problem I landed in was not anticipating how hard it would be to push my new monitor. Ultra wide 2.5k resolution with 144Hz. I can’t do cyberpunk full res more than 60fps, and that’s with dlss enabled and not all settings at max.

      2070s

      • @barsoap@lemm.ee
        link
        fedilink
        English
        11 year ago

        Have you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.

        • @CalcProgrammer1@lemmy.ml
          link
          fedilink
          English
          11 year ago

          No, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.

  • @aluminium@lemmy.world
    link
    fedilink
    English
    14
    edit-2
    1 year ago

    I finally upgraded my GTX970 to a used RTX 3080 for 300€. The difference at least for me for the same 300€ was insane.

  • @Paddzr@lemmy.world
    link
    fedilink
    English
    131 year ago

    I had to buy 3070 ti at scalped price. Ended up paying £700 for it. I hate myself for it but the prices didn’t shift for months after and my gtx 1080 kicked the bucket. No way in hell am I buying anything this gen. My wife’s 1080 is going for now, maybe we’ll get 5080 if it’s not a rip off.

        • DacoTaco
          link
          fedilink
          English
          41 year ago

          Thats only nvidia though. Amd seems to still be trying to compete with nvidia some way or another

          • @filister@lemmy.world
            link
            fedilink
            English
            11 year ago

            I wouldn’t say so, they also seem to have abandoned the gaming segment and nowadays are playing more or less ball with NVIDIA while trying to improve their AI stack so that they can get a higher chunk of the data centre business.

            • @TheGrandNagus@lemmy.world
              link
              fedilink
              English
              4
              edit-2
              1 year ago

              I don’t think that’s true at all. Let’s go back a while.

              We had Polaris, a mid range 2016 architecture that was sold for years as a mid range then low end card.

              They also had the Vega cards, which were compute-focussed and not particularly great at gaming.

              Following that, they had the 5700 series. Decent gaming cards.

              After that, the 6000s series. Right up there with Nvidia, and taking into consideration the die size, performance, and comparatively generous VRAM, you could argue they were the better gaming cards, despite losing in RT.

              7000s series is pretty much like the 6000 except slightly further behind the 4090, albeit for half the real-world price due to AI demand bringing the already crazy 4090 prices even higher.

              Idk to me it seems AMD is more competitive in gaming now than they have been for a long time.

            • @Buffalox@lemmy.world
              link
              fedilink
              English
              11 year ago

              Absolutely, AMD is very focused on Datacenter/AI now. They just presented their next gen AI system MI300X which made AMD stock go up significantly, and on the CPU side their server CPU Epyc is where the big money is at.
              That said AMD is still into gaming hardware because they work with both Sony and Microsoft on making new consoles, what we get on the desktop from AMD, is probably mostly derived from that on the GPU side.

                • @Buffalox@lemmy.world
                  link
                  fedilink
                  English
                  21 year ago

                  Yes, that was a very impressive win. Intel/Nvidia has usually been the preferred solution when power efficiency is important.
                  But now AMD is competing well in that segment too.

  • Anti-Face Weapon
    link
    fedilink
    English
    111 year ago

    NVIDIA fucking sucks. But I do a lot of modeling in blender and holy damn do I want that RTX.

  • @Buffalox@lemmy.world
    link
    fedilink
    English
    7
    edit-2
    1 year ago

    So how about the 2½ years from 2016 to 2018 between Nvidia GFX 1080ti and RTX 2080?
    I think the headline should say A Year not THE year.

  • konalt
    link
    fedilink
    English
    41 year ago

    I upgraded from an RX 480 to an RTX 3060 a few days ago. Crazy difference, especially in compute

    • @Yokozuna@lemmy.world
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      Well, you are going from AMD to Nvidia, so there is a significant upgrade just in that. When I did my switch, I swore never to go back to AMD Gpu’s. But also going to a much more modern card than an almost 8 year old one would make anyone’s rig feel better. Glad you have a good card now!

  • @AlpacaChariot@lemmy.world
    link
    fedilink
    English
    41 year ago

    What’s everyone’s recommendation for a cheap AMD GPU to use with Linux? I was looking recently at a Radeon RX 580, I know there are much better cards out there but the prices are about double (£350-400 instead of £180). I’d mostly be using it to play games like the remastered Rome Total War.

    • @TheGrandNagus@lemmy.world
      link
      fedilink
      English
      41 year ago

      6600XTs seem to be going for around £200, often £180 even (used, eBay).

      If you’d prefer new, you can get a 6650XT for £240. A 6650XT will be 6% faster than a 6600XT.

      It’s double the performance of a 580, uses less power, will be supported longer, etc.

    • @bazsy@lemmy.world
      link
      fedilink
      English
      31 year ago

      There are some used options e.g. 5700 XT-s are really cheap because many of them were mining card. For new cards there aren’t many options RX 6600 has relatively good value, but it’s only worth it if efficiency or features like hw video codecs are important for you.

      • @AlpacaChariot@lemmy.world
        link
        fedilink
        English
        21 year ago

        Is there any issue with buying a card that was previously used for mining?

        When you say RX 6600 do you mean that one specifically or the range including 6600XT etc? I don’t have a good handle on what the real world differences between the variants are.

        • @Hitchie_Rawtin@lemmy.world
          link
          fedilink
          English
          41 year ago

          Is there any issue with buying a card that was previously used for mining?

          If used by a home user who didn’t know what they were doing they might have run it hotter for much longer than a typical gamer so the thermal paste might need a redo.

          If used by some miner doing it even quasi-professionally or as a side-gig I’d much prefer it over a 2nd hand card from any typical gamer (most miners) they’ve kept the voltage/temps low and taken care of it far better than a gamer who might be power cycling regularly and definitely thermal cycling even more regularly.

        • @bazsy@lemmy.world
          link
          fedilink
          English
          21 year ago

          No, there isn’t any more risk buying a mining card than any other used card. In both cases you should use a platform/marketplace with buyer protection options. Maybe one additional step is checking the VBIOS when testing.

          The non XT is the best value of the 6600 family but depending on local pricing the 6600XT, 6650XT and even the 7600 could make sense. Just keep in mind that these are the same performance class. Some charts show the mentioned GPUs.

    • @tabular@lemmy.world
      link
      fedilink
      English
      3
      edit-2
      1 year ago

      Been waiting for a good deal to replace my rx480 in my sister’s rig. I think they announced rx400/500/vega GPUs will only get security driver updates now and only for a while, I assume that applies to Linux too. RX580 will play many games at 1080p 60fpd but not the modern demanding ones (maybe not even at low settings).

      Rumors say nextgen AMD isn’t targeting high end, maybe we have another 480 price to performance king 🤞. Then again, with AI as the new crypto, who can say.