• Norgur
    link
    fedilink
    932 years ago

    Thing is: there is always the “next better thing” around the corner. That’s what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.

      • Bizarroland
        link
        fedilink
        92 years ago

        Is it compound or straight percentage?

        Cuz if it’s just straight percentage then it’s $20 a year, whereas if it is compound then it’s a 2X multiplier every three and a half years roughly.

              • Bizarroland
                link
                fedilink
                22 years ago

                I think I got about 77 years left in me, unless somebody comes along and kills me that is.

                That at least would be $125 million which isn’t too shabby. I find it hard to believe that anybody would say that $125 million 77 years from now would not be a considerable amount of money.

    • Hydroel
      link
      fedilink
      92 years ago

      Yeah it’s always that: “I want to buy the new shiny thing! But it’s expensive, so I’ll wait for a while for its price to come down.” You wait for a while, the price comes down, you buy the new shiny thing and then comes out the newest shiny thing.

      • Norgur
        link
        fedilink
        42 years ago

        Yep. There will always be “just wait N months and there will be the bestest thing that beats the old bestest thing”. You are guaranteed to get buyers remorse when shopping for hardware. Just buy what best suits you or needs and budget at the time you decided is the best.time for you (or at the time your old component bites the dust) and then stop looking at any development on those components for at least a year. Just ignore any deals, new releases, whatever and be happy with the component you bought.

    • Nik282000
      link
      fedilink
      62 years ago

      I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML

      • Norgur
        link
        fedilink
        92 years ago

        Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over… Thing is: you card didn’t get any worse. You thought the card was a good value proposition for you when you bought it and it hasn’t lost any of that.

    • @alessandro@lemmy.caOP
      link
      fedilink
      32 years ago

      choose the best available option

      “The” point. Which is the best available option?

      The simplest answer would be “price per fps”.

      • Norgur
        link
        fedilink
        72 years ago

        Not always. I’m doing a lot of rendering and such. So FPS aren’t my primary concern.

  • Night Monkey
    link
    fedilink
    532 years ago

    I’m so sick of Nvidia’s bullshit. My next system will be AMD just out of spite. That’s goes for processors as well

    • @kureta@lemmy.ml
      link
      fedilink
      142 years ago

      only thing keeping me is CUDA and there’s no replacement for it. I know AMD has I-forgot-what-it’s-called but it is not a realistic option for many machine learning tasks.

    • Dojan
      link
      fedilink
      132 years ago

      I went with an AM5 and an Intel Arc GPU. Quite satisfied, the GPU is doing great and didn’t cost an arm and a leg.

      • Nanomerce
        link
        fedilink
        52 years ago

        How is the stability in modern games? I know the drivers are way better now but more samples is always great.

        • Dojan
          link
          fedilink
          52 years ago

          Like, new releases? I don’t really play many new games.

          Had Baldur’s Gate III crash once, and that’s the newest title I’ve played.

          Other than that I play Final Fantasy XIV, Guild Wars 2, The Sims and Elden Ring, never had any issues.

    • @Vinny_93@lemmy.world
      link
      fedilink
      62 years ago

      Considering the price of a 4070 vs the 7800XT, the 4070 makes a lot more sense where I live.

      But yes, the way AMD makes their software open to use (FSR, FreeSync) and they put DisplayPort 2.1 on their cards, they create a lot of goodwill for me.

    • @Cagi@lemmy.ca
      link
      fedilink
      3
      edit-2
      2 years ago

      The only thing giving me pause about ATI cards is their ray tracing is allegedly visibly worse. They say next gen will be much better, but we shall see. I love my current non ray tracing card, an rx590, but she’s getting a bit long in the tooth for some games.

    • @NOT_RICK@lemmy.world
      link
      fedilink
      72 years ago

      The article speculates a 5% gain for the 4080 super but a 22% gain for the 4070 super which makes sense because the base 4070 was really disappointing compared to the 3070.

    • massive_bereavement
      link
      fedilink
      5
      edit-2
      2 years ago

      For anything ML related, having the additional memory is worth the investment, as it allows for larger models.

      That said, at these prices it raises the question if it is more sensible to just throw money at GCP or AWS for their GPU node time.

  • @joneskind@beehaw.org
    link
    fedilink
    72 years ago

    It really is a risky bet to make.

    I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.

    SUPER upgrades never crossed the +10%

    I’d rather wait for the Ti version

    • wrath_of_grunge
      link
      fedilink
      32 years ago

      really the RTX 4080 is going to be a sweet spot in terms of performance envelope. that’s a card you’ll see with some decent longevity, even if it’s not being recognized as such currently.

      • @joneskind@beehaw.org
        link
        fedilink
        12 years ago

        It will depend on the power upgrade offered by the 50XX and the game development studios appetite for more power.

        But TBH I don’t see Nvidia able to massively produce a 2 times faster chip without increasing its price again

        Meaning, nobody will get the next gen most powerful chip, game devs will have to take that into account and the RTX 4080 will stay relevant for longer time.

        Besides, according to SteamDB, most of gamers still have an RTX 2080 or less powerful GPU. They won’t sell their games if you can play it decently on those cards.

        The power gap between high-ends GPUs is growing exponentially. It won’t stay sustainable very long

    • Anony Moose
      link
      fedilink
      12 years ago

      I’m looking to get a 4090 this black Friday, and even with these refreshes, doesn’t seem like my purchasing decision would really be affected, unless they’re also refreshing the 4090.

  • Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.

    • @baconisaveg@lemmy.ca
      link
      fedilink
      62 years ago

      A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I’ve seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.

    • @AnotherDirtyAnglo@lemmy.ca
      link
      fedilink
      32 years ago

      Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.

  • @dellish@lemmy.world
    link
    fedilink
    12 years ago

    Perhaps this is a good place to ask now the topic has been raised. I have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

    • @vivadanang@lemm.ee
      link
      fedilink
      32 years ago

      have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

      in a laptop? practically none. there are some very rare ‘laptops’ out there - really chonk tops - that have full size desktop gpu’s inside them. the vast majority, on the other hand, will have ‘mobile’ versions of these gpus that are basically permanently connected to the laptop’s motherboard (if not being on the mobo itself).

      one example of a laptop with a full-size gpu (legacy, these aren’t sold anymore): https://www.titancomputers.com/Titan-M151-GPU-Computing-Laptop-workstation-p/m151.htm note the THICK chassis - that’s what you need to hold a desktop gpu.

    • @gazab@lemmy.world
      link
      fedilink
      12 years ago

      You could use an separate external gpu if you have thunderbolt ports. It’s not cheap and you sacrifice some performance but worth it for the flexibility in my opinion. Check out https://egpu.io/

    • @chemsed@lemmy.ca
      link
      fedilink
      12 years ago

      In my experience, AMD is not more reliable on updates. I had to clean install trice to be able to have my RX 6600 function properly and months later, I have a freezing issue that may be caused by my GPU.

    • MudMan
      link
      fedilink
      42 years ago

      I miss that small time window where maxing out games and not having to tweak and tune was a thing.

      Is 4K60 the goal? Because I have a bunch of 120Hz displays, so… 4K60? Or what about 1440p120? Or maybe you can split the difference and try to get 90-ish at upscaled 4K and the VRR will eat the difference. And of course I have handhelds so those are a separate performance target altogether.

      These days you are tuning everything no matter what unless you’re running… well, a game from that era when 1080p60 was the only option.

      • BaroqueInMind
        link
        fedilink
        12 years ago

        You’ll have your dream come true when consumers are told to upgrade their televisions after the next generation of game consoles mandate it as their next new shiny feature.