• AnyOldName3@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      The last time they had plenty of stock and cards people wanted to buy at the same time was the RX 200 series. They sold lots of cards, but part of the reason people wanted them was because they were priced fairly low because the cards were sold with low margins, so they didn’t make a huge amount of money, helping to subsidise their CPU division when it was making a loss, but not more.

      Shortly after this generation launched Litecoin ASIC mining hardware became available, so suddenly the used market was flooded with these current-generation cards, making it make little sense to buy a new one for RRP, so towards the end of the generation, the cards were sold new at a loss just to make space. That meant they needed to release the next generation cards to convince people to buy them, but as they were just a refresh generation (basically the same GPUs but clocked higher and with lower model numbers with only the top-end Fury card being new silicon) it was hard to sell 300-series cards when they cost more than equivalent 200-series ones.

      That meant they had less money to develop Polaris and Vegas than they wanted, so they ended up delayed. Polaris sold okay, but was only available as low-margin low-end cards, so didn’t make a huge amount of money. Vega ended up delayed by so long that Nvidia got an entire extra generation out, so AMD’s GTX 980 competitor ended up being an ineffective GTX 1070 competitor, and had to be sold for much less than planned, so again, didn’t make much money.

      That problem compounded for years until Nvidia ran into their own problems recently.

      It’s not unreasonable to claim that AMD graphics cards being in stock at the wrong time caused them a decade of problems.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      It’s a huge gamble for manufacturers to order a large allocation of wafers a year in advance of actual retail sales. The market can shift considerably in that time. They probably didn’t expect Nvidia to shit the bed so badly.

    • hume_lemmy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      Wait, are you saying if they had more product, they could sell more product?

      Sounds like voodoo economics to me!

    • Zorsith@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      I had luck with Microcenter last week (if you have one near you); checked their website at my preferred location and they had 9070 XTs in stock, went after work and I got one.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        2 days ago

        I have one about an hour away and no luck so far at that location.

        Edit: oh damn, they are in stock today!

        Edit2: it was one and now its gone :(

        • Zorsith@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          11
          ·
          edit-2
          2 days ago

          I got mine on a Thursday FWIW. The employees didn’t even know they had them in stock, they had to pull it out of a case in the back; had to have been an afternoon truck delivery

          Also it is goddamned wonderful as a GPU 😍 ive been replaying Far Cry 6 with maxed out… everything. At 4k, and I’ve got roughly 100 FPS stable. Its absolutely gorgeous

        • edgemaster72@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 days ago

          If you really wanna get one and it’s in stock, reserve it for pick up, they’ll hold it for like 3 days and you don’t have to pay until you actually go get it, in case you change your mind or can’t make it in that time.

  • blandfordforever@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 days ago

    My ryzen 7 3700x is several years old now. It was a little finicky with what memory it liked but since working that out, it’s been great. No complaints. I expect this system to last me at least another 5 years.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      3800x here, I’ve been very happy with it. I don’t see a need to upgrade. My 2070s, however… does not work very high-end with my ultra-wide monitor when playing AAA (or even AAAA!..) games lol.

    • BeardedGingerWonder@feddit.uk
      link
      fedilink
      English
      arrow-up
      3
      ·
      22 hours ago

      I upgraded to a 5700X from a 3600 this year to take advantage of some sales, no regrets. Wish I had the spare cash for a 9070XT, maybe next gen.

      • blandfordforever@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        22 hours ago

        The only hardware issues I’ve ever had were due to poor thermal management.

        If you want hardware longevity, use a high quality PSU, don’t overclock, provide excessive cooling (so that several years from now, when you neglect your system and its full of dust, you’ll still be OK).

        • AnyOldName3@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          Thermal problems are much less likely to kill hardware than they used to be. CPU manufacturers have got much better at avoiding microfractures caused by thermal stress (e.g. by making sure that everything in the CPU expands at the same rate when heated) and failures from electromigration (where the atoms in the CPU move because of applied voltage and stop being parts of transistors and traces, which happens faster at higher temperatures). Ten or twenty years ago, it was really bad for chips to swing between low and high temperatures a lot due to thermal stress, and bad for them to stay at above 90°C for a long time due to electromigration, but now heat makes so little difference that modern CPUs dynamically adjust their frequency to stay between 99.0° and 99.9° under load by default. The main benefit of extra cooling these days is that you can stay at a higher frequency for longer without exceeding the temperature limit, so get better average performance, but unless your cooling solution is seriously overspecced, the CPU will be above 99.0° under load a lot of the time either way and the motherboard just won’t ramp the fan up to maximum.

        • SpacetimeMachine@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          22 hours ago

          I had all of that. Ran into intermittent random crashes about a year ago. After a year of not being able to find a cause I found a thread of other 5800x users running into the same problem. (For the record this was with a high quality PSU, a very very light overclock, and temps were fine throughout that time. Also while I’m not a true IT professional I do know my way around a computer, and the most in depth error logs I could find, which there were very few of, pointed to really low level calculation errors.)

          After finally giving up and just buying a 9800x3d I gave the system to my friend for a huge discount, but after reinstalling everything the CPU never booted again.

          While what you say is generally true, it is also sometimes the case that some parts are just slightly defective, and those defects might show with age. It’s the first CPU I’ve ever had that died on me (other than a 9800x3d but that was an MSI mobo that killed it,) so I don’t really hold it against them. And I’m very happy with the 9800x3d. Its amazing the difference it’s made in games.

          • blandfordforever@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            20 hours ago

            That’s a bummer that it failed on you.

            I’ve been wondering if it would be worth it to replace my 3700x with a 5800x3d but I’m not sure the modest performance improvement would be worth the price.

  • CrowAirbrush@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    I’ve always been an AMD cpu guy; my first pc had an AMD gpu and then i moved away to nvidia, but due to costs i now moved back to AMD and i got zero complaints.

  • Asafum@feddit.nl
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 days ago

    Hell they finally got me too… They can thank Intel for royally fucking up 13-14 gen and then releasing a new chip that wasn’t much better than previous ones to warrant the price.

    I always built Intel PCs out of habit mostly, but I just got a 9800x3d last week for my rebuild.

    • Auli@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 hours ago

      Been AMD for years but went Intel for a media server due to the encoder and better idling power. I wish AMD would improve their video encoding.

  • RejZoR@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    2 days ago

    Radeon RX 9000 series are amazing. Apparently sales are doing great, but obviously NVIDIA is holding monopoly and people are brainwashed to just buy NVIDIA no matter what.

    • theunknownmuncher@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      Seemed like it was marginal improvement with focus on upscaling/framegen, which does not really interest me. I’m still really happy with my 6900 XT. Although, NVIDIA has been marginal improvement with significant TDP (💀) and price increase for several generations now, so whatever 🤷

      • KingRandomGuy@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        19 hours ago

        40 series to 30 series was pretty tangible IMO (4090 gets something like 30-50% more perf in most tasks than 3090 Ti with the same TDP), in part thanks to the much higher L2 cache plus newer process node.

        50 series was very poor though, probably because it’s the same process node.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        I was surprised to see the 9070 xt at about double the 6800 xt performance in benchmarks, once ones with both of those started coming out.

        I got it because I also see that if China does follow through with an attack on Taiwan, PC components are going to become very hard to find and very expensive while all of that production capacity is replaced. And depending on how things go after that, this might be the last GPU I ever buy.

        • theunknownmuncher@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 days ago

          A huge factor is rendering resolution. I only render at most <1080p (1024x768 or 1600x1200). 2x performance improvement over 6800XT in general sounds very incorrect if the benchmarks are run at 1080p, unless they are using upscaling and frame gen to cheat the performance numbers. Do you have a link to these benchmarks? I’d be less skeptical about a significant performance improvement over 6800XT if the benchmarks were done specifically at 4k resolution though as both AMD and NVIDIA have further optimized GPUs for 4k rendering each passing generation.

          Upscaling/framegen and 4k are completely irrelevant to me, so counting that out, it is marginal improvement based on the numbers I’ve seen. I’d like to be wrong though, and I could be

          • RejZoR@lemmy.ml
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            I do care about upscaling and ray tracing, which is why I didn’t go with AMD for last few generations. RX 9070 XT felt like the right time as they made huge improvements. Especially FSR4 is easily comparable to DLSS and I use it as antialiasing replacement while boosting performance. FSR2, while it works it turns into pixelated mess during fast movements and has a lot of ghosting. FSR4 is near perfect.

            What I also love is how AMD’s Fluid Motion Frames just work in all games with minimal artifacting and Radeon Chill is what I especially love with summer coming in. It decreases power consumption dramatically and thus heat output to levels RTX 5070Ti couldn’t ever achieve despite being more efficient in raw tests for power consumption. All while not affecting experience. It’s so good I’m using it in Overwatch 2 and Marvel Rivals and I can’t really tell a diffeeence, it controls framerate that seamlessly.

            • theunknownmuncher@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              Ray tracing just still isn’t there yet. Even during the manicured ray tracing demo during the AMD announcement event for 9000 series, its nothing but surface boil. Looks like analog white static overlayed on all the surfaces.

                • theunknownmuncher@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  24 hours ago

                  Are you kidding…?? I wish that was true. The worst I’ve seen it is in Marvel Rivals. It’s pretty bad in S.T.A.L.K.E.R. Heart of Chernobyl as well

  • FreeBooteR69@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    2 days ago

    Glad i already built a system and don’t have to worry for a few years about upgrades unless something breaks. Also i’m Canadian so our tariff dispositions may be different.