Today my PC finally ate it. No POST, no disk activity, so I’m pretty sure the mobo has failed. I built this PC 8 or 10 years ago, and I’m honestly too old and out of touch to know where to start on a rebuild lol.

I’m an arch Linux user, my job is in machine learning, and I’m looking at a soup to nuts style rebuild but I don’t know where to start. I want as much future proofing as I can get and I’m happy with a budget anywhere from $2k to $8k. I don’t game now, but I might want to in the future.

So it seems like to leverage good ML tools I’m locked to cuda, so probably Nvidia GPU. Does that mean 4070 Ti is the knee in the curve? CPU I came from AMD but I have no idea. RAM speed is something I have never ever considered. And mobo wise, I have a couple of M.2 drives now, but I’m not sure what else should drive decisions? 1 monitor currently that I intend to replace, so I’m not sure why I would need multiple GPUs or something that necessitates a lot of PCIe connections.

I want a plain old closed black case, no color changing gamer shit, and about as much computing power as I can get. Pcpartpicker came up a little short, how do I start?

I’ve got maybe a week of lead time, then I would like to pull the trigger. This whole build process was a lot easier circa 2003!

  • stevedice@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    51 minutes ago

    If you’re fine with a budget of up to $8K there really isn’t that much to consider. Buy the most expensive Ryzen 9 and a 5090.

    • dream_weasel@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      8 hours ago

      Mostly pytorch and honestly mostly CPU limited in my experience (small networks with frequent database access). I feel like I’m throwing people off the conversation by having mentioned it. GPU training is a nice to have, but maybe not necessary the more I think about it. I have a cluster at work if I really need that functionality unless I’m doing it for personal projects.

      • mrvictory1@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 hours ago

        Nvidia is gold standard for AI stuff, that’s why it was so heavily recommended. If training is not a major concern, then AMD is the obvious choice for Linux gaming (unless you need HDMI 2.1).

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    17 hours ago

    Probably not good advice, but the 50XX series is right around the corner (end of january, and february for <=5070ti), which Nvidia did advertise big increases for AI workflows. Whether or not they actually are, and whether or not you will actually be able to buy one is another question lol.

    Good chance 40xx series will drop in value right after though, so I would wait until the end of the month and see at least.

    I’m still out here using tensorflow on my server with a 750ti lmao.

  • CarbonatedPastaSauce@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    16 hours ago

    Fractal cases are amazing. I have purchased several Define and Meshify models over the years and I’m always very pleased with them. The Define cases in particular are very quiet and easy to cool. Build quality is always stellar.

    For ML you definitely want Nvidia from all that I’ve read. For literally everything else on linux you probably want AMD because it just works. I just built a brand new AMD CPU/GPU PC a few months ago with the 7800X3D and 7900XTX, and I’m very pleased with the gaming performance. But AMD just isn’t there yet with AI / ML.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    20 hours ago

    VRAM is king for AI workloads. If you’re at all interested in running LLMs, you want to maximize VRAM. RTX 3090 or 4090 are your options if you want 24GB and CUDA. If you get a 4090, be sure you get a power supply that supports the 12V HPWR connector. Don’t skimp on power. I’m a little out of the loop but I think you’ll want a PCIe 5.0 PSU. https://www.pcguide.com/gpu/pcie-5-0-psus-explained/

    If you’re not interested in LLMs and you’re sure your machine learning tasks don’t/won’t need that much VRAM, then yes, the 4070 Ti is the sweet spot.

    logicalincrements.com is aimed at gaming, but it’s still a great starting point for any workload. You’ll probably want to go higher on memory and skew more toward multi-core performance compared to gaming builds, IMO. Don’t even think about less than 32GB. A lot of build guides online skimp on RAM because they’re only thinking about gaming.

    • dream_weasel@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 hours ago

      This is all great info and the new power supply and ram kit stuff is blowing my mind. Fortunately, my worn is not LLM related but just simple neural networks, but I don’t know how that might affect best practices for hardware.

      • GenderNeutralBro@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 hours ago

        Fortunately, my worn is not LLM related but just simple neural networks, but I don’t know how that might affect best practices for hardware.

        Okay. If this is something you already do on existing machines, you’ll be in good position to know how much memory you actually need, and then maybe give yourself a little room to grow. My guess would be that you’re not working on massive models so you’ll probably be fine with a mid-range card.

        At the same time, a lot of AI/ML stuff is becoming mainstream and requires a ton of VRAM to get good performance. If you do any work with graphics, audio, or video, you might find yourself running large models without really thinking about it. There are lots of use cases for speech recognition models, for example, which are quite large. Photoshop already has some medium-sized models for some tasks. Noise reduction for audio can also be quite demanding (if you want to do a really good job).

        As for system RAM…the world of DDR5 is indeed complicated. I don’t think there’s a huge need to go over 6000MHz RAM, and faster RAM brings some compatibility issues with some mobos/CPUs. It’s also usually faster to use two sticks than four. So 2x32GB would be better than 4x16 in general.

        For GPUs in particular, new gens with more VRAM are on the way, so buying the high-end now might leave you with something that feels obsolete by the time you grow into it. If you spend $750 now and $750 again in 2-3 years, you might end up better off than if you spent $1500 today and waited twice as long to upgrade. Particularly if you are able/willing to sell your old equipment to offset upgrade costs.

        • dream_weasel@sh.itjust.worksOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          Solid advice. Saving this comment. Yeah I have 32G ram now and was thinking 64, so 2 sticks. I also think I’m going to roll Nvidia in the 4000 series as high as I can get without going over about 2k.

          Still gotta resolve processor… I’m thinking AMD Ryzen but not sure how to pick which one. After that it’s just mobo and making sure I’ve got enough name/m.2 slots I guess?

          • GenderNeutralBro@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            Yeah. If you want to be on the cutting edge of storage, look for a mobo that has PCIe gen5 m.2 slots. But really, PCIe gen4 m.2 drives are still pretty darned fast. You can get some with >7GB/sec transfer rates. Do you need >12GB/sec transfer from disk? Probably not. Is it cooooool? Sure. :)

            This is a popular SSD these days, very good for the price: https://us-store.msi.com/PC-Components/Storage-Devices/Solid-State-Drive/M482-NVMe-M2-2TB-Bulk . If you want something high-end, look for an SSD with DRAM cache. Useful if you’re writing massive amounts of data regularly, like video mastering or something like that, generally overkill otherwise.

            I’ve been on the Ryzen x700 line for a long time now, first the 1700 and now on the 7700. No complaints, they rock. So I’d start by looking at the 9700. 9900 has more cores (and uses significantly more power), 9600 has fewer cores. Single-core performance is basically the same across the board, so it just depends on whether your workload can use a lot of cores or not. The “X3D” chips have additional CPU cache that supposedly improves performance in some workloads (notably in gaming). So if that’s important to you, the 9800X3D is the natural choice.

      • scribbler@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        18 hours ago

        When training you’ll want way more VRAM than you need to run inference - get a 90 series GPU for the memory.

  • filister@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    16 hours ago

    Additionally for machine learning you need as much VRAM as you can possibly get. So consider 5060Ti, 5070Ti, 4070Ti, 5070Ti if 16Gb is enough, otherwise maybe 3090Ti, 4090, 5090

  • statler_waldorf@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    20 hours ago

    I’m a few years out of date on most hardware, but I can recommend the Fractal Define cases. They also make RGB monstrosities, but the Define line are sturdy, quiet, and come in solid metal side styles. I’ve had the original XL for probably 15 years now and it’s still going strong. XL is probably overkill, but at the time I had a bunch of old platter drives I was still using

    • dream_weasel@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 hours ago

      Rock on. My last 2 have been Corsair and I had a lian li before that taking me back to about 2006 😬. Basically I’m hanging this case under a standing desk, so it makes no sense to make it visually interesting.

      • nomad@infosec.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        18 hours ago

        Consider going with a rail mounted 19’’ case. I have 4HE hanging under my standing desk in a chenbro case. Nothing special, moves with the desk and can be easily pulled out and opened on rails. standing desk is motorized, so it’s heavy enough.

  • novacomets@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    1
    ·
    19 hours ago

    Research about a case that works for you. Some like a Lian Li case, some like a Fractal Design case, a few like a be quiet! or Corsair case, or NZXT case, look at different options. Then, an x870-E motherboard, maybe a Gigabyte Master or ASRock Taichi, Ryzen 9950x processor, 64GB RAM, and a RTX 5090.