I haven’t thought about it in a while but the premise of the article rings true. Desktops are overall disposable. Gpu generations are only really significant with new cpu generations. CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.

Is there a platform that challenges that trend?

Edit Good points were made. There is a lot to disagree with in the article, especially when focused on gaming.

Storage For the love of your data : storage is a WEAR component. Especially with HDD. Up until recently storage was so cheap it was crazy not to get new drives every few years.

Power Supplies Just because the computer still boots doesn’t mean the power supply is still good. A PSU will continue to shove power into your system long past the ability to provide clean power. Scope and test an older PSU before you put it on a new build.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    1 month ago

    That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.

    I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.

    I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.

    …That being said, there’s a lot of trends going against people, especially for gaming:

    • There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.

    • We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.

    • Time gaps between generations are growing as silicon gets more expensive to design.

    • …Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.

    • Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.

    • You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.

    IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.

    • kreskin@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.

      While throwing out working things is terrible, the cost of servicing a motherboard outpaces the cost of replacing it. They can possibly still charge you 200 dollars and tell you the board cant be fixed, right? I think the right balance is that you observe the warranty period, try to troubleshoot it yourself --and then call it a day, unless you have a 400+ dollar motherboard.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        Yeah, probably. I actually have no idea what they charge, so I’d have to ask.

        It’s be worth it for a 3090 though, no question.

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 month ago

      Typically I’ve seen a motherboard supports about 2 generations of gpu before some underlying technology makes it no longer can keep up.

      If you are going from a 30 series to a 50 series gpu there is going to be a need for increased pci bandwidth in terms of lanes and pcie- spec for it to be fully utilized.

      I just saw this play out with a coworker where he replaced 2x3090 with a 5090. The single card is faster but now the he can’t fully task his storage and gpu at the same time due to pci-lane limits. So it’s a new motherboard, which needs a new cpu which needs new ram.

      Basically a 2 generation gpu upgrade needs a whole new system.

      Each generation of pcie doubles bandwith so a future 2x pcie-6 gpu will need an 8x pcie 4 worth of bandwidth.

      Even then gpu’s and cpu have been getting more power hungry. Unless you over spec your psu there is a reasonable chance once you get past 2 gpu generations you need a bigger Psu. Power supplies are wear items. They continue to function, but may not provide power as cleanly when you get to 5+ years of continuous use.

      Sure you can keep the case and psu but literally everything else will run thunderbolt or usb-c without penalties.

      At this point why not run storage outside the box for anything sizeable? Anything fast runs on nvme internal.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 month ago

        This doesn’t make any sense, especially the 2x 3090 example. I’ve run my 3090 at PCIe 3.0 over a riser, and there’s only one niche app where it ever made any difference. I’ve seen plenty of benches show PCIe 4.0 is just fine for a 5090:

        https://gamersnexus.net/gpus/nvidia-rtx-5090-pcie-50-vs-40-vs-30-x16-scaling-benchmarks

        1x 5090 uses the same net bandwidth, and half the PCIe lanes, as 2x 3090.

        Storage is, to my knowledge, always on a separate bus than graphics, so that also doesn’t make any sense.

        My literally ancient TX750 still worked fine with my 3090, though it was moved. I’m just going to throttle any GPU that uses more than 420W anyway, as that’s ridiculous and past the point of diminishing returns.

        And if you are buying a 5090… a newer CPU platform is like a drop in the bucket.


        I hate to be critical, and there are potential issues, like severe CPU bottlenecking or even instruction support. But… I don’t really follow where you’re going with the other stuff.

        • worhui@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 month ago

          And if you are buying a 5090… a newer CPU platform is like a drop in the bucket.

          That is the point of the article.

          The problem my friends has is that he is rendering video so he has a high performance Sas host adapter on the same PCI bus as the GPU. He upgraded both hoping the 5090 would play nicer with the sas adapter but he can’t pull full disk bandwith and render images at the same time. Maybe it’s ok for gaming, not for compute and writing to disk.

          The thing with power supplies, they continue to provide enough power long after they lose the ability to provide clean power under load. Only when they are really on their last legs will they actually stop providing the rated power. I have seem a persistent networking issue resolved by swapping a power supply. Most of the time you don’t test a power supply under load to understand if each rail is staying where it needs to be.

  • m-p{3}@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 month ago

    Personally I still prefer the desktop because I can choose exactly where I prefer performance, and where I can make some tradeoffs. Also, parts are easier to replace when they fail, making them more sustainable. You don’t have that choice with a laptop since it’s all prebuilt.

    • socphoenix@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      Desktops also offer better heat dissipation and peripheral replacements extending the life of the unit. It can be difficult for most folks to replace a laptop display or even battery nowadays frankly.

  • sorghum@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 month ago

    Disposable my ass. I just did the final upgrades to my AM4 platform to be my main rig for the next 5 years. After that it will get a storage upgrade and become a NAS and do other server stuff. This computer 7 years in has another 15 left in it.

    • Lfrith@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      Yeah, it’s crazy that someone could have gotten like a Ryzen 5 1600 then upgraded to a 5800x3D around 5 years later without needing to buy a new motherboard, which usually can mean having to buy a new set of ram too.

      For a long time just doing a new build if upgrading to a newer CPU used to be the thing when Intel was dominant.

      • sorghum@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        Yeah, I usually over spec when I build my main rig because I want to have it last and repurpose it later down the road. I finally retired a power supply that I bought back in the mid 2000s. I can’t power modern cards anymore unfortunately. 🫡 pc power and cooling single rail take a break. You’ve earned it.

  • saltesc@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 month ago

    Let’s say that you’ve just significantly upgraded your GPU. If you were getting the most out of your CPU with your previous GPU, there’s a good chance that your new GPU will be held back by that older component. So now, you need a new CPU or some percentage of your new GPU’s performance is wasted. Except, getting a new CPU that’s worth the upgrade usually means getting a new motherboard, which might also require new RAM, and so on.

    This guy’s friends should keep him away from computers and just give him an iPad to play with.

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      1 month ago

      Technology moves on. The highest spec iPads blow away older workstation class pc’s for non-gpu loads. It would only be the OS holding that back, not the hardware.

      • biggerbogboy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        Technology moves on. Any meaningfully upgradable desktop will blow away the highest spec iPads of 2026 once the owner finds it necessary to upgrade. And upgrading a pc doesn’t mean you have to replace the entire thing just because you want a new GPU, it’s just like bulldozing your house because you don’t like the current wall paint colour.

        Sure, if you’ll go clinically insane if you get the smallest bottleneck with your hardware, sure, replace your rig if it’s viable for you, but most of the time for most workloads, small bottlenecks don’t mean much, so upgrading components when it feels right is generally just a better choice.

        Also, the highest spec iPads only have 16gb of unified ram, and sure, with compression and it only having a single page table between all processors, it’s impressive, but realistically, what do you need all that insane power of memory architecture as well as the m5 chipset for a mobile workflow? And how are you supposed to replace the storage/ram/processor when it begins to feel slow after defying the trillion dollar company by putting a desktop OS on it?

        iPads and workstations/desktops aren’t applicable to each other, they’re entirely different classes of devices. Frankly, if you manage to put a desktop OS on an iPad, I’d like to see you try using it for gaming, productivity and other workloads for at least a decade. And if you can’t? Well you can’t upgrade it like you can a real workstation/desktop.

  • alessandro@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 month ago

    🎺"The upgrade argument for desktops doesn’t stand up anymore" 🎺

    of course, you can still…

    hum… well, you can also…

    yeah, yeah, you can do that also… but…

    …and so going on.

  • RaoulDook@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    Everything in this post is wrong, actually. But if you buy shit parts to build your desktop, you’ll have a shitty desktop.

    Simple answer is at the motherboard level - you look at your motherboard’s future expansion capability and if you started with a good foundation you can do years of upgrades. Also your computer case needs to be big enough to fit extra stuff, full ATX motherboard size is great.

    For example I have a VR gaming rig that runs VR games well on DDR3 RAM and a Sandy Bridge CPU, because it has a decent modern GPU and enough CPU cores + RAM.

  • yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    This is a weird way to say that PC tech is stagnated and improvements between “generations” is incremental.

  • Cyv_@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    I disagree that you need to upgrade your CPU and GPU inline. I almost always stagger those upgrades. Sure, I might have some degree of bottleneck but it’s pretty minimal tbh.

    I also think it’s a bit funny the article mentions upgrading every generation. I’ve never done that, I don’t know a single person who does. Maybe I’m just too poor to hang with the rich fucks, but the idea of upgrading every generation was always stupid.

    Repairability is a big deal too. It also means that if my GPU dies I can just replace that one card rather than buy an entire new laptop since they tend to just solder things down for laptops.

    • stealth_cookies@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      I typically build a whole new PC and then do a mid-life GPU upgrade after a couple generations. e.g. I just upgraded my GPU I bought in late 2020. For most users there just isn’t a good reason to be upgrading your CPU that frequently.

      I can see why some people would upgrade their GPU every generation. I was suprised at how expensive even 2 generations old card are going for on ebay, if you buy a new card and sell your old one every couple years the “net cost per year” of usage is pretty constant.

  • Samskara@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    It’s been like that that since I can remember. Upgrading can extend the lifespan by a few years, but often it’s a good idea to replace the whole system.

    It depends on a lot of factors of course. If you buy a midrange machine now, you can upgrade it in five years to a high end machine from today, then five years ago.

    Rarely do you get to take advantage of technology shifts like hard drives to SSD. A couple of years ago, adding more RAM and an SSD made machines usable, that had these bottlenecks. Still the best thing you can do to an old laptop or desktop.

    Over the last decade performance hasn’t improved that much for most typical use cases. An i7 from ten years ago with 16 GB RAM and a 1 TB SSD, and a NVIDIA GTX 1080 is still a decent computer today.

    What makes PCs great is that you’re more flexible regarding how you configure your machine. Adding more storage, more ports, extension cards, optical drives inside your machine etc. is just nice.

    With a laptop you end up with crappy hubs and lots of cables.

    • worhui@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      From a pure aesthetics standpoint hubs and cables suck. From a functional standpoint they are equivalent except for the GPU.

  • Rioting Pacifist@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    This has been true for a long time, CPU sockets don’t last long enough to make upgrades worth it, unless you are constantly upgrading. Whenever i’ve built a “futureproof” desktop with a mid-high end GPU, by the time I hit performance problems I needed a new motherboard to fit the new CPU anyway. Only really upgradable components are storage and ram, but you can do that in your laptop too.

    The main advantage of Desktops is still that you get much more performance for your money and can decide where it goes if you build it yourself.

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    The main benefit of a desktop is the price / performance ratio which is higher because you’re trading space and portability for easier thermal management and bigger components.

  • Zink@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    The importance of open & interchangeable hardware and software goes way beyond the upgrades you may or may not make, or even saving money & reducing e-waste.

    You get better products that way. Having complete control over your system benefits you even if you never exercise that control. It is literally a constraint on enshittification.