• NOT_RICK@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    Pretty astute. Maybe I can buy a half cooked gpu on firesale in a few years for a budget build… one can dream!

    • Sturgist@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      22 hours ago

      Unfortunately the ones used in AI data centers aren’t useful for gaming. So yeah, probably could buy one for ⅓ the price of new, but couldn’t use it for gaming and likely still wouldn’t be able to afford it because of:

      NVIDIA H200 (Blackwell architecture) – The latest flagship (late 2023), with upgraded ~141 GB HBM3e and other Hopper improvements. It provides ~50% performance uplift over H100 in many tasks on the same power envelope. Pricing is said to be only modestly above H100. For instance, a 4-GPU H200 SXM board is listed at about $170K (versus $110K for 4×H100) ([2]) ([20]). A single H200 (NVL version) is quoted at around $31,000–32,000 ([21]). NVIDIA’s data center system NVDIMMs for H200 (DGX B200) reflect these prices, though bulk deals may apply.

    • addie@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      ·
      22 hours ago

      Data centre GPUs tend not to have video outputs, and have power (and active cooling!) requirements in the “several kW” range. You might be able to snag one for work, if you work at a university or at somewhere that does a lot of 3D rendering - I’m thinking someone like Pixar. They are not the most convenient or useful things for a home build.

      When the bubble bursts, they will mostly be used for creating a small mountain of e-waste, since the infrastructure to even switch them on costs more than the value they could ever bring.

      • panda_abyss@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        14 hours ago

        The optimist in me has hope that this does fuel an explosion cheap hardware for businesses to build cheap+useful+private AI stuff on.