• ouRKaoS@lemmy.today
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    12 hours ago

    I don’t think there will be a DDR6. I think the AI bubble is going to pop & all these data centers will become “mainframe centers” that your minimum spec’d home terminal connects to to do all the computing for you on “Our lightning fast multi-core super computer with terabytes of memory!”

    😐😭🤮

    • carpelbridgesyndrome@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      9 hours ago

      I doubt it. Those AI computers are built in a really weird way and have a lot of hardware that isn’t really useful outside an AI/HPC context. Some stuff like the weird card to card network topology can be reconfigured but the rest of it can’t easily be. The servers are rather agressively designed around keeping as many GPUs fed as possible making them kinda weird for other jobs. Those datacenter cards are missing enough video hardware (for example texture units) to make gaming hard and I’m not sure there’s that much consumer demand for linear algebra accelerators. If they can’t find more HPC jobs they may go under. Movie studios could have interesting opportunities here but they are still primarily using CPUs in all their software IIRC.

      The clusters in the UAE and Saudi Arabia might be repurposable for nuclear weapons research which isn’t great.

      • fruitycoder@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        GPGPU usage is probally going to see some real usage. There was an interesting talk at the xorg conf even about turn the video hardware into virtual services running on GPGPU focused hardware.

        Ive talked with some of the HPC programers too who are trying to find creative repurposes already lol