• @boonhet@lemm.ee
    link
    fedilink
    English
    811 days ago

    What price point are you trying to hit?

    With regards to AI?. None tbh.

    With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.

      • @bassomitron@lemmy.world
        link
        fedilink
        English
        911 days ago

        No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.

          • caseyweederman
            link
            fedilink
            English
            110 days ago

            I have a hard time believing anybody wants AI. I mean, AI as it is being sold to them right now.

            • @boonhet@lemm.ee
              link
              fedilink
              English
              310 days ago

              I mean the image generators can be cool and LLMs are great for bouncing ideas off them at 4 AM when everyone else is sleeping. But I can’t imagine paying for AI, don’t want it integrated into most products, or put a lot of effort into hosting a low parameter model that performs way worse than ChatGPT without a paid plan. So you’re exactly right, it’s not being sold to me in a way that I would want to pay for it, or invest in hardware resources to host better models.

    • @barsoap@lemm.ee
      link
      fedilink
      English
      011 days ago

      With regards to AI?. None tbh.

      TBH, that might be enough. Stuff like SDXL runs on 4G cards (the trick is using ComfyUI, like 5-10s/it), smaller LLMs reportedly too (haven’t tried, not interested). And the reason I’m eyeing a 9070 XT isn’t AI it’s finally upgrading my GPU, still would be a massive fucking boost for AI workloads.