It even has a second NVMe slot for another SSD. And what you can’t build fir that price is a PC with 110 GB VRAM (96 GB when using Windows). This thing is a thinly disguised AI workstation for running large models locally. It’s significantly cheaper than all alternatives.
It even has a second NVMe slot for another SSD. And what you can’t build fir that price is a PC with 110 GB VRAM (96 GB when using Windows). This thing is a thinly disguised AI workstation for running large models locally. It’s significantly cheaper than all alternatives.
Not disguised at all, running locally and even clustering multiple machines for ai was a major point in the presentation
SLI / Crossfire for AI by Framework was surely not on my bingo card