Obviously you need lots of GPUs to run large deep learning models. I don’t see how that’s a fault of the developers and researchers, it’s just a fact of this technology.
Obviously you need lots of GPUs to run large deep learning models. I don’t see how that’s a fault of the developers and researchers, it’s just a fact of this technology.
In deep learning generally open source doesn’t include actual training or inference code. Rather it means they publish the model weights and parameters (necessary to run it locally/on your own hardware) and publish academic papers explaining how the model was trained. I’m sure Stallman disagrees but from the standpoint of deep learning research DeepSeek definitely qualifies as an “open source model”
Ughhh that was my fear. Haven’t built a desktop in probably 20 years. I definitely worry about the time sink mostly in deciding every component, researching if it’ll work with linux, sourcing it, hoping it’s authentic, etc. Any recent guides you could recommend if I have to go down that route?
My poor wife got shingles at 39 last year. Her doc was like “yeah it’s definitely shingles, welcome to firmly middle aged”