Anyone else just sick of trying to follow guides that cover 95% of the process, or maybe slightly miss a step and then spend hours troubleshooting setups just to get it to work?
I think I just have too much going in my “lab” the point that when something breaks (and my wife and/or kids complain) it’s more of a hassle to try and remember how to fix or troubleshoot stuff. I lightly document myself cuz I feel like I can remember well enough. But then it’s a style to find the time to fix, or stuff is tested and 80%completed but never fully used because life is busy and I don’t have loads of free time to pour into this stuff anymore. I hate giving all that data to big tech, but I also hate trying to manage 15 different containers or VMs, or other services. Some stuff is fine/easy or requires little effort, but others just don’t seem worth it.
I miss GUIs with stuff where I could fumble through settings to fix it as is easier for me to look through all that vs read a bunch of commands.
Idk, do you get lab burnout? Maybe cuz I do IT for work too it just feels like it’s never ending…


I find the overhead of docker crazy, especially for simpler apps. Like, do I really need 150GB of hard drive space, an extensive poorly documented config, and a whole nested computer running just because some project refuses to fix their dependency hell?
Yet it’s so common. It does feel like usability has gone on the back burner, at least in some sectors of software. And it’s such a relief when I read that some project consolidated dependencies down to C++ or Rust, and it will just run and give me feedback without shipping a whole subcomputer.
This is a crazy take. Docker doesn’t involve much overhead. I’m not sure where your 150GB hard drive space commend comes from, as I just dozens of containers on machines with 30-50GB of hard drive space. There’s no nested computer, as docker containers are not virtualization. Containers have nothing to do with a single projects “dependency hell”, they’re for your dependency hell when trying to run a bunch of different services on one machine, or reproducing them quickly and easily across machines.
Docker in and of itself is not the problem here, from my understanding. You can and should trim the container down.
Also it’s not a “whole nested computer”, like a virtual machine. It’s only everything above the kernel, because it shares its kernel with the host. This makes them pretty lightweight.
It’s sometimes even sometimes useful to run Rust or C++ code in a Docker container, for portability, provided you of course do it right. For Rust, it typically requires multiple build steps to bring the container size down.
Basically, the people making these Docker containers suck donkey balls.
Containers are great. They’re a huge win in terms of portability, reproducibility, and security.
Yeah, I’m not against the idea philosophically. Especially for security. I love the idea of containerized isolation.
But in reality, I can see exactly how much disk space and RAM and CPU and bandwidth they take, heh. Maintainers just can’t help themselves.
As someone used to the bad old days, gimmie containers. Yes it kinda sucks but it sucks less than the alternative. Can you imagine trying to get multiple versions of postgres working for different applications you want to host on the same server? I also love being able to just use the host OS stock packages without needing to constantly compile and install custom things to make x or y work.