How the fuck does that business model work? 10TB is cheaper than Backblaze B2 in 20 months.
How the fuck does that business model work? 10TB is cheaper than Backblaze B2 in 20 months.


That’s an incredibly good point. Bad actors are the worst. Some ideas:
Definitely a difficult problem to solve. I’m sure people smarter than me have ideas beyond mine.


That NAS software company Linus (of Linus Tech Tips) funded has a feature for this planned I think.
An open-source standalone implementation would be dope as hell. Sure, it’d mean you’d need to double your NAS capacity (as you’d have to provide enough storage as you use), but that’s way easier than building a second NAS and storing/maintaining it somewhere else or constantly paying for and managing a cloud backup.


Yeah, people have done workarounds and stuff to get their entire NAS backed up but those seemed sketchy and bad when I looked into it.


I have an Index on Kubuntu. It works, I was able to play Beat Saber, but it felt a bit jelly-y. The framerate was fine, but it didn’t feel fine. I haven’t had time to troubleshoot further.


fdroid Sideloading
Almost everything in fdroid can simply be sideloaded. Due to the inability to backup your app list from fdroid, I’ve completely switched to obtainium.


deleted by creator


The first two dressings you listed are much healthier than the latter two. If I’m eating a salad, I don’t need to put a caloric dressing on it.


People into Jellyfin use smart TVs? I haven’t connected mine to the internet.
What have you tried to find an instance? Looks like PeerTube.wtf has open registration, and I found that with a quick search.
I used NGINX with Certbot and haven’t had to manually touch anything HTTPS, it’s great.


You can still go to your mom’s house and get your data, unlike if you’re renting a VM.
I’m also completely not arguing that renting a VM isn’t self hosting, I’m certain I’ve said nothing of the sort, I’m just arguing that it’s worse than owning your hardware and therefor data.


That’s not really the point I’m trying to make, I’m not sure where this disconnect is coming from.
The question you should be asking is whether or not I can more easily access my home than a data center, to which the answer would be yes.
If the entire world disappeared aside from the plot of land I live on, well, I’d have larger issues, but I would still be able to access my data, until the generator ran out of gas of course.
To answer the question you did ask that, again, is not relevant to the point I’m trying to make, is yes. I work from home, and live in America where we don’t have third places, so I do most often access my data from home. Additionally, most of the services I self host are home automation and data backup based. Sure, I wouldn’t be able to access Immich or Home Assistant while away from home, which would be annoying, but the end of the world? Not really. A lot of people intentionally don’t make their HA/Immich instances visible to the internet.


I’m not sure what the disconnect is here. In both scenarios I’m reliant on an ISP. In the scenario where it’s on a data center, if my internet goes down or the data center goes down, I am shit out of luck. I am not capable of accessing my data. If it’s hosted at my house, I still have the ability to go home and access my stuff. One seems much better than the other to me. It’s the difference between being able to access your stuff and not.
There are definitely positives to both, but having physical access to my own hardware that contains my own data is a huge positive to me.


Sure, but if stuff goes really south, I can still access the stuff on my hardware from my home. If stuff goes down, I cannot access the stuff in data centers, period. There’s positives and negatives either way, but imo owning your shit is a huge positive.


This feels like a bad faith argument. If the internet goes down, I will be able to access my servers and my data by simply going home. If those services were hosted in the cloud, I wouldn’t not be able to access my data at all. Obviously one is better than the other.


Anyone looking to set this up should note that the instructions are slightly incorrect, the URL when adding the repository must be updated to the new repository location, there’s a PR for the change.
Additionally, I’ll mention that it just didn’t work for me, whereas owntracks worked flawlessly. The errors it provided were of little help, so I’m not sure why it wasn’t working. I’m planning to continue using owntracks for now.


Hi, this looks awesome. I just got it set up, and it’s working great, however I cannot get the Immich integration to work. Here’s the error I get when I paste in the domain and the freshly created API key. I’ve tried the API key with all permissions as well as with only the required permissions.
Connection failed: I/O error on POST request for "https://my.immich.domain/api/auth/validateToken": my.immich.domain
Any ideas of troubleshooting steps I could try? Thanks for sharing such a cool tool.


Or, at the very least, a reverse proxy.
Memory or storage?