

Ah, yeah I feel that. Been dealing with it in my own setup, and sometimes they get annoyed when stuff gets done on my time 😅
I’ve thought about deploying overseerr to take some of that off me, but I’m lazy…


Ah, yeah I feel that. Been dealing with it in my own setup, and sometimes they get annoyed when stuff gets done on my time 😅
I’ve thought about deploying overseerr to take some of that off me, but I’m lazy…


What’s wrong with your setup that makes you have to touch your stack constantly?
I have radarr and sonarr, and I only ever touch them to add media. Hell, I use prowlarr more than either of the others just cuz I don’t have any kind of management for other media I partake in


Don’t use sonic walls, and also I don’t have any configured, so I can’t help you with any specifics other than other vendors.
Just guessing, they might be doing some kind of network level exploit detection along with the VPN. My network team has that setup on our firewall (multi-zone, including VPN), and I’ve been called in on more than a few security calls triggered by network EDR. If they have people you can use in that kind of a scenario, it would probably be worth it (my CISO is always trying to get customers to buy into the service BEFORE we have to get on a call rather than after).


I don’t have the attention span to draw out a Pepe Silva looking graph (even if I periodically have to try to explain it to newbies haha)


It sounds like your ubiquity and your ISP router are on the same LAN segment, which is not a good config.
You should never have multiple DHCP servers configured unless you’re intentionally split braining your vlan (only ever done that for HA purposes and using half of the pool on each). Im pretty sure you need to have your ISP connected to your cloud gateway, and all of your gear connected to the ubiquity. Your ISP router should only see your ubiquity, and that’s likely a good part of the reason you can’t see all the DHCP leases on your ubiquity gear.
Were I in your position, I’d probably disconnect everything and slowly reconnect stuff one piece at a time until you trip over what’s causing your issue. I doubt this is the case, but you could also have another DHCP server running on something you forgot about causing issues. Seen that many times before when doing small business network overhauls.


I tried this for a bit last year, and I never got it to work. I’m sure it was user error, but I’m super glad I got a new kit before shit blew up.
Might have to track down the bad kit I had and give this another try for giggles.


Agreed. I spent a bit of time writing out a script for similar functionality for one of our business units, but I never was able to figure out how to convert excel sheets to a PDF to be able to merge them in the allotted time, so it just doesn’t support them lol.
But I can see why it wouldn’t have an API, since the whole deal is it stays in your browser, and an API would mean sending the files to the server.


The fact that they gave the peace prize to that fucking war criminal says all you need to know.


No surprise world news let’s a pedo/groomer back in. That place fucking sucked years ago


Because Israel is a fascist nation, and many modern countries are run by authoritarians who will gladly line up to test new weapons on brown people and hide behind accusations of antisemitism whenever someone calls out Israel for their genocide
I use chatgpt when I care to, and while I was given a subscription by work, I’m not actively encouraged to use it. I really only use it for researching problems that Google search is too seo poisoned to help me with, or debugging scripts. Past that it doesn’t have much professional use for me, given how much time I spend validating output and insulting the AI for hallucinations and just generally being terrible at moderate tasks.
Basic data interpretation can be pretty great though. I’ve had it find a couple problems I missed after having it parse log files.


. if the duration is 45 days then they will give you 365/45 certificates ?
Minimum. We get through digicert at work, and we abuse the hell out of our wildcard and reissue it tons of times a year. You’re buying a service for the year, not an individual cert.


Given that automating things like this is annoying sometimes, you’ll be sure people will max out the 45 days…
I know from professional experience that this is a stupid as fuck idea that leads to outages. One of the many reasons I’m working to automate those annoying ones.
Also, don’t let perfect be the enemy of better.


Lol, never had to buy a cert huh?
You’re still buying a year or more at a time, no matter the lifetime of the cert itself. Even if the cert lifetime was a week, you’re still buying the same product, no matter how many times you rotate it.


Personally, yes. Everything is behind NPM and SSL cert management is handled by certbot.
Professionally? LOL NO. Shit is manual and usually regulated to overnight staff. Been working on getting to the point it is automated though, but too many bespoke apps for anyone to have cared enough to automate the process before me.


And you still
can’tcan self certify.
Skill issue, you’ve always been able to self certify. You just have to know where to drop the self signed cert or the parent/root cert you use to sign stuff.
If you’re running windows, it’s trivial to make a self signed cert trusted. There’s an entire certificate store you can access that makes it easy enough you can double click it and install it and be on your way. Haven’t had a reason to figure it out on Linux, but I expect it won’t be super difficult.


note that the max duration was reduced from 3 years to 398 days earlier this year)


I’ve been dreading this switch for months (I still am, but I have been, too!) considering this year and next year will each double the amount of cert work my team has to do. But, I’m hopeful that the automation work I’m doing will pay off in the long run.


So, if you would, help me out with the ‘why’ part
It eliminates a single point of failure, can be used to bypass censorship, and allow for community support/engagement in a way that is harder to track and suppress (in that there’s no ‘central’ hub and you have to go after nodes individually. From an opsec point of view, you’re still broadcasting a signal that someone in range can pick up). Obviously it requires many devices to make a good mesh work, but short of DOSing every channel or just blowing out the signal space, it’s gonna be hard to take that down.
I see it as something like tor or i2p, not something for general use at the moment, but definitely has good uses.
I should set up that bitwarden feature that lets people ask for access and they get it if you don’t respond in a set amount of time.