

but don’t you need kernel access to verify everything that is running


but don’t you need kernel access to verify everything that is running


multiple times a day for me


idk what to tell you, because I just tried it and it works


ah, this puts it together and it’s exactly what I was looking for, thanks


ah yes, I stopped watching the guy because of that and the clickbait, but he does make some interesting content sometimes.


so you haven’t tried it recently


maybe last you tried it was over 6 months ago, maybe you’re using the old google assistant, or idk, but it definitely works for me


yeah, that’s what I’m looking for. Do you know of a way to integrate ollama with HA?




I don’t quite get what this is supposed to do. Is it basically a software to allow jellyfin/plex users to request media without needing a radarr/sonarr account?


it just means they’ll be a passive node, but still able to seed if they connect to the other node (edited). It’s the setup I have and I manage to keep an overall ratio >1, especially if the torrent is popular.


if you use this often, you can add a keyword search (firefox-based browsers) or a custom site search (chromium-based) with this URL
https://icon-sets.iconify.design/?query=%25s
(use %s after equals; some lemmy front-ends seem to be rendering it wrong)
and a shortcut e.g. icon
so everytime you enter e.g. icon person in a new tab, it’ll run the search for you
you just know a company like Microsoft or Apple will eventually try suing an open source project over AI code that’s “too similar” to their proprietary code.
Doubt it. The incentives don’t align. They benefit from open source much more than are threatened by it. Even that “embrace, extent, extinguish” idea comes from different times and it’s likely less profitable than the vendor lock-in and other modern practices that are actually in place today. Even the copyright argument is something that could easily backfire if they just throw it in a case, because of all this questionable AI training.


yes, the system will likely use some swap if available even when there’s plenty of free RAM left:
The casual reader1 may think that with a sufficient amount of memory, swap is unnecessary but this brings us to the second reason. A significant number of the pages referenced by a process early in its life may only be used for initialisation and then never used again. It is better to swap out those pages and create more disk buffers than leave them resident and unused.
Src: https://www.kernel.org/doc/gorman/html/understand/understand014.html
In my recently booted system with 32GB and half of that free (not even “available”), I can already see 10s of MB of swap used.
As rule of thumb, it’s only a concern or indication that the system is/was starved of memory if a significant share of swap is in use. But even then, it might just be some cached pages hanging around because the kernel decided to keep instead of evicting them.


if my system touches SWAP at all, it’s run out of memory
That’s a swap myth. Swap is not an emergency memory, it’s about creating a memory reclamation space on disk for anonymous pages (pages that are not file-backed) so that the OS can more efficiently use the main memory.
The swapping algorithm does take into account the higher cost of putting pages in swap. Touching swap may just mean that a lot of system files are being cached, but that’s reclaimable space and it doesn’t mean the system is running out of memory.


potentially relevant: paperless recently merged some opt-in LLM features, like chatting with documents and automated title generation based on the OCR context extracted.


no problem. I can see that, at the same time, the directory of the place I work at has 20x that number and finding someone is never an issue, so I also don’t bother cleaning up my local list.


of course most are not used, that is fine, I don’t understand why anyone would bother deleting “unused” contacts


I have over 800 and I’m not even a salesperson or anything like that; that’s mostly from exchanged emails over the years
KDE on endeavouros works with HDR for me (latest drivers). Ubuntu is usually a few months behind on updates, but I wouldn’t expect plasma 6 to crash every time when trying HDR. I hope you’re able to narrow down the cause, or have a magical update that fixes it.