

3000 is the OpenWebUI port, never got it to work by using either 127.0.0.1 or localhost
, only 0.0.0.0. Ollama’s port 11434 on 127.x worked fine though.
you don’t want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.
Fair point.
Dang, good job and thanks for following up! 🎇👏😯