

Just do like me - Install Ollama and OpenWebUI, install Termux on Android, connect through Termux with port forwarding.
ssh -L 0.0.0.0:3000:ServerIP_OnLAN:3000
And access OpenWebUI at http://127.0.0.1:3000/ on your phone browser. Or SSH forward the Ollama port to use the Ollama Android app. This requires you to be on the same LAN as the server. If you port forward SSH through your router, you can access it remotely through your public IP (If so, I’d recommend only allowing login through certs or have a rate limiter for SSH login attempts.
The shell command will then be ssh -L 0.0.0.0:3000:YourPublicIP:3000
But what are the chances that you run the LLM on a Linux machine and use an android to connect, like me, and not a windows machine and use an iPhone? You tell me. No specs posted…
3000 is the OpenWebUI port, never got it to work by using either 127.0.0.1 or
localhost
, only 0.0.0.0. Ollama’s port 11434 on 127.x worked fine though.Fair point.