How to Install and Configure Ollama at a Custom Location on Linux

0

Ollama is a versatile AI model-serving tool designed for efficiency and ease of use. However, sometimes you need more control over your installation, like setting a custom directory. In this tutorial, we’ll cover step-by-step how to install Ollama in a custom location (/data/ollama) on your Linux server and configure it to be accessible remotely.

Step 1: Prepare the Custom Directory and User

First, create a dedicated directory and user for running Ollama securely:

sudo mkdir -p /data/ollama
sudo useradd -r -s /bin/false -U -m -d /data/ollama ollama
sudo chown -R ollama:ollama /data/ollama

Step 2: Download and Install Ollama Binaries

Download the Ollama Linux binary and extract it directly into your custom directory:

curl -L https://ollama.com/download/ollama-linux-amd64.tgz | sudo tar -xzf - -C /data/ollama
sudo chown -R ollama:ollama /data/ollama

Step 3: Configure Ollama as a Systemd Service

To have Ollama run automatically and reliably, create a systemd service file:

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/data/ollama/bin/ollama serve
WorkingDirectory=/data/ollama
User=ollama
Group=ollama
Environment="HOME=/data/ollama"
Environment="OLLAMA_MODELS=/data/ollama/models"
Environment="OLLAMA_HOST=0.0.0.0:11434"
Restart=always
RestartSec=3

[Install]
WantedBy=multi-user.target

Activate the Ollama service:

sudo mkdir -p /data/ollama/models
sudo chown -R ollama:ollama /data/ollama/models

sudo systemctl daemon-reload
sudo systemctl enable --now ollama

Check the status to ensure everything works correctly:

sudo systemctl status ollama --no-pager

Step 4: Open the Firewall Port

Ensure your firewall allows incoming requests on Ollama’s default port (11434):

sudo firewall-cmd --permanent --add-port=11434/tcp
sudo firewall-cmd --reload

Step 5: Verify the Installation

Test your installation locally with a quick command:

curl http://localhost:11434/api/version

You should see a JSON response indicating Ollama’s version.


    Congratulations! You’ve successfully installed and configured Ollama at your custom location on Linux. Now your server is optimized for performance and flexibility.

    Leave a Reply

    Your email address will not be published. Required fields are marked *