current state where it is working

This commit is contained in:
Moritz Graf 2026-02-07 13:59:40 +01:00
parent 990090bed7
commit 0865c1f80f
3 changed files with 15 additions and 5 deletions

View File

@ -778,7 +778,9 @@ An autonomous AI agent platform.
1. **Create Namespace**: `kubectl apply -f openclaw/namespace.yaml`
2. **Configure Secrets**:
* Edit `openclaw/openclaw.secret.yaml`.
* Replace `change-me` with your Gemini API Key.
* **Gemini**: Replace `change-me` with your Gemini API Key.
* **Telegram**: Replace `telegram-bot-token` with your Bot Token.
* **Gateway**: Token is pre-filled (randomly generated). Change if desired.
* **Encrypt**: Ensure the file is encrypted with `git crypt` before committing!
3. **Deploy**: `kubectl apply -f openclaw/openclaw.secret.yaml`
4. **Access**: `https://openclaw.haumdaucher.de`

View File

@ -64,10 +64,18 @@ spec:
port: http
initialDelaySeconds: 30
periodSeconds: 5
lifecycle:
postStart:
exec:
command: ["/bin/sh", "-c", "sleep 10; ollama pull llama3.1:8b-instruct-q8_0"]
command: ["/bin/sh", "-c"]
args:
- |
# Start Ollama in background
/bin/ollama serve &
PID=$!
echo "Waiting for Ollama..."
sleep 10
echo "Pulling model..."
ollama pull llama3.1:8b-instruct-q8_0
echo "Model pulled. Keeping container alive."
wait $PID
volumes:
- name: ollama-storage
persistentVolumeClaim:

Binary file not shown.