Zack Saadioui
8/26/2024
1 2
[INFO] Ollama started successfully! [INFO] Loaded model: Llama3 with dynamic libraries CUDA, CPU.
1
2
plaintext
[2023-11-02 17:00:23] [INFO] POST request to /api/generate from 192.168.1.1 (Status: 200)
1
2
plaintext
[INFO] GPU used: NVIDIA GTX 1080, Memory available: 4077 MB
1
2
bash
cat ~/.ollama/logs/server.log
1
~/.ollama/logs
1
2
bash
journalctl -u ollama --no-pager
1
2
bash
/usr/share/ollama/.ollama/logs/
1
2
cmd
explorer %LOCALAPPDATA%\Ollama
1
server.log
1
server-#.log
1
OVR
Copyright © Arsturn 2024