712 B
712 B
Kub
LLM service over TCP with shared context.
Usage
Dependencies
deno
Typescript runtime used for the project, install it using your package manager.ollama
Local LLM service used to run chats, install it using your package manager.llama3.2
Language Model used for chats, you can install it usingollama pull llama3.2
.
Running
kub requires the ollama service to be active, either start it manually with
ollama serve
or enable its systemd servicesudo systemctl enable --now ollama.service
$ ./src/kub.ts
# Listening on port 8080
TODO
- Auto update.
- Colored chat.
- System messages.
- Change of days.
- Tools.
- Tweak personality.