LLM service over TCP with shared context.
Find a file
2025-05-28 20:22:30 +02:00
package/aur Remove false conflicts. 2025-05-28 18:44:13 +02:00
src Ensure no history gets overwritten. 2025-05-28 20:22:30 +02:00
.gitignore Add persistence and nick changes. 2025-05-28 19:01:20 +02:00
build.sh Add packaging & split code. 2025-05-28 18:42:43 +02:00
deno.json Initialization. 2025-05-28 16:32:25 +02:00
deno.lock Add colorization. 2025-05-28 19:18:12 +02:00
kub-tcp.service Add commented properties to service. 2025-05-28 18:46:32 +02:00
README.md Add persistence and nick changes. 2025-05-28 19:01:20 +02:00

Kub

LLM service over TCP with shared context.

Usage

Dependencies

  • deno Typescript runtime used for the project, install it using your package manager.
  • ollama Local LLM service used to run chats, install it using your package manager.
  • llama3.2 Language Model used for chats, you can install it using ollama pull llama3.2.

Running

kub requires the ollama service to be active, either start it manually with ollama serve or enable its systemd service sudo systemctl enable --now ollama.service

$ ./src/kub.ts
# Listening on port 8080

TODO

  • Auto update.
  • Colored chat.
  • System messages.
    • Change of days.
  • Tools.
  • Tweak personality.