LLM service over TCP with shared context.
Find a file
2025-05-28 18:46:32 +02:00
package/aur Remove false conflicts. 2025-05-28 18:44:13 +02:00
src Add packaging & split code. 2025-05-28 18:42:43 +02:00
.gitignore Add packaging & split code. 2025-05-28 18:42:43 +02:00
build.sh Add packaging & split code. 2025-05-28 18:42:43 +02:00
deno.json Initialization. 2025-05-28 16:32:25 +02:00
deno.lock Initialization. 2025-05-28 16:32:25 +02:00
history.json Add packaging & split code. 2025-05-28 18:42:43 +02:00
kub-tcp.service Add commented properties to service. 2025-05-28 18:46:32 +02:00
README.md Complement readme. 2025-05-28 16:37:52 +02:00

Kub

LLM service over TCP with shared context.

Usage

Dependencies

  • deno Typescript runtime used for the project, install it using your package manager.
  • ollama Local LLM service used to run chats, install it using your package manager.
  • llama3.2 Language Model used for chats, you can install it using ollama pull llama3.2.

Running

kub requires the ollama service to be active, either start it manually with ollama serve or enable its systemd service sudo systemctl enable --now ollama.service

$ ./src/kub.ts
# Listening on port 8080

TODO

  • Colored chat.
  • Persistent chat.
  • System messages.
    • Change of days.
  • Tools.
  • Tweak personality.