| backend | ||
| executor | ||
| frontend | ||
| .env.example | ||
| .gitignore | ||
| docker-compose.yml | ||
| README.md | ||
🧩 Leetdle
A daily coding challenge game inspired by Wordle — but for code. Every day at midnight (Paris time), a new AI-generated coding challenge is published. Players solve it in Python, JavaScript, TypeScript, or Rust, with their code executed in secure, sandboxed Docker containers.
🎯 Project Overview
Leetdle is a full-stack web application where users:
- Read a daily coding challenge with description, examples, and test cases
- Write a solution in their preferred language using an in-browser Monaco editor
- Submit their code, which runs against hidden test cases inside an isolated Docker container
- Compete — solve it in fewer tries and less time, then share results
The challenge is automatically generated by AI (GPT-4.1), validated by running AI-generated solutions against the test cases, and saved to a SQLite database — all without human intervention.
🏗️ Architecture
graph TB
subgraph Frontend["Frontend (Next.js)"]
SSR["Server-Side Rendering<br/>page.tsx"]
GC["GameClient.tsx<br/>Monaco Editor + UI"]
end
subgraph Backend["Backend (Hono + Node.js)"]
API["REST API<br/>/api/challenge<br/>/api/execute<br/>/api/submissions"]
AI["AI Service<br/>GPT-4.1"]
EXEC["Executor Service"]
FETCH["Fetcher Job<br/>Midnight Scheduler"]
DB["SQLite<br/>Drizzle ORM"]
end
subgraph Executors["Docker Executor Containers"]
PY["🐍 leetdle-python<br/>Alpine + Python3"]
JS["📜 leetdle-javascript<br/>Alpine + Node.js"]
TS["🔷 leetdle-typescript<br/>Alpine + Node.js + ts-node"]
RS["🦀 leetdle-rust<br/>rust:alpine + Cargo"]
end
SSR -->|"fetch /api/challenge/daily"| API
GC -->|"POST /api/execute"| API
GC -->|"POST /api/submissions"| API
API --> EXEC
API --> DB
FETCH -->|"Midnight Paris"| AI
AI -->|"Validate solutions"| EXEC
FETCH --> DB
EXEC -->|"docker run --rm"| PY
EXEC -->|"docker run --rm"| JS
EXEC -->|"docker run --rm"| TS
EXEC -->|"docker run --rm"| RS
style Frontend fill:#1a1a2e,stroke:#7c3aed,color:#fff
style Backend fill:#1a1a2e,stroke:#06b6d4,color:#fff
style Executors fill:#1a1a2e,stroke:#ef4444,color:#fff
⚙️ Challenge Generation & Validation Pipeline
Every day at midnight (Europe/Paris), the Fetcher Job triggers a fully automated pipeline:
flowchart TD
A["⏰ Midnight Paris<br/>Fetcher Job triggers"] --> B{"Today's challenge<br/>exists?"}
B -->|Yes| SKIP["Skip — already done"]
B -->|No| C["Request challenge from<br/>GPT-4.1"]
C --> D["Parse JSON response<br/>title, description, difficulty,<br/>testCases, solutions, stubs"]
D --> E["🔄 Validation Loop<br/>(for each language)"]
E --> F["Generate AI test runner<br/>for language"]
F --> G["Run AI solution in<br/>Docker sandbox"]
G --> H{"All tests pass?"}
H -->|Yes| I["✅ Language validated"]
H -->|No| J{"Fix attempts<br/>remaining?"}
J -->|Yes| K["Ask AI to fix<br/>the solution"]
K --> G
J -->|No| L["❌ Language failed"]
I --> M{"All 4 languages<br/>passed?"}
L --> M
M -->|Yes| N["💾 Save to SQLite"]
M -->|No| O{"Generation attempts<br/>remaining? (max 3)"}
O -->|Yes| C
O -->|No| P["🚨 Generation failed"]
style A fill:#7c3aed,stroke:#7c3aed,color:#fff
style N fill:#22c55e,stroke:#22c55e,color:#fff
style P fill:#ef4444,stroke:#ef4444,color:#fff
Key Details
| Step | Description |
|---|---|
| AI Prompt | GPT-4.1 receives a detailed prompt with format requirements, past challenge titles (to avoid duplicates), and rules for test case diversity |
| Test Runner Generation | A separate AI call generates a per-language test runner script with a {{USER_CODE}} placeholder |
| Validation | AI-generated solutions are executed inside the same Docker sandbox used for user submissions |
| Fix Loop | If a solution fails, the AI gets the error message and retries up to 2 times per language |
| Retry | If any language still fails, the entire challenge is discarded and regenerated (up to 3 total attempts) |
What Gets Stored (per challenge)
| Field | Description |
|---|---|
title |
Challenge name |
description |
Markdown-formatted problem statement |
difficulty |
Easy / Medium / Hard |
testCases |
JSON array of {input, expected} objects |
stubs |
Per-language starter code shown in the editor |
solutions |
AI-validated solutions (viewable after solving/failing) |
languageTests |
Per-language test runner scripts |
🔒 Secure Docker Isolation
User-submitted code runs in ephemeral Docker containers with multiple layers of security:
flowchart LR
subgraph Backend
EXEC["Executor<br/>Service"]
end
subgraph Sandbox["Ephemeral Container"]
CODE["User Code<br/>Execution"]
end
EXEC -->|"docker run --rm ..."| Sandbox
style Sandbox fill:#7f1d1d,stroke:#ef4444,color:#fff
style Backend fill:#1a1a2e,stroke:#06b6d4,color:#fff
Security Flags
| Flag | Purpose |
|---|---|
--rm |
Container is automatically deleted after execution |
--network none |
No internet access — code cannot make outbound requests |
--cap-drop ALL |
All Linux capabilities are dropped (no ptrace, chown, kill, etc.) |
--memory 128m/256m |
Hard memory limit (128 MB for Python/JS, 256 MB for TS/Rust) |
--cpus 0.5/1.0 |
CPU throttling (0.5 cores for Python/JS, 1.0 for TS/Rust) |
| Non-root user | All containers run as the leetdle user, not root |
| 20s timeout | The backend kills the container after 20 seconds if it hasn't exited |
Executor Images
Each language has its own minimal Alpine-based Docker image:
| Image | Base | Installed |
|---|---|---|
leetdle-python |
alpine:latest |
python3 |
leetdle-javascript |
alpine:latest |
nodejs |
leetdle-typescript |
alpine:latest |
nodejs, npm, typescript, ts-node |
leetdle-rust |
rust:alpine |
musl-dev, pre-compiled serde/serde_json deps |
These images are pre-built by Docker Compose but never run as daemons — they only serve as base images for the ephemeral docker run containers spawned by the backend.
🌐 Frontend ↔ Backend Communication
Server-Side (SSR)
The Next.js page.tsx fetches today's challenge server-side at render time:
page.tsx → GET http://backend:5000/api/challenge/daily → renders GameClient
This uses Docker's internal DNS (backend hostname) since both containers are on the same leetdle_web bridge network.
Client-Side (Browser)
The GameClient.tsx component makes API calls from the browser via Next.js rewrites:
| Action | Method | Endpoint |
|---|---|---|
| Run code | POST |
/api/execute |
| Save submission | POST |
/api/submissions |
| Get challenge stats | GET |
/api/submissions/stats/:id |
| Get solution | GET |
/api/challenge/solution/:id |
| List all challenges | GET |
/api/challenge/all |
The next.config.js rewrites /api/* to http://backend:5000/api/*, so the browser's requests are proxied through the frontend container to the backend.
🚀 Getting Started
Prerequisites
- Docker & Docker Compose
- An OpenAI API key (for challenge generation)
Setup
# 1. Clone the repository
git clone <repo-url> && cd leetdle
# 2. Create a .env file
cp .env.example .env
# Edit .env and set your OPENAI_API_KEY
# 3. Build and start everything
docker compose up --build -d
# 4. Open in your browser
# http://localhost:3000
On first startup, the backend will automatically generate today's challenge using the AI pipeline. This takes ~1-2 minutes as it validates solutions across all 4 languages.
Project Structure
leetdle/
├── frontend/ # Next.js 14 app
│ ├── app/
│ │ ├── page.tsx # SSR entry — fetches daily challenge
│ │ ├── GameClient.tsx # Main game UI (Monaco editor, timer, modals)
│ │ ├── StatsButton.tsx # Landing page stats component
│ │ └── challenges/ # Previous challenges page
│ └── Dockerfile
├── backend/ # Hono + Node.js API
│ └── src/
│ ├── index.ts # App entrypoint
│ ├── routes/ # API route handlers
│ ├── services/
│ │ ├── ai.ts # OpenAI API integration
│ │ ├── executor.ts # Docker container orchestration
│ │ ├── challenge.ts # Challenge CRUD + validation pipeline
│ │ └── submission.ts # Submission tracking & stats
│ ├── jobs/
│ │ └── fetcher.ts # Midnight scheduler
│ └── db/
│ ├── schema.ts # Drizzle ORM schema
│ └── index.ts # Database connection
├── executor/ # Language-specific Docker images
│ ├── python/Dockerfile
│ ├── javascript/Dockerfile
│ ├── typescript/Dockerfile
│ └── rust/Dockerfile
├── docker-compose.yml
└── .env.example