Installation
Prerequisites
| Dependency | Description |
|---|---|
| Python 3.10+ | Runtime |
| PostgreSQL 16+ | Data storage |
| Ollama | Local LLM inference (optional, can use cloud-only) |
Clone & Install
git clone https://github.com/wangjiake/JKRiver.git
cd JKRiver
python3 -m venv .venv
source .venv/bin/activate # macOS / Linux
# .venv\Scripts\activate # Windows
pip install -r requirements.txt
Setup PostgreSQL
createdb -h localhost -U your_username Riverse
psql -h localhost -U your_username -d Riverse -f agent/schema.sql
Note
Riverse and River Algorithm — AI Chat History Edition share the same database. Running the schema setup from either project creates all tables needed for both. If you have already run the other project's database setup, you can skip this step.
Verify tables were created:
You should see conversation_turns, user_profile, observations, fact_edges, memory_clusters and about fifteen other tables.
Tip
If you need scheduled skills support (Telegram Job Queue):
Pull Ollama Models (optional)
If using local LLM:
ollama pull <your-model> # e.g. qwen2.5:14b, llama3, mistral
ollama pull bge-m3 # Embedding model (optional)
Run
Web Dashboard (recommended) — starts FastAPI backend + Flask frontend together:
Or start individually:
uvicorn agent.api:app --host 127.0.0.1 --port 8400 # FastAPI backend (required for web dashboard)
python web.py # Flask frontend (http://localhost:1234)
python -m agent.main # CLI
python -m agent.telegram_bot # Telegram Bot
python -m agent.discord_bot # Discord Bot
The web dashboard needs both services running.
start_local.pyhandles this automatically.