- Go 70.1%
- Python 22.2%
- HTML 4.3%
- Shell 1.4%
- JavaScript 1.3%
- Other 0.7%
Apply wide-ranging updates across the repo: update browser_extension (assets, manifest, popup, background, polyfills, packaging scripts and tests), refresh Odoo mail_bot modules for v16–v18 (models, utils, manifests, static assets, README), and modify many odooclaw Go commands, packages, tests, docs, Dockerfiles and sample configs. Includes numerous test additions/updates, docs improvements, example configs, and packaging artifacts to consolidate features and maintenance changes. |
||
|---|---|---|
| .tmp/logo | ||
| browser_extension | ||
| examples/doodba | ||
| odoo/custom/src | ||
| odooclaw | ||
| scripts | ||
| .gitignore | ||
| browser_extension.zip | ||
| README.md | ||
OdooClaw: AI Assistant for Odoo ERP
Native Odoo Integration · AI Assistant · $10 Hardware · 10MB RAM
Fork Notice: This project is a fork of PicoClaw created by Sipeed. We have deeply modified and adapted it to integrate natively with Odoo ERP using asynchronous webhooks and a dedicated communication channel.
🌟 The PicoClaw Legacy: Why this base?
PicoClaw was originally created by Sipeed to solve a very specific problem: bringing advanced AI Agent capabilities to ultra-low-cost hardware. We chose it as the foundation for OdooClaw because of its incredible technical feats:
- Written in Go: Resulting in a single, fast, self-contained binary.
- Microscopic Footprint: It uses less than 10MB of RAM, which is 99% less memory than its NodeJS/TypeScript counterparts (like OpenClaw or AutoGPT).
- Instant Boot: Boots in under 1 second, even on single-core 0.6GHz hardware ($10 boards).
- True Portability: Runs seamlessly on x86, ARM, and RISC-V architectures.
By using this engine, OdooClaw inherits the ability to run directly inside any Odoo deployment (even on minimal cloud VPS instances) without cannibalizing the resources your ERP needs to serve users.
🦐 OdooClaw is an ultra-lightweight AI assistant written in Go. We added a native Odoo channel and a specialized odoo-mcp server, allowing the agent to interact with your Odoo instance through secure, granular tools (search/read/create/write and safe business actions) while replying directly in Odoo Discuss.
✨ Key Features
- 🪶 Ultra-Lightweight: Under 10MB of RAM footprint. It can run on the exact same server as Odoo without impacting performance!
- 🤝 Odoo Discuss Integration: Talk to the AI directly from your Odoo chat.
- 🔐 Native Permission Inheritance: Secure by default. The AI dynamically assumes Odoo user permissions, preventing any bypass of native Security Rights or Record Rules.
- 🧠 Intelligent ORM Bridge: High-precision tool execution. The
odoo-mcpbridge provides modular tools with strict validation, denylist/allowlist controls, and safer mappings for real Odoo ORM operations. - 🧠 Dual-Layer Memory (HOT + COLD): Keeps current prompt memory behavior while adding scoped historical memory, temporal facts, timeline recall, retrieval explainability, and optional historical markdown import.
- 🔁 RLM Acceleration (Context-Rot Resistant): For large Odoo datasets, OdooClaw decomposes analysis into recursive Map-Reduce steps (
rlm_partition-> sub-agents ->rlm_aggregate) to keep context clean, improve accuracy, and reduce long-context cost. - 📄 Smart OCR & Action Generation: Automatically scans PDF invoices, extracts data, and creates vendor bills or purchase orders intelligently.
- 💼 Workforce Ops Tools: Native tools for attendance, check-in/check-out, task-centric timesheets, daily summaries, missing-timesheet detection, and expense report lifecycle.
- 🧾 Accounting Ops Tools: Native tools for bank reconciliation workflows, AR/AP aging, period-close checks, journal entry creation/posting, tax summary, and duplicate bill risk checks.
- 🚗 OCR Expense Flows: Attachment-to-expense creation for employee receipts and mileage (
ocr-create-employee-expense,ocr-create-mileage-expense) with dry-run support. - 🎤 Voice Messages: Send and receive voice notes! Supports transcription (STT) and speech synthesis (TTS).
- ⚡ Asynchronous & Non-Blocking: Odoo ↔ OdooClaw communication relies on Webhooks ("Fire & Forget"), releasing Odoo workers instantly.
- 🧠 Segregated Context: AI memory is independent per channel/user. It doesn't mix private information.
- 🤖 Integrated MCP Server: Uses the industry standard Model Context Protocol (MCP) via embedded Python servers, providing
odoo-mcp(granular Odoo tools with permission-aware execution),ocr-invoice(invoice/PO parsing),whisper-stt(voice transcription), andedge-tts(text-to-speech). - 🧷 Reliable Odoo Chat Identity Context: Odoo Discuss sender context now consistently propagates to
odoo-mcpcalls (includingodoo-mcpserver alias), ensuring correctsender_id/company scope in tool execution. - 🔒 Private Odoo Reply Routing: Group mentions can be safely handled with private 1:1 reply targets and user-scoped sessions, preventing cross-user context leakage in shared channels.
- 🧩 Gemma4 Tool-Calling Compatibility: Supports Gemma4/OpenAI-compatible endpoints that emit pseudo tool-call content (
<|tool_call>call:...{...}), including normalization of tool names, nested argument parsing, and automatic conversion to executable tool calls. - 🛡️ Secure by Design: Pre-configured personality (
AGENTS.md) designed to query, ask for confirmation, and never perform critical modifications without explicit permission.
🚀 Integration Architecture
The integration consists of two parts:
- The OdooClaw container: Acts as the AI Gateway.
- The Odoo module (
mail_bot_odooclaw): Intercepts messages in Odoo and sends them to OdooClaw.
The Communication Flow (Via Webhook)
- User writes to OdooClaw: In Odoo, a user sends a Direct Message (default mode) or, if enabled, mentions
@OdooClawin a channel. The module overrides_message_postto detect this intent. - Odoo sends an Asynchronous Webhook: Instead of blocking while waiting for the AI, Odoo sends an HTTP POST JSON payload in the background to the OdooClaw API (
http://odooclaw:18790/webhook/odoo). - OdooClaw processes it: The agent evaluates the intent and contacts the LLM provider (OpenAI, Anthropic, vLLM, etc.). The LLM invokes
odoo-mcptools from our internal MCP server (Python), executing permission-aware Odoo operations (search, read, create, write, safe actions) for the requesting user context. - OdooClaw replies to Odoo: Once the response is ready, OdooClaw makes an HTTP POST back to the Odoo endpoint (
/odooclaw/reply), which injects the message into Discuss, impersonating the bot.
🎤 Voice Messages (STT & TTS)
OdooClaw supports voice notes in both directions:
Receiving Voice Messages (Speech-to-Text)
When a user sends a voice note in Odoo Discuss:
- The webhook automatically detects the voice attachment
- OdooClaw uses the
whisper-sttskill to transcribe the audio - The LLM processes the transcribed text and responds
Transcription Methods:
- Faster Whisper (local): No API key needed, runs on CPU
- Whisper API (OpenAI): More accurate, requires
OPENAI_API_KEY
Sending Voice Responses (Text-to-Speech)
When the user asks for voice output (e.g., "read this aloud", "voice response"):
- OdooClaw uses the
edge-ttsskill to generate audio - Audio is uploaded to Odoo as an attachment
- Voice metadata is created for proper playback in Discuss
- Bot responds with a playable voice note
Available Voices:
- Spanish:
es-ES-ElenaNeural,es-MX-DaliaNeural,es-AR-TomasNeural - English:
en-US-JennyNeural,en-US-GuyNeural,en-GB-SoniaNeural - And many more (French, German, Italian, Portuguese, Chinese, Japanese)
Environment Variables for Voice
# For STT (Speech-to-Text)
- OPENAI_API_KEY=${OPENAI_API_KEY} # Optional, for Whisper API fallback
# For TTS (Text-to-Speech) - No additional config needed
# Edge TTS is free and included by default
See Voice Features Documentation for detailed configuration.
📦 Odoo Module (mail_bot_odooclaw)
The native module is located at: odoo/custom/src/{version}/mail_bot_odooclaw/
Supported Odoo Versions
| Version | Module Path | Channel Model |
|---|---|---|
| Odoo 18 | odoo/custom/src/18.0/mail_bot_odooclaw/ |
discuss.channel |
| Odoo 17 | odoo/custom/src/17.0/mail_bot_odooclaw/ |
mail.channel |
| Odoo 16 | odoo/custom/src/16.0/mail_bot_odooclaw/ |
mail.channel |
Note
: Odoo 18 renamed
mail.channeltodiscuss.channeland changed the member relationship structure. Each version's module handles these differences automatically.
Installation in Odoo
- Spin up your Odoo environment (for instance, using Doodba).
- Copy the
mail_bot_odooclawfolder to youraddonsdirectory. - Enable Developer Mode in Odoo (Settings -> Activate the developer mode).
- Go to Apps, click on "Update Apps List".
- Search for
OdooClawand install the OdooClaw AI Bot module. - Additional Configuration: Go to Settings > Technical > System Parameters and verify/create the key
odooclaw.webhook_urlwith the valuehttp://odooclaw:18790/webhook/odoo.
🐳 Deployment with Doodba (Docker Compose)
You can easily integrate OdooClaw into your Doodba stack. Here is an example of how to set up your docker-compose.yml (or prod.yaml / devel.yaml):
version: "2.4"
services:
odoo:
# Your normal Odoo Doodba configuration...
depends_on:
- db
networks:
default:
odooclaw:
build:
context: ./odooclaw # Path to OdooClaw source code
dockerfile: docker/Dockerfile # Required for Doodba integration
restart: unless-stopped
environment:
# Credentials for Odoo XML-RPC connection
- ODOO_URL=http://odoo:8069
- ODOO_DB=${POSTGRES_DB:-devel}
- ODOO_USERNAME=${ODOO_USERNAME:-admin}
- ODOO_PASSWORD=${ODOO_PASSWORD:-admin} # IMPORTANT: Use an Odoo API Key in PROD
# LLM Configuration
- ODOOCLAW_AGENTS_DEFAULTS_PROVIDER=openai
- ODOOCLAW_AGENTS_DEFAULTS_MODEL=gpt-4o
- ODOOCLAW_PROVIDERS_OPENAI_API_KEY=${OPENAI_API_KEY}
- ODOOCLAW_PROVIDERS_OPENAI_API_BASE=${OPENAI_API_BASE:-https://api.openai.com/v1}
# Odoo Channel Configuration (Gateway)
- ODOOCLAW_CHANNELS_ODOO_ENABLED=true
- ODOOCLAW_CHANNELS_ODOO_WEBHOOK_HOST=0.0.0.0
- ODOOCLAW_CHANNELS_ODOO_WEBHOOK_PORT=18790
- ODOOCLAW_CHANNELS_ODOO_WEBHOOK_PATH=/webhook/odoo
- ODOOCLAW_CHANNELS_ODOO_ALLOW_GROUP_MENTIONS=false # Recommended default: DM-only
volumes:
# Persistent volume for memory, configs, and OdooClaw local DB
- odooclaw_data:/home/odooclaw/.odooclaw
depends_on:
- odoo
networks:
- default
volumes:
odooclaw_data:
Credentials Management (.env)
It is imperative to use environment variables (e.g., in .docker/odoo.env) to inject your keys securely:
OPENAI_API_KEY="sk-your-api-key"
# Optional, if using LMStudio, vLLM or other OpenAI-compatible APIs:
# OPENAI_API_BASE="http://your-local-llm:1234/v1"
# In production, use an Odoo API Key, not the admin password:
ODOO_PASSWORD="your-odoo-api-key"
Doodba 18 Dev/Test (Practical Local Flow)
If your local Doodba project is in a path like /Users/nramos/DEV/doodba-18, this is the recommended open-source friendly flow:
- Keep OdooClaw source in your Doodba workspace so Compose can build it.
- Add
odooclawservice todevel.yaml(orprod.yaml) with internal URLODOO_URL=http://odoo:8069. - Store secrets in
.docker/odoo.env(never commit API keys). - Set Odoo system parameter
odooclaw.webhook_urltohttp://odooclaw:18790/webhook/odoo. - Rebuild only changed services:
docker compose build odoo odooclaw
docker compose up -d odoo odooclaw
docker compose logs -f odooclaw
For complete Doodba setup guides:
- English:
odooclaw/docs/GUIDE_DOODBA_SETUP_EN.md - Spanish:
odooclaw/docs/GUIA_DOODBA_PUESTA_EN_MARCHA_ES.md
Odoo Privacy Modes (Recommended)
-
DM-only (default and recommended):
ODOOCLAW_CHANNELS_ODOO_ALLOW_GROUP_MENTIONS=false- Group mentions are ignored; only direct messages trigger the assistant.
-
Group mentions enabled (advanced mode):
ODOOCLAW_CHANNELS_ODOO_ALLOW_GROUP_MENTIONS=true- Group mentions are accepted.
- Odoo module provides private reply targets so responses can still be posted in a user↔bot private chat.
- Session scope is isolated per requesting user for those interactions.
Browser Copilot in Doodba (Phase 1 MVP)
To enable the new browser-copilot module in the same dev/test stack:
- Start backend from project root:
docker compose -f "odooclaw/browser_copilot/docker-compose.browser-copilot.yml" up --build
-
Configure extension popup:
- Backend URL:
http://127.0.0.1:8765 - Token: same value as
BROWSER_COPILOT_TOKEN
Browser support currently documented for:
- Chrome/Chromium (load unpacked)
- Firefox (load temporary add-on)
See
browser_extension/README.mdfor browser-specific install steps. - Backend URL:
-
Keep secure defaults in phase 1:
BROWSER_COPILOT_READ_ONLY=true- allowlisted domains only
- explicit user confirmation before action execution
-
Validate end-to-end:
./odooclaw/browser_copilot/scripts/smoke_test.sh
See full backend and extension documentation:
odooclaw/browser_copilot/README.mdbrowser_extension/README.mdodooclaw/docs/BROWSER_COPILOT_DOODBA_SETUP.mdodooclaw/docs/BROWSER_EXTENSION_DISTRIBUTION.mdodooclaw/docs/DOODBA_MINIMAL_STACK_EXAMPLE.md
Copy/paste-ready baseline files for Doodba are available in:
examples/doodba/prod.odooclaw-browser-copilot.redis.yamlexamples/doodba/odoo-env-odooclaw-browser-copilot.exampleexamples/doodba/config.odooclaw.minimal.example.json
3. Configuration Files
To facilitate its use in different environments (Docker/Doodba or local binaries), OdooClaw offers two ways to configure it:
-
odooclaw/.env.example(Recommended for Doodba / Docker Compose):- Shows how to inject settings directly via environment variables (e.g.:
OPENAI_API_KEY). - In a Doodba environment, simply copy the contents of
.env.exampleinto your.docker/odoo.envfile or your main server's environment file. - It is the safest approach to keep passwords (like the Odoo API Key and your LLM provider key) secure and portable.
- Shows how to inject settings directly via environment variables (e.g.:
-
odooclaw/config/config.example.json(Local Deployments / Binaries):- It is the structured template with all the complete configuration for OdooClaw.
- Defines providers, sandbox rules, chat channels (Discord, Telegram, Odoo), web search, and scheduled tasks (
cron). - When you run OdooClaw without Docker, it reads from
~/.odooclaw/config.jsonby default. You should copy this example file to that path and edit it with your keys. - Note: Docker environment variables will always take precedence over the
config.jsonfile.
💻 Usage Modes
1. Server/Gateway (Recommended)
The container starts by default in gateway mode (odooclaw gateway). It listens on port 18790 waiting for webhooks from the Odoo chat.
2. CLI "One-Shot" Mode (Quick Testing)
Since you are running OdooClaw as a container within a docker-compose environment (like Doodba), you can execute queries directly in the terminal by attaching to the running container and using the agent mode:
# Test the Odoo skill from the terminal
docker compose exec odooclaw odooclaw agent -m "Tell me what Odoo version is running and verify the connection"
# Enter interactive terminal mode
docker compose exec odooclaw odooclaw agent
⚙️ Configuration Deep Dive
While the .env.example provides a quick way to configure OdooClaw for Docker, the core engine relies on a rich configuration system inherited and adapted from PicoClaw.
Workspace Layout
OdooClaw stores its data in the configured workspace (default inside Docker: /home/odooclaw/.odooclaw/workspace):
.odooclaw/workspace/
├── sessions/ # Conversation sessions and history for Odoo users
├── memory/ # Long-term vector memory
├── state/ # Persistent state (last channel, etc.)
├── skills/ # Custom skills (like odoo-mcp)
├── AGENTS.md # AI personality and strict Odoo directives
├── HEARTBEAT.md # Periodic task prompts (checked every 30 min)
├── IDENTITY.md # Agent identity (Odoo Assistant)
├── SOUL.md # Agent soul and values
└── USER.md # User preferences and expectations
Heartbeat (Periodic Tasks)
OdooClaw can perform periodic tasks automatically in the background without user intervention. Simply edit the HEARTBEAT.md file in your workspace:
# Periodic Tasks
- Query Odoo for unconfirmed Sales Orders older than 3 days and summarize them.
- Check the Odoo logs or system parameters to ensure the webhook is correctly set.
The agent will read this file every 30 minutes (configurable via ODOOCLAW_HEARTBEAT_INTERVAL env var) and execute any tasks using the Odoo skill, silently acting as a background supervisor for your ERP.
🔒 Security Sandbox
Because OdooClaw can execute terminal commands and write files, it runs in a sandboxed environment by default to ensure it doesn't accidentally mess with your host system files.
- Protected Tools: Tools like
read_file,write_file, andlist_dirare restricted to the workspace folder. - Exec Protection: Even if you disable the sandbox, the
exectool proactively blocks dangerous patterns likerm -rf, formatting commands, system shutdown commands, or fork bombs.
Providers & Model Configuration
OdooClaw uses a model-centric configuration approach (model_list in config.json). You simply specify the vendor/model format to add new providers—zero code changes required!
This allows incredible flexibility for your ERP, such as using lightweight local models for easy queries to save costs, and falling back to massive models for complex data analysis.
All Supported Vendors Prefix:
openai/, anthropic/, zhipu/, deepseek/, gemini/, groq/, moonshot/, qwen/, nvidia/, ollama/ (Local), openrouter/, vllm/ (Local).
Example: Local Ollama Model
If you want to use a 100% free and local model hosted on your server alongside Odoo, you can easily point OdooClaw to it:
{
"model_list": [
{
"model_name": "llama3.1",
"model": "ollama/llama3.1",
"api_base": "http://host.docker.internal:11434/v1"
}
],
"agents": {
"defaults": {
"model": "llama3.1"
}
}
}
Load Balancing
If you manage a huge Odoo instance with hundreds of users querying the AI, you can configure multiple API keys/endpoints for the same model name, and OdooClaw will automatically round-robin between them to prevent rate-limiting!
🛠️ MCP Server and Skills
One of the most advanced features of OdooClaw is its use of the Model Context Protocol (MCP). We include MCP servers that expose vital tools to the AI:
Core Skills
| Skill | Description |
|---|---|
odoo-mcp |
Modular Odoo tools (odoo_search, odoo_read, odoo_create, odoo_write, safe actions) with strict permission context and denylist/allowlist security |
ocr-invoice |
Parse and extract structured data from PDF/Image documents |
rlm-utils |
Partition and aggregate large datasets for recursive long-context analysis |
Voice Skills
| Skill | Description |
|---|---|
whisper-stt |
Transcribe voice messages (Faster Whisper local + Whisper API fallback) |
edge-tts |
Generate voice responses using Microsoft Edge TTS |
By relying on the MCP standard, these servers run isolated and dynamically inject their capabilities into the LLM on every interaction.
Why RLM in OdooClaw?
RLM (Recursive Language Models) is used as a practical inference strategy for ERP workloads where a single prompt can include hundreds of records or large attachments. Instead of pushing everything into one giant context, OdooClaw applies context-centric decomposition:
- Decompose: Fetch data, split into chunks with
rlm_partition. - Map: Process each chunk in parallel with sub-agents (
spawn/subagent). - Reduce: Merge outputs using
rlm_aggregateand produce a final answer.
Benefits in production:
- Better robustness against context rot on long conversations.
- Lower token pressure and more predictable latency/cost.
- Higher precision for analytical tasks (invoices, journals, stock moves, large order lists).
Recommended chunk sizing (starting point):
| Workload | Typical records | Suggested chunk_size |
Why |
|---|---|---|---|
| Invoice/PO quick checks | 50-300 | 20-40 | Fast map phase with low overhead |
| Accounting analysis | 300-2,000 | 50-100 | Good cost/latency balance |
| Very large audits | 2,000+ | 100-200 | Fewer sub-calls while preserving context hygiene |
Reproducible benchmark: single-pass vs RLM
Use odooclaw/scripts/benchmark_rlm.py to compare:
- Latency (
mean_latency_s) - Cost proxy (
mean_total_tokens,mean_cost_usd) - Quality (
exact_match_rate,mean_abs_error)
Example:
python3 odooclaw/scripts/benchmark_rlm.py \
--api-base "https://api.openai.com/v1" \
--api-key "$OPENAI_API_KEY" \
--model "gpt-4o-mini" \
--sizes 100 500 2000 \
--repeats 3 \
--chunk-size 100 \
--input-cost-per-1m 0.15 \
--output-cost-per-1m 0.60
The script prints JSON summary per mode/size so you can track if RLM improves robustness as context grows.
🧠 Behavior Configuration (Workspace)
OdooClaw extracts its personality and rules from the workspace/ folder. The files have been adjusted to suit an ERP environment:
AGENTS.md(Strict Directives): Instructed to NEVER delete or critically modify an Odoo record without first showing a summary and demanding an explicit "Yes" from the user.USER.md(User Profile): Assumes it is talking to employees/operators of an ERP. Formats its results in clean Markdown and gets straight to the point.SOUL.md(Alignment): Has a cautious personality; prefers to admit it can't find a piece of data rather than making it up (zero hallucinations).
If you need to "reset" the brain or wipe a user's vector memory, simply delete or purge the odooclaw_data volume.
📚 Additional Documentation
Deeper configuration (alternative providers like Anthropic, Ollama, etc., troubleshooting, and advanced setups) can be found in the /odooclaw/docs/ directory:
- Main Documentation
- General Configuration (JSON)
- Voice Features (STT/TTS)
- SQLite + Historical Memory
- Odoo Chat Memory QA Guide
- Changelog
- General Troubleshooting
- Antigravity Auth and Usage
Furthermore, OdooClaw retains the ability to integrate with Telegram, Discord, WhatsApp, and WeCom. Check the documentation in docs/channels/ to enable them alongside Odoo.
🛠️ Architecture and Technical Documentation
OdooClaw shares the ultra-lightweight architectural principles of its predecessor PicoClaw, but extends them significantly for the ERP ecosystem:
- Core Engine: Written in Go (1.21+), compiling to a single standalone binary.
- Event Bus: An internal
buspackage decouples the Odoo webhooks from the LLM execution, allowing true asynchronous background processing. - Routing & Memory: Channels route conversations seamlessly. Each user/thread gets isolated context to avoid data contamination between different Odoo records. Memory uses a HOT operational layer plus a scoped COLD historical layer with temporal facts and explainability tools.
- Skills Framework (MCP): Native support for the Model Context Protocol, allowing you to plug any external Python/Node script securely.
For an in-depth look at the architecture, please refer to the Design Documentation.
⚖️ License and Credits
This project is distributed under the MIT license.
- OdooClaw and its Odoo native integration have been developed by Nicolás Ramos and the OdooClaw contributors.
- It is a deeply adapted fork of PicoClaw by Sipeed.
- In turn, PicoClaw is heavily inspired by nanobot by HKUDS.
Forking and Attribution
We strongly encourage the open-source community to fork, modify, and improve OdooClaw! If you fork this project or use its core components in your own work, we kindly request that you:
- Maintain the attribution to the original creators (Nicolás Ramos / OdooClaw, Sipeed, and HKUDS).
- Keep the
LICENSEfile intact. - Include a visible "Fork Notice" in your project's
README.mdpointing back to this repository, similar to the one at the top of this document.