Asoba Zorora Documentation

Configuration Guide

Configure Zorora models, endpoints, and API keys.

Configuration Methods

Zorora supports three configuration methods:

  1. Web UI Settings Modal (Recommended) - Visual configuration interface
  2. Terminal Interactive - Use /models command
  3. Manual Configuration - Edit config.py directly

The easiest way to configure Zorora is through the Web UI settings modal:

Step 1: Start Web UI

python web_main.py
# Or if installed via pip:
zorora web

Step 2: Open Settings Modal

  1. Open http://localhost:5000 in your browser
  2. Click the ⚙️ gear icon in the top-right corner
  3. Settings modal opens

Step 3: Configure Models

Model Selection:

Endpoint Selection:

Step 4: Configure API Keys

API Key Management:

Step 5: Add/Edit Endpoints

Add New Endpoint:

  1. Click “Add New Endpoint”
  2. Select provider (HuggingFace, OpenAI, Anthropic)
  3. Fill in endpoint details:
    • HuggingFace: URL + Model name
    • OpenAI: Model name + Max tokens
    • Anthropic: Model name + Max tokens
  4. Click “Save”

Edit Endpoint:

  1. Click edit icon next to endpoint
  2. Modify settings
  3. Click “Save”

Delete Endpoint:

  1. Click delete icon next to endpoint
  2. Confirm deletion
  3. System automatically reassigns roles if needed

Step 6: Save Configuration

  1. Click “Save” button
  2. Changes take effect after server restart
  3. Config file backup created automatically before write

Terminal Configuration

Interactive Model Selector

Use the /models command in the REPL:

zorora
[1] ⚙ > /models

Follow the interactive prompts to configure models and endpoints.

Show Current Configuration

[2] ⚙ > /config

Displays current routing configuration and model assignments.

Manual Configuration

Step 1: Copy Example Config

cd zorora
cp config.example.py config.py

Step 2: Edit config.py

LM Studio Configuration:

# Local LM Studio endpoint (default)
MODEL_ENDPOINTS = {
    "orchestrator": "local",
    "codestral": "local",
    "reasoning": "local",
}

HuggingFace Configuration:

# HuggingFace token
HF_TOKEN = "your-huggingface-token"

# HuggingFace endpoints
HF_ENDPOINTS = {
    "codestral-hf": {
        "url": "https://api-inference.huggingface.co/models/Qwen/Qwen2.5-Coder-32B-Instruct",
        "model_name": "Qwen/Qwen2.5-Coder-32B-Instruct",
        "timeout": 120,
    }
}

MODEL_ENDPOINTS = {
    "codestral": "codestral-hf",
}

OpenAI Configuration:

# OpenAI API key
OPENAI_API_KEY = "your-openai-api-key"

# OpenAI endpoints
OPENAI_ENDPOINTS = {
    "gpt-4": {
        "model": "gpt-4",
        "max_tokens": 4096,
        "timeout": 60,
    }
}

MODEL_ENDPOINTS = {
    "orchestrator": "gpt-4",
}

Anthropic Configuration:

# Anthropic API key
ANTHROPIC_API_KEY = "your-anthropic-api-key"

# Anthropic endpoints
ANTHROPIC_ENDPOINTS = {
    "claude-opus": {
        "model": "claude-3-opus-20240229",
        "max_tokens": 4096,
        "timeout": 60,
    }
}

MODEL_ENDPOINTS = {
    "reasoning": "claude-opus",
}

Brave Search Configuration:

BRAVE_SEARCH = {
    "api_key": "your-brave-api-key",
    "enabled": True,
}

Step 3: Environment Variables (Alternative)

Instead of editing config.py, you can use environment variables:

export HF_TOKEN="your-huggingface-token"
export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"
export BRAVE_SEARCH_API_KEY="your-brave-api-key"

Configuration Options

Model Endpoints

Local (LM Studio):

HuggingFace:

OpenAI:

Anthropic:

API Keys

HuggingFace Token:

OpenAI API Key:

Anthropic API Key:

Brave Search API Key:

Best Practices

Model Selection

Endpoint Management

Security

Troubleshooting

Configuration Not Saving

Problem: Changes not persisting

Solution:

Endpoint Not Working

Problem: Endpoint connection fails

Solution:

Model Not Found

Problem: Model not available

Solution:

Next Steps

After configuration:

  1. Terminal REPL - Learn the command-line interface
  2. Web UI - Use the browser-based interface
  3. Research Workflow - Run your first research query

See Also