Skip to main content

Droid

Droid is Friday Dev's flexible execution backend that supports multiple AI model providers.

Overview

Droid provides:

  • Multi-model support - Connect any compatible model
  • Unified interface - Same tools across all models
  • Flexible autonomy - Configure permissions per model
  • Custom profiles - Create your own agent configurations

When to Use Droid

Ideal For

  • ✅ Custom model configurations
  • ✅ Self-hosted models
  • ✅ Experimental models
  • ✅ Specific autonomy needs
  • ✅ Cost optimization

Less Ideal For

  • ⚠️ Quick start (use preset agents)
  • ⚠️ Beginners (more configuration)

How Droid Works

┌─────────────────────────────────────────────────────────────────┐
│ Friday Dev │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Task → Droid Executor → AI Model → Tools → Output │
│ │ │ │
│ ▼ ▼ │
│ ┌─────────────┐ ┌─────────────┐ │
│ │ Model API │ │ File System │ │
│ │ (Any) │ │ Git │ │
│ │ │ │ Terminal │ │
│ │ - Gemini │ │ MCP │ │
│ │ - OpenAI │ │ │ │
│ │ - GLM │ │ │ │
│ │ - Local │ │ │ │
│ └─────────────┘ └─────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘

Setup

Default Profiles

Droid comes with pre-configured profiles:

{
"GEMINI_2_5_PRO": {
"DROID": {
"autonomy": "workspace-write",
"model": "gemini-2.5-pro-preview-06-05"
}
},
"GLM_4_7": {
"DROID": {
"autonomy": "workspace-write",
"model": "glm-4.7"
}
}
}

Custom Profile

Create your own profile:

{
"MY_CUSTOM_AGENT": {
"DROID": {
"autonomy": "workspace-write",
"model": "your-model-id",
"apiKey": "YOUR_API_KEY",
"baseUrl": "https://your-api-endpoint.com/v1"
}
}
}

Configuration

Profile Options

OptionDescriptionDefault
modelModel identifierRequired
autonomyPermission levelworkspace-write
apiKeyAPI key (or env var)-
baseUrlCustom API endpoint-
maxTokensMax output tokens8192
temperatureRandomness (0-1)0.7

Autonomy Levels

LevelCapabilities
workspace-writeRead/write workspace files
skip-permissions-unsafeFull system access

Environment Variables

# Set API keys via environment
export GEMINI_API_KEY="your-key"
export OPENAI_API_KEY="your-key"
export GLM_API_KEY="your-key"

Usage

From CLI

# Use specific profile
friday-dev run --task 123 --agent droid --profile GEMINI_2_5_PRO

# Use custom profile
friday-dev run --task 123 --agent droid --profile MY_CUSTOM_AGENT

From UI

  1. Open task
  2. Click "Run Agent"
  3. Select "Droid"
  4. Choose profile
  5. Start

Supported Models

Cloud Providers

ProviderModels
GoogleGemini 2.5 Pro, Gemini 2.5 Flash
OpenAIGPT-5, GPT-4 Turbo, GPT-4o
AnthropicClaude 3.5 Sonnet, Claude 3 Opus
ZhipuGLM-4.7, GLM-4
AlibabaQwen-Coder

Self-Hosted

SystemModels
OllamaLlama, Mistral, CodeLlama
vLLMAny compatible model
LocalAIAny GGUF model

Self-Hosted Setup

Ollama

# Install Ollama
curl https://ollama.ai/install.sh | sh

# Pull a coding model
ollama pull codellama

# Configure Friday Dev
{
"LOCAL_CODELLAMA": {
"DROID": {
"autonomy": "workspace-write",
"model": "codellama",
"baseUrl": "http://localhost:11434"
}
}
}

vLLM

# Start vLLM server
python -m vllm.entrypoints.openai.api_server \
--model codellama/CodeLlama-34b \
--port 8000
{
"VLLM_CODELLAMA": {
"DROID": {
"autonomy": "workspace-write",
"model": "codellama/CodeLlama-34b",
"baseUrl": "http://localhost:8000/v1"
}
}
}

MCP Integration

Droid supports Model Context Protocol:

{
"mcp": {
"enabled": true,
"servers": [
{
"name": "filesystem",
"command": "mcp-filesystem",
"args": ["--root", "/workspace"]
}
]
}
}

Creating Custom Agents

Step 1: Define Profile

{
"MY_AGENT": {
"DROID": {
"autonomy": "workspace-write",
"model": "my-model",
"systemPrompt": "You are a helpful coding assistant..."
}
}
}

Step 2: Add to Configuration

Edit ~/.friday-dev/profiles.json:

{
"profiles": {
"MY_AGENT": { ... }
}
}

Step 3: Use the Agent

friday-dev run --task 123 --agent droid --profile MY_AGENT

Troubleshooting

Model Not Responding

  1. Check API endpoint is reachable
  2. Verify API key is valid
  3. Check model name is correct
  4. Review error logs

Poor Performance

  1. Try a different model
  2. Increase maxTokens
  3. Adjust temperature
  4. Add more context in task

Connection Issues

  1. Check network/firewall
  2. Verify base URL
  3. Test API endpoint directly
  4. Check rate limits

Best Practices

  1. Start with presets - Use built-in profiles first
  2. Test thoroughly - Verify custom profiles work
  3. Use appropriate autonomy - Don't over-permission
  4. Monitor costs - Different models have different pricing
  5. Keep profiles versioned - Track configuration changes

Next Steps