📖 Documentation Menu

Integrations

OpenRouter Integration

DRIP uses OpenRouter to access multiple AI models through a unified API. This powers the AI reasoning layer that synthesizes collected data into structured research.

Models Used

DRIP dynamically selects the best model for each task. The AI brain routes queries through OpenRouter to models including:

ModelProviderUsed For
Claude 3.5 SonnetAnthropicComplex research synthesis, long-form analysis
GPT-4oOpenAIStructured data extraction, tool selection
Claude 3 HaikuAnthropicFast classification, routing, simple queries

How the AI Brain Works

When you submit a query to DRIP, the AI brain:

  1. Classifies the query type (company, person, market, social, or mixed).
  2. Selects tools — determines which data sources to query based on the classification.
  3. Collects data — fans out to multiple sources in parallel.
  4. Synthesizes — the AI model processes raw data into a structured research brief with citations and confidence scores.

What This Means for Users

OpenRouter is a server-side integration. As a DRIP user, you:

  • Don't need an API key — DRIP handles model access on the backend
  • Don't pay for AI separately — model costs are included in the per-query price
  • Don't configure anything — the AI brain selects the optimal model automatically
💡OpenRouter gives DRIP access to multiple frontier models through a single API, with automatic failover. If one provider is down, queries are routed to an alternative model transparently.