Class: FactDb::LLM::Adapter

Inherits:
Object
  • Object
show all
Defined in:
lib/fact_db/llm/adapter.rb

Overview

Adapter for ruby_llm gem Provides a unified interface for the LLM extractor

Examples:

Configure with OpenAI

FactDb.configure do |config|
  config.llm_client = FactDb::LLM::Adapter.new(
    provider: :openai,
    api_key: ENV["OPENAI_API_KEY"],
    model: "gpt-4o-mini"
  )
end

Configure with Anthropic

FactDb.configure do |config|
  config.llm_client = FactDb::LLM::Adapter.new(
    provider: :anthropic,
    api_key: ENV["ANTHROPIC_API_KEY"],
    model: "claude-sonnet-4-20250514"
  )
end

Configure via YAML (config/fact_db.yml)

# llm_provider: anthropic
# llm_model: claude-sonnet-4-20250514
# llm_api_key: <%= ENV["ANTHROPIC_API_KEY"] %>

Configure via environment variables

# FACT_DB_LLM_PROVIDER=anthropic
# FACT_DB_LLM_MODEL=claude-sonnet-4-20250514
# FACT_DB_LLM_API_KEY=sk-...

Constant Summary collapse

PROVIDER_DEFAULTS =
{
  openai: "gpt-4o-mini",
  anthropic: "claude-sonnet-4-20250514",
  gemini: "gemini-2.0-flash",
  ollama: "llama3.2",
  bedrock: "claude-sonnet-4",
  openrouter: "anthropic/claude-sonnet-4"
}.freeze

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider:, model: nil, api_key: nil, **options) ⇒ Adapter

Create an adapter for a specific LLM provider

Parameters:

  • provider (Symbol)

    :openai, :anthropic, :gemini, :ollama, :bedrock, :openrouter

  • model (String) (defaults to: nil)

    Model name (optional, uses provider default)

  • api_key (String) (defaults to: nil)

    API key (optional if set via ENV)

  • options (Hash)

    Additional options passed to RubyLLM



55
56
57
58
59
60
61
# File 'lib/fact_db/llm/adapter.rb', line 55

def initialize(provider:, model: nil, api_key: nil, **options)
  @provider = provider.to_sym
  @model = model || PROVIDER_DEFAULTS[@provider]
  @options = options

  configure_ruby_llm(api_key)
end

Instance Attribute Details

#modelObject (readonly)

Returns the value of attribute model.



37
38
39
# File 'lib/fact_db/llm/adapter.rb', line 37

def model
  @model
end

#providerObject (readonly)

Returns the value of attribute provider.



37
38
39
# File 'lib/fact_db/llm/adapter.rb', line 37

def provider
  @provider
end

Instance Method Details

#chat(prompt) ⇒ String Also known as: call, complete

Send a prompt to the LLM and return the response text

Parameters:

  • prompt (String)

    The prompt to send

Returns:

  • (String)

    The response text



67
68
69
70
71
# File 'lib/fact_db/llm/adapter.rb', line 67

def chat(prompt)
  chat_instance = RubyLLM.chat(model: model)
  response = chat_instance.ask(prompt)
  response.content
end