Class: FactDb::LLM::Adapter
- Inherits:
-
Object
- Object
- FactDb::LLM::Adapter
- Defined in:
- lib/fact_db/llm/adapter.rb
Overview
Adapter for ruby_llm gem Provides a unified interface for the LLM extractor
Constant Summary collapse
- PROVIDER_DEFAULTS =
{ openai: "gpt-4o-mini", anthropic: "claude-sonnet-4-20250514", gemini: "gemini-2.0-flash", ollama: "llama3.2", bedrock: "claude-sonnet-4", openrouter: "anthropic/claude-sonnet-4" }.freeze
Instance Attribute Summary collapse
-
#model ⇒ Object
readonly
Returns the value of attribute model.
-
#provider ⇒ Object
readonly
Returns the value of attribute provider.
Instance Method Summary collapse
-
#chat(prompt) ⇒ String
(also: #call, #complete)
Send a prompt to the LLM and return the response text.
-
#initialize(provider:, model: nil, api_key: nil, **options) ⇒ Adapter
constructor
Create an adapter for a specific LLM provider.
Constructor Details
#initialize(provider:, model: nil, api_key: nil, **options) ⇒ Adapter
Create an adapter for a specific LLM provider
55 56 57 58 59 60 61 |
# File 'lib/fact_db/llm/adapter.rb', line 55 def initialize(provider:, model: nil, api_key: nil, **) @provider = provider.to_sym @model = model || PROVIDER_DEFAULTS[@provider] @options = configure_ruby_llm(api_key) end |
Instance Attribute Details
#model ⇒ Object (readonly)
Returns the value of attribute model.
37 38 39 |
# File 'lib/fact_db/llm/adapter.rb', line 37 def model @model end |
#provider ⇒ Object (readonly)
Returns the value of attribute provider.
37 38 39 |
# File 'lib/fact_db/llm/adapter.rb', line 37 def provider @provider end |
Instance Method Details
#chat(prompt) ⇒ String Also known as: call, complete
Send a prompt to the LLM and return the response text
67 68 69 70 71 |
# File 'lib/fact_db/llm/adapter.rb', line 67 def chat(prompt) chat_instance = RubyLLM.chat(model: model) response = chat_instance.ask(prompt) response.content end |