Skip to content

Examples

The examples/ directory contains runnable demo applications that exercise the gem from a client and server process. Each demo uses examples/common_config.rb, which adds the repository lib/ directory to $LOAD_PATH before requiring simple_a2a, so the examples run against the local checkout.

simple_a2a example applications Dark themed transparent-background diagram showing the three example applications and the A2A capabilities they demonstrate. 01 Basic Usage JSON-RPC request/response agent card discovery send, list, and get tasks client error handling 02 Streaming SSE task subscription working/final statuses append artifact chunks incremental client output 03 LLM Research multi-agent orchestration Anthropic + OpenAI agents evaluator agent CLI and web clients

Run a demo

From the repository root:

bundle exec ruby examples/run 01_basic_usage
bundle exec ruby examples/run 02_streaming

The launcher starts the demo server on http://localhost:9292, waits for it to accept connections, runs the demo client, and then shuts the server down.

To run a demo manually, start its server.rb in one terminal and its client.rb in another:

bundle exec ruby examples/01_basic_usage/server.rb
bundle exec ruby examples/01_basic_usage/client.rb

Demo-specific dependencies

The basic and streaming demos use only the gem and its normal development setup. The LLM research demo intentionally keeps its LLM and web UI dependencies out of the gem runtime dependency list. Install the demo-specific gems before running it:

bundle add ruby_llm async-http-faraday sinatra

Then set API keys:

export ANTHROPIC_API_KEY=your_key_here
export OPENAI_API_KEY=your_key_here

Demo index

Demo Command Documentation
Basic Usage bundle exec ruby examples/run 01_basic_usage Basic Usage
Streaming bundle exec ruby examples/run 02_streaming Streaming
Multi-Agent LLM Research bundle exec ruby examples/run 03_llm_research Multi-Agent LLM Research