Skip to content
forked from crmne/ruby_llm

A delightful Ruby way to work with AI. No configuration madness, no complex callbacks, no handler hell – just beautiful, expressive Ruby code.

License

Notifications You must be signed in to change notification settings

StrongMind/ruby_llm

 
 

Repository files navigation

RubyLLM

One beautiful Ruby API for GPT, Claude, Gemini, and more.

Battle tested at Chat with WorkClaude Code for your documents

Gem Version Ruby Style Guide Gem Downloads codecov

crmne%2Fruby_llm | Trendshift


Build chatbots, AI agents, RAG applications. Works with OpenAI, Anthropic, Google, AWS, local models, and any OpenAI-compatible API.

Why RubyLLM?

Every AI provider ships their own bloated client. Different APIs. Different response formats. Different conventions. It's exhausting.

RubyLLM gives you one beautiful API for all of them. Same interface whether you're using GPT, Claude, or your local Ollama. Just three dependencies: Faraday, Zeitwerk, and Marcel. That's it.

Show me the code

# Just ask questions
chat = RubyLLM.chat
chat.ask "What's the best way to learn Ruby?"
# Analyze any file type
chat.ask "What's in this image?", with: "ruby_conf.jpg"
chat.ask "Describe this meeting", with: "meeting.wav"
chat.ask "Summarize this document", with: "contract.pdf"
chat.ask "Explain this code", with: "app.rb"
# Multiple files at once
chat.ask "Analyze these files", with: ["diagram.png", "report.pdf", "notes.txt"]
# Stream responses
chat.ask "Tell me a story about Ruby" do |chunk|
  print chunk.content
end
# Generate images
RubyLLM.paint "a sunset over mountains in watercolor style"
# Create embeddings
RubyLLM.embed "Ruby is elegant and expressive"
# Let AI use your code
class Weather < RubyLLM::Tool
  description "Get current weather"
  param :latitude
  param :longitude

  def execute(latitude:, longitude:)
    url = "https://api.open-meteo.com/v1/forecast?latitude=#{latitude}&longitude=#{longitude}&current=temperature_2m,wind_speed_10m"
    JSON.parse(Faraday.get(url).body)
  end
end

chat.with_tool(Weather).ask "What's the weather in Berlin?"
# Get structured output
class ProductSchema < RubyLLM::Schema
  string :name
  number :price
  array :features do
    string
  end
end

response = chat.with_schema(ProductSchema).ask "Analyze this product", with: "product.txt"

Features

  • Chat: Conversational AI with RubyLLM.chat
  • Vision: Analyze images and screenshots
  • Audio: Transcribe and understand speech
  • Documents: Extract from PDFs, CSVs, JSON, any file type
  • Image generation: Create images with RubyLLM.paint
  • Embeddings: Vector search with RubyLLM.embed
  • Tools: Let AI call your Ruby methods
  • Structured output: JSON schemas that just work
  • Streaming: Real-time responses with blocks
  • Rails: ActiveRecord integration with acts_as_chat
  • Async: Fiber-based concurrency
  • Model registry: 500+ models with capability detection and pricing
  • Providers: OpenAI, Anthropic, Gemini, Bedrock, DeepSeek, Mistral, Ollama, OpenRouter, Perplexity, GPUStack, and any OpenAI-compatible API

Installation

Add to your Gemfile:

gem 'ruby_llm'

Then bundle install.

Configure your API keys:

# config/initializers/ruby_llm.rb
RubyLLM.configure do |config|
  config.openai_api_key = ENV['OPENAI_API_KEY']
end

Rails

rails generate ruby_llm:install
class Chat < ApplicationRecord
  acts_as_chat
end

chat = Chat.create! model_id: "claude-sonnet-4"
chat.ask "What's in this file?", with: "report.pdf"

Documentation

rubyllm.com

Contributing

See CONTRIBUTING.md.

License

Released under the MIT License.

About

A delightful Ruby way to work with AI. No configuration madness, no complex callbacks, no handler hell – just beautiful, expressive Ruby code.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Ruby 100.0%