RubyGems Navigation menu

local_llm 0.1.1

local_llm is a lightweight Ruby gem that lets you interact with locally installed Ollama LLMs such as LLaMA, Mistral, CodeLLaMA, Qwen, and more. It supports configurable default models, configurable Ollama API endpoints, real-time streaming or non-streaming responses, and both one-shot and multi-turn chat—while keeping all inference fully local, private, and offline.

Gemfile:
=

install:
=

Versions:

  1. 0.1.1 December 02, 2025 (13.5 KB)
  2. 0.1.0 December 02, 2025 (7.5 KB)

Owners:

Pushed by:

Authors:

  • MD Abdul Barek

SHA 256 checksum:

=

Total downloads 260

For this version 154

Version Released:

License:

MIT

Required Ruby Version: >= 3.0.0

Links: