local_llm 0.1.1
local_llm is a lightweight Ruby gem that lets you interact with locally installed Ollama LLMs such as LLaMA, Mistral, CodeLLaMA, Qwen, and more. It supports configurable default models, configurable Ollama API endpoints, real-time streaming or non-streaming responses, and both one-shot and multi-turn chat—while keeping all inference fully local, private, and offline.