Elixir Interface / Adapter for Google Gemini LLM, for both AI Studio and Vertex AI
-
Updated
Apr 9, 2026 - Elixir
Elixir Interface / Adapter for Google Gemini LLM, for both AI Studio and Vertex AI
An Elixir SDK for Claude Code - provides programmatic access to Claude Code CLI with streaming message processing
OpenAI Codex SDK written in Elixir
Agent Session Manager - A comprehensive Elixir library for managing AI agent sessions, state persistence, conversation context, and multi-agent orchestration workflows
Ollixir provides a first-class Elixir client with feature parity to the official ollama-python library. Ollixir runs large language models locally or on your infrastructure via Ollama.
Protocol-based AI adapter foundation for Elixir - unified abstractions for gemini_ex, claude_agent_sdk, codex_sdk with automatic fallback, capability detection, and telemetry
Elixir client SDK for the Jules API - orchestrate AI coding sessions
Native Elixir SDK for the Notion API — comprehensive, idiomatic client for Notion workspaces, databases, pages, blocks, users, comments, and search. Built on OTP with supervised HTTP, automatic rate limiting, pagination helpers, and robust error handling for BEAM applications.
🛠️ Enhance your Elixir development with Claude Code plugins for smoother coding, formatting, and efficient project management.
Native Elixir SDK for the GitHub REST API — comprehensive, idiomatic client for repositories, issues, pull requests, actions, organizations, users, apps, and more. Built on OTP with supervised HTTP, OAuth and token onboarding, pagination helpers, rate-limit awareness, and robust error handling for BEAM applications.
Elixir SDK for the Amp CLI — provides a comprehensive client library for interacting with Amp's AI-powered coding agent, including thread management, tool orchestration, streaming responses, and programmatic access to Amp's full feature set from Elixir/OTP applications
Full-featured Elixir client for the Model Context Protocol (MCP) with multi-transport support, resources, prompts, tools, and telemetry.
Barebones Elixir wrapper and integration surface for llama.cpp experiments, local inference workflows, and future nshkr AI SDK interoperability, with repository assets, docs, and package metadata prepared for professional distribution.
An Elixir-first external runtime transport foundation for AI SDK integrations, focused on clean transport boundaries, provider interoperability, protocol adapters, and production-grade documentation, packaging, and operational ergonomics for downstream nshkrdotcom runtime and SDK ecosystems.
Core Elixir primitives for building reliable self-hosted inference clients, provider adapters, transport boundaries, and operational controls for private AI runtimes across local, edge, and dedicated infrastructure.
vLLM - High-throughput, memory-efficient LLM inference engine with PagedAttention, continuous batching, CUDA/HIP optimization, quantization (GPTQ/AWQ/INT4/INT8/FP8), tensor/pipeline parallelism, OpenAI-compatible API, multi-GPU/TPU/Neuron support, prefix caching, and multi-LoRA capabilities
Elixir SDK for Linear built on Prismatic, using a schema-driven GraphQL toolchain, thin provider-specific configuration, generated operations and models, and professional docs, verification, and examples for production-grade client use.
🤖 Enable local large language models with Ollixir, the Elixir client mirroring the ollama-python library for seamless chat, generation, and model management.
Shared LLM Actions for NSAI runtimes. Wraps PortfolioCore adapters with Jido.Action semantics and CrucibleIR.Backend input/output to centralize provider access.
An Elixir SDK for the Gemini CLI — Build AI-powered applications with Google Gemini via a robust, idiomatic wrapper around the Gemini CLI. Features streaming, structured output, session management, model selection, and OTP supervision tree integration for production-grade Gemini-powered Elixir apps.
Add a description, image, and links to the nshkr-ai-sdk topic page so that developers can more easily learn about it.
To associate your repository with the nshkr-ai-sdk topic, visit your repo's landing page and select "manage topics."