ChatDelta is a comprehensive suite of tools and libraries for connecting to multiple AI APIs with a unified interface. Whether you need a terminal interface, a library for your application, or a command-line tool, ChatDelta has you covered.
ChatDelta TUI - Side-by-side chat with OpenAI, Gemini, and Claude
chatdelta - NPM package with TypeScript support
chatdelta - Rust crate for AI API integration
chatdelta-go - Go library (in development)
ChatDelta CLI - Command-line tool for querying multiple AI models
Below is a high-level flow showing how ChatDelta connects to multiple AI providers:
npm install chatdelta
import { createClient, executeParallel } from 'chatdelta';
const openai = createClient('openai', process.env.OPENAI_KEY, 'gpt-4');
const results = await executeParallel([openai], 'Explain quantum computing');
[dependencies]
chatdelta = "0.2"
use chatdelta::{create_client, ClientConfig};
let client = create_client("openai", "your-key", "gpt-4o", config)?;
let response = client.send_prompt("Hello, world!").await?;
go get github.com/chatdelta/chatdelta-go
client, err := chatdelta.CreateClient("openai", "", "gpt-3.5-turbo", nil)
response, err := client.SendPrompt(context.Background(), "What is AI?")
git clone https://github.com/ChatDelta/chatdelta.git
cd chatdelta
cargo build --release
# Set your API keys
export OPENAI_API_KEY="your-key"
export GEMINI_API_KEY="your-key"
export ANTHROPIC_API_KEY="your-key"
./target/release/chatdelta
Ready to get started? Choose your preferred language or tool above and dive into the comprehensive ChatDelta ecosystem!