Introduction
Kalosm is a library with dead simple interfaces for local, language, audio, and image models
Quick Start
- Add the Kalosm and Tokio libraries
cargo add kalosm --features language # Enable the metal or cuda feature if you have a supported accelerator cargo add tokio --features full
- Initialize a Kalosm model
use kalosm::language::*; #[tokio::main] async fn main() -> Result<(), Box<dyn std::error::Error>> { let model = Llama::new_chat().await?; }
- Start a chat session with a pirate
use kalosm::language::*; #[tokio::main] async fn main() -> Result<(), Box<dyn std::error::Error>> { let model = Llama::new_chat().await?; // New code let mut chat = Chat::builder(model) .with_system_prompt("The assistant will act like a pirate") .build(); loop { chat.add_message(prompt_input("\n> ")?).to_std_out().await?; } }
- Add build configuration to your
.cargo/config.toml
for improved performance
[build] rustflags = ["-C", "target-cpu=native"] [target.x86_64-apple-darwin] rustflags = ["-C", "target-feature=-avx,-avx2"]
- Run the program
cargo run --release