Introduction
Kalosm is a library with dead simple interfaces for local, language, audio, and image models
Quick Start
- Add the Kalosm library
cargo add kalosm
- Initialize a Kalosm model
use kalosm::language::*; #[tokio::main] async fn main() { let _model = Llama::new_chat(); }
- Start a chat session with a pirate
use kalosm::language::*; #[tokio::main] async fn main() { let mut model = Llama::new_chat(); // New code let mut chat = Chat::builder(&mut model) .with_system_prompt("The assistant will act like a pirate") .build(); loop { chat.add_message(prompt_input("\n> ").unwrap()) .await .unwrap() .to_std_out() .await .unwrap(); } }
- Add build configuration to your .cargo/config.toml for improved performance
[build] rustflags = ["-C", "target-cpu=native"] [target.x86_64-apple-darwin] rustflags = ["-C", "target-feature=-avx,-avx2"]
- Run the program
cargo run --release