Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 50 additions & 0 deletions crates/cactus/examples/chat.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
use std::io::{self, BufRead, Write};

use cactus::{CompleteOptions, Message, Model};

fn main() {
let path = std::env::args().nth(1).expect("Usage: chat <model-path>");
let mut model = Model::new(&path).expect("Failed to load model");

let options = CompleteOptions {
max_tokens: Some(1024),
temperature: Some(0.7),
confidence_threshold: Some(0.0),
..Default::default()
};

let mut messages: Vec<Message> = vec![Message::system("You are a helpful assistant.")];

println!("Chat with your model. Type 'exit' to quit.\n");

loop {
print!("> ");
let _ = io::stdout().flush();

let mut input = String::new();
io::stdin().lock().read_line(&mut input).unwrap();
let input = input.trim();

if input.is_empty() || input == "exit" || input == "quit" {
break;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Empty input exits chat instead of reprompting

Medium Severity

The input.is_empty() check in the break condition causes the chat loop to exit when the user simply presses Enter without typing anything. Most interactive CLI chat applications skip blank lines and re-display the prompt. This makes accidental exits too easy and contradicts the PR description which states only "exit" or "quit" trigger clean shutdown. The is_empty() condition conflates EOF handling with blank-line handling — EOF from read_line (returning Ok(0)) could be detected via the return value instead.

Fix in Cursor Fix in Web


messages.push(Message::user(input));
model.reset();

let mut response_text = String::new();
let result = model.complete_streaming(&messages, &options, |token| {
print!("{token}");
let _ = io::stdout().flush();
response_text.push_str(token);
true
});

println!();

match result {
Ok(_) => messages.push(Message::assistant(&response_text)),
Err(e) => eprintln!("Error: {e}"),
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Failed completion leaves orphaned user message in history

Medium Severity

The user message is pushed into messages on line 32 before complete_streaming is called. If complete_streaming returns Err, the error branch only prints the error — it never removes the user message that was already appended. On subsequent turns, the conversation history contains a user message with no corresponding assistant reply, breaking the expected alternating user/assistant pattern. This corrupts multi-turn context for all future turns in the session.

Fix in Cursor Fix in Web

}
}
13 changes: 1 addition & 12 deletions crates/openai-transcription/src/batch/request.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,25 +18,14 @@ pub struct CommonTranscriptionOptions {
pub temperature: Option<f32>,
}

#[derive(Debug, Clone, PartialEq)]
#[derive(Debug, Clone, Default, PartialEq)]
pub struct CreateWhisperTranscriptionOptions {
pub common: CommonTranscriptionOptions,
pub prompt: Option<String>,
pub response_format: Option<WhisperResponseFormat>,
pub timestamp_granularities: Vec<TimestampGranularity>,
}

impl Default for CreateWhisperTranscriptionOptions {
fn default() -> Self {
Self {
common: CommonTranscriptionOptions::default(),
prompt: None,
response_format: None,
timestamp_granularities: Vec::new(),
}
}
}

#[derive(Debug, Clone, PartialEq)]
pub struct CreateGptTranscriptionOptions {
pub model: GptTranscriptionModel,
Expand Down
3 changes: 1 addition & 2 deletions crates/owhisper-client/tests/provider_live_e2e.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ use futures_util::{Stream, StreamExt};
use hypr_audio_utils::AudioFormatExt;
use owhisper_client::{
AssemblyAIAdapter, DashScopeAdapter, DeepgramAdapter, ElevenLabsAdapter, FinalizeHandle,
FireworksAdapter, GladiaAdapter, ListenClient, MistralAdapter, OpenAIAdapter, Provider,
FireworksAdapter, GladiaAdapter, ListenClient, MistralAdapter, Provider,
RealtimeSttAdapter, SonioxAdapter,
};
use owhisper_interface::{ControlMessage, MixedMessage, stream::StreamResponse};
Expand Down Expand Up @@ -181,7 +181,6 @@ direct_live_test!(assemblyai, AssemblyAIAdapter, Provider::AssemblyAI);
direct_live_test!(soniox, SonioxAdapter, Provider::Soniox);
direct_live_test!(gladia, GladiaAdapter, Provider::Gladia);
direct_live_test!(fireworks, FireworksAdapter, Provider::Fireworks);
direct_live_test!(openai, OpenAIAdapter, Provider::OpenAI);
direct_live_test!(elevenlabs, ElevenLabsAdapter, Provider::ElevenLabs);
direct_live_test!(dashscope, DashScopeAdapter, Provider::DashScope);
direct_live_test!(mistral, MistralAdapter, Provider::Mistral);
Loading