You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In Goose Desktop, streaming responses from any ACP provider (claude-acp, amp-acp, codex-acp, copilot-acp, pi-acp) render as many discrete timestamped message bubbles — one per agent_message_chunk event — instead of accumulating into a single streaming assistant message. Non-ACP providers (e.g. claude-code) stream correctly in the same UI.
Root cause: the ACP provider yields each text chunk as a fresh Message without an id, so the Desktop coalescing check in useChatStream.ts never matches.
Write a 200-word summary of first-principles Bayesian reasoning applied to BTC spot price prediction. Include three concrete considerations, show numerical reasoning, and produce output in flowing paragraphs (not bullet points).
Observed
The response arrives as ~20 separate message bubbles stacked vertically, each with its own timestamp, each containing a few tokens or a sentence fragment (e.g. "B", "ayesian reasoning begins with a prior belief...", "ates it through Bayes' theorem — P(H|E) = P(E|H) ×", …). The text concatenates correctly if you read across bubbles, but it's unusable as a conversation view.
Session auto-title renders cleanly ("Bayesian BTC Price Prediction Summary"), so this is not the older session-description marker-stripping bug.
Expected
A single streaming assistant message that fills in progressively, as claude-code and other non-ACP providers produce.
Same prompt via CLI
goose run --provider claude-acp --model current --text "..." produces a single clean flowing-prose response — so this is a Desktop-rendering symptom of the backend Message shape, not an issue with the ACP adapter or the model output itself.
Root cause
Backend (Rust) — crates/goose/src/acp/provider.rs:397–410, inside the stream() loop:
Message::assistant() in crates/goose/src/conversation/message.rs:709–717 creates a Message with id: None and created: Utc::now().timestamp() every call. So every inbound chunk becomes a distinct message with no id and a fresh timestamp.
if(lastMsg?.id&&lastMsg.id===incomingMsg.id){// append text delta to lastMsg.content}
Coalescing requires lastMsg.id truthy AND matching ids. With id: None serializing to null/absent, the branch is never taken → every chunk becomes a new bubble.
Why claude_code streams cleanly
crates/goose/src/providers/claude_code.rs pre-generates one UUID + one timestamp per stream() call and attaches them to every partial chunk:
L732: let message_id = uuid::Uuid::new_v4().to_string();
L775: let stream_timestamp = chrono::Utc::now().timestamp();
L809–816: Message::new(Role::Assistant, stream_timestamp, vec![MessageContent::text(text)]) then partial_message.id = Some(message_id.clone()); then yield (Some(partial_message), None);
All chunks in one response share id + timestamp → frontend coalesces.
Scope
Affects every ACP provider, since they share crates/goose/src/acp/provider.rs:
claude-acp
amp-acp
codex-acp
copilot-acp
pi-acp
Proposed fix
Minimal patch matching the existing claude_code pattern — one id + timestamp per stream() call, attached to Text and Thought yields only. Other yields (denial message, action-required) intentionally keep fresh Message::assistant() since they are logically distinct messages.
--- a/crates/goose/src/acp/provider.rs+++ b/crates/goose/src/acp/provider.rs@@ -390,6 +390,8 @@
.map_err(|_| ProviderError::RequestFailed("goose_mode lock poisoned".into()))?;
let reject_all_tools = goose_mode == GooseMode::Chat;
+ let message_id = uuid::Uuid::new_v4().to_string();+ let stream_timestamp = chrono::Utc::now().timestamp();
Ok(Box::pin(try_stream! {
let mut suppress_text = false;
let mut rejected_tool_calls: HashSet<String> = HashSet::new();
@@ -398,14 +400,22 @@
match update {
AcpUpdate::Text(text) => {
if !suppress_text {
- let message = Message::assistant().with_text(text);+ let mut message = Message::new(+ Role::Assistant,+ stream_timestamp,+ vec![MessageContent::text(text)],+ );+ message.id = Some(message_id.clone());
yield (Some(message), None);
}
}
AcpUpdate::Thought(text) => {
- let message = Message::assistant()+ let mut message = Message::new(+ Role::Assistant,+ stream_timestamp,+ vec![],+ )
.with_thinking(text, "")
.with_visibility(true, false);
+ message.id = Some(message_id.clone());
yield (Some(message), None);
}
Happy to open a PR if that's useful — otherwise flagging for whoever picks this up.
Summary
In Goose Desktop, streaming responses from any ACP provider (
claude-acp,amp-acp,codex-acp,copilot-acp,pi-acp) render as many discrete timestamped message bubbles — one peragent_message_chunkevent — instead of accumulating into a single streaming assistant message. Non-ACP providers (e.g.claude-code) stream correctly in the same UI.Root cause: the ACP provider yields each text chunk as a fresh
Messagewithout anid, so the Desktop coalescing check inuseChatStream.tsnever matches.Environment
@agentclientprotocol/claude-agent-acp: 0.30.0 (npm global)claude-acp/currentReproduction
npm install -g @agentclientprotocol/claude-agent-acp~/.config/goose/config.yaml:Observed
The response arrives as ~20 separate message bubbles stacked vertically, each with its own timestamp, each containing a few tokens or a sentence fragment (e.g.
"B","ayesian reasoning begins with a prior belief...","ates it through Bayes' theorem — P(H|E) = P(E|H) ×", …). The text concatenates correctly if you read across bubbles, but it's unusable as a conversation view.Session auto-title renders cleanly (
"Bayesian BTC Price Prediction Summary"), so this is not the older session-description marker-stripping bug.Expected
A single streaming assistant message that fills in progressively, as
claude-codeand other non-ACP providers produce.Same prompt via CLI
goose run --provider claude-acp --model current --text "..."produces a single clean flowing-prose response — so this is a Desktop-rendering symptom of the backendMessageshape, not an issue with the ACP adapter or the model output itself.Root cause
Backend (Rust) —
crates/goose/src/acp/provider.rs:397–410, inside thestream()loop:Message::assistant()incrates/goose/src/conversation/message.rs:709–717creates aMessagewithid: Noneandcreated: Utc::now().timestamp()every call. So every inbound chunk becomes a distinct message with no id and a fresh timestamp.Frontend (Electron) —
ui/desktop/src/hooks/useChatStream.ts:181:Coalescing requires
lastMsg.idtruthy AND matching ids. Withid: Noneserializing to null/absent, the branch is never taken → every chunk becomes a new bubble.Why
claude_codestreams cleanlycrates/goose/src/providers/claude_code.rspre-generates one UUID + one timestamp perstream()call and attaches them to every partial chunk:let message_id = uuid::Uuid::new_v4().to_string();let stream_timestamp = chrono::Utc::now().timestamp();Message::new(Role::Assistant, stream_timestamp, vec![MessageContent::text(text)])thenpartial_message.id = Some(message_id.clone());thenyield (Some(partial_message), None);All chunks in one response share id + timestamp → frontend coalesces.
Scope
Affects every ACP provider, since they share
crates/goose/src/acp/provider.rs:claude-acpamp-acpcodex-acpcopilot-acppi-acpProposed fix
Minimal patch matching the existing
claude_codepattern — one id + timestamp perstream()call, attached to Text and Thought yields only. Other yields (denial message, action-required) intentionally keep freshMessage::assistant()since they are logically distinct messages.Happy to open a PR if that's useful — otherwise flagging for whoever picks this up.
Related / Not related
generate_simple_session_descriptionmarker-stripping bug — session titles are clean in 1.31.1.claude-agent-acppackage behavior — the ACP client emits chunks correctly per spec; the issue is on goose's consumer side.