Back to changelog
Jan 11, 2026AIReliabilityUXData

Shipping update - 2026-01-11

Shipped real-time AI chat streaming with thinking content, fixed race conditions in chat state, improved dark-mode markdown rendering, and cleaned up the assistant UX.

Highlights

  • Real-time AI chat streaming: responses now arrive token-by-token via Server-Sent Events, and the system surfaces intermediate thinking/reasoning content alongside the final answer, giving users visibility into the assistant's process.
  • AI assistant stability: fixed race conditions where stale chat data could override current state, added loading indicators while chats load, and implemented optimistic rollback on failure so the UI stays consistent even when network calls fail.
  • Chat input now clears immediately on send instead of waiting for the API response, eliminating a confusing UX lag where the user's message lingered in the input field during async processing.
  • Dark-mode markdown tables in chat are now styled with proper borders, backgrounds, and text colors, making AI-generated tabular data readable in both themes.
  • Logs column visibility migration updated to use reserved prefixes instead of the previous naming convention, fixing inconsistencies in saved column layouts.

User outcomes

  • AI assistant responses stream in real-time, so users see progress immediately instead of waiting for the full response to generate.
  • Chat state is more reliable: loading spinners appear when data is being fetched, and the interface recovers gracefully from failed requests without losing the current conversation.
  • The assistant input field feels snappier since it clears the moment you send, matching the expected behavior of modern chat interfaces.
  • AI-generated tables are now fully readable in dark mode with consistent styling.
  • Saved log column layouts using the previous format are migrated automatically to the new prefix convention.

Technical wins

  • Implemented SSE-based streaming in the conversation API with incremental token delivery, plus a thread history system that persists thinking content alongside final messages for context continuity.
  • Resolved chat state race conditions by consolidating message creation, input clearing, and loading state into a single atomic setState call via a buildMessage helper, preventing interleaved state updates during async sends.
  • Added explicit table renderers (table, thead, tbody, tr, th, td) to the AI markdown renderer with Tailwind classes for dark-mode borders and backgrounds, rather than relying on global CSS that missed nested chat contexts.
  • Built optimistic chat preview and upsert logic with rollback: the UI immediately shows new chats while the API call is in-flight, reverting cleanly if the request fails.

Notes

  • Sensitive/internal details have been redacted.
  • An infrastructure fix for multi-region service URLs was attempted and reverted for further investigation, resulting in no net change.

More updates

View all