Skip to content
Learn Agentic AI10 min read0 views

Optimistic UI for Agent Interactions: Showing Immediate Feedback Before Server Response

Learn how to implement optimistic updates in AI agent chat interfaces to provide instant feedback, handle rollbacks on failure, and manage loading states for the best user experience.

The Latency Problem in Agent Interfaces

When a user sends a message to an AI agent, the round trip involves network transit, model inference, and response generation. This can take anywhere from 500 milliseconds to 30 seconds depending on the model and task complexity. Without optimistic UI, the user stares at a blank space after hitting send, wondering whether their message was received. Optimistic updates solve this by immediately showing the user's message in the chat and displaying a typing indicator while the agent processes the request.

The Optimistic Update Pattern

The core idea: update the UI immediately as if the server request succeeded, then reconcile the state when the actual response arrives. If the request fails, roll back the optimistic change and show an error.

interface ChatMessage {
  id: string;
  role: "user" | "assistant";
  content: string;
  status: "optimistic" | "confirmed" | "error";
  timestamp: Date;
}

type ChatAction =
  | { type: "ADD_OPTIMISTIC"; message: ChatMessage }
  | { type: "CONFIRM"; tempId: string; realId: string }
  | { type: "ADD_RESPONSE"; message: ChatMessage }
  | { type: "MARK_ERROR"; id: string; error: string }
  | { type: "RETRY"; id: string };

function chatReducer(
  state: ChatMessage[],
  action: ChatAction
): ChatMessage[] {
  switch (action.type) {
    case "ADD_OPTIMISTIC":
      return [...state, action.message];

    case "CONFIRM":
      return state.map((m) =>
        m.id === action.tempId
          ? { ...m, id: action.realId, status: "confirmed" }
          : m
      );

    case "ADD_RESPONSE":
      return [...state, action.message];

    case "MARK_ERROR":
      return state.map((m) =>
        m.id === action.id ? { ...m, status: "error" } : m
      );

    case "RETRY":
      return state.map((m) =>
        m.id === action.id ? { ...m, status: "optimistic" } : m
      );

    default:
      return state;
  }
}

Using a reducer instead of simple useState makes state transitions explicit and testable. Each action represents a clear step in the message lifecycle.

Implementing the Send Flow

Wire the reducer into a hook that manages the full send-and-receive lifecycle.

import { useReducer, useCallback } from "react";

function useOptimisticChat() {
  const [messages, dispatch] = useReducer(chatReducer, []);

  const sendMessage = useCallback(async (content: string) => {
    const tempId = crypto.randomUUID();
    const optimisticMsg: ChatMessage = {
      id: tempId,
      role: "user",
      content,
      status: "optimistic",
      timestamp: new Date(),
    };

    dispatch({ type: "ADD_OPTIMISTIC", message: optimisticMsg });

    try {
      const res = await fetch("/api/agent/chat", {
        method: "POST",
        headers: { "Content-Type": "application/json" },
        body: JSON.stringify({ message: content }),
      });

      if (!res.ok) throw new Error("Request failed");

      const data = await res.json();
      dispatch({
        type: "CONFIRM",
        tempId,
        realId: data.userMessageId,
      });
      dispatch({
        type: "ADD_RESPONSE",
        message: {
          id: data.agentMessageId,
          role: "assistant",
          content: data.response,
          status: "confirmed",
          timestamp: new Date(),
        },
      });
    } catch {
      dispatch({
        type: "MARK_ERROR",
        id: tempId,
        error: "Failed to send",
      });
    }
  }, []);

  return { messages, sendMessage, dispatch };
}

Visual Feedback for Message States

Each message state requires distinct visual treatment so users understand what is happening.

See AI Voice Agents Handle Real Calls

Book a free demo or calculate how much you can save with AI voice automation.

function MessageBubble({ message }: { message: ChatMessage }) {
  const statusStyles: Record<ChatMessage["status"], string> = {
    optimistic: "opacity-70",
    confirmed: "opacity-100",
    error: "opacity-100 border-2 border-red-300",
  };

  return (
    <div className={`rounded-2xl px-4 py-2.5
                     ${statusStyles[message.status]}`}>
      <p>{message.content}</p>

      {message.status === "optimistic" && (
        <span className="text-xs text-gray-400 mt-1 block">
          Sending...
        </span>
      )}

      {message.status === "error" && (
        <div className="flex items-center gap-2 mt-2">
          <span className="text-xs text-red-500">
            Failed to send
          </span>
          <button
            onClick={() => {/* retry logic */}}
            className="text-xs text-blue-600 underline"
          >
            Retry
          </button>
        </div>
      )}
    </div>
  );
}

Optimistic messages render at reduced opacity so users can subconsciously distinguish them from confirmed messages. Error messages get a red border and a retry button.

The Typing Indicator

While waiting for the agent response, show a typing indicator that appears after the user's confirmed message.

function TypingIndicator() {
  return (
    <div className="flex items-center gap-1 px-4 py-3">
      <div className="flex gap-1">
        {[0, 1, 2].map((i) => (
          <span
            key={i}
            className="w-2 h-2 bg-gray-400 rounded-full animate-bounce"
            style={{ animationDelay: `${i * 150}ms` }}
          />
        ))}
      </div>
      <span className="text-sm text-gray-500 ml-2">
        Agent is thinking...
      </span>
    </div>
  );
}

Retry with Exponential Backoff

When a message fails, the retry button should not hammer the server. Implement exponential backoff for automatic retries.

async function retryWithBackoff<T>(
  fn: () => Promise<T>,
  maxRetries = 3
): Promise<T> {
  for (let attempt = 0; attempt <= maxRetries; attempt++) {
    try {
      return await fn();
    } catch (err) {
      if (attempt === maxRetries) throw err;
      const delay = Math.min(1000 * 2 ** attempt, 10_000);
      await new Promise((r) => setTimeout(r, delay));
    }
  }
  throw new Error("Unreachable");
}

FAQ

How do I handle optimistic updates for messages that trigger tool calls?

When the agent uses tools (web search, database queries, code execution), show an intermediate status like "Searching..." or "Running code..." between the user message and the final response. Add a toolCalls field to your message type and render each tool call as a collapsible section that shows the tool name, input, and output.

Should I use TanStack Query's optimistic update feature instead of a custom reducer?

TanStack Query's onMutate / onError / onSettled pattern works well for CRUD operations with cache invalidation. However, chat messages are append-only and sequential, which makes a reducer more natural. The reducer gives you fine-grained control over the message lifecycle without fighting the cache invalidation model.

How do I prevent duplicate messages if the user double-clicks the send button?

Disable the send button immediately after the first click by checking the status of the last message. If the last message has status optimistic, disable the input. Additionally, debounce the submit handler and deduplicate by content hash on the server side.


#OptimisticUI #React #UXPatterns #TypeScript #ErrorHandling #AgenticAI #LearnAI #AIEngineering

Share this article
C

CallSphere Team

Expert insights on AI voice agents and customer communication automation.

Try CallSphere AI Voice Agents

See how AI voice agents work for your industry. Live demo available -- no signup required.