Chat
A flexible conversational UI with pluggable adapters and built-in feedback support.
The Chat component renders a message thread with a composer, optional welcome content, and optional error presentation. It is transport-agnostic: the component does not talk to a backend by itself. Instead, it uses a ChatAdapter to send user input and (optionally) to handle actions such as message feedback.
To integrate Chat into your product, you typically:
- Provide and manage
messagesandloadingstate. - Implement a
ChatAdapter(or use the built-in HTTP adapter) that updates the message state. - Optionally provide welcome content, error handling, and feedback support.
#Basic usage
This example uses a small in-memory adapter and keeps all state in the parent. It demonstrates the core API: adapter, messages, and loading.
AI Assistant
const { adapter, messages, loading, error, resetChat } = useMockedAdapter();
const welcomeChoices: Choice[] = [
{
label: "Where are users dropping off or struggling?",
icon: (
<Icon>
<IconAreaChart />
</Icon>
),
},
{
label: "Compare traffic and conversions to last month.",
icon: (
<Icon>
<IconSite />
</Icon>
),
},
{
label: "Show me unusual patterns in user behavior.",
icon: (
<Icon>
<IconCampaign />
</Icon>
),
},
];
return (
<div style={{ height: "800px" }}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
onClose={resetChat}
welcomeHeading="Let’s dive in to your Analytics data"
welcomeText="If you're looking to explore visitor behavior, page performance, or traffic trends, you're in the right place."
composerPlaceholder="Type a question or choose a prompt to get started. Messages with 'notify' or 'interrupt' will simulate errors."
welcomeChoices={welcomeChoices}
error={error}
/>
</div>
);#API
#Chat props
The Chat component is controlled: you own the state. The adapter is responsible for updating messages, loading, and any error state.
| Prop | Type | Description |
|---|---|---|
| adapter | ChatAdapter | Required. The transport + orchestration layer. Called when the user sends a prompt and when the UI triggers actions such as feedback. |
| messages | ChatMessage[] | The message thread to render. When empty, the welcome state can be shown. |
| loading | boolean | Whether the assistant is currently working. When true, the list shows a reasoning indicator. |
| loadingMessage | string | Optional label next to the reasoning indicator. |
| onClose | () => void | Called when the user closes the chat (for example when the chat is rendered inside a floating container). |
| composerPlaceholder | string | Placeholder for the text input in the composer. |
| welcomeHeading | string | Heading shown when there are no messages yet. |
| welcomeText | string | Supporting text shown in the welcome state. |
| welcomeChoices | Choice[] | Optional prompt suggestions shown in the welcome state. When clicked, the choice label is sent as a prompt. |
| error | ChatError | null | Optional error state. Use displayMode: "interrupt" to replace the chat with a retry screen, or displayMode: "notify" to show a non-blocking message. |
#Message types
Messages are rendered based on their payload type. This enables structured UI responses while keeping the message model explicit and type-safe.
#Markdown messages
Use markdown messages for regular free-form content. Markdown is rendered using the platform styling.
Payload shape:
type MarkdownChatMessagePayload = {
type: "markdown";
markdown: string;
};
AI Assistant
const [messages] = useState<ChatMessage[]>([
{
id: "1234",
role: "assistant",
payload: {
type: "markdown",
markdown: `
## Sample Markdown Message
This is an example of a markdown message rendered in the Chat component.
- It supports **bold** and *italic* text.
- You can create lists:
1. First item
2. Second item
3. Third item
`,
},
createdAt: new Date(),
},
]);
const adapter: ChatAdapter = {
send: async () => {},
};
return (
<div style={{ height: "420px" }}>
<Chat adapter={adapter} messages={messages} loading={false} {...commonProps} />
</div>
);#Table messages
Use table messages for displaying tabular data. Tables are rendered as Fancy Table components.
Payload shape:
type TableChatMessagePayload = {
type: "table";
table: TableProps<TDto>;
};
AI Assistant
type DishItem = { id: number; title: string; cookTime: number; servings: number };
const items: DishItem[] = [
{ id: 1, title: "Lasagna", cookTime: 45, servings: 2 },
{ id: 2, title: "Pancakes", cookTime: 20, servings: 4 },
{ id: 3, title: "Sushi", cookTime: 90, servings: 6 },
{ id: 4, title: "Cake", cookTime: 30, servings: 8 },
];
const [messages] = useState<ChatMessage[]>([
{
id: "1234",
role: "assistant",
payload: {
type: "table",
table: {
columns: [
{
header: {
content: "Dish",
"data-observe-key": "table-header-dish",
},
render: (dto: DishItem) => dto.title,
options: {
isKeyColumn: true,
},
},
{
header: {
content: "Cook Time",
tooltip: "in minutes",
"data-observe-key": "table-header-cook-time",
},
render: (dto: DishItem) => dto.cookTime,
},
{
header: {
content: "Servings",
tooltip: "in persons",
"data-observe-key": "table-header-servings",
},
render: (dto: DishItem) => dto.servings,
},
],
items,
sort: null,
setSort: () => {},
loading: false,
caption: "Basic usage table",
},
},
createdAt: new Date(),
} as ChatMessage,
]);
const adapter: ChatAdapter = {
send: async () => {},
};
return (
<div style={{ height: "420px" }}>
<Chat adapter={adapter} messages={messages} loading={false} {...commonProps} />
</div>
);#Choices messages
Use choices messages when the assistant should guide the user with suggested actions. Choices are similar to Fancy interactive Pill components in style, but has been changed slightly to support multi-line content.
Payload shape:
type ChoicesChatMessagePayload = {
type: "choices";
choices: Array<{ label: string; icon?: ReactElement }>;
};
The welcome view also uses the same Choice type. This makes it easy to reuse the same prompt suggestions both before and during the conversation.
AI Assistant
const [messages] = useState<ChatMessage[]>([
{
id: "1234",
role: "assistant",
payload: {
type: "choices",
choices: [
{
label: "Option 1",
icon: (
<Icon>
<IconOpenNew />
</Icon>
),
},
{
label: "Option 2",
},
{
label: "Option 3",
},
],
},
createdAt: new Date(),
},
]);
const adapter: ChatAdapter = {
send: async () => {},
};
return (
<div style={{ height: "420px" }}>
<Chat adapter={adapter} messages={messages} loading={false} {...commonProps} />
</div>
);#Chart messages
Use chart messages for chart visualization. Charts are rendered as Fancy Chart components.
Payload shape:
type ChartChatMessagePayload = {
type: "chart";
chart: ChartWithoutAnnotationProps | ChartWithAnnotationProps;
};
AI Assistant
const [messages] = useState<ChatMessage[]>([
{
id: "1234",
role: "assistant",
payload: {
type: "chart",
chart: {
options: {
series: [
{
type: "bar",
name: "Series 1",
data: [25, 23, 48, 52, 52, 57, 56, 61, 66].map((val, i) => {
return {
x: i,
y: val,
name: i.toString(),
};
}),
},
{
type: "bar",
name: "Series 2",
data: [18, 26, 36, 44, 49, 59, 65, 71, 82].map((val, i) => {
return {
x: i,
y: val,
name: i.toString(),
};
}),
},
],
},
},
},
createdAt: new Date(),
},
]);
const adapter: ChatAdapter = {
send: async () => {},
};
return (
<div style={{ height: "500px" }}>
<Chat adapter={adapter} messages={messages} loading={false} {...commonProps} />
</div>
);#Adapters
The adapter is the only piece that knows how to communicate with your backend. It receives a ChatMessageDraft when the user sends something.
type ChatMessageDraft = {
content: string;
type: "prompt" | "choice";
};
type ChatAdapter = {
send: (draft: ChatMessageDraft) => Promise<void>;
act?: (action: ChatAction) => Promise<void>;
};
The Chat UI is controlled; therefore the adapter is expected to update your message state. A common pattern is:
- Append the user message to
messagesas soon as the user sends. - Set
loadingwhile your backend request is in flight. - Append the assistant response as one (or more)
ChatMessageitems.
AI Assistant
const [messages, setMessages] = useState<ChatMessage[]>([]);
const [loading, setLoading] = useState<boolean>(false);
const mockAdapter: ChatAdapter = {
send: async (draft: ChatMessageDraft) => {
// depending on the use case, one might want to implement different logic for prompts vs choices. For this demo, we treat them the same, but you could easily branch based on draft.type.
setMessages((prev) => [
...prev,
{
id: crypto.randomUUID(),
role: "user",
payload: { type: "markdown", markdown: draft.content },
createdAt: new Date(),
},
]);
setLoading(true);
await sleep(800);
setMessages((prev) => [
...prev,
{
id: crypto.randomUUID(),
role: "assistant",
payload: {
type: "markdown",
markdown: `You said: ${draft.content}`,
},
createdAt: new Date(),
},
]);
setLoading(false);
},
act: async (action: ChatAction) => {
if (action.type === "feedback") {
const { feedbackValue, feedbackContext } = action.payload;
setMessages((prev) =>
prev.map((msg) =>
msg.id === feedbackContext.id ? { ...msg, feedback: feedbackValue } : msg
)
);
}
},
};
return (
<div style={{ height: "800px" }}>
<Chat
adapter={mockAdapter}
messages={messages}
loading={loading}
onClose={() => setMessages([])}
{...commonProps}
/>
</div>
);#Built-in HTTP adapter
If your backend is reachable via HTTP, you can use the built-in hook useHttpChatAdapter. The hook owns the messages and loading state, and it returns an adapter compatible with Chat.
AI Assistant
const { adapter, messagesState, loadingState } = useHttpChatAdapter({
baseUrl: "https://api.example.com",
endpoints: {
chat: "/chat",
feedback: "/feedback",
},
// optional customizations
mapInbound: (data): ChatMessage => ({
id: data.id,
role: "assistant",
payload: data.payload ?? { type: "markdown", markdown: data.content ?? "" },
createdAt: data.createdAt ? new Date(data.createdAt) : new Date(),
feedback: data.feedback ?? undefined,
}),
// mock fetch implementation for demonstration purposes
fetchImpl: async (input: RequestInfo | URL, init?: RequestInit) => {
console.log("HTTP Adapter fetch called with:", input, init);
await sleep(800);
// Feedback endpoint demo
if (String(input).endsWith("/feedback")) {
return new Response(JSON.stringify({ ok: true }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
}
const body = JSON.parse(String(init?.body ?? "{}"));
return new Response(
JSON.stringify({
id: crypto.randomUUID(),
content: "You sent: " + (body?.messages?.[0]?.content ?? ""),
}),
{ status: 200, headers: { "Content-Type": "application/json" } }
);
},
});
const [messages, setMessages] = messagesState;
const [loading] = loadingState;
return (
<div style={{ height: "800px" }}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
onClose={() => setMessages([])}
{...commonProps}
/>
</div>
);By default, the HTTP adapter:
- Serializes outbound drafts as
{ messages: [{ role: 'user', content: string }] }. - Maps inbound JSON into an assistant markdown message.
- Optionally posts feedback to a dedicated feedback endpoint when configured.
You can customize the adapter using:
composeMessageto control how user messages are appended locally.mapOutboundto match your backend request schema.mapInboundto map the backend response into aChatMessage.headersandfetchImplfor auth and testing.
#Feedback
Assistant markdown messages can store feedback via the optional feedback field. The UI uses the adapter action act with type: "feedback".
Feedback values use MessageFeedbackType:
enum MessageFeedbackType { POSITIVE = 1, NEGATIVE = -1, NEUTRAL = 0 }The built-in HTTP adapter will automatically:
- POST feedback to
endpoints.feedbackwhen provided. - Update the feedback field in local message state after a successful call.
AI Assistant
const initialAssistant: ChatMessage = {
id: "1234",
role: "assistant",
payload: {
type: "markdown",
markdown:
"This message supports feedback. Use the message actions to rate it and optionally leave a comment.",
},
createdAt: new Date(),
feedback: MessageFeedbackType.NEUTRAL,
};
const [messages, setMessages] = useState<ChatMessage[]>([initialAssistant]);
const [loading] = useState<boolean>(false);
const adapter: ChatAdapter = {
send: async () => {
// not needed for this demo
},
act: async (action: ChatAction) => {
if (action.type === "feedback") {
const { feedbackValue, feedbackContext } = action.payload;
// In a real adapter, you would also persist feedback to your backend.
setMessages((prev) =>
prev.map((m) => (m.id === feedbackContext.id ? { ...m, feedback: feedbackValue } : m))
);
}
},
};
return (
<div style={{ height: "500px" }}>
<Chat adapter={adapter} messages={messages} loading={loading} {...commonProps} />
</div>
);#Errors
Provide error to render either a blocking interrupt state or a non-blocking notification.
AI Assistant
const { adapter, messages, loading } = useMockedAdapter();
const [mode, setMode] = useState<"none" | "notify" | "interrupt">("none");
const error: ChatError | null =
mode === "notify"
? {
displayMode: "notify",
message: "This is a non-blocking notification error. The chat remains usable.",
}
: mode === "interrupt"
? {
displayMode: "interrupt",
heading: "We couldn't connect to the assistant.",
message: "This is a blocking interrupt error. Provide an action to retry.",
action: {
label: "Try again",
callback: () => setMode("none"),
},
}
: null;
return (
<div style={{ height: "650px" }}>
<Content gap="small" padding="none">
<Button.Group aria-label="Error mode">
<Button onClick={() => setMode("none")}>No error</Button>
<Button onClick={() => setMode("notify")}>Notify error</Button>
<Button onClick={() => setMode("interrupt")} variant="destructive">
Interrupt error
</Button>
</Button.Group>
</Content>
<div style={{ height: "560px", marginTop: 16 }}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
error={error}
{...commonProps}
/>
</div>
</div>
);#Layout
#Floating container
Use ChatFloatingContainer when you need a floating chat that overlays page content. The container handles positioning and the close affordance.
const { adapter, messages, loading, error, resetChat } = useMockedAdapter();
const [shown, setShown] = useState<boolean>(false);
return (
<>
<Button onClick={() => setShown(!shown)} variant="primary">
Toggle chat
</Button>
<ChatFloatingContainer shown={shown} onClose={() => setShown(false)}>
<Chat
adapter={adapter}
messages={messages}
loading={loading}
onClose={() => {
setShown(false);
resetChat();
}}
error={error}
{...commonProps}
/>
</ChatFloatingContainer>
</>
);