Chat History

Guide to accessing chat history with the EVI.

EVI records detailed conversation histories, enabling developers to review and analyze past chat sessions. This guide introduces Chats and Chat Groups, and explains how to retrieve chat transcripts, expression measurements, and reconstructed audio.

If data retention is disabled, chat history will not be recorded. This means past chat data and audio reconstructions will no longer be accessible or retrievable.

Chats vs Chat Groups

EVI organizes conversation history into two levels: Chats and Chat Groups.

  • Chats represent individual sessions, beginning when a WebSocket connection is established and ending when it closes. Each chat contains the messages and events recorded during that session.
  • Chat Groups link related chats to maintain continuity across multiple sessions. A group can contain one or more chats, allowing conversations to persist even when users disconnect and reconnect.

By default, a new chat session creates a new chat group. If the session resumes a previous conversation, the new chat is added to the existing chat group, preserving the full interaction history and context across sessions.

Fetching Chats & Chat Groups

Each Chat has a unique chat_id and a chat_group_id that links it to its corresponding Chat Group. Similarly, each Chat Group has its own ID, allowing you to retrieve individual sessions or entire sequences of related interactions.

Chat ID

Use the list Chats endpoint to fetch chats. The returned chat_id can be used to fetch chat details or resume a previous session.

1curl -G https://api.hume.ai/v0/evi/chats \
2 -H "X-Hume-Api-Key: <YOUR_API_KEY>" \
3 -d page_number=0 \
4 -d page_size=10 \
5 -d ascending_order=false

Chat Group ID

Every chat includes a chat_group_id that identifies the group it belongs to. To fetch chat groups directly, use the list Chat Groups endpoint. This is useful for retrieving all chats that are part of an ongoing conversation.

1curl -G https://api.hume.ai/v0/evi/chat_groups \
2 -H "X-Hume-Api-Key: <YOUR_API_KEY>" \
3 -d page_number=0 \
4 -d page_size=1 \
5 -d ascending_order=false

From chat_metadata

You can also extract both IDs at the start of every session via the chat_metadata message. This is useful for associating downstream actions or data with the active chat session.

chat_metadata
1{
2 "type": "chat_metadata",
3 "chat_group_id": "369846cf-6ad5-404d-905e-a8acb5cdfc78",
4 "chat_id": "470a49f6-1dec-4afe-8b61-035d3b2d63b0",
5 "request_id": "73c75efd-afa2-4e24-a862-91096b0961362258039"
6}

Viewing Chats in the Platform UI

You can also explore chat history and retrieve Chat IDs directly through the Platform UI:

  1. Visit the Chat history page to see a paginated list of past chats. Each entry displays key information such as the Chat ID, timestamp, event count, and duration.

    Platform UI chat history page
  2. Click “Open” on any chat to view its full details. The chat details page includes the Chat ID, Chat Group ID, start and end timestamps, duration, status, associated Config ID (if applicable), and a paginated list of recorded chat events.

    Platform UI chat details page

Chat Events

Each Chat consists of a sequence of predefined events that represent everything that occurred during the session.

The table below outlines each event type and its purpose.

Chat Event Description
SYSTEM_PROMPT The system prompt used to initialize the session.
CHAT_START_MESSAGE Marks the beginning of the chat session.
USER_RECORDING_START_MESSAGE Marks when the client began streaming audio.
USER_MESSAGE A message sent by the user.
USER_INTERRUPTION A user-initiated interruption while the assistant is speaking.
AGENT_MESSAGE A response generated by the assistant.
FUNCTION_CALL A record of a tool invocation by the assistant.
FUNCTION_CALL_RESPONSE The result of a previously invoked function or tool.
PAUSE_ONSET

Marks when the client sent a pause_assistant_message.

RESUME_ONSET

Marks when the client sent a resume_assistant_message.

CHAT_END_MESSAGE Indicates the end of the chat session.

Fetching Chat Events

The Chat Events API lets you retrieve detailed event data for a specific Chat or an entire Chat Group. Each event represents a message, action, or system signal recorded during a session. You can use these endpoints to reconstruct transcripts, analyze interactions, and extract emotion predictions.

Fetching events for a Chat

Use the /chats/{chat_id}/events endpoint to fetch events for a single Chat:

1curl -G https://api.hume.ai/v0/evi/chats/<YOUR_CHAT_ID> \
2 -H "X-Hume-Api-Key: <YOUR_API_KEY>" \
3 -d page_number=0 \
4 -d page_size=10 \
5 -d ascending_order=false

Fetching events for a Chat Group

Use the /chat_groups/{chat_group_id}/events endpoint to fetch events across Chats within a Chat Group:

1curl -G https://api.hume.ai/v0/evi/chats/<YOUR_CHAT_GROUP_ID> \
2 -H "X-Hume-Api-Key: <YOUR_API_KEY>" \
3 -d page_number=0 \
4 -d page_size=10 \
5 -d ascending_order=false

Parsing Chat Events

Chat events provide a structured record of each conversation, capturing both transcribed messages and expression measures over time. Use this data to generate readable transcripts, analyze sentiment, and build visualizations of user–assistant interactions.

The following examples show how to work with chat event data using the Hume SDKs:

Chat transcription

Conversation transcripts can be reconstructed from USER_MESSAGE and AGENT_MESSAGE events. These events include the speaker’s role, timestamp, and message text, allowing you to format the dialogue into a readable script.

The following example extracts a chat transcript from a list of events and writes it to a text file:

1import fs from "fs";
2import { ReturnChatEvent } from "hume/api/resources/empathicVoice";
3
4function generateTranscript(chatEvents: ReturnChatEvent[]): void {
5 // Filter events for user and assistant messages
6 const relevantChatEvents = chatEvents.filter(
7 (chatEvent) => chatEvent.type === "USER_MESSAGE" || chatEvent.type === "AGENT_MESSAGE"
8 );
9
10 // Map each relevant event to a formatted line
11 const transcriptLines = relevantChatEvents.map((chatEvent) => {
12 const role = chatEvent.role === "USER" ? "User" : "Assistant";
13 const timestamp = new Date(chatEvent.timestamp).toLocaleString();
14 return `[${timestamp}] ${role}: ${chatEvent.messageText}`;
15 });
16
17 // Join all lines into a single transcript string
18 const transcript = transcriptLines.join("\n");
19 // Define the transcript file name
20 const transcriptFileName = `transcript_${CHAT_ID}.txt`;
21 // Write the transcript to a text file
22 try {
23 fs.writeFileSync(transcriptFileName, transcript, "utf8");
24 console.log(`Transcript saved to ${transcriptFileName}`);
25 } catch (fileError) {
26 console.error(
27 `Error writing to file ${transcriptFileName}:`,
28 fileError
29 );
30 }
31}

Expression measurement

Expression measurement predictions are stored in the USER_MESSAGE events under the emotion_features property. These predictions provide confidence levels for various emotions detected in the user’s speech.

For example, you might want to gauge the emotional tone of a conversation to better understand user sentiment. This information can guide customer support strategies or highlight trends in the expression measurement predictions over time.

The following example calculates the top 3 emotions from the USER_MESSAGE events by averaging their emotion scores across the Chat session:

1import { ReturnChatEvent, EmotionScores } from "hume/api/resources/empathicVoice";
2
3function getTopEmotions(chatEvents: ReturnChatEvent[]): Partial<EmotionScores> {
4 // Extract user messages that have emotion features
5 const userMessages = chatEvents.filter(
6 (event) => event.type === "USER_MESSAGE" && event.emotionFeatures
7 );
8
9 const totalMessages = userMessages.length;
10
11 // Infer emotion keys from the first user message
12 const firstMessageEmotions = JSON.parse(userMessages[0].emotionFeatures!) as EmotionScores;
13 const emotionKeys = Object.keys(firstMessageEmotions) as (keyof EmotionScores)[];
14
15 // Initialize sums for all emotions to 0 (no extra type assertions needed)
16 const emotionSums: Record<keyof EmotionScores, number> = Object.fromEntries(
17 emotionKeys.map((key) => [key, 0])
18 ) as Record<keyof EmotionScores, number>;
19
20 // Accumulate emotion scores from each user message
21 for (const event of userMessages) {
22 const emotions = JSON.parse(event.emotionFeatures!) as EmotionScores;
23 for (const key of emotionKeys) {
24 emotionSums[key] += emotions[key];
25 }
26 }
27
28 // Compute average scores for each emotion
29 const averageEmotions = emotionKeys.map((key) => ({
30 emotion: key,
31 score: emotionSums[key] / totalMessages,
32 }));
33
34 // Sort by average score (descending) and pick the top 3
35 averageEmotions.sort((a, b) => b.score - a.score);
36 const top3 = averageEmotions.slice(0, 3);
37
38 // Build a Partial<EmotionScores> with only the top 3 emotions
39 const result: Partial<EmotionScores> = {};
40 for (const { emotion, score } of top3) {
41 result[emotion] = score;
42 }
43
44 return result;
45}