EVI .NET Quickstart

A quickstart guide for integrating the Empathic Voice Interface (EVI) with .NET.

In this guide, you’ll learn how to use Hume’s .NET SDK to integrate with EVI.

Make sure that connecting to EVI from your .NET code is the right choice.

If your .NET app is a client app — a desktop application or CLI that runs on the user’s machine and captures audio directly from their microphone — then connecting to EVI from .NET is appropriate.

If your .NET app is a server app that will not run on the same machine to which the user’s microphone is connected, it is usually better to connect to EVI not from .NET code but directly from the client to keep latency low. If you need to control an EVI chat with logic that MUST live on your backend, and have your .NET backend use the Send Message endpoint or Control Plane WebSocket connection to control an EVI chat that was already opened from the client.

The example code in this guide sends EVI hardcoded audio from a file, as a placeholder. You should replace this with logic that sends audio sourced from your user’s microphone.

  1. Environment setup: Download package and system dependencies to run EVI.
  2. Import statements: Import needed symbols from the Hume SDK.
  3. Authentication: Use your API credentials to authenticate your EVI application.
  4. Connection: Set up a secure WebSocket connection to interact with EVI.
  5. Handling incoming messages: Subscribe to events and process messages from EVI.
  6. Audio input: Capture audio data from an input device and send to EVI.

Environment setup

Create a new .NET project and install the required packages:

$dotnet new console -n EviDotnetQuickstart
>cd EviDotnetQuickstart
>dotnet add package Hume
>dotnet add package DotNetEnv

Download sample audio

Download the sample PCM audio file to use with this guide:

$curl -O https://raw.githubusercontent.com/HumeAI/hume-api-examples/main/evi/evi-dotnet-quickstart/sample_input.pcm

Import statements

First, import the needed namespaces from the .NET standard library and the Hume SDK.

Program.cs
1using System;
2using System.IO;
3using System.Linq;
4using System.Threading.Tasks;
5using DotNetEnv;
6using Hume;
7using Hume.EmpathicVoice;

Authentication

Log into your Hume AI Account and obtain an API key. Create a .env file in your project directory and store your API key:

.env
$HUME_API_KEY=your_api_key_here

Load the environment variables and use the API key to instantiate the HumeClient class. This is the main entry point provided by the Hume .NET SDK.

Program.cs
1Env.Load();
2
3var apiKey = Environment.GetEnvironmentVariable("HUME_API_KEY")
4 ?? throw new InvalidOperationException("HUME_API_KEY environment variable is required.");
5var client = new HumeClient(apiKey);

Connection

To connect to an EVI chat, create a ChatApi instance using the client.EmpathicVoice.CreateChatApi method. You can specify session settings in the ChatApi.Options object.

Program.cs
1// Create a signal to wait for Chat Metadata
2var chatMetadataReceived = new TaskCompletionSource<bool>();
3
4// Create the ChatApi instance
5var chatApi = client.EmpathicVoice.CreateChatApi(new ChatApi.Options
6{
7 ApiKey = apiKey,
8 SessionSettings = new ConnectSessionSettings(),
9});

Connect to EVI and wait for the chat metadata to confirm the connection is established:

Program.cs
1// Connect to EVI
2Console.WriteLine("Connecting to EVI...");
3await chatApi.ConnectAsync();
4Console.WriteLine("Connected!");
5
6// Wait for Chat Metadata
7Console.WriteLine("Waiting for Chat Metadata...");
8await chatMetadataReceived.Task;
9Console.WriteLine("Chat Metadata received.");

Handling incoming messages

EVI communicates through events. Subscribe to the events you want to handle before connecting. The main event types are:

  • AssistantMessage: Text messages from EVI
  • UserMessage: Transcriptions of user speech
  • AudioOutput: Audio data for playback
  • ChatMetadata: Information about the chat session
Program.cs
1// Subscribe to events
2chatApi.AssistantMessage.Subscribe(message =>
3{
4 Console.WriteLine($"Assistant: {message.Message?.Content}");
5});
6
7chatApi.UserMessage.Subscribe(message =>
8{
9 Console.WriteLine($"User: {message.Message?.Content}");
10});
11
12chatApi.AudioOutput.Subscribe(audio =>
13{
14 Console.WriteLine($"Received audio chunk: {audio.Data?.Length ?? 0} bytes");
15});
16
17chatApi.ChatMetadata.Subscribe(metadata =>
18{
19 Console.WriteLine($"Chat Metadata - Chat ID: {metadata.ChatId}");
20 chatMetadataReceived.TrySetResult(true);
21});

Audio input

Before sending audio, configure the audio format by sending session settings. EVI expects audio in a specific format (e.g., 48kHz, 16-bit, mono PCM).

Program.cs
1// Configure audio format (48kHz, 16-bit, mono PCM)
2const int sampleRate = 48000;
3const int channels = 1;
4
5var sessionSettings = new SessionSettings
6{
7 Audio = new AudioConfiguration
8 {
9 Encoding = "linear16",
10 SampleRate = sampleRate,
11 Channels = channels
12 }
13};
14
15await chatApi.Send(sessionSettings);

Sending audio data

Audio data should be sent as base64-encoded chunks. Here’s a helper function that reads a PCM file and streams it to EVI in real-time chunks:

Program.cs
1static async Task TransmitTestAudio(ChatApi chatApi, string filePath, int sampleRate, int channels)
2{
3 const int chunkDurationMs = 10;
4 const int bytesPerSample = 2; // 16-bit audio
5 int bytesPerChunk = sampleRate * bytesPerSample * channels * chunkDurationMs / 1000;
6
7 // Read PCM file
8 var audioData = File.ReadAllBytes(filePath);
9
10 // Split into chunks and send with appropriate timing
11 for (int offset = 0; offset < audioData.Length; offset += bytesPerChunk)
12 {
13 var chunkSize = Math.Min(bytesPerChunk, audioData.Length - offset);
14 var chunk = audioData.Skip(offset).Take(chunkSize).ToArray();
15
16 // Pad final chunk if needed
17 if (chunk.Length < bytesPerChunk)
18 {
19 chunk = chunk.Concat(new byte[bytesPerChunk - chunk.Length]).ToArray();
20 }
21
22 // Send as base64-encoded audio input
23 var data = Convert.ToBase64String(chunk);
24 await chatApi.Send(new AudioInput { Data = data });
25
26 // Delay to simulate real-time streaming
27 await Task.Delay(chunkDurationMs);
28 }
29}

Put it all together

Here’s the complete example that connects to EVI and transmits audio:

Program.cs
1using System;
2using System.IO;
3using System.Linq;
4using System.Threading.Tasks;
5using DotNetEnv;
6using Hume;
7using Hume.EmpathicVoice;
8
9Env.Load();
10
11var apiKey = Environment.GetEnvironmentVariable("HUME_API_KEY")
12 ?? throw new InvalidOperationException("HUME_API_KEY environment variable is required.");
13var client = new HumeClient(apiKey);
14
15// Create a signal to wait for Chat Metadata
16var chatMetadataReceived = new TaskCompletionSource<bool>();
17
18// Create the ChatApi instance
19var chatApi = client.EmpathicVoice.CreateChatApi(new ChatApi.Options
20{
21 ApiKey = apiKey,
22 SessionSettings = new ConnectSessionSettings(),
23});
24
25// Subscribe to events
26chatApi.AssistantMessage.Subscribe(message =>
27{
28 Console.WriteLine($"Assistant: {message.Message?.Content}");
29});
30
31chatApi.UserMessage.Subscribe(message =>
32{
33 Console.WriteLine($"User: {message.Message?.Content}");
34});
35
36chatApi.AudioOutput.Subscribe(audio =>
37{
38 Console.WriteLine($"Received audio chunk: {audio.Data?.Length ?? 0} bytes");
39});
40
41chatApi.ChatMetadata.Subscribe(metadata =>
42{
43 Console.WriteLine($"Chat Metadata - Chat ID: {metadata.ChatId}");
44 chatMetadataReceived.TrySetResult(true);
45});
46
47// Connect to EVI
48Console.WriteLine("Connecting to EVI...");
49await chatApi.ConnectAsync();
50Console.WriteLine("Connected!");
51
52// Wait for Chat Metadata
53await chatMetadataReceived.Task;
54
55// Configure audio format (48kHz, 16-bit, mono PCM)
56const int sampleRate = 48000;
57const int channels = 1;
58
59var sessionSettings = new SessionSettings
60{
61 Audio = new AudioConfiguration
62 {
63 Encoding = "linear16",
64 SampleRate = sampleRate,
65 Channels = channels
66 }
67};
68
69await chatApi.Send(sessionSettings);
70
71// Send audio (replace with your audio source)
72// await TransmitTestAudio(chatApi, "sample_input.pcm", sampleRate, channels);
73
74// Wait for responses
75await Task.Delay(5000);
76
77await chatApi.DisposeAsync();

Running the example

$dotnet run

View the complete example code on GitHub.

Next steps

Next, consider exploring these areas to enhance your EVI application:

For further details and practical examples, explore the API Reference and our Hume API Examples on GitHub.