.NET quickstart

Submit batch jobs and stream predictions in real time using Hume's .NET SDK.

This guide walks you through using Hume’s Expression Measurement API with the .NET SDK. You will submit a batch job to analyze a media file and then connect to the streaming API for real-time predictions.

Setup

Install the SDK

$dotnet add package Hume

Set your API key

Get your API key from the Hume AI platform and set it as an environment variable.

$export HUME_API_KEY=your_api_key_here

Create the client

1using Hume;
2
3var client = new HumeClient(Environment.GetEnvironmentVariable("HUME_API_KEY"));

Batch API

The Batch API lets you submit files for processing and retrieve predictions when the job completes. This is best for analyzing recordings, datasets, and other pre-recorded content.

Submit a job

Start a job by specifying the models you want to run and the URLs of the files to process.

1using Hume.ExpressionMeasurement.Batch;
2
3var jobId = await client.ExpressionMeasurement.Batch.StartInferenceJobAsync(
4 new InferenceBaseRequest
5 {
6 Urls = new List<string> { "https://hume-tutorials.s3.amazonaws.com/faces.zip" },
7 Models = new Models
8 {
9 Prosody = new Prosody { Granularity = Granularity.Utterance },
10 },
11 }
12);
13
14Console.WriteLine($"Job ID: {jobId.JobId_}");

You can also upload local files instead of URLs. See the API reference for the local file upload endpoint.

Wait for the job to complete

Poll the job status until it reaches COMPLETED or FAILED.

1while (true)
2{
3 var jobDetails = await client.ExpressionMeasurement.Batch.GetJobDetailsAsync(jobId.JobId_);
4
5 if (jobDetails.State.IsCompleted)
6 {
7 Console.WriteLine("Job completed.");
8 break;
9 }
10 else if (jobDetails.State.IsFailed)
11 {
12 Console.WriteLine("Job failed.");
13 break;
14 }
15
16 Console.WriteLine($"Status: {jobDetails.State.Status}");
17 await Task.Delay(3000);
18}

For production use, consider passing a CallbackUrl when submitting the job. Hume will send a POST request to your URL when the job completes, eliminating the need to poll. The webhook payload includes the job_id, status, and predictions.

Retrieve predictions

Once the job completes, retrieve and print the predictions.

1var predictions = await client.ExpressionMeasurement.Batch.GetJobPredictionsAsync(jobId.JobId_);
2
3foreach (var result in predictions)
4{
5 foreach (var filePrediction in result.Results.Predictions)
6 {
7 foreach (var group in filePrediction.Models.Prosody.GroupedPredictions)
8 {
9 foreach (var prediction in group.Predictions)
10 {
11 Console.WriteLine($"\nText: {prediction.Text}");
12
13 var topEmotions = prediction.Emotions
14 .OrderByDescending(e => e.Score)
15 .Take(3);
16
17 foreach (var emotion in topEmotions)
18 {
19 Console.WriteLine($" {emotion.Name}: {emotion.Score:F3}");
20 }
21 }
22 }
23 }
24}

Predictions are also available as CSV files. Use the Get job artifacts endpoint to download a zip archive containing one CSV per model.

Streaming API

The Streaming API provides real-time predictions over a WebSocket connection. This is best for live audio, video, and interactive applications.

Connect and send data

Create a streaming connection and send text for analysis.

1using Hume.ExpressionMeasurement.Stream;
2
3var streamApi = client.ExpressionMeasurement.Stream.CreateStreamApi();
4await streamApi.ConnectAsync();
5
6streamApi.StreamModelPredictions.Subscribe(async (predictions) =>
7{
8 if (predictions.Language?.Predictions != null)
9 {
10 foreach (var prediction in predictions.Language.Predictions)
11 {
12 Console.WriteLine($"Text: {prediction.Text}");
13
14 var topEmotions = prediction.Emotions
15 .OrderByDescending(e => e.Score)
16 .Take(3);
17
18 foreach (var emotion in topEmotions)
19 {
20 Console.WriteLine($" {emotion.Name}: {emotion.Score:F3}");
21 }
22 }
23 }
24});
25
26await streamApi.Send(new StreamModelsEndpointPayload
27{
28 Data = "I am so excited to try this out!",
29 RawText = true,
30 Models = new Config
31 {
32 Language = new StreamLanguage { Granularity = "sentence" },
33 },
34});
35
36// Allow time for the response
37await Task.Delay(2000);
38
39await streamApi.CloseAsync();

Stream a file

You can also send audio or video files through the streaming connection by encoding them as base64.

1var streamApi = client.ExpressionMeasurement.Stream.CreateStreamApi();
2await streamApi.ConnectAsync();
3
4streamApi.StreamModelPredictions.Subscribe(async (predictions) =>
5{
6 if (predictions.Prosody?.Predictions != null)
7 {
8 foreach (var prediction in predictions.Prosody.Predictions)
9 {
10 var topEmotions = prediction.Emotions
11 .OrderByDescending(e => e.Score)
12 .Take(3);
13
14 foreach (var emotion in topEmotions)
15 {
16 Console.WriteLine($" {emotion.Name}: {emotion.Score:F3}");
17 }
18 }
19 }
20});
21
22var audioBytes = await File.ReadAllBytesAsync("sample.mp3");
23var base64Audio = Convert.ToBase64String(audioBytes);
24
25await streamApi.Send(new StreamModelsEndpointPayload
26{
27 Data = base64Audio,
28 Models = new Config
29 {
30 Prosody = new Dictionary<string, object?>(),
31 },
32});
33
34// Allow time for the response
35await Task.Delay(5000);
36
37await streamApi.CloseAsync();