EVI Next.js Quickstart
A quickstart guide for implementing the Empathic Voice Interface (EVI) with Next.js.
With Hume’s React SDK, WebSocket connection management is handled for you and the complexities of audio capture, playback, and streaming are abstracted away. You can integrate EVI into your React app with just a few hooks and components, without writing any low-level WebSocket or audio code.
In this guide, you’ll learn how to integrate EVI into your Next.js applications using Hume’s React SDK, with step-by-step instructions for both the App Router and the Pages Router.
See the complete implementation of this guide on GitHub
Explore or contribute to Hume’s React SDK on GitHub
This guide is broken up into five sections:
- Installation: Install Hume SDK packages.
- Authentication: Generate and use an access token to authenticate with EVI.
- Context provider: Set up the
<VoiceProvider/>
. - Connection: Open a WebSocket connection and start a chat with EVI.
- Display chat: Display chat messages in the UI.
Before you begin, you’ll need an existing Next.js project.
Installation
Install Hume’s React SDK and TypeScript SDK packages.
pnpm
npm
yarn
bun
Authentication
Generate an access token for authentication. Doing so will require your API key and Secret key. These keys can be obtained by logging into the portal and visiting the API keys page.
Load your API key and secret from environment variables. Avoid hardcoding them in your code to prevent credential leaks and unauthorized access.
App Router
Pages Router
In your root component, use the TypeScript SDK’s fetchAccessToken
method to fetch your access token.
Context Provider
After fetching our access token we can pass it to our Chat
component. First we set up the <VoiceProvider/>
so that
our Messages
and StartCall
components can access the context.
We also pass the access token to the accessToken
prop of the StartCall
component for setting up the WebSocket
connection.
App Router
Pages Router
Connection
Use the useVoice
hook’s connect
method for starting a Chat session. It is important that this event is
attached to a user interaction event (like a click) so that the browser is capable of recording and playing
back audio.
Implementing this step is the same whether you are using the App Router or Pages Router.
Display chat
Use the useVoice
hook to access the messages
array. We can then map over the messages
array to display the
role (Assistant
or User
) and content of each message.
Implementing this step is the same whether you are using the App Router or Pages Router.
Next steps
Congratulations! You’ve successfully integrated EVI using Hume’s React SDK.
Next, consider exploring these areas to enhance your EVI application:
See detailed instructions on how you can customize EVI for your application needs.
Learn how you can access and manage conversation transcripts and expression measures.
For further details and practical examples, explore the API Reference and our Hume API Examples on GitHub.