Combined with words, expressions provide a wealth of information about our state of mind in any given context:
- Customer satisfaction or frustration
- Patient health and well-being
- Student comprehension and confusion
- and so much more.
Our Custom Model API unlocks these insights at the click of a button, integrating patterns of facial expression, vocal expression, and language into a single custom model to predict whatever outcome you specify. This works by taking advantage not only of our state-of-the-art expression AI models, but also specialized language-expression embeddings that we have trained on conversational data.
The algorithm that drives our Custom Model API is pretrained on huge volumes of data. That means it already recognizes most patterns of expression and language that people form. All you have to do is add your labels.
You can add labels to your data using the Models page or through our API. Once added, your labels will be used to train a Custom Model that you own and only your account can access. You’ll be able to run the model on any new file through our Playground and Custom Model Inference API. You’ll also get statistics on the accuracy of your Custom Model.
Updated about 2 months ago