| ▲ | dreamdeadline 4 hours ago | |
Cool! Do you plan to expose controls over the avatar’s movement, facial expressions, or emotional reactions so users can fine-tune interactions? | ||
| ▲ | lcolucci 3 hours ago | parent | next [-] | |
Yes we do! Within the web app, there's a "action text prompt" section that allows you to control the overall actions of the character (e.g. "a fox talking with lots of arm motions"). We'll soon expose this in the API so you can control the characters movements dynamically (e.g. "now wave your hand") | ||
| ▲ | sid-the-kid 3 hours ago | parent | prev [-] | |
Our text control is good, especially for emotions. For example, you can add the text prompt: "a person talking. they are angry", and agent will have an angry expression. You can also control background motions (like ocean waves, or a waterfall or car driving). We are actively training a model that has better text control over hand motions. | ||