MHC Talker

AI tools for digital humans that bring Metahuman to life

Unreal Engine Real-Time Driver

MHC Talker

Call AnimaCore API in Unreal Engine to drive high quality dialog for Metahuman characters in real time. One-click configuration of intelligent dialog characters, features include: text or voice dialog, real-time lip-sync driving, post-processing fine-tuning of expressions, voice re-reading of natural head movements, body movement dialog logic driving, animation data recording and more.

In addition, the DNA and BS modifications of character assets are particularly important to the character's animation performance, ensuring that the character's DNA and BS modifications conform to official MetaHuman standard specifications;

One-Click Metahuman Configuration

For individuals or teams of artists, without complex programming, through the visual interface binding Metahuman character models, voice libraries, AI service interfaces, quickly complete the character "intelligent" configuration.

Dual-channel text/voice input

It supports users to interact with Metahuman characters through text input or real-time voice, and the characters can automatically recognize the semantics and generate contextualized voice replies, realizing a truly two-way conversation.

Real-time oral reasoning synchronization

Based on deep learning algorithms, accurately parsing voice content, real-time drive mouth shape and pronunciation highly matching, goodbye to mechanical mouth shape animation, presenting real and natural dialog performance. Full language support.

Speech repetition linked to head micro-movements

Through voice accent recognition technology, it automatically triggers the character's natural head bobbing, head nodding, side tilting and other micro-movements, simulating the subconscious reaction of human conversations, and dramatically improving the realism of interaction.

Post-processing expression fine-tuning control

Provide refined expression control panel, supporting detailed adjustment of AI-generated expressions (e.g., curvature of mouth, eye focus, frown intensity, etc.), realizing accurate portrayal from "basic emotions" to "delicate emotions".

Conversational logic-driven body movement libraries

Characters can automatically trigger matching body movements (e.g. hand gestures, shrugs, hand spreads, etc.) based on the content of the conversation, and support customization of logic rules. For example: head tilted forward when asking a question, waving hands when emphasizing, shrugging shoulders when puzzled, and actions are deeply bound to semantics.

MHC Talker Demo

Real-time interactive dialog

3D digital human real-time dialog

Features include:

Excellent language generalization ability

F.A.Q.

common problems

At present, MetaHuman standard program is the best choice for general-purpose binding, at this stage, only MetaHuman standard binding program is supported, and more program support will be launched later.

Support, MHC Talker can receive text and voice, send text or voice to the plug-in interface can be, the specific operation can be viewed in the relevant documents.

Currently MHC Talker only supports Unreal Engine and will consider updating the Unity version in the future.

Support for Unreal Engine versions 5.3, 5.4, 5.5, continuously updated.

Currently supports both Windows and Linux.

Packaged Windows, Linux and Android are currently supported.

滚动至顶部