Unreal Engine Metahuman with Faceware Mark IV HMC, Xsens, Manus & GlassboxTech

If anyone has used Metahumans and facial motion, you understand it is challenging to capture dialogue. A short clip showing my Metahuman speaking in Greek. This is one of the first few tests I did in Unreal Engine using the Faceware Mark IV HMC to record facial motion, with Xsens and MANUS™ to capture body and finger data, simultaneously. I used the Glassbox Technologies Live Client plugin to stream in the facial motion into Unreal from Faceware Studio. All of this is being powered by a custom Puget Systems virtual production workstation coming equipped with the NVIDIA RTXA6000. I am so grateful to be able to utilize all of this incredible technology in order to create in Unreal Engine. I will be releasing a series of tutorials very soon, walking you through my entire process from capturing facial motion with Metahumans to creating cinematics using a virtual camera tool, Dragonfly. Metahumans from Epic Games and 3Lateral. Thank you to Daniel Rodriguez Cadena for your incredible texture work on the female character body. You are a true gem! Thank you to Bernhard Rieder . Fatty Bull for helping me to create this gorgeous cinematic and light setup in Unreal. And thank you Pixel Urge for modifying the body and teaching me how to modify the Metahuman face textures in Substance. Original character by Michael Weisheim Special thank you as always to everyone at Xsens (Katie Jo Turk), MANUS™ (Arsène van de Bilt), Faceware (Catarina M Rodrigues, Karen Chan (she/her), & Pete Busch), Glassbox Technologies (Norman Wang, Johannes Wilke, & Mariana Acuña Acosta) Puget Systems and NVIDIA Design and Visualization. Link to my Discord: #unrealengine #virtualproduction #motioncapture #nvidiartx #pugetsystems #epicgames #metahumans #mocap #xsens #manusmeta #faceware #glassboxtech #metaverse
Back to Top