MeFaMo – MediapipeFaceMocap

MeFaMo calculates the facial keypoints and blend shapes of a user. Instead of using the built in IPhone blend shape calculation (like LiveLinkFace App does), this uses the Googles Mediapipe to calculate the facial key points of a face. Those key points will then be used to calculate several facial blend shapes (like eyebrows, blinking, smiling etc.). You only need a PC with a webcam and no external device to use it. It uses my PyLiveLinkFace library to send the blend shapes directly into the currently opened Unreal LiveLink Project (the Unreal Engine can also run on a separate PC).

alt text

It’s not fully finished yet and missing a calibration feature to recalibrate all the values to several other faces, but it’s a good start on how to calculate the blend shapes and create your own facial motion capture with Unreal.

If you find this project useful and want to support me, feel free to buy me a coffee:

"Buy Me A Coffee"

Prerequisites

To setup the LiveLink plugin and system in Unreal, see the following tutorial: https://docs.unrealengine.com/4.27/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/

Requirements

MeFaMo needs the following python libraries:

  • numpy
  • cv2
  • pylivelinkface
  • mediapipe
  • transforms3d
  • open3d

Install

To install it, clone the git repo and install it with the setup.py file:

python setup.py install

Usage

To use MeFaMo, just execute the mefamo_cli.py file in the examples folder:

python mefamo_cli.py

There’s also an experemental GUI (which doesn’t look different to the default executable, but uses kivy for future work).

GitHub

View Github