Tutorials
July 12, 2024

Setup ‘LivePortrait’ with ComfyUI Workflow

Step-by-step guide to setup ‘LivePortrait’ with ComfyUI workflow on your local machine.

Jim Clyde Monge
by 
Jim Clyde Monge

In the past couple of days, videos of animated portraits from single still images have been going viral on social media platforms like X, Reddit, and LinkedIn. These lifelike portrait animations were the results of a new and open-source AI tool called LivePortrait.

LivePortrait builds upon the face vid2vid framework, which animates still portraits using motion features from driving video sequences.

LivePortrait builds upon the face vid2vid framework, which animates still portraits using motion fevatures from driving video sequences.
Image from LivePortrait Whitepaper

If you want to learn about how it works, check out the whitepaper here.

In this article, I will walk you through the step-by-step process of setting up LivePortrait on your local PC.

The process is pretty simple, there will only be three steps that we need to do:

  1. Setup LivePortrait
  2. Run the Gradio web application
  3. Generate animated portraits

Before diving into the step-by-step guide, make sure that your environment is set up or meets the minimum hardware requirements of LivePortrait.

Pre-requisites:

  • A capable GPU (I am using an NVIDIA RTX 3060 Ti with 8GB VRAM)
  • Download and install the latest Python 3.x
  • Download and install the latest Git
  • Windows 11 Pro OS
  • 16 GB RAM and at least 10 GB local disk space

Set up LivePortrait

  1. Clone the Repository: Open your terminal or command prompt and run the following command to clone the LivePortrait GitHub repository:

git clone https://github.com/xinntao/LivePortrait.git

This will create a new folder named ‘LivePortrait’ on your local disk. Alternatively, you can download the GitHub repository and open a terminal to the ‘LivePortrait’ folder.

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Image by Jim Clyde Monge

2. Create a Conda Environment: We’ll use Conda to create a clean environment for LivePortrait. If you don’t have Conda installed on your system, check on this guide to install it.

Once you have Conda, run:

conda create -n LivePortrait python==3.9.18

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Image by Jim Clyde Monge

This creates a new environment called ‘LivePortrait’ with Python 3.9.18.

3. Activate the Environment: Switch to your newly created environment by running:

conda activate LivePortrait

There will be no logs generated with this command.

4. Install Dependencies: Install all the packages LivePortrait needs with the following command:

pip install -r requirements.txt --user

Be warned that this step will download large files so be sure you have enough local disk space.

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Image by Jim Clyde Monge

Download and extract the LivePortrait weights from here. The final file structure should look like this:

pretrained_weights
├── insightface
│   └── models
│       └── buffalo_l
│           ├── 2d106det.onnx
│           └── det_10g.onnx
└── liveportrait
   ├── base_models
   │   ├── appearance_feature_extractor.pth
   │   ├── motion_extractor.pth
   │   ├── spade_generator.pth
   │   └── warping_module.pth
   ├── landmark.onnx
   └── retargeting_models
       └── stitching_retargeting_module.pth

Run Gradio UI

With everything set up, it’s time to launch the LivePortrait interface. In your terminal, run the following command:

python app.py

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Image by Jim Clyde Monge

You can either run it on a local URL or a public URL.

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Image by Jim Clyde Monge

The local URL is for using LivePortrait on your own machine, while the public URL allows you to share the interface with others. Keep in mind that the public URL expires after 72 hours. For free permanent hosting and GPU upgrades, you can run gradio deploy from the terminal to deploy to HuggingFace Spaces.

Generating Animated Portraits

Now that you have LivePortrait up and running, let’s create your first animation.

  1. Select a target image you’d like to animate. For best results, choose a clear, front-facing portrait photo.
  2. Upload a source video that will serve as a reference for the animation. This video should feature facial expressions and movements you want to apply to your target image.
  3. You can leave all the settings at their default values for your first attempt.
  4. Click on the “Animate” button to start the video processing.
How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Image by Jim Clyde Monge

On my NVIDIA RTX 3060 Ti with 8GB VRAM, the process took approximately two minutes. Your processing time may vary depending on your hardware specifications. Once complete, you’ll see the final result displayed in the Gradio web app.

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Image by Jim Clyde Monge

Here’s a clearer view of the final video.

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Video by Jim Clyde Monge

LivePortrait also works with a 3D cartoon face with Disney-ish aesthetics. Here is an example:

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Video by Jim Clyde Monge

You can play around with various types of images and animated reference videos. Trust me, it’s so fun to use and the results are often hilarious.

How To Setup The Viral ‘LivePortrait’ — A FREE AI Portrait Animation Tool
Video by Jim Clyde Monge

Below are the results of inferring one frame on an RTX 4090 GPU using the native PyTorch framework with torch.compile:

Image by Jim Clyde Monge

The possibilities are virtually endless, and the results can range from impressively realistic to hilariously entertaining. Don’t be afraid to experiment with various types of images and animated reference videos — that’s half the fun!

If you are more comfortable working with ComfyUI, check out the workflow below.

GitHub - kijai/ComfyUI-LivePortraitKJ: ComfyUI nodes for LivePortrait
ComfyUI nodes for LivePortrait. Contribute to kijai/ComfyUI-LivePortraitKJ development by creating an account on…github.com

While I won’t go through the step-by-step process of setting it up and running in this guide, it’s definitely worth exploring if you’re familiar with ComfyUI. If there’s enough interest, I’d be happy to create a separate guide focusing on the ComfyUI integration.

Let me know in the comments if that’s something you’d like to see!

Final Thoughts

Overall, this open-source AI tool that brings photos to life is a ton of fun to play with. It’s not perfect though — it doesn’t work on non-human subjects and struggles with portraits at weird angles.

As an open-source project, LivePortrait is likely to improve over time, thanks to contributions from the global developer community. It would be exciting to see future versions support more flexible head or lip movements, or even the ability to dub text or audio onto the animated portraits.

I’m currently exploring various ComfyUI workflows and considering building a more user-friendly app around LivePortrait. The goal would be to make this technology accessible to users without requiring them to go through the complex setup process I’ve outlined here.

Get your brand or product featured on Jim Monge's audience