Skip to content

Hand Posing

Carson Katri edited this page Apr 8, 2023 · 3 revisions

Project File: hand_posing.blend

Note This project requires the following models:

  • runwayml/stable-diffusion-v1-5
  • lllyasviel/sd-controlnet-openpose
  • lllyasviel/sd-controlnet-depth

Using OpenPose

ControlNet is a tool that lets us control diffusion models like Stable Diffusion. The model lllyasviel/sd-controlnet-openpose gives us the ability to generate images of characters in a particular pose.

Dream Textures can generate a pose map from an armature. Use an armature generated by the Rigify addon and the OpenPose bones will be automatically detected from their names.

We'll also add some hand meshes to help with the hand posing later.

Add a ControlNet node and set the control type to OpenPose. Then choose the collection that contains your armature.

Adding Depth

Using the model lllyasviel/sd-controlnet-depth, we can control the image with the hand meshes as well. Add another ControlNet set to Depth mode.

Controls can be specified with a collection or an image. Use the Depth Map node set to Invert and connect its image output to the depth ControlNet node.

Multiple ControlNet nodes can be connected to a single Stable Diffusion node. This lets you use multiple controls together to influence the result. The strength of each control can be adjusted separately.

Here's how this works:

  1. An OpenPose map is generated from the armature. It looks something like this:

  1. A depth map is rendered for the hands:

  1. The depth and openpose ControlNet models are connected to Stable Diffusion, which influence the generation process. The depth model's strength is decreased a bit to improve the result.

Now hit render, and we get an image with the correct number of fingers!