To make a 3D model ( FBX) react to body movements by MediaPipe

I’m a beginner in TouchDesigner.
I created a 3D garment using CLO3D and I’d like to make it morph in response to my body movements using MediaPipe, but I’m having some trouble and would really appreciate your help.

My visual goal is to have the 3D object move naturally in sync with the direction my body is facing, along with my shoulders and arm joints—almost like looking into a mirror.

Is it possible to morph a solid 3D object naturally based on body movement?
If so, I’d be very grateful if you could let me know how I should prepare the FBX file for that to work.

I modeled the clothing in CLO 3D and exported it as an FBX file. Depending on whether I choose “single object” or “multiple objects” when exporting, the file is imported into TouchDesigner differently, as shown in the screenshots.

Since I want each part of the clothing to move separately, it seems like exporting as multiple objects is the better choice. However, when I do that, the imported model appears only as lines instead of solid geometry. Would it still be possible to achieve my goal using a model exported as a single object?

Here’s the link to the project file.
Thank you so much in advance for your guidance!

This isn’t really an easy beginner problem or solution here. I have goofed around trying to figure this out but I don’t know much about FBX formatting. Your example does not include the FBX file so I couldn’t try to attach my broken “solution” to your file but here is a script chop that should get you closer to an answer.

I was working from some examples in the POPs alpha that run some animations on a humanoid skeleton. I don’t do any 3d rendering so I don’t have easy access to a t pose to get a simpler test going so I can’t really see why this doesn’t work.

This is hacked together from a method I used to get simple joint angles from body tracking data for sonification but the principle is the same. MediaPipe doesn’t include any joint angle info so you have to calculate those from 3 joint positions. I create a mapping using the fbx names that I observed and the names of the 3 points that could compute that rotation. I naively set the position as the midpoint of the first two positions. If I want to compute the rotation around midpoint of only two joints I leave the third string blank.

There are also parameters RightForeArmRoll and other Roll parameters that I have no idea what they are communicating

If you dropped the fbx I might be able to test with that at some point. Maybe someone more versed in fbx would be interested in looking into this. I don’t even know where best to find a detailed description of the fbx data format. I’m too lazy to sift through the Autodesk marketing jargon to find it at the moment.

I think that fbx is somehow a Tree like structure so perhaps the individual points are offsets from the parent rather than absolute positions like I currently compute. i.e. LeftArm is a directional vector from LeftShoulder. This would be a clear bug in my code to anyone who knows what they’re doing here

import numpy as np


def compute_rotation_matrix(p1, p2, p3):
    x_axis = p2 - p1
    x_axis /= np.linalg.norm(x_axis)

    helper = p3 - p1
    z_axis = np.cross(x_axis, helper)
    z_axis /= np.linalg.norm(z_axis)

    y_axis = np.cross(z_axis, x_axis)
    y_axis /= np.linalg.norm(y_axis)

    return np.stack([x_axis, y_axis, z_axis], axis=1)  # 3x3 matrix


def matrix_to_euler_xyz(rotation_matrix):
  
    # rx
    roll = np.arctan2(rotation_matrix[2, 1], rotation_matrix[2, 2])

    # ry
    pitch = np.arctan2(-rotation_matrix[2, 0], np.sqrt(rotation_matrix[0, 0]**2 + rotation_matrix[1, 0]**2))

    # rz
    yaw = np.arctan2(rotation_matrix[1, 0], rotation_matrix[0, 0])

    return np.array([roll, pitch, yaw])


def pt(src, names):
    assert len(names) == 3, "Expected a list or tuple of exactly 3 names."
    p1 = np.array(src[names[0]].vals)
    p2 = np.array(src[names[1]].vals)
    if names[2] == "":
        p3 = 0.5 * (p1 + p2)
    else:
        p3 = np.array(src[names[2]].vals)
    return p1, p2, p3



def compute_rotation_matrix(p1, p2, p3):
    x_axis = p1 - p3
    x_axis /= np.linalg.norm(x_axis)

    helper = p2 - p3
    z_axis = np.cross(x_axis, helper)
    z_axis /= np.linalg.norm(z_axis)

    y_axis = np.cross(z_axis, x_axis)
    y_axis /= np.linalg.norm(y_axis)
    if np.dot(np.cross(x_axis, y_axis), z_axis) < 0:
        z_axis = -z_axis
        y_axis = np.cross(z_axis, x_axis)
        y_axis /= np.linalg.norm(y_axis)

    return np.stack([x_axis, y_axis, z_axis], axis=1)


def onCook(scriptCHOP):
    scriptCHOP.clear()
    src = scriptCHOP.inputs[0]

    # FBX joint mapping table
    joint_map = {
        # Hips and Spine
        'Hips':           ('left_hip', 'right_hip', ''),
        'Spine':          ('left_shoulder', 'right_shoulder', ''),
        'Chest':          ('left_shoulder', 'right_shoulder', ''),
        'Neck':           ('left_shoulder', 'right_shoulder', 'nose'),
        'Head':           ('left_eye_inner', 'right_eye_inner','nose'),

        # Shoulders and Arms
        #'LeftArm':   ('left_shoulder', 'left_elbow', ''),
        'LeftForeArm':   ('left_elbow', 'left_wrist', ''),
        #'RightArm':  ('right_shoulder', 'right_elbow', ''),
        'RightForeArm':  ('right_elbow', 'right_wrist', ''),

        # Legs
        'LeftUpLeg':   ('left_hip', 'left_knee', ''),
        'LeftLeg':   ('left_knee', 'left_ankle', ''),
        'RightUpLeg':  ('right_hip', 'right_knee', ''),
        'RightLeg':  ('right_knee', 'right_ankle', ''),

        # Hands
        'LeftHand':       ('left_wrist', 'left_thumb', 'left_index'),
        'RightHand':      ('right_wrist', 'right_thumb', 'right_index'),

        # Feet
        'LeftFoot':       ('left_ankle', 'left_foot_index', 'left_knee'),
        'RightFoot':      ('right_ankle', 'right_foot_index', 'right_knee'),

        # Fingers
        'LeftThumb1':      ('left_thumb', 'left_wrist', 'left_index'),
        'RightThumb1':     ('right_thumb', 'right_wrist', 'right_index'),
        'LeftIndex1':      ('left_index', 'left_thumb', 'left_wrist'),
        'RightIndex1':     ('right_index', 'right_thumb', 'right_wrist'),
    }

    # Output each joint
    for joint_name, names in joint_map.items():
        p1, p2, p3 = pt(src,names)
        pos = (p1 + p2) / 2
        rot_matrix = compute_rotation_matrix(p1, p2, p3)
        rot = matrix_to_euler_xyz(rot_matrix)
        for i, axis in enumerate('xyz'):
            scriptCHOP.appendChan(f'{joint_name}:t{axis}')[0] = pos[i]
            scriptCHOP.appendChan(f'{joint_name}:r{axis}')[0] = rot[i]
    return

Thanks so much again! I’ll definitely take a look at the script you shared.

By the way, could you briefly explain how I should connect MediaPipe, the Script CHOP, and the imported FBX file together?

I just realized that the FBX file was missing from the project—so I’ve added it to the Google Drive link above, just in case.

Thanks again!

mediatofbxskel.toe (38.0 KB)

here is an example with the defaults in touchesigner. Looking at your fbx it probably needs some type of skeletal rigging rather than just the cloth mesh. I have no idea how to go about that, but I would start by loading some default skeletal rig into touchdesigner and developing from there.

In order for this to pickup MediaPipe make sure it is in the folder above the toxes folder i.e. the same folder as MediaPipe Touchesigner.toe. I have enable external toxes selected in the common page of the MediaPipe base so that it doesn’t include MediaPipe in my .toe file. This reduces the file size and allows me to share that file directly in the forum (:

good luck! and share here if you get anywhere!

When exporting the FBX file, there’s an option to include the rigging structure, so I exported it with that enabled. I’m going to try applying it using the node setup you shared.
Thanks again—I’ll let you know if it works!