Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to correctly convert Motion-X dataset to HumanML3D format (n,263)? #98

Open
MD-Student opened this issue Sep 23, 2024 · 11 comments
Open

Comments

@MD-Student
Copy link

I use tomato_representation/raw_pose_processing,motion_representation and motionx2hummanml/transfer_body_only these three scripts to convert motionx smplx_322 data. The result data is (n,263) but they cannot be visualized correctly.
For example:
image

@mkstmyk
Copy link

mkstmyk commented Oct 2, 2024

I also met the same problem and fixed it adding the following codes to motion_representation.py. It modifies the coordinate system of joints before saving it in new_joints directory.

def transform_coordinates(joints):
# Swap Y and Z coordinates
transformed_joints = joints.copy()
transformed_joints[:, :, [1, 2]] = transformed_joints[:, :, [2, 1]]
return transformed_joints

def invert_z_axis(joints):
inverted_joints = joints.copy()
inverted_joints[:, :, 2] *= -1 # Invert the Z-axis
return inverted_joints

You can call them as follows:

rec_ric_data = recover_from_ric(torch.from_numpy(data).unsqueeze(0).float(), joints_num)
joints = rec_ric_data.squeeze().numpy()
joints = transform_coordinates(joints)
joints = invert_z_axis(joints)

@MD-Student
Copy link
Author

I also met the same problem and fixed it adding the following codes to motion_representation.py. It modifies the coordinate system of joints before saving it in new_joints directory.

def transform_coordinates(joints):

Swap Y and Z coordinates

transformed_joints = joints.copy()
transformed_joints[:, :, [1, 2]] = transformed_joints[:, :, [2, 1]]
return transformed_joints
def invert_z_axis(joints):
inverted_joints = joints.copy()
inverted_joints[:, :, 2] *= -1 # Invert the Z-axis
return inverted_joints

You can call them as follows:

rec_ric_data = recover_from_ric(torch.from_numpy(data).unsqueeze(0).float(), joints_num)
joints = rec_ric_data.squeeze().numpy()
joints = transform_coordinates(joints)
joints = invert_z_axis(joints)

I couldn't explain how excited I am. I will try this code as soon as possible.

@kangzejian1896
Copy link

I also found that the face in the data was facing downwards, and after applying a rotation matrix, it seems to be correct:

from scipy.spatial.transform import Rotation as R
trans = dict_idea400['trans'].reshape(-1, 3)
root_orient = dict_idea400['root_orient'].reshape(-1, 3)
R_x_90 = np.array([[1, 0, 0],
                   [0, 0, 1],
                   [0, -1, 0]])
root_orient_rot = R.from_rotvec(root_orient).as_matrix()  # (87, 3, 3)
root_orient_fixed = np.zeros_like(root_orient)  # (87, 3)
trans_fixed = np.zeros_like(trans)  # (87, 3)
for i in range(len(root_orient)):
    root_orient_rot_90 = np.dot(R_x_90, root_orient_rot[i])  #  (3, 3)
    root_orient_fixed[i] = R.from_matrix(root_orient_rot_90).as_rotvec()  # (3,)
    trans_fixed[i] = np.dot(R_x_90, trans[i])  # (3,)

@MD-Student
Copy link
Author

I also found that the face in the data was facing downwards, and after applying a rotation matrix, it seems to be correct:

from scipy.spatial.transform import Rotation as R
trans = dict_idea400['trans'].reshape(-1, 3)
root_orient = dict_idea400['root_orient'].reshape(-1, 3)
R_x_90 = np.array([[1, 0, 0],
                   [0, 0, 1],
                   [0, -1, 0]])
root_orient_rot = R.from_rotvec(root_orient).as_matrix()  # (87, 3, 3)
root_orient_fixed = np.zeros_like(root_orient)  # (87, 3)
trans_fixed = np.zeros_like(trans)  # (87, 3)
for i in range(len(root_orient)):
    root_orient_rot_90 = np.dot(R_x_90, root_orient_rot[i])  #  (3, 3)
    root_orient_fixed[i] = R.from_matrix(root_orient_rot_90).as_rotvec()  # (3,)
    trans_fixed[i] = np.dot(R_x_90, trans[i])  # (3,)

Thank you very much!!!!
I will try this code soon!

@PerfectBlueFeynman
Copy link

Hi @MD-Student , May I ask which sub-dataset are you converting?
I've encountered some errors saying the mismatching matrix dimension in blend shape function. Are you using smplh from the website or processed with mano/smpl merging?

thank you.

@MD-Student
Copy link
Author

Hi @MD-Student , May I ask which sub-dataset are you converting? I've encountered some errors saying the mismatching matrix dimension in blend shape function. Are you using smplh from the website or processed with mano/smpl merging?

thank you.

As a matter of fact, my dataset structure is also in a mess. The sub-dataset I am converting is located in the kungfu folder. And I am using the smplh model downloaded from its website.

@PerfectBlueFeynman
Copy link

PerfectBlueFeynman commented Oct 10, 2024

Hi @MD-Student , May I ask which sub-dataset are you converting? I've encountered some errors saying the mismatching matrix dimension in blend shape function. Are you using smplh from the website or processed with mano/smpl merging?
thank you.

As a matter of fact, my dataset structure is also in a mess. The sub-dataset I am converting is located in the kungfu folder. And I am using the smplh model downloaded from its website.

Thank you for the information, same here XD.
Have you ever handled some error on the mismatching dim of beta_num for betas and shape_disps?
And the SMPLX model you're using is version 1.1?

@MD-Student
Copy link
Author

MD-Student commented Oct 11, 2024

Hi @MD-Student , May I ask which sub-dataset are you converting? I've encountered some errors saying the mismatching matrix dimension in blend shape function. Are you using smplh from the website or processed with mano/smpl merging?
thank you.

As a matter of fact, my dataset structure is also in a mess. The sub-dataset I am converting is located in the kungfu folder. And I am using the smplh model downloaded from its website.

Thank you for the information, same here XD. Have you ever handled some error on the mismatching dim of beta_num for betas and shape_disps? And the SMPLX model you're using is version 1.1?

I have not encountered the promblem of data dim mismatching. Just follow the ipynb scripts.
I just download the smpl model from the website, so the version should be latest.

@zhangyuhong01
Copy link
Collaborator

Hello, I have uploaded the new version of the Tomato-aligned motion data from Motion-X++ data here, so you could check it, and I show some visualization for you here.
from kungfu subset:
kungfu npy
from game motion subset:
game_motion
from humman subset:
humman

@MD-Student
Copy link
Author

Hello, I have uploaded the new version of the Tomato-aligned motion data from Motion-X++ data here, so you could check it, and I show some visualization for you here. from kungfu subset: kungfu npy kungfu npy from game motion subset: game_motion game_motion from humman subset: humman humman

Thany you very much! I will check it soon.

@PerfectBlueFeynman
Copy link

Hello, I have uploaded the new version of the Tomato-aligned motion data from Motion-X++ data here, so you could check it, and I show some visualization for you here. from kungfu subset: kungfu npy kungfu npy from game motion subset: game_motion game_motion from humman subset: humman humman

Hi @zhangyuhong01, thank you for the new version release! I happened to find that there are some problems with the text annotation for the animation subset. In the Motion-X/texts/animation folder, there are only .npy files instead of .txt files.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants