-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Good work! How can i run demo on Internet videos? #8
Comments
Hi, Currently we do not have plans for a demo code, to test on internet videos, you can use any off-the-shelf method for the initialization, and format the initialized motion sequnece following the sample sequences' data format provided in 'Test and evaluate on PROX/EgoBody' in README, with z(or y)-axis. |
I'm trying to run this on method on a custom dataset of only RGB videos. Preparing the dataset to run your model is becoming quite a challenge. Please reconsider releasing a demo script, easier reproducibility means more citations after all ;) |
Thank you for this impressive work! +1 on some demo code. A documented colab would be SO useful even if it starts from something like pre-computed (from LEMO or GT data perhaps) smpl-x shape, pose, translation sequences without directly relying on images. An idea for a possible flow that I'd personally find helpful!
|
Hi, we stuck on download SMPL-X in AMASS dataset as well. |
I'm having the same difficulties too. A demo would be much appreciated. |
Thanks for releasing such an outstanding work, I wonder how to test it on Internet video?
Now i can get the init_motion with WHAM project (contains :world_pose_root\cam_pose_root、body_pose、transl_cam\trans_world、openpose25 joints),and i want to use your work to refine the init_motion (reduce foot sliding、jitter、kp2d consistency),will you provide the demo code for Internet videos?
The text was updated successfully, but these errors were encountered: