-
Notifications
You must be signed in to change notification settings - Fork 285
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Human joint constraints #919
Conversation
Quik suggestion: I think |
This reverts commit 6c04669.
Hi JS, I have tried your suggestion and everything builds on my mac. Thus updated my PR. I am confused why the change set won't build on CI. Do you have any ideas? |
Specifically, I am looking at the macos CI, where I got the following errors: /Users/travis/build/dartsim/dart/dart/constraint/HumanArmJointLimitConstraint.cpp:76:24: error: no matching member function for call to 'getDof' /Users/travis/build/dartsim/dart/tiny_dnn/util/util.h:397:32: error: no matching function for call to 'zeros' |
I haven't had time to do a thorough review yet, but I have some high-level thoughts from a quick glance. How necessary is it to make two different constraint classes: one for the arm and one for the leg? Is it possible to create a limb constraint class which is more generic, and agnostic to whether it is being applied to an arm or a leg? DART tries to be generic whenever possible, since there are many types of robots or characters that we would like to be able to simulate. Similarly, do the constraints have to specifically be human? One way to shift towards generality is to have a I'll do a more thorough review as soon as I get the chance. There's a lot of content to look over, so I'm not sure that I could offer an ETA right now. |
Hi Grey, thanks for the feedback. Yes, the constraints are specific to human, since the neural net functions are learned from data on human range of motion. (For instance, our shoulders have extremely large ROM.) I think I could merge the 2 classes into 1 limb class with some effort. I didn't do that because in that case we would see some large IF blocks in the core constraint functions. As q_arm and q_leg have different DOFs (4 vs 6, we don't have data for hands), with different neural net shapes, they don't share logic during back-propagation, for example. Speaking of reading clarity, is that what we prefer? Actually the fitted functions are pretty specific to a certain configuration: the first joint has to be 3DoF shoulder and the second has to be 1DoF elbow, and we still need certain box limits set on top of our functions. I guess since often time humans are what we are interested in, we might be a bit more "tolerant" in being specific. |
Are we assuming that these constraint types must be used with a specific data set? Suppose I wanted to create an avatar of myself which accurately reflects my own flexibility (which may be more or less flexible than the data source that you used in your research project), is that something I could do using this pull request, as long as I have normal human-like joints? Now suppose I wanted an avatar of my dog, and I have collected motion capture data for him and fitted some functions using a neural network to describe his joint limits. What would be stopping me from using your constraint class to simulate those dog-like fitted functions, besides the fact that the choice of joints is hard-coded? (These are genuine questions, not rhetorical questions.) It also strikes me that I would propose that these constraint types (along with And I want to be clear: My feeling on this says nothing about the quality of the content. I've watched the video and skimmed the paper, and I fully believe it's very valuable, high-quality work. But its application is very specific, especially in its current implementation, and we should not underestimate the burden of saddling users with bytes that they are unlikely to use. Software bloat can become a seriously debilitating issue if gone unchecked. I am happy to explore whatever changes need to be made to DART to allow these constraints to be used from an external library. I suspect those changes would be relatively minor and very desirable for the general population of DART users. I can also offer guidance on how to set your classes up in their own repo as an external library, although I may be a little slow on that since I have some projects right now that I'm trying to finish up soon. If the implementations of these constraint types can be made more generic (and especially if the direct dependency on If anyone disagrees with my evaluation, I'll gladly reconsider my stance. |
I generally agree with @mxgrey. One another possible solution would be that we keep the constraints in this repo but use One problem with doing this is that |
Even without the |
👍 I also have the same question of your first paragraph of this comment before adding more thought on whether we want to have this PR in this repo or outside repo.
I'm also considering splitting |
I agree that I don't think soft bodies should be removed from DART, but I do believe they should be replaced with a soft body framework that's more comprehensive and general. What we currently have is better than not having any soft body support at all, so we should certainly keep it until we can put together a replacement. |
Thanks for the inputs here Grey. I greatly appreciate it. I tend to agree that this work might better be external to DART. The burden from tiny-dnn is high considering it is only used in one application. Let me talk to Karen and see how she feels about it. |
It seems we have two major issues here:
I think these issues are solvable. Issue 1 can be resolved by making
Issue 2 also can be resolved by revising the new constraints to be general to work with any set of
The above description is pretty simplified but we could continue to discuss the details. My only unclear part in making the joint constraint class generic is the input and output of the neural network in Could you point out which part should I look at in your paper or simply explain them here for me? In particular, I wonder where those sin/cos come from. |
Let me first try to explain this line: qz is one DoF(joint angle). Though ranging in (-inf, inf), it is actually circular - all the 2k\pi is the same configuration. Neural Nets dislike inputs in (-inf, inf), so we inform the net by inputting cos(qz) and sin(qz) instead of qz itself. All the other DoF's (e.g. qx) are limited in \pi range - they distribute only on a semicircle rather than a full circle as qz. In this case, we replace qz with cos(qz) so that all the inputs to the NN would be in range [-1,1], making the training easier. This is the consequence of the above. Since we input sin(q) instead of q, we need an additional chain rule when calculating gradients. This is simply because we train the NN on left arm/leg, but would like them to work for the right limbs as well. So we take the mirror value of some joint angles and then input to the Nets. Thanks for composing a blueprint for rebuilding this pull request. I thought about your suggestions and agree that the main unclear part would be the pre/post-processing steps discussed above. With the goal of making the class more generalizable, could it be helpful to make the pre/post-processing steps as override functions? I could imagine every JointLimit classes should have slightly different steps (like whether to use sin or cos etc.) Surely I am no expert here, just thinking this should be taken care of. |
Hi all,
This PR uses Neural Nets to simulate realistic human range of motion:
https://youtu.be/wzkoE7wCbu0
http://arxiv.org/abs/1709.08685
Basically, there is a learned neural net function C(q); C(q) = 0.5 represents the boundary of valid human range of motion.
In every timestep, a forward pass and back-prop of the NN is executed to evaluate C(q) and dC/dq for current q. If q is invalid, a constraint force in the dC/dq direction is generated by LCP solver to push q back to valid region.
In practice, there is one NN for the arms (C_arm(q)), and one for the legs (C_leg(q)) to make training and evaluations easier.
Tested on MacOS 10.12; Not sure about Linux.
We need to figure out how to include the external library tiny-dnn: #918
Thanks in advance for helping with this.
Yifeng