Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iiwa LBR 7 R800: EndEffectorPose is upside down #80

Open
MightyMirko opened this issue Jan 2, 2024 · 10 comments
Open

iiwa LBR 7 R800: EndEffectorPose is upside down #80

MightyMirko opened this issue Jan 2, 2024 · 10 comments

Comments

@MightyMirko
Copy link

MightyMirko commented Jan 2, 2024

Hello and Thank you,

i tested your library today with a KUKA FRI Connection. It works ofc, but using the given model file from rl-examples i cant get the correct Transformation of the end effector Orientation:

**Details**

robotModel::robotModel(const std::string &xmlFilePath) {
    rl::mdl::XmlFactory factory;
    std::shared_ptr<rl::mdl::Model> tempModel(factory.create(xmlFilePath));
    model = std::move(tempModel);  // Move ownership to the member variable
    kinematics = dynamic_cast<rl::mdl::Kinematic *>(model.get());
    q.resize(kinematics->getDof());
    //std::printf("DoF: %i",sizeof(q));
}

void robotModel::performForwardKinematics() {
//    q *= rl::math::DEG2RAD;

    kinematics->setPosition(q);
    kinematics->forwardPosition();
    rl::math::Transform t = kinematics->getOperationalPosition(0);
    rl::math::Vector3 position = t.translation();


    //std::cout << "Was steckt in T " <<position<< std::endl;
    rl::math::Vector3 orientation = t.rotation().eulerAngles(2, 1, 0).reverse();
    printVector(position, orientation);
}

void robotModel::printVector(rl::math::Vector3 &position, rl::math::Vector3 &orientation) {
    std::cout << "x: " << position.x() << " m\ty: " << position.y() << " m\tz: " << position.z() << " m" << std::endl;
    std::cout << "a: " << orientation.x() * rl::math::RAD2DEG << " deg\tb: " << orientation.y() * rl::math::RAD2DEG
              << " deg\tc: " << orientation.z() * rl::math::RAD2DEG << " deg" << std::endl;
    std::cout << "Joint configuration in degrees: " << q.transpose() * rl::math::RAD2DEG << std::endl;
    std::cout << "End-effector position: [m] " << position.transpose() << " orientation [deg] "
              << orientation.transpose() * rl::math::RAD2DEG << std::endl;
}

void robotModel::setQ(double *jointPos) {
    q << jointPos[0],
            jointPos[1],
            jointPos[2],
            jointPos[3],
            jointPos[4],
            jointPos[5],
            jointPos[6];
}

Output is:

Session State:	MONITORING_READY
x: -0.45935 m	y: -0.00375237 m	z: 0.613881 m
a: -179.617 deg	b: -28.2452 deg	c: 179.657 deg
Joint configuration in degrees:    0.465831 -0.00258179  0.00238266     92.1643 -0.00076904    -59.5855    0.715265
End-effector position: [m]    -0.45935 -0.00375237    0.613881 orientation [deg] -179.617 -28.2452  179.657

Whereas a should be positive and c negative according to my sunrise smartpad > cartesian position in world frame on flange.

Do i have to change the rlmdl/kuka-lbr-iiwa-7-r800.xml or is this correct behaviour?

2:

Perhaps i oversaw it, but is there also a rlkin file for the 7 r800? EDIT: apparently kin is deprecated?

My Goal is to get TCP Position and Velocity. Also i want to write a simple endeffector mockup which is using this velocity.

@rickertm
Copy link
Member

rickertm commented Jan 3, 2024

The KUKA ABC orientation values represent Euler angles in the convention z-y'-x'' calculated as R = Rz(A) Ry(B) Rx(C). To calculate these values from a rotation matrix, you can use

rl::math::Vector3 orientation = t.rotation().eulerAngles(2, 1, 0);
rl::math::Real a = orientation(0);
rl::math::Real b = orientation(1);
rl::math::Real c = orientation(2);

I do not currently have a rl::kin model definition for the iiwa 7 r800. The kinematic model is easy to define, but the existing rl::sg geometric models use different coordinate systems and have to be adjusted—rl::mdl does not require rotation around the z axis, whereas rl::kin uses DH notation according to Paul. The development focuses on rl::mdl, as it can easily support different joint models. The earlier rl::kin is still kept for backward compatibility and as a DH reference implementation at the moment.

@MightyMirko
Copy link
Author

image

Hi Markus,

thanks for the reply.

I have it running like you suggested.

void robotModel::performForwardKinematics() {
//    q *= rl::math::DEG2RAD;

    kinematics->setPosition(q);
    kinematics->forwardPosition();
    rl::math::Transform t = kinematics->getOperationalPosition(0);
    rl::math::Vector3 position = t.translation();
    //std::cout << "Was steckt in T " <<position<< std::endl;
    rl::math::Vector3 orientation = t.rotation().eulerAngles(2, 1, 0).reverse();
    rl::math::Real a = orientation(0) * rl::math::RAD2DEG;
    rl::math::Real b = orientation(1) * rl::math::RAD2DEG;
    rl::math::Real c = orientation(2) * rl::math::RAD2DEG;
    std::cout<<"a: "<<a<<"\tb: "<<b<<"\tc: "<<c<<std::endl;
    //printVector(position, orientation);
}

Output:

Session State:	MONITORING_READY
 J0: 0.465831
 J1: -0.002582
 J2: 0.002372
 J3: 92.164334
 J4: -0.000731
 J5: -59.585480
 J6: 0.715268
a: -179.617	b: -28.2452	c: 179.657
Session State:	MONITORING_READY

Sunrise Station:

image

image

@MightyMirko
Copy link
Author

@rickertm
Copy link
Member

rickertm commented Jan 3, 2024

Please note the difference in the extra .reverse(), it should be

rl::math::Vector3 orientation = t.rotation().eulerAngles(2, 1, 0);

@MightyMirko
Copy link
Author

Oh sorry for that :-)

Could you help me with a follow up question?
What would be a good approach to get TCP Velocity? Do i have to go for Jacobian or is there already a built in solution?

@rickertm
Copy link
Member

rickertm commented Jan 3, 2024

You can get the TCP velocity from your joint velocity via the forwardVelocity function

kinematics->setPosition(q);
kinematics->setVelocity(qd);
kinematics->forwardVelocity();
rl::math::MotionVector xd = kinematics->getOperationalVelocity(0);
// transform to world frame
kinematics->forwardPosition();
rl::math::Vector3 v = kinematics->getOperationalPosition(0).linear() * xd.linear();
rl::math::Vector3 omega = kinematics->getOperationalPosition(0).linear() * xd.angular();

or via the Jacobian matrix:

kinematics->setPosition(q);
kinematics->forwardPosition();
kinematics->calculateJacobian();
rl::math::Vector xd = kinematics->getJacobian() * qd;

@MightyMirko
Copy link
Author

Hello Markus,

ok im trying it right now.

I do have one misunderstanding and perhaps overestimation in this repo, but with FRI i can only get Torques and Joint Positions. Each cycle (5ms) I'm setting the sent joint positions

// Update q from FRI
    lbr.q << jointPos[0],
            jointPos[1],
            jointPos[2],
            jointPos[3],
            jointPos[4],
            jointPos[5],
            jointPos[6];

In the proposed solution it is necessary to have the joint velocities qd? My question is whether there is a way to calculate joint velocities ($\dot q$​) directly from the FRI data, or if I need to derive them manually based on the successive joint positions. I would appreciate any guidance or clarification you can provide on this matter.

Thank you for your time and assistance.

@MightyMirko
Copy link
Author

MightyMirko commented Jan 4, 2024

Ok got it now:

/*
 * One Sided differencing for numerical differentiation
 */
rl::math::Vector mastersclient::calcJointVel(double dt) {
    size_t size = jointTest.size();
    rl::math::Vector derivativeVector(size);
    derivativeVector.setZero();

    // Central differencing for numerical differentiation
    for (size_t i = 1; i < size - 1; ++i) {
        derivativeVector(i) = (jointTest[i] - oldJointPos[i]) / (dt);
        derivativeVector(i) = std::round(derivativeVector(i) * 10000.0) / 10000.0;
    }

    return derivativeVector;
}


void mastersclient::calcRobot() {
    try {
        getCurrentTimestamp();
        const double *measuredJointPosPtr = robotState().getMeasuredJointPosition();
        // KOpiere von vorne nach hinten in jointpos
        oldJointPos = jointTest;
        std::copy(measuredJointPosPtr, measuredJointPosPtr + LBRState::NUMBER_OF_JOINTS, jointTest.begin());

        // Calculate the derivative of the joint positions
        rl::math::Vector jointVel = calcJointVel(_stepWidth);  // Assuming dt is 5ms
        //std::cout << "Joint Velocities: " << jointVel << std::endl;
        robotmdl->setQ(jointTest, jointVel);
        robotmdl->performForwardKinematics();
        robotmdl->getTransformation();
        robotmdl->getTCPvelocity();

    } catch (const std::runtime_error &e) {
        printf("Not connected yet;\n");
    } catch (const std::exception &e) {
        std::cerr << "Exception caught: " << e.what() << std::endl;
    }
}

RobotModel.cpp

robotModel::robotModel(const std::string &xmlFilePath) {
    rl::mdl::XmlFactory factory;
    try {
        std::shared_ptr<rl::mdl::Model> tempModel(factory.create(xmlFilePath));
        model = std::move(tempModel);  // Move ownership to the member variable
        std::cout << "Model created successfully!" << std::endl;
    } catch (const std::exception &e) {
        std::cerr << "Exception caught: " << e.what() << std::endl;
        // Handle the exception as needed
    }

    kinematics = dynamic_cast<rl::mdl::Kinematic *>(model.get());
    //dynamic = dynamic_cast<rl::mdl::Dynamic *>(model.get());

    if (kinematics != nullptr) {//|| dynamic != nullptr) {
        // Dynamic cast was successful
        // You can use kinematics here
    } else {
        // Dynamic cast failed
        // Handle the failure or print an error message
        std::cerr << "Dynamic cast to rl::mdl::Kinematic or Dynamic failed." << std::endl;
    }

    lbr.q.resize(model->getDof());
    //std::printf("DoF: %i",sizeof(lbr_q));
    std::cout << "\n\n\n Using File: " << xmlFilePath << std::endl;

    lbr.q = model->getPosition();
    // lbr.qd = q.der
    lbr.qd = model->getVelocity();
    lbr.qdd = model->getAcceleration();
    lbr.tau = model->getTorque();
    //  this->printQ();

}


void robotModel::setQ(std::vector<double> &q, rl::math::Vector &qd) {
    // Setze akuelle FRI Werte in Kinematicberechnung
    assert(q.size() == lbr.q.size());
    assert(qd.size() == lbr.qd.size());
    lbr.q(q.data(), q.size());
    lbr.qd = qd;
}

void robotModel::performForwardKinematics() {
    kinematics->setPosition(this->lbr.q);
    kinematics->setVelocity(this->lbr.qd);

    kinematics->forwardPosition();
    kinematics->forwardVelocity();
}

void robotModel::getTransformation() {
    //auto xd = kinematics->getJacobian() * lbr.qd;
    rl::math::MotionVector xd = kinematics->getOperationalVelocity(0);
    kinematics->forwardPosition();
    tcp.vecV = kinematics->getOperationalPosition(0).linear() * xd.linear();
    tcp.vecOmega = kinematics->getOperationalPosition(0).linear() * xd.angular();
}

void robotModel::getTCPvelocity() {
    // Convert values to degrees
    tcp.vecV = rl::math::RAD2DEG * tcp.vecV;
    tcp.vecOmega = rl::math::RAD2DEG * tcp.vecOmega;
    // Print all values of vecV with labels
    std::cout << "Linear Velocity (vecV): [Vx, Vy, Vz] = [" << tcp.vecV.transpose() << "]" << std::endl;
    // Print all values of vecOmega with labels
    std::cout << "Angular Velocity (vecOmega): [Omega_x, Omega_y, Omega_z] = [" << tcp.vecOmega.transpose() << "]"
              << std::endl;
}
Linear Velocity (vecV): [Vx, Vy, Vz] = [12.9601       0       0]
Angular Velocity (vecOmega): [Omega_x, Omega_y, Omega_z] = [      0 13.9859       0]
Linear Velocity (vecV): [Vx, Vy, Vz] = [12.9448       0       0]
Angular Velocity (vecOmega): [Omega_x, Omega_y, Omega_z] = [      0 13.9916       0]
Linear Velocity (vecV): [Vx, Vy, Vz] = [13.032      0      0]
Angular Velocity (vecOmega): [Omega_x, Omega_y, Omega_z] = [      0 14.0833       0]

@rickertm
Copy link
Member

rickertm commented Jan 6, 2024

If you need a more accurate value for the velocities and your trajectories are defined based on rl::math::Polynomial or rl::math::Spline, you can calculate the respective derivatives (velocities, accelerations, etc.) via the derivative() functions.

@MightyMirko
Copy link
Author

Hi Markus,

its ok, i upgraded to 5point Stencil and it works fine imho.

I want to attach a endeffector and a workpiece. After gripping id like to calculate the workpiece velocity and position.

Is it necessary to create a second model of gripper/workpiece and attach it to the robot/gripper?

Is there an example demo which i should look into?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants