Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add instructions for eye-on-base. Rename some variables for generality. #6

Merged
merged 1 commit into from
Apr 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 35 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,20 +6,43 @@
<em> Figure 1. Handeye calibration for 4DOF SCARA arms [1]. Left is eye-on-base and right is eye-on-hand. </em>
</p>

This package provides eye-on-hand handeye calibration for 4DOF robot manipulators using dual quaternions. This is an implementation of [1] which extended dual quaternion handeye calibration techniques [2] to 4DOF robots. Handeye calibration through dual quaternions is favorable as it allows for us to solve for translation and rotation simultaneously, thus avoiding any compounding error resultant from solving these separately.
This package provides handeye calibration for 4DOF robot manipulators using dual quaternions. This is an implementation of [1] which extended dual quaternion handeye calibration techniques [2] to 4DOF robots. Handeye calibration through dual quaternions is favorable as it allows for us to solve for translation and rotation simultaneously, thus avoiding any compounding error resultant from solving these separately.
## Background

Robot handeye calibration is an important and well studied problem that allows us to obtain the unknown static transformation between the end effector and camera (*in eye-on-hand case*) or the robot base and camera (*in eye-on-base case*). This is done by sampling several transformations between the base and end effector (*through forward kinematics*) and camera to marker (*through marker detection methods*) which allows us to formulate the [**AX = XB**](https://en.wikipedia.org/wiki/Hand_eye_calibration_problem) problem where **X** is our unknown desired static transform.

Several methods exist to solve for **X** but a vast majority of these methods assume that the robot is well articulated (i.e. has 6DOF). For 4DOF robots such as SCARAs (x, y, z, yaw), conventional calibration methods are infeasible and are incapable of producing valid results. This method circumvents this problem by reducing [2] to 4DOF robots. See [1] for further details. For an introduction to dual quaternions, see [3].

## How to run
To run, first install necessary packages.
## Dependencies
First, install the necessary packages.
```bash
python3 -m pip install -r requirements.txt
```
The expected format for poses is a pickle file containing a tuple consisting of a list of base to hand transforms and a list of camera to marker transforms. All transforms should be 4x4 numpy arrays.

## How to Run
There are two possible types of calibrations you can perform:
1. Eye-in-hand: where the transform of interest is from hand to camera or
2. Eye-on-base: where the transform of interest is from camera to base.

The necessary sampled transforms for calibration thus depend on the type of calibration chosen. Below, we can see the simple transform chaining used for both types of calibration.

### Eye-in-hand calibration
```
camera_T_marker * hand_T_camera * base_T_hand = base_T_marker
(known) (unknown, want) (known) (unknown, don't care)
```
For this type of calibration we require a set of (camera to marker) and (base to hand) transforms, which will then be used to obtain the desired (hand to camera) transform.

### Eye-on-base calibration
```
base_T_hand * camera_T_base * marker_T_camera = marker_T_hand
(known) (unknown, want) (known) (unknown, don't care)
```
For this type of calibration we require a set of (base to hand) and (marker to camera) transforms, which will then be used to obtain the desired (camera to base) transform.

### Setting up the calibration

The expected format for poses is a pickle file containing a tuple consisting of a list of the first desired transform and a list of the second desired transforms (in the order specified above). All transforms should be 4x4 numpy arrays.
Afterwards, calibration can be performed by running the command shown below.

```bash
Expand All @@ -29,6 +52,7 @@ python3 calibrate.py [-h] [-p POSE_PATH] [-m MOTION_PATH] [-c] [-a] [-n] [-s SAM

For those who wish to create their own custom scripts, an example template is provided in `src/example.py` *(small snippet shown below)*. Running these examples will also perform the calibration on the provided example data.
```python3
""" ========================= Eye-in-Hand Calibration Example =============================== """
def main():
with open("../example_data/pose_samples.pkl", "rb") as f:
try:
Expand All @@ -51,21 +75,24 @@ def main():

# Hand to Camera TF obtained from post nonlinear refinement.
nl_hand_to_camera = cb.nonlinear_refinement(base_to_hand, camera_to_marker, ca_hand_to_camera)
```
```
Some example data for eye-on-base can be seen provided in issue #2.

**Note:** that the nonlinear refinement step has only been validated to work for eye-in-hand. For users performing eye-on-base, it is recommended to skip this step for now.

## TODO
If you would like to contribute by implementing any of the tasks below or you spot an error in the existing code base, feel free to submit a pull request!

- Add eye-on-base functionality **(major update)**.
- ~~Add eye-on-base functionality~~
- Improve README documentation.
- Add ROS capabilities.
- Add graphical visualizations / GUI.
- ~~Add command line interface.~~
- ~~Add post nonlinear refinement.~~
- ~~Add singular value checking.~~
- Add pose selection based off of dual quaternion scalar score *(tried but not working)*.
- Add details for z-translation obtainment.
- Add code comments throughout.
- Add details for z-translation obtainment. (Some details in the discussion of issue #2)
- Test and validate the nonlinear refinement for eye-on-base.

## References
[1] Ulrich, Markus. “Hand-Eye Calibration of SCARA Robots.” (2014).
Expand Down
4 changes: 2 additions & 2 deletions src/calibrate.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ def main():
# Initialize calibrator with motions.
cb = Calibrator4DOF(motions)

# Our camera and end effector z-axes are antiparallel so we must apply a 180deg x-axis rotation.
# Our camera and end effector z-axes are antiparallel, so we must apply a 180deg x-axis rotation.
dq_x = cb.calibrate(antiparallel_screw_axes=args.antiparallel_axes)

# Calibration Obtained Hand to Camera
Expand All @@ -61,7 +61,7 @@ def main():
# Calibration Obtained with post nonlinear refinement
if args.nonlinear:
assert args.pose_path is not None, "Can't perform nonlinear refinement without poses. Please specify -p."
hand_to_camera = cb.nonlinear_refinement(base_to_hand, camera_to_marker, hand_to_camera)
hand_to_camera = cb.nonlinear_refinement(camera_to_marker, base_to_hand, hand_to_camera)

rotation = np.rad2deg(R.from_matrix(hand_to_camera[:3, :3]).as_euler('xyz'))

Expand Down
2 changes: 1 addition & 1 deletion src/example.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ def main():
ca_hand_to_camera = np.linalg.inv(dq_x.as_transform())

# Hand to Camera TF obtained from post nonlinear refinement.
nl_hand_to_camera = cb.nonlinear_refinement(base_to_hand, camera_to_marker, ca_hand_to_camera)
nl_hand_to_camera = cb.nonlinear_refinement(camera_to_marker, base_to_hand, ca_hand_to_camera)

ca_rotation = np.rad2deg(R.from_matrix(ca_hand_to_camera[:3, :3]).as_euler('xyz'))
nl_rotation = np.rad2deg(R.from_matrix(nl_hand_to_camera[:3, :3]).as_euler('xyz'))
Expand Down
2 changes: 1 addition & 1 deletion src/example_using_precomputed_motions.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def main():
ca_hand_to_camera = np.linalg.inv(dq_x.as_transform())

# Hand to Camera TF obtained from post nonlinear refinement.
nl_hand_to_camera = cb.nonlinear_refinement(base_to_hand, camera_to_marker, ca_hand_to_camera)
nl_hand_to_camera = cb.nonlinear_refinement(camera_to_marker, base_to_hand, ca_hand_to_camera)

ca_rotation = np.rad2deg(R.from_matrix(ca_hand_to_camera[:3, :3]).as_euler('xyz'))
nl_rotation = np.rad2deg(R.from_matrix(nl_hand_to_camera[:3, :3]).as_euler('xyz'))
Expand Down
2 changes: 1 addition & 1 deletion src/handeye_4dof/calibrator.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def calibrate(self, antiparallel_screw_axes=False):
return dq_x

@staticmethod
def nonlinear_refinement(base_to_hand, camera_to_marker, calib_hand_to_camera):
def nonlinear_refinement(camera_to_marker, base_to_hand, calib_hand_to_camera):
# We only vary the first 11 parameters of the transform matrix since we cannot solve for tz.
W = np.eye(4)
W[-1, -1] = 9
Expand Down
10 changes: 5 additions & 5 deletions src/handeye_4dof/pose_selector.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@
from .dual_quaternions import DualQuaternion


def robot_pose_selector(camera_to_marker, base_to_hand):
assert len(camera_to_marker) == len(base_to_hand), "Nonmatching number of transforms."
motions = [None for _ in range(len(camera_to_marker))]
for i, (Ai, Bi) in enumerate(zip(camera_to_marker, base_to_hand)):
def robot_pose_selector(tf1, tf2):
assert len(tf1) == len(tf2), "Nonmatching number of transforms."
motions = [None for _ in range(len(tf1))]
for i, (Ai, Bi) in enumerate(zip(tf1, tf2)):
print("Obtaining motion {}...".format(i+1))
curr_theta_max = 0
for j, (Aj, Bj) in enumerate(zip(camera_to_marker, base_to_hand)):
for j, (Aj, Bj) in enumerate(zip(tf1, tf2)):
if i == j: continue
A = DualQuaternion.from_transform(np.dot(Aj, np.linalg.inv(Ai)))
B = DualQuaternion.from_transform(np.dot(np.linalg.inv(Bj), Bi))
Expand Down