Skip to content
sguptasarma edited this page Jul 21, 2024 · 11 revisions

Welcome to the ProACT wiki! This page will show you how to use ProACT to set up a study on intelligent control for prosthetic arms.

Tested on the following setup:

  • Ubuntu 20.04
  • ROS Noetic
  • Hololens 2 with Windows Holographic for Business OS build 20348.1518
  • Unity 2020.3.26.f1
  • Mixed Reality Toolkit 2.7.3.0
  • OptiTrack motion capture PrimeX series with Motive Tracker 3.0.1]
  • OyMotion GForce Pro+ EMG band
  • LibEMG latest release as on January 2024

Setup before inviting participants

Network

Connect the Linux computer running ROS, Windows computer running Unity, HoloLens 2, and whichever computer is running Motive, all to the same WiFi network and note the IP addresses.

Installation

  1. Install ROS and make a catkin workspace.
  2. Clone this repository into catkin_ws/src.
  3. Install dependencies using rosdep.
  4. Installation of the Gazebo-ROS controllers may be necessary separately, if rosdep does not handle it:
sudo apt install ros-noetic-ros-control ros-noetic-ros-controllers ros-noetic-gazebo-ros-control ros-noetic-moveit`.
  1. Copy the file_server package from the UWP fork of ROS#: https://github.com/EricVoll/ros-sharp into the workspace.
  2. Clone packages from https://github.com/JenniferBuehler/general-message-pkgs and https://github.com/JenniferBuehler/gazebo-pkgs into the workspace.
  3. Install libemg
pip3 install libemg
  1. Clone libemg_ros into the catkin workspace.
  2. In shoulder_localization/launch/mocap_comm.launch, replace the server IP with the IP of the computer running Motive.

Motion capture

Attach motion capture markers and create Motive rigid bodies for:

  1. A shoulder brace
  2. The HoloLens 2
  3. A reference object In shoulder_localization/launch/mocap_comm.launch, these are named Shoulderpad/ShoulderpadL (we had two sizes), HololensSG, and Clipboard. Replace these names in the launch file, and everywhere else they appear, if using different names in Motive.

Remember to export these as Motive assets to avoid repeating calibration.

⚠️

  • HoloLens markers need to be on the top of the visor (so that they are fixed relative to the IMU and not occluding any cameras), and non-collinear as usual.
  • Putting too many markers on the surface will cause them to occlude each other.
  • Note that turning the visor up after this might damage markers.

In Motive, open network settings from the bottom right and enable VRPN streaming.

📝 The first time this setup is used, the coordinate frame attached to the shoulder (where the base of the arm appears) will likely need to be manually adjusted in Motive, to attain some intersection of believability and visibility.

Windows

This part of the setup may require some downgrades on your system, since the ROS# UWP fork was written for older versions of Unity, which work with older versions of Visual Studio.

  1. Install Unity 2020 and open the Unity project from this repository in the Unity editor.
  2. The ROS Connector game object has a ROS Connector script attached to it. Replace the ROS bridge server URL with the Linux computer IP address, keeping the port 9090.
  3. Build the app for Universal Windows Platform.
  4. Install Visual Studio 2019. Open the .sln file generated by the Unity build in VS2019, and set up for UWP deployment: Release configuration, ARM64 platform, Remote Machine. In Project>Properties>Debugging, enter the HoloLens IP in the machine name field.
  5. Deploy to HoloLens and give the app permissions.

At first, the arm, box and blocks will appear above your head somewhere. To match motion capture and Unity worlds, we need to measure the transform between the head frame measured by Unity and the reference frame attached to the HoloLens in Motive, as described below.

Calibration for localization

  1. Find a way to repeatedly place the HoloLens in a fixed pose. We built a rig from some old PC parts and packaging:

  1. Run shoulder_localization/src/calibration.sh in a terminal while the HoloLens is on the rig and do as prompted:
  • Once the app has started and is connected to ROS, wear the HoloLens to manually align the reference mocap frame with the virtual coordinate frame that appears 30 cm in front of the Unity world frame.
  • Replace the HoloLens on the rig and get the transform between the Unity head frame (coincident, at this pose, with the Unity head frame) and HoloLens mocap frame.
  • Paste these values into the initialization section (self.pos_m2c and temp) of shoulder_localization/src/holoworld_with_mocap.py. Note that tf puts q0 after q1, q2, q3, while pyquaternion expects them in the order q0, q1, q2, q3. The script arranges them so that you can paste values directly from the terminal.

With participants

EMG

  1. Have participants put on the EMG band snugly on the forearm and switch it on.
  2. Navigate to libemg_ros and run python3 train.py, clicking "Train Model" to train gestures. Explain that the training gestures need to be as independent of each other as possible in the way they are performed.
  3. Close the training window to return to the main page, clicking "Classify" to check the quality of training and retrain if necessary.

HoloLens

  1. Explain how to open and close the main menu.
  2. Guide participants to Settings > System > Calibration > Run eye calibration and let them follow the instructions.

Shoulder tracking

Secure the brace with mocap markers on the participant's shoulder.

Running experiments

Run box_and_blocks_experiment/src/master.sh.