-
Notifications
You must be signed in to change notification settings - Fork 8
Getting Started
If you haven't already, you should set up the code base using the instructions in the project README. This walks through the installation of the codebase and its dependencies.
This codebase is fully open source and comes with no guarantees. Making the robot move is an action you take as a user / programmer and thus you accept full responsibility. We recommend supervising the robot at all times.
Questions or issues about software should be handled through GitHub issues with an appropriate issue tag
. Other inquiries can be handled through our
Support Center.
If you do not yet have a physical robot, or want to test your code in a risk-free way, trying running the Revel stack in simulation. You can find details on how to do that here.
With the robot plugged in via USB, bring up ROS with
roslaunch svenzva_drivers svenzva_bringup.launch
which brings up the main driver and a number of supporting nodes.
The default mode of the arm is position control
mode, which is useful for MoveIt!, kinesthetic playback and typical command type movements.
Other modes include velocity control
mode, which is necessary for cartesian velocity movements of the end effector, and gravity control
mode which is necessary for kinesthetic teaching. To set the mode, change the mode
argument in the svenzva_bringup.launch
file, or set it through the command line with
roslaunch svenzva_drivers svenzva_bringup.launch mode:=gravity
or
roslaunch svenzva_drivers svenzva_bringup.launch mode:=velocity
At any point you should be able to visualize the robot through RViz with the Robot State
plugin.
You can launch MoveIt! on top of svenzva_ros
with
roslaunch svenzva_moveit demo.launch
Note that this will launch an RViz window by default, but can be turned off through launch file arguments. By following the planning and execution steps in the MoveIt! Simulation tutorial, you will be able to make the arm phyiscally move using a MoveIt! plan.
Be careful not to execute a plan around objects or people. MoveIt! checks for collisions with the arm, but does not detect environment collisions without vision feedback.