Robot Behavior¶
Description¶
We currently use robot IRL-1 from IntRoLab for our demonstrations. See official IRL-1 GitHub for more details.
Possible Mouvements¶
- Point to position (x, y, z) with
- Right arm
- Left arm
- Head
- Open and close
- Right gripper
- Left gripper
- SIMULATION ONLY, Do complex movements with arms and head:
- Happy (confidence >= threshold, success 1)
- Satisfied (confidence < threshold, success 1)
- Disappointed (confidence >= threshold, success 0)
- Sad (confidence < threshold, success 0)
- Facial expression
- Anger
- Joy
- Sad
- Surprise
Running Examples¶
Before running any examples, you need to:
- Launch jn0 with RViz UI
$ roslaunch jn0_gazebo jn0_empty_world.launch # for simulation
$ roslaunch jn0_bringup jn0_standalone.launch # for real robot
- Launch devine_irl_control nodes
$ roslaunch devine_irl_control devine_irl_control.launch sim:=true # for simulation
- Load RViz configuration
File -> Open Config -> src/robot_control/launch/irl_point.rviz
You can now execute any of the examples:
- Point to position [x, y, z]
$ rosrun devine_irl_control example_point.py --point 0.6,0.3,0.5 --look 1,-0.6,0.5 __ns:=devine
# Position is referenced from base_link
- Do complex move (SIMULATION ONLY!!!)
$ rosrun devine_irl_control example_emotion.py -c 0 -s 0 __ns:=devine
Dependencies¶
See package.xml for dependencies.
Topics¶
Topics input and output from this module
In/Out | Topic | ROS Message |
---|---|---|
In | /devine/guess_location/world | geometry_msgs/PoseStamped * |
In | /devine/robot/robot_look_at | |
In | /devine/robot/head_joint_traj_point | trajectory_msgs/JointTrajectoryPoint |
Out | /devine/robot/is_pointing | std_msgs/Bool |
Out | /devine/robot/is_looking | |
Out | /devine/robot/err_pointing | std_msgs/Float64MultiArray |
* PoseStamped are relative to base_link (see frame_id)