The 9th Robotic Grasping and Manipulation Competition (RGMC)
IEEE-RAS International Conference on Robotics and Automation (ICRA)
Yokohama, Japan, May 13-17, 2024


Essential Skill Sub-Track 2: In-Hand Manipulation
Organizers: Kaiyu Hang (main contact), Podshara Chanrungmaneekul
Judges: Joshua T. Grace (Yale), Kapil Katyal (Amazon Robotics)
Timeline: May 13 (Dry Run), May 14 (Competition Day)
Competition GitHub Repository: RGMC_In-Hand_Manipulation_2024.

     The focus of this subtrack is on the in-hand manipulation skills of different robot hands and various manipulation planning and control algorithms. Competition information and scoring criteria are detailed below.

I. Workspace Setup
     Every team will bring their own robot hand for the competition. There is no restriction on the design of the hardware. The robot hands can be commercial models, open source models, or models designed the participating teams. The robot hand will be installed on a stationary mount designed by the teams, e.g., the mount can be a stationary frame that can stably hold the robot hand. The teams can change the hand mounting poses as needed for different tasks. An example hand setup is shown in Fig. 1. Note: regardless of the mount used to fix the robot hand, for all manipulation tasks in this competition, only hand actions are allowed. Even if a team decides to mount their robot hand on a robot arm, the robot arm has to be stationary during the task executions.
Fig. 1: An example hand setup with a camera on top of the object being manipulated.
II. Sensors
     Every team is required to use at least one camera that can support the use of apriltag_ros. As exemplified in Fig. 1, the object being manipulated will be tracked by this camera through an apriltag. As will be detailed below, the scoring of the competition is heavily relying on the apriltags, and the manipulation goals are specified in the camera's view/frame based on the readings of apriltags. Therefore, we do not require specific hand-camera calibration.
     There is no restriction on the sensors used by teams. Vision sensors, tactile sensors, force/torque sensors, etc., are all welcome to be included in the setups.

III. Hand Control
     During the competition, after a team has finished setting up all necessary equipment, the robot hand should run fully autonomously to complete the competition tasks. No teleopration or human intervention will be allowed. In cases where human interventions are needed for safety reasons, that specific run of the competition will be regarded a failure.

IV. Competition Objects
     To make it easy for every team to use the same objects for system development and test, this competition will use 3D printed objects. As shown in Fig. 2, three cynlinders of different sizes and two cubes with "A, B, C, D, E, F" written on different facets, together with their CAD models, are provided.
     The cylinders are 60mm, 80mm, 100mm in diameters and 80mm tall. The cubes are of sizes 50*50*50 mm3 and 90*90*90 mm3. On the cynlinder objects, the is a 30*30*1mm3 reserved space to attach the apriltags. On the cube object, there are one or four of such reserved spaces on each facet. Please read the details in the GitHub Repository about how to track those objects.
     The models can be downloaded from the RGMC_In-Hand_Manipulation_2024 GitHub Repository. In addition, there will be one "novel" object at the competition. This "novel" object will be an object of similar sizes to the ones we provided, and will be selected from the YCB Object and Model Set.
Fig. 2: All objects used in the competition. The "novel" object is not shown here.
V. Competition Tasks
Task A: Object Position Control for Waypoints Tracking
     The robot hand will grasp a cylinderical object with an apriltag attached at the top of it. The grasp should be initialized by a human operator, e.g., a human operator can give the object into the robot hand. Since different teams will have different hand designs, every team will need to grasp a cynliner of only one size of their choice. The task is to manipulate the object in-hand so that the apriltag's motion will track given waypoints. In this task, a "novel object" will be used in addition to the cynliders, so that every team will need to run their system 2 times on both the cynlinder and the "novel" objects.
     The waypoints will be given as a sequence of positions relative to the initial object's (apriltag's) position. Once the initial grasp is stabilized, the object's position (in the camera's frame) will be regarded as (0, 0, 0). Thereafter, a sequence of 10 waypoints (in the camera's frame relative to the aforementioned (0, 0, 0)), in the form of (x, y, z), will be given. The robot hand will then be tasked to fully autonomously move the object (apriltag) through those waypoints one by one. The waypoints are all limited within the range of [-2.5cm, 2.5cm] * [-2.5cm, 2.5cm] * [-2.5cm, 2.5cm] centered at the intial position of the grasped object. An example list of waypoints is provided in the GitHub Repository.
     Note: There is no restriction on how the obejct should be initially grasped, or in what pose it should be initially grasped. Every team can design their strategies based on their own system.

Task B: Object Re-orientation
     Every team will grasp only one size of cube objects of their choice. The grasp should be initialized by a human operator, e.g., a human operator can give the object into the robot hand. The task is to manipulate the object in-hand so that the cube object will rotate in-hand to match a sequence of required orientations as marked by the "A, B, C, D, E, F" letters.
     Once the initial grasp is stabilized, the team will be given the sequence of letters to reach one by one. A target letter is considered reached if the apriltag on that target facet is posed to be sufficiently aligned with the "Z" direction of the camera's frame. Skipping letters is not allowed and will potentially make all subsequent letters fail. The given target sequence will make sure that all letters can be reached in a continuous sequence without the need to go through other intermediate letters.

VI. Scoring and Ranking
     By tracking the apriltag, the organizers will provide a ROS package (auto-evaluator) to automatically evaluate the manipulation performance. The ROS package is hosted on GitHub at: RGMC_In-Hand_Manipulation_2024.

     Task A: The auto-evaluator will record the motion trajectory of the object (apriltag) and calculate the accumulated errors. Every team will have a 20-seconds time budget for each goal waypoint.
     There is only one set of 10 waypoints for Task A for both cylinder and novel objects, i.e., same waypoints for both objects. Every team can have 2 runs on each object. For each object, we record both runs but only use the best to rank the teams.
     For incomplete runs, e.g., object drops, teams are ranked by: (total_error/number_of_waypoints_reached). All incomplete runs will be ranked lower than complete runs regardless of the error.

     Task B: the auto-evaluator will simply check how many target facets have been reached. If multiple teams have the same number of successfully reached facets, they will be ranked by the accumulated execution time. Every team will have a 30-seconds time budget for each goal facet.
     There is only one set of 10 target facets for Task B. Every team can have 2 runs on their selected cube. We record both runs but only use the best to rank the teams.

     The scoring system is fully ranking-based. For Task A on the cynliner objects, team #1 will collect 5 points, and the teams ranked after will collect 4, 3, 2, and 1 points. The same rule applies to Task A on the "novel" object. For Task B, team #1 will collect 10 points, and the teams ranked after will collect 9 to 1 points.
     Finally, the overall ranking of the teams will be based on the total points each team has collected from all tasks. Teams can check out this example scoring sheet for reference.

VII. Frequently Asked Questions (Q&A)

Q: Is it allowed to use arm motions or wrist motions to move the object?
A: No, the hand should be installed on a stationary mount. Only hand motions, e.g., finger motions, palm motions, are allowed to manipulate the object. If the object is manipulated by any motions external to the hand, the team's performance will not be evaluated.

Q: Can we design an end-effector that holds the object with a fixed grasp and then have the object manipulated by other joints, e.g., revolute or prismatic, external to the grasp?
A: Such designs will not be treated as in-hand manipulation, since essentially an "arm or wrist" is used to move the object. If you would like to make sure your hand design is going to be qualified at the competition, you are encouraged to contact the organizers as soon as possible to confirm.

Q: How is a grasp initialized for the competition?
A: A human operator should give the object into the robot hand and secure the grasp. There is no need for any autonomous grasping. Once the grasp is secured, the human operator should not touch the setup anymore.

Q: Is there any orientation requirement for the tags in Task A?
A: No, task A is only evaluated by the positional accuracy. The tag can be arbitrarily oriented in the grasp but it is your team's responsibility to ensure that the tag will be detected by the camera.

Q: Is it allowed to use a palm support, instead of a real grasp, to hold the object during manipulation?
A: Yes, palm support is allowed. Essentially, as long as the object is fully held and manipulated by the robot hand, it is considered a valid in-hand manipulation solution.

Q: Where should the camera be installed?
A: Every team can install the camera at anywhere within their robot hand's workspace as long as it can see the object being manipulated.