Preliminary description! There can be changes during next weeks.To be discussed in the teams meeting.
During morning from 10:00 to 12:00:
A task will be conducted virtually in a simulation of ROS Gazebo. (explanation)
Using a standard robot model or an own machine model.
In the afternoon from 14:00 to 16:00:
The same task will be conducted on a real maize field.
Using the standard robot or an own machine.
Robot models in simulation – Rules
A model of a standard robot (Jackal CLEARPATH ROBOTICS) will be provided for those teams who want a model. Teams can also come with their own machine models. The models must be realistic in function and physics (kinematics, sensing and other abilities). Basic parameters must be considered and respected.
The basic parameters and rules that apply for custom robots are:
Velocity and acceleration limits.
The limits to the velocity and acceleration are:
|Linear forward velocity||2.0 m/s|
|Linear forward acceleration||20.0 m/s2|
|Angular velocity||4.0 rad/s|
|Angular acceleration||25.0 rad/s2|
These limits are already applied to the Jackal robot in the example workspace.
Limits to sensors and Gazebo plugins used to simulate your robot.
The sensor plugins should be used with the default noise parameters. The allowed sensors and the corresponding allowed Gazebo plugin are:
|Sensor||Allowed Gazebo plugin||Noise parameters|
|Synchronized RGB cameras (stereo)||libgazebo_ros_multicamera.so||Link|
|Inertial Measurement Unit (IMU)||libhector_gazebo_ros_imu.so||Link|
|Realsense RGB-D camera||librealsense_gazebo_plugin.so||Link|
* This plugin will also work fine on a pc without GPU and is the only version allowed because it does interact with the visual mesh of the maize plants.
If your robot model needs additional sensors and plugins that are not mentioned above, please contact the organization (email@example.com) before 1 June 2022. The organization will decide if the sensor will be allowed in the simulation contest and if so, will install the corresponding plugin in the simulation container and publish the new simulation container on Dockerhub. The table above will be updated if new sensors and plugins are allowed in the simulation. You can also subscribe to the competition environment on Github to get an update when something is added.
Limits to the ROS control plugins used to simulate your robot.
To let your robot work in Gazebo, the controller should be installed in the simulation container. By default the following ROS controllers are available in the simulation container.
|velocity_controllers||joint_position_controller joint_velocity_controller joint_group_velocity_controller|
|effort_controllers||joint_position_controller joint_group_position_controller joint_velocity_controller joint_effort_controller joint_group_effort_controller|
|joint_trajectory_controllers||position_controller velocity_controller effort_controller position_velocity_controller position_velocity_acceleration_controller|
If your robot model needs an additional ROS controller that is not mentioned above, please contact the organization (firstname.lastname@example.org) before 1 June 2022. If your team writes a custom ROS controller for your robot, it has to be published in a public repository on Github or Gitlab. The organization will install the needed ROS controller in the simulation container and will keep the table above up to date. You can also subscribe to the competition environment to get an update when something is added.
Make your robot description and ROS parameters public.
Your custom robot description and ROS control parameters (in case of the example robot, you should upload both folders ‘example_robot_description’ and ‘example_robot_control’) should be made public before the simulation contest by uploading it to a public repository on Github or Gitlab. The link to this repository should be emailed to the organization (email@example.com) before 1 June 2022. The organisation will add this link to your robots page on the website. In this way, the organization and other teams are able to check your robot.
Your robot should be realistic.
For example, if you attach a camera to your robot model, it should be attached to your robot with a frame and not floating around.
Using the standard robot in field
For those teams who are not coming to the event personally, we offer the opportunity that their codes can also be tested in the real field during the afternoon runs.
Teams should inform us about what sensors etc. they want to use. The organizers will decide about if the requested components can be used. After the organizers agreed to the use of a component they will ensure that they can be executed within the composed environments (simulation).
All source codes (models for sensors etc.) should be send to the organising team and made public before the event, because we want to promote the use of open source sensors.