Remark: The organizers tried to describe the tasks and assessments as good and fair as possible, but all teams should be aware of that we might need to modify the rules before or even during the contest! These ad hoc changes will always be decided by the jury members.
The robots shall detect objects as weeds (5 dandelions) and beer cans (5 objects as examples for waste) and map or geo-reference them. The coordinate system shall be locally in horizontal field dimensions. Good row navigation is required. There will be ten (10) objects in total distributed across the virtual and real field.
The robot has to generate a file (
pred_map.csv) with detected objects and their coordinates relative to the given reference points (pillars with QR code). The reference point of the coordinate system (0,0) is in the center of the field. Each line in the submitted file shall represent an object together with the coordinates x and y in horizontal plane in meters with 3 decimal points. Extra points can be obtained for object classification of weed or litter also indicated in the file. The file layout as an example is:
1.412,2.301,weed -2.352,3.321,litter 1.873,-1.322,weed etc…
After the run, this file should be given to the jury immediately on a USB-stick from the organization. In the simulation, this file should be saved in the ‘
map‘ folder of the ‘
virtual_maize_field‘ package in the robot container. In the field run the removal of waste to headlands gains also extra points, see below.
Virtual and Field Environment
Objects are realistic weeds and cans e.g. of beer with different brands and colors. There will be 5 dandelions and 5 cans as litter within the field. There will be variations within the 5 dandelions in number of blossoms (0 to 5). At least one can will be damaged.
The objects will be placed randomly across the field. No objects are located on the headlands. The reference point of the relative coordinate system will be in the center of the field and not marked. The pillars show a QR code with the name of that pillar. The relative coordinates of the pillars will be provided to the teams on forehand . The coordinates of the pillars will be provided in a ‘
markers.csv‘ file. In the field contest on a USB-stick and in the simulation in the ‘
map‘ folder of the ‘
In the field, the robot should be able to make two different loud distinct sounds when it detects a weed or can. Both sounds should be different in order to indicate which kind of object was found. In the simulation, when the robot detects an object, it should publish the object type (‘weed’ or ‘litter’) to the ‘
/fre_detections’ topic. This will spawn a marker in the simulation above the robot position.
Rules for robots
The maximum available time for the mapping run is 3 min, but if the robot successfully moves an object to the headline it will gain 1 min of time. This is to promote the waste removal and not to get punished for this useful action.
The jury assesses the detection and classification during the run:
|Detected object and right category (true positive)||5 point|
|Detected object wrong category (false positive)||-5 (minus) points|
And assesses the classification and accuracy of mapped objects:
|x: Euclidean distance to object of the same kind*||points|
|x ≤ 2cm||15|
|2 cm < x ≤ 37.5 cm||15.56 – 0.2817 * x|
|x > 37.5 cm (false positive)||-5|
*(distance error to the nearest object of the same kind)
And in the field run registers and assesses the removal of waste objects:
|Object picked up||3 points/object|
|Object delivered to the headlands||6 points/object and a time bonus of 1 minute|
The robot is allowed to push the object to the headland, but without a clear act of picking up, it will only earn points for the delivery.
Crop plant damage by the robot will result in a penalty of 4 points per plant.
The task completing teams will be ranked by the number of points as described above. The best 3 teams will be rewarded.