Author: stachnis

2017-08: Olga Vysotska receives a Google Award for the ACM womENcourage 2017

Olga Vysotska receives a Google Award for the ACM womENcourage 2017 in Barcelona in September 2017. The award is given through the European ACM-W committee (ACM-EW). The ACM-WE vision is a transformed European professional and scholarly landscape where women are supported and inspired to pursue their dreams and ambitions to find fulfillment in the computing field.

2017-06: Code Available: Online Place Recognition by Olga Vysotska

Online Place Recognition by Graph-Based Matching of Image Sequences by Olga Vysotska and Cyrill Stachniss

Online Place Recognition is available on GitHub

Given two sequences of images represented by the descriptors, the code constructs a data association graph and performs a search within this graph, so that for every query image, the code computes a matching hypothesis to an image in a database sequence as well as matching hypothesis for the previous images. The matching procedure can be perfomed in two modes: feature based and cost matrix based mode. For more theoretical details, please refer to our paper Lazy data association for image sequence matching under substantial appearance changes.

This code is related to the following publication:
O. Vysotska and C. Stachniss, “Lazy Data Association For Image Sequences Matching Under Substantial Appearance Changes,” IEEE Robotics and Automation Letters (RA-L)and IEEE International Conference on Robotics & Automation (ICRA), vol. 1, iss. 1, pp. 1-8, 2016. doi:10.1109/LRA.2015.2512936.

2017-06: “UAV-Based Crop and Weed Classification for Smart Farming” Receives the Best Paper Award in Automation of the IEEE Robotics and Automation Society at ICRA 2017

award

The work “UAV-Based Crop and Weed Classification for Smart Farming” by Philipp Lottes, Raghav Khanna, Johannes Pfeifer, Roland Siegwart and Cyrill Stachniss received the Best Paper Award in Automation of the IEEE Robotics and Automation Society at ICRA 2017.

Abstract: Unmanned aerial vehicles (UAVs) and other robots in smart farming applications offer the potential to monitor farm land on a per-plant basis, which in turn can reduce the amount of herbicides and pesticides that must be applied. A central information for the farmer as well as for autonomous agriculture robots is the knowledge about the type and distribution of the weeds in the field. In this regard, UAVs offer excellent survey capabilities at low cost. In this paper, we address the problem of detecting value crops such as sugar beets as well as typical weeds using a camera installed on a light-weight UAV. We propose a system that performs vegetation detection, plant-tailored feature extraction, and classification to obtain an estimate of the distribution of crops and weeds in the field. We implemented and evaluated our system using UAVs on two farms, one in Germany and one in Switzerland and demonstrate that our approach allows for analyzing the field and classifying individual plants.

2017-05: Code Available: High-Speed Depth Clustering Using 3D Range Sensor Data by Igor Bogoslavskyi (Updated)

Depth Clustering by Igor Bogoslavskyi and Cyrill Stachniss

Depth Clustering is available on GitHub

We released our code that implements a fast and robust algorithm to segment point clouds taken with Velodyne sensor into objects. It works with all available Velodyne sensors, i.e. 16, 32 and 64 beam ones. See a video that shows all objects which have a bounding box with the volume of less than 10 qubic meters:

This code is related to the following publications:
I. Bogoslavskyi and C. Stachniss, “Efficient Online Segmentation for Sparse 3D Laser Scans,” PFG — Journal of Photogrammetry, Remote Sensing and Geoinformation Science, pp. 1-12, 2017.
as well as
I. Bogoslavskyi and C. Stachniss, “Fast Range Image-Based Segmentation of Sparse 3D Laser Scans for Online Operation”, In Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS) , 2016.

2017-02: EXIST Startup on Laser-Based Weeding Has Started

EXIST-Funded startup focussing on an autonomous laser-based weeding system has started in Feb 2017 at the Photogrammetry & Robotics Lab.

Abstract – For decades, herbicides have been the main means of weed control. The extensive use of chemical substances in agriculture is generating devastating effects on our environment and is also affecting human health. Additionally, several weed varieties have been developing a natural resistance to the applied chemicals, hence new and more potent herbicides have to be developed.
Due to the great ecological and human health effects, the world is demanding chemical free crops from our fields. Laser-based weeding is an excellent non-herbicide solution to weed control that will enable the production of organic certified products.
The proposed system uses multi-spectral sensors and state of the art computer vision algorithms to detect and classify all plants on the field. After identifying the weed plants, a laser beam will be used to eliminate or to seriously damage the weeds. In this way, value crops can grow without the competition from weed and have higher yields because all available nutrients do not have to be shared.
This technology has the potential for installing a new generation of sustainable crop production farms. With the use of the laser-based weeding system at large scales, herbicide use can be substantially reduced. Our goal is that within the foreseeable future this weeding method will be used by every major crop producer, thus, protecting the the environment and the health of the population.

Contact:
Julio Pastrana (julio.pastrana@nullescarda.net) and Tim Wigbels (tim.wigbels@nullescarda.net)

2017-01: EUROPA & EUROPA2: Seven Years of EC-funded Research on Navigation in Urban Environments

The kickoff meeting of the EUROPA project on urban robot navigation was held in Feb 2009 in Freiburg and on Jan 20, 2017, the final review of the successor project EUROPA2 ended with an excellent evaluation. A big thank you to all our partners and team members from the University of Freiburg and Oxford, KU Leuven, ETH Zürich, RWTH Aachen as well as GeoAutomation and BlueBotics.

2017-01-20-europa2-final-review-cut

2016-09: ROVINA Project Performed Excellently in the Final Review

The EC-funded project ROVINA has the goal of providing new means for accessing and digitizing cultural heritage sites. It is targeted at developing innovative digital preservation tools and at the same time novel technologies that strengthen the robustness of robots operating in previously unknown and unstructured 3D environments. On September 26, 2016 the final review of the 42 month project was successful so that ROVINA has been evaluated excellently in all four review meetings.

The collaborative project has been conducted by a consortium coordinated by Cyrill Stachniss at the University of Bonn that includes experts from robotics, computer vision and archeology: KU Leuven, La Sapienza University of Rome, RWTH Aachen, University of Freiburg, Algorithmica and the International Council of Monuments and Sites.

Impressions from the ROVINA Project


[huge_it_gallery id=”2″]