![]() ![]() This work was supported by the National Centre of Competence in Research Robotics (NCCR) through the Swiss National Science Foundation, the SNSF-ERC Starting Grant, and the DARPA FLA Program. This datasets are released under the Creative Commons license (CC BY-NC-SA 3.0), which is free for non-commercial use (including research). Discrete-Time Vision-based SLAM: A Comparative Study},Īuthor=, Robotics and Automation Letters (RA-L), 2022. Discrete-Time Vision-based SLAM: A Comparative Study When using this work in an academic context, please cite the following publications:Ĭontinuous-Time vs. ![]() With thisĭataset, our goal is to help advance the state of the art in high speed High-resolution RGB images from the pilot’s FPV camera. Tracker, as well as event data from an mDAVIS 346 event camera, and Snapdragon Flight board, ground truth from a Leica Nova MS60 laser We present the camera images and IMU data from a Qualcomm Gates, as well as free-form trajectories around obstacles, both indoorĪnd out. The trajectories include fast laps around a racetrack with drone racing Quadrotor fitted with sensors and flown aggressively by an expert pilot. Sequences were recorded with a first-person-view (FPV) drone racing Speed state estimation, but existing datasets do not address this. However, manyĬompelling applications, such as autonomous drone racing, require high LargeĪccelerations, rotations, and apparent motion in vision sensors makeĪggressive trajectories difficult for state estimation. We introduce the UZH-FPV Drone Racing dataset, which is the mostĪggressive visual-inertial odometry dataset to date. The results from our IROS 2020 competition are public. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |