Perceptive Locomotion through Nonlinear Model Predictive Control
This work is currently under review, a preprint is available:
Title:
Perceptive Locomotion through Nonlinear Model Predictive Control
Authors:
Ruben Grandia, Fabian Jenelten, Shaohui Yang, Farbod Farshidian, and Marco Hutter
Abstract:
Dynamic locomotion in rough terrain requires accurate foot placement, collision avoidance, and planning of the underactuated dynamics of the system. Reliably optimizing for such motions and interactions in the presence of imperfect and often incomplete perceptive information is challenging. We present a complete perception, planning, and control pipeline, that can optimize motions for all degrees of freedom of the robot in real-time. To mitigate the numerical challenges posed by the terrain a sequence of convex inequality constraints is extracted as local approximations of foothold feasibility and embedded into an online model pred
1 view
4202
1731
6 months ago 00:02:18 2
Meet XBot-L, the first humanoid robot to climb the Great Wall
6 months ago 00:04:52 1
ANYmal Parkour: Learning Agile Navigation for Quadrupedal Robots
8 months ago 00:03:00 1
Learning Vision-Based Bipedal Locomotion for Challenging Terrain
9 months ago 00:06:46 1
DTC: Deep Tracking Control
1 year ago 00:06:32 5
Old ticket to the Andromeda Nebula (c) (Старый билет до туманности Андромеды)
1 year ago 00:03:58 1
Calico: Relocatable On-cloth Wearables with Fast, Reliable, and Precise Locomotion
2 years ago 00:02:56 1
Advanced Skills by Learning Locomotion and Local Navigation End-to-End
2 years ago 00:09:10 1
Perceptive Locomotion through Nonlinear Model Predictive Control