The IMPACT project focuses on developing robust navigation systems (Perception, Planning, and Control) for autonomous robots in challenging off-road environments. Our approach integrates multimodal sensing with adaptive control strategies to minimize human intervention while maintaining safety and efficiency across diverse terrain types. This research initiative encompasses multiple projects addressing different aspects of autonomous off-road navigation.
Direct applications of this project can be seen in our work: INSPECTOR Project.
CART focuses on developing real-time terrain adaptation strategies for legged robots operating in dynamic environments. The project combines proprioceptive sensing with terrain classification to adjust gait parameters and contact forces. Our approach uses deep reinforcement learning to enable robots to maintain stability on surfaces ranging from loose gravel to muddy terrain while carrying variable payloads. Current work involves integrating millimeter-wave radar with traditional vision systems for improved terrain characterization.
This project develops risk-aware navigation algorithms for wheeled and tracked robots operating in construction sites and natural disaster areas. Our multi-objective optimization framework balances terrain traversability, energy efficiency, and operational safety. The system incorporates probabilistic terrain modeling and dynamic risk assessment to handle partially observable environments. Current research focuses on human-robot collaborative path planning using uncertainty-aware motion primitives.
Nature has evolved humans to walk on different terrains by developing a detailed understanding of their physical characteristics. Similarly, legged robots need to develop their capability to walk on complex terrains with a variety of task-dependent payloads to achieve their goals. However, conventional terrain adaptation methods are susceptible to failure with varying payloads. In this work, we introduce PANOS, a weakly supervised approach that integrates proprioception and exteroception from onboard sensing to achieve a stable gait while walking by a legged robot over various terrains. Our work also provides evidence of its adaptability over varying payloads. We evaluate our method on multiple terrains and payloads using a legged robot. PANOS improves the stability up to 44% without any payload and 53% with 15 lbs payload. We also notice a reduction in the vibration cost of 20% with the payload for various terrain types when compared to state-of-the-art methods.
Coming Soon!
PANOS inputs a stream of images and proprioception data Pt (joints, hips and, feet slips) recorded in an unsupervised fashion. The framework encodes these readings into two backbones DINOv2 [25] and a vanilla encoder for proprioception resulting in sets of random sequences St with visual tokens F visual t and proprioceptive features F proprio t stacked together. As an intermediate step, a pointer network is defined to assign the weighted confidence Confidence between these sets and select the dominating ones to train. Finally, we use the trained contextual relationship ct as the input to a neural network that predicts the optimal velocity.
@inproceedings{singh2024panos,
title={PANOS: Payload-Aware Navigation in Offroad Scenarios},
author={kartikeya singh, Yash Turkar, Christo Aluckal and Karthik Dantu},
year={2024}
}