Perception-Aware Human-Assisted Navigation of Mobile Robots on Persistent Trajectories
We propose a novel shared control and active perception framework combining the skills of a human operator in accomplishing complex tasks with the capabilities of a mobile robot in autonomously maximizing the information acquired by the onboard sensors for improving its state estimation. The human operator modifies at runtime some suitable properties of a persistent cyclic path followed by the robot so as to achieve the given task (e.g., explore an environment). At the same time, the path is concurrently adjusted by the robot with the aim of maximizing the collected information. This combined behavior enables the human operator to control the high-level task of the robot while the latter autonomously improves its state estimation. The user’s commands are included in a task priority framework together with other relevant constraints, while the quality of the acquired information is measured by the Shatten norm of the Constructibility Gramian. The user is also provided with guidance feedback pointing in the direction that would maximize this information metric. We evaluated the proposed approach in two human subject studies, testing the effectiveness of including the Constructibility Gramian into the task priority framework as well as the viability of providing either visual or haptic feedback to convey this information metric.