In industrial settings, robotic tasks often require interaction with various objects, necessitating compliant manipulation to prevent damage while accurately tracking reference forces. Interaction controllers are typically employed for this purpose, but they need either human tinkering for parameter tuning or precise environmental modeling. Both these aspects can be problematic, as the former is a time-consuming procedure, and the latter is unavoidably affected by approximations, hence being prone to failure during the actual application. In order to address these challenges, the current research focuses on designing high-performance force controllers. Along this line, this work introduces optimized residual action for interaction control with learned environments (ORACLE), a novel force control approach. Exploiting neural networks, ORACLE predicts robot–environment interaction forces, which are then used in an optimal residual action controller to locally correct actions from a base force controller, minimizing the force-tracking error. Tested on a real Franka Emika Panda robot, ORACLE demonstrates superior force-tracking performance compared with the state-of-the-art controllers, with a short setup time.
Optimized Residual Action for Interaction Control With Learned Environments
Petrone V.;Ferrentino E.;Chiacchio P.;
2025
Abstract
In industrial settings, robotic tasks often require interaction with various objects, necessitating compliant manipulation to prevent damage while accurately tracking reference forces. Interaction controllers are typically employed for this purpose, but they need either human tinkering for parameter tuning or precise environmental modeling. Both these aspects can be problematic, as the former is a time-consuming procedure, and the latter is unavoidably affected by approximations, hence being prone to failure during the actual application. In order to address these challenges, the current research focuses on designing high-performance force controllers. Along this line, this work introduces optimized residual action for interaction control with learned environments (ORACLE), a novel force control approach. Exploiting neural networks, ORACLE predicts robot–environment interaction forces, which are then used in an optimal residual action controller to locally correct actions from a base force controller, minimizing the force-tracking error. Tested on a real Franka Emika Panda robot, ORACLE demonstrates superior force-tracking performance compared with the state-of-the-art controllers, with a short setup time.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


