Time Reversal Symmetry for Efficient Robotic Manipulations in Deep Reinforcement Learning

1Shanghai Jiao Tong University   2Duke Kunshan University
*Corresponding author(s)
NeurIPS 2025

Abstract

Symmetry is pervasive in robotics and has been widely exploited to improve sample efficiency in deep reinforcement learning (DRL). However, existing approaches primarily focus on spatial symmetries, such as reflection, rotation, and translation, while largely neglecting temporal symmetries. To address this gap, we explore time reversal symmetry, a form of temporal symmetry commonly found in robotics tasks such as door opening and closing. We propose Time Reversal symmetry enhanced Deep Reinforcement Learning (TR-DRL), a framework that combines trajectory reversal augmentation and time reversal guided reward shaping to efficiently solve temporally symmetric tasks. Our method generates reversed transitions from fully reversible transitions, identified by a proposed dynamics-consistent filter, to augment the training data. For partially reversible transitions, we apply reward shaping to guide learning, according to successful trajectories from the reversed task. Extensive experiments on the Robosuite and MetaWorld benchmarks demonstrate that TR-DRL is effective in both single-task and multi-task settings, achieving higher sample efficiency and stronger final performance compared to baseline methods.

TR-DRL: Time-Reversal enhanced Deep Reinforcement Learning

TR-DRL Teaser

Example of Time Reversal Symmetric Task Pairs

Drawer Opening

Drawer Opening

Drawer Closing

Drawer Closing

Window Opening

Window Opening

Window Closing

Window Closing

Method Overview

Overview of TR-DRL

Results

Demo Video in Robosuite

Summary Video

BibTeX (arXiv)

@misc{jiang2025timereversalsymmetryefficient,
      title={Time Reversal Symmetry for Efficient Robotic Manipulations in Deep Reinforcement Learning},
      author={Yunpeng Jiang and Jianshu Hu and Paul Weng and Yutong Ban},
      year={2025},
      eprint={2505.13925},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2505.13925}
    }