Meta-Graph Adaptation for Visual Object Tracking

Existing deep trackers typically use offline-learned backbone networks for feature extraction across various online tracking tasks. However, for unseen objects, offline-learned representations are still limited due to the lack of adaptation. In this paper, we propose a Meta-Graph Adaptation Network (MGA-Net) to adapt backbones of deep trackers to specific online tracking tasks in a meta-learning fashion. Our MGANet is composed of a gradient embedding module (GEM) and a filter adaptation module (FAM). GEM takes gradients as an adaptation signal, and applies graph-message propagation to learn smoothed low-dimensional gradient embeddings. FAM utilizes both the learned gradient embeddings and the target exemplar to adapt the filter weights for the specific tracking task. MGA-Net can be end-to-end trained in an offline meta learning way, and runs completely feed-forward for testing, thus enabling highly-efficient online tracking. We show that MGA-Net is generic and demonstrate its effectiveness in both template matching and correlation filter tracking frameworks.

Selected Publications

Demo