Hi4D: 4D Instance Segmentation of Close Human Interaction

Yifei Yin · Chen Guo · Manuel Kaufmann · Juan Jose Zarate · Jie Song · Otmar Hilliges

West Building Exhibit Halls ABC 051
[ Abstract ] [ Project Page ]
Thu 22 Jun 10:30 a.m. PDT — noon PDT


We propose Hi4D, a method and dataset for the auto analysis of physically close human-human interaction under prolonged contact. Robustly disentangling several in-contact subjects is a challenging task due to occlusions and complex shapes. Hence, existing multi-view systems typically fuse 3D surfaces of close subjects into a single, connected mesh. To address this issue we leverage i) individually fitted neural implicit avatars; ii) an alternating optimization scheme that refines pose and surface through periods of close proximity; and iii) thus segment the fused raw scans into individual instances. From these instances we compile Hi4D dataset of 4D textured scans of 20 subject pairs, 100 sequences, and a total of more than 11K frames. Hi4D contains rich interaction-centric annotations in 2D and 3D alongside accurately registered parametric body models. We define varied human pose and shape estimation tasks on this dataset and provide results from state-of-the-art methods on these benchmarks.

Chat is not available.