Skip to yearly menu bar Skip to main content


Poster

GigaTraj: Predicting Long-term Trajectories of Hundreds of Pedestrians in Gigapixel Complex Scenes

Haozhe Lin · Chunyu Wei · Li He · Yuchen Guo · Yuchy Zhao · Shanglong Li · Lu Fang

Arch 4A-E Poster #456
[ ] [ Paper PDF ]
Thu 20 Jun 5 p.m. PDT — 6:30 p.m. PDT

Abstract: Pedestrian trajectory prediction is a well-established area in human behavior research and various industries, with significant recent advancements. While attention has been shifted to predict long-term trajectories, existing datasets are unable to fulfill the demand for studying minute-level trajectory prediction, mainly due to the lack of high-resolution trajectory observation in wide field of view (FoV). To bridge this gap, we introduce a novel dataset named GigaTraj, specifically tailored for long-term trajectory predictions. GigaTraj provides videos with an expansive FoV spanning a vast area of $4 \times 10^4 ~m^2$ and high-resolution imagery at the gigapixel level. Furthermore, it includes comprehensive annotations such as bounding boxes, identity associations, world coordinates, face orientations, complex group/interaction relationships, and scene semantics. Leveraging these multimodal annotations, we evaluate and validate the state-of-the-art approaches for minute-level long-term trajectory prediction in large-scale scenes. Extensive experiments and analyses have revealed that long-term prediction for pedestrian trajectories present numerous challenges and unresolved issues, indicating a vital new direction for trajectory research. The dataset and our code will be made publicly available upon paper acceptance.

Chat is not available.