We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Contrastive learning of graph encoder for accelerating pedestrian trajectory prediction training.
- Authors
Yao, Zonggui; Yu, Jun; Ding, Jiajun
- Abstract
In the area of pedestrian trajectory prediction, the hybrid structures of temporal feature extractor or spatial feature extractor have paved the way for the precise prediction model, and they are in larger and larger scale. Learning of specific feature encoding model not only influenced by the structure of the network, but also by the learning manners such as supervised learning and unsupervised learning. Previous works concentrated on more comprehensive encoders and more delicate designs of feature extractors. However, the mutual influence factors from the neighbour pedestrians associate with the distance to the centre pedestrian seldomly noticed. Most of the existed feature extractors in prediction models trained in the way of supervised learning other than unsupervised manners caused the problem that the extracted features are always handcrafted without the natural distinction of obscure situations. The graph contrastive accelerating encoder is proposed, which accelerates the pedestrian trajectory prediction training process of the state of the art method of spatio‐temporal graph transformer networks. Employing the unsupervised contrastive learning process and the graph of neighbours representing distance affection of nearest and farthest pedestrian to the centre pedestrian, the graph contrastive accelerating encoder significantly shrinked the training time. Holding the final performance on to state of the art level, the proposed method let the lowest pedestrian trajectory prediction error show up in the obviously earlier training steps.
- Subjects
MACHINE learning; GRAPH theory; PEDESTRIAN traffic flow; PREDICTION models; FEATURE extraction
- Publication
IET Image Processing (Wiley-Blackwell), 2021, Vol 15, Issue 14, p3645
- ISSN
1751-9659
- Publication type
Article
- DOI
10.1049/ipr2.12185