Calendar - 葫芦影业

葫芦影

Skip to main content

[Defense] Towards Robust Person Re-Identification: Learning Invariant Features Under Clothing Changes and Occlusions

Friday, March 28, 2025

10:00 am - 12:00 pm

In Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy
Vuong (Dustin) Nguyen
will defend his dissertation
Towards Robust Person Re-Identification: Learning Invariant Features Under Clothing Changes and Occlusions


Abstract

Person Re-Identification (Re-ID) involves matching the same person in a non overlapping camera system. This is a crucial task in computer vision with applications spanning from search-and-rescue to safety and security. However, achieving robust performance in real-world, unconstrained environments remains a significant challenge due to factors like clothing changes and occlusions. This research addresses these in-the-wild conditions by proposing novel methods for learning robust and invariant feature representations capable of generalizing across these challenging scenarios. Clothing changes make appearance-based features unreliable. To address this, we extract auxiliary cloth-invariant modalities, such as shape and gait, in addition to appearance. Our work introduces novel learning strategies based on contrastive learning to enhance the robustness of both appearance and cloth-invariant features, achieving state-of-the-art (SOTA) performance on Cloth-Changing Re-ID (CCRe-ID) datasets. In real-world scenarios, occlusions often compound with clothing changes to further challenge Re-ID. While no existing method explicitly addresses the combination of these two issues, we are the first to tackle this more practical task, which we term Occluded Cloth-Changing Re-ID (OCCRe-ID). Through occlusion synthesis, we expose the model to a variety of real-world occlusion variations while capturing cloth-invariant modalities. We first propose a novel cross-modality collaborative training strategy, which dynamically mines the complementary relationships between the extracted modalities. This strategy encourages modalities to exchange beneficial information, allowing the framework to emphasize the most informative modality when others are ambiguous under challenging conditions such as occlusions, clothing changes, or poor lighting. Second, we propose to further enhance the occlusion handling ability of Re-ID models by leveraging occlusion type awareness, which can be seamlessly integrated into Re-ID backbones as a plug-and-play strategy. Our proposed approaches generalize effectively to both image-based and video-based Re-ID. To advance research in In-the-Wild Re-ID, we also construct large-scale datasets containing the most extensive clothing variations and occlusions per identity. Finally, extensive experiments and evaluations on major benchmarks demonstrate the superiority of our proposed frameworks over existing methods.


Friday, March 28, 2025
10:00 AM - 12:00 PM

PGH 550 and MS Teams
Meeting ID: 211 376 846 231; Passcode: Q8xo7gf2

Dr. Shishir Shah, dissertation advisor

Faculty, students, and the general public are invited.

Dissertation Defense Thumbnail (3 of 3)