Non-destructive anonymization of training data for object detection
The rapid advancement of computer vision, powered by large-scale visual datasets and deep learning, has raised pressing concerns about privacy, particularly when human faces are involved. This work explores how facial anonymization affects the performance of human detection models, aiming to balance identity protection with model utility. A range of anonymization techniques are applied, including