Abstract
We tackle stationary crowd analysis in this paper, which is similarly important as modeling mobile groups in crowd scenes and finds many applications in surveillance. Our key contribution is to propose a robust algorithm of estimating how long a foreground pixel becomes stationary. It is much more challenging than only subtracting background because failure at a single frame due to local movement of objects, lighting variation, and occlusion could lead to large errors on stationary time estimation. To accomplish decent results, sparse constraints along spatial and temporal dimensions are jointly added by mixed partials to shape a 3D stationary time map. It is formulated as a L0 optimization problem. Besides background subtraction, it distinguishes among different foreground objects, which are close or overlapped in the spatio-temporal space by using a locally shared foreground codebook. The proposed technologies are used to detect four types of stationary group activities and analyze crowd scene structures. We provide the first public benchmark dataset1 for stationary time estimation and stationary group analysis.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition |
| Publisher | IEEE Computer Society |
| Pages | 2219-2226 |
| Number of pages | 8 |
| ISBN (Electronic) | 9781479951178, 9781479951178 |
| DOIs | |
| Publication status | Published - 24 Sept 2014 |
| Externally published | Yes |
| Event | 27th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2014 - Columbus, United States Duration: 23 Jun 2014 → 28 Jun 2014 |
Publication series
| Name | Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition |
|---|---|
| ISSN (Print) | 1063-6919 |
Conference
| Conference | 27th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2014 |
|---|---|
| Country/Territory | United States |
| City | Columbus |
| Period | 23/06/14 → 28/06/14 |
Bibliographical note
Publisher Copyright:© 2014 IEEE.