TY - JOUR
T1 - GestureLens
T2 - Visual Analysis of Gestures in Presentation Videos
AU - Zeng, Haipeng
AU - Wang, Xingbo
AU - Wang, Yong
AU - Wu, Aoyu
AU - Pong, Ting Chuen
AU - Qu, Huamin
N1 - Publisher Copyright:
© 1995-2012 IEEE.
PY - 2023/8/1
Y1 - 2023/8/1
N2 - Appropriate gestures can enhance message delivery and audience engagement in both daily communication and public presentations. In this article, we contribute a visual analytic approach that assists professional public speaking coaches in improving their practice of gesture training through analyzing presentation videos. Manually checking and exploring gesture usage in the presentation videos is often tedious and time-consuming. There lacks an efficient method to help users conduct gesture exploration, which is challenging due to the intrinsically temporal evolution of gestures and their complex correlation to speech content. In this article, we propose GestureLens, a visual analytics system to facilitate gesture-based and content-based exploration of gesture usage in presentation videos. Specifically, the exploration view enables users to obtain a quick overview of the spatial and temporal distributions of gestures. The dynamic hand movements are firstly aggregated through a heatmap in the gesture space for uncovering spatial patterns, and then decomposed into two mutually perpendicular timelines for revealing temporal patterns. The relation view allows users to explicitly explore the correlation between speech content and gestures by enabling linked analysis and intuitive glyph designs. The video view and dynamic view show the context and overall dynamic movement of the selected gestures, respectively. Two usage scenarios and expert interviews with professional presentation coaches demonstrate the effectiveness and usefulness of GestureLens in facilitating gesture exploration and analysis of presentation videos.
AB - Appropriate gestures can enhance message delivery and audience engagement in both daily communication and public presentations. In this article, we contribute a visual analytic approach that assists professional public speaking coaches in improving their practice of gesture training through analyzing presentation videos. Manually checking and exploring gesture usage in the presentation videos is often tedious and time-consuming. There lacks an efficient method to help users conduct gesture exploration, which is challenging due to the intrinsically temporal evolution of gestures and their complex correlation to speech content. In this article, we propose GestureLens, a visual analytics system to facilitate gesture-based and content-based exploration of gesture usage in presentation videos. Specifically, the exploration view enables users to obtain a quick overview of the spatial and temporal distributions of gestures. The dynamic hand movements are firstly aggregated through a heatmap in the gesture space for uncovering spatial patterns, and then decomposed into two mutually perpendicular timelines for revealing temporal patterns. The relation view allows users to explicitly explore the correlation between speech content and gestures by enabling linked analysis and intuitive glyph designs. The video view and dynamic view show the context and overall dynamic movement of the selected gestures, respectively. Two usage scenarios and expert interviews with professional presentation coaches demonstrate the effectiveness and usefulness of GestureLens in facilitating gesture exploration and analysis of presentation videos.
KW - Gesture
KW - hand movements
KW - presentation video analysis
KW - visual analysis
UR - https://www.webofscience.com/wos/woscc/full-record/WOS:001022080200017
UR - https://openalex.org/W4224246038
UR - https://www.scopus.com/pages/publications/85128585550
U2 - 10.1109/TVCG.2022.3169175
DO - 10.1109/TVCG.2022.3169175
M3 - Journal Article
C2 - 35446768
SN - 1077-2626
VL - 29
SP - 3685
EP - 3697
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
IS - 8
ER -