The study of skeleton description reduction in the human fall-detection task

Автор: Seredin Oleg Sergeevich, Kopylov Andrei Valerievich, Surkov Egor Eduardovich

Журнал: Компьютерная оптика @computer-optics

Рубрика: Обработка изображений, распознавание образов

Статья в выпуске: 6 т.44, 2020 года.

Бесплатный доступ

Accurate and reliable real-time fall detection is a key aspect of any intelligent elderly people care system. A lot of modern RGB-D cameras can provide a skeleton description of a human figure as a compact pose presentation. This makes it possible to use this description for further analysis without access to real video and, thus, to increase the privacy of the whole system. The skeleton description reduction based on the anthropometrical characteristics of a human body is proposed. The experimental study on the TST Fall Detection dataset v2 by the Leave-One-Person-Out method shows that the proposed skeleton description reduction technique provides better recognition quality and increases the overall performance of a Fall-Detection System.

Еще

Fall detection, human activity detection, skeleton description, rgb-d camera, elderly people care system

Короткий адрес: https://sciup.org/140250071

IDR: 140250071   |   DOI: 10.18287/2412-6179-CO-753

Список литературы The study of skeleton description reduction in the human fall-detection task

  • Falls. World Health Organization. Source: áhttps://www.who.int/en/news-room/fact-sheets/detail/fallsñ.
  • Wild K, et al. Unobtrusive in-home monitoring of cognitive and physical health: Reactions and perceptions of older adults. J Appl Gerontol 2008; 27: 181-200.
  • Mastorakis G, Makris D. Fall detection system using Kinect's infrared sensor. J Real-Time Image Process 2012; 9(4): 635-646.
  • Demiris G, et al. Older adults' privacy considerations for vision based recognition methods of eldercare applications. Technol Heal Care 2009; 17(1): 41-48.
  • Seredin OS, Kopylov AV, Huang S-C, Rodionov DS. A skeleton features-based fall detection using Microsoft Kinect v2 with one class-classifier outlier removal. Int Arch Photogramm Remote Sens Spat Inf Sci 2019; 42(2:W12): 189-195.
  • Mundher Z, Zhong J. A real-time fall detection system in elderly care using mobile robot and Kinect sensor. Int J Mater Mech Manuf 2014; 2(2): 133-138.
  • Wang J, et al. Mining actionlet ensemble for action recognition with depth cameras. Proc IEEE Comp Soc Conf Comp Vis Pattern Recognit 2012: 1290-1297.
  • Vemulapalli R, Arrate F, Chellappa R. Human action recognition by representing 3D skeletons as points in a lie group. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 2014: 588-595.
  • Hussein ME, et al. Human action recognition using a temporal hierarchy of covariance descriptors on 3D joint locations. IJCAI Int Jt Conf Artif Intell 2013: 2466-2472.
  • Papandreou G, et al. Towards accurate multi-person pose estimation in the wild. Proc 30th IEEE Conf Comput Vis Pattern Recognit (CVPR) 2017; 2017: 3711-3719.
  • Pathak D, Bhosale VK. Fall detection for elderly people in homes using Kinect Sensor. Int J Innov Res Comput Commun Eng 2017; 5(2): 1468-1474.
  • Bevilacqua V, et al. Fall detection in indoor environment with Kinect sensor. 2014 IEEE Int Symp Innov Intell Syst Appl Proc 2014: 319-324.
  • Chen C, et al. Learning a 3D human pose distance metric from geometric pose descriptor. IEEE Trans Vis Comput Graph 2011; 17(11): 1676-1689.
  • Zhang S, Liu X, Xiao J. On geometric features for skeleton-based action recognition using multilayer LSTM networks. Proc 2017 IEEE Winter Conference on Applications of Computer Vision (WACV) 2017: 148-157.
  • Zhang X, Xu C, Tao D. Graph edge convolutional neural networks for skeleton based action recognition. 2018. P. 1-22.
  • Du Y, Wang W, Wang L. Hierarchical recurrent neural network for skeleton based action recognition. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 2015; 07-12-June: 1110-1118.
  • Yan S, Xiong Y, Lin D. Spatial temporal graph convolutional networks for skeleton-based action recognition. arXiv Preprint 2018. Source: áhttps://arxiv.org/abs/1801.07455ñ.
  • TST Fall detection dataset v2. IEEE DataPort. Source: áhttps://ieee-dataport.org/documents/tst-fall-detection-dataset-v2ñ.
  • Sung J, et al. Unstructured human activity detection from RGBD images. Proc IEEE Int Conf on Robotics and Automation 2012: 842-849.
  • Page ES. Continuous inspection schemes. Biometrika 1954; 41(1/2): 100-115.
  • Gasparrini S, Cippitelli E, Gambi E, Spinsante S, Wåhslén J, Orhan I, Lindh T. Proposal and experimental evaluation of fall detection solution based on wearable and depth data fusion. In Book: Loshkovska SSK, ed. ICT Innovations 2015: Advances in intelligent systems and computing. Cham: Springer; 2016: 99-108.
  • Fakhrulddin AH, Fei X, Li H. Convolutional neural networks (CNN) based human fall detection on Body Sensor Networks (BSN) sensor data. Proc 2017 4th Int Conf Syst Informatics (ICSAI) 2018: 1461-1465.
  • Hwang S, Ahn D, Park H, Park T. Maximizing accuracy of fall detection and alert systems based on 3D convolutional neural network. Proc Second Int Conf Internet-of-Things Des Implement (IoTDI'17) 2017: 343-344.
  • Min W, Yao L, Lin Z, Liu L. Support vector machine approach to fall recognition based on simplified expression of human skeleton action and fast detection of start key frame using torso angle. IET Comput Vis 2018; 12(8): 1133-1140.
Еще
Статья научная