A vision-based postural assessment approach for the unobtrusive detection of physical strain and the prevention of Musculoskeletal Disorders in industrial environments

Assessment and prevention of work-related musculoskeletal disorders (WMSD) is considered a common and critical task towards occupational safety and well-being in working environments 1. Especially in the manufacturing industry, labor-intensive, repetitive assembly tasks often attribute prolonged, suboptimal working postures. These may lead to physical strain, thus increased risk for WMSD related to various body parts or joints, such as the low or upper back, the neck, the shoulders and the knees. The conditions of WMSD can severely affect the quality of life, especially for adults and ageing workers, related to significant productivity drops and increasing healthcare costs for industrial organizations and the public sector 2.

A common practice for organizations is to use conventional ergonomic risk assessment methods, which are based on observations by experts and self-reports by workers using worksheets or checklists to evaluate the working postures, indicators for mental and physical stress and other risk factors related to WMSD and fatigue, making the process subjective, time-consuming, and labor-intensive 4. The effective detection of risk factors for WMSD and physical fatigue, in real-time during work tasks or offline, can significantly assist in supporting relevant preventive actions and promoting the occupational safety of workers 5. In the context of sustAGE, we are exploring and developing innovative unobtrusive methods for the automatic assessment of physical strain of workers by combining visual and heart rate data, detecting abnormal postures and worker’s fatigue.

Method

The sustAGE case study supports the car manufacturing industry, where workers often work in shifts on an assembly line, while the conveyor belt moves slowly at a constant speed. Each worker is responsible for a specific car assembly task cycle (e.g., welding, assembling), of duration of 4 to 5 minutes and which is repeated during the shift. To detect events causing physical strain and to support preventive actions, we rely on visual information acquired by stereo cameras placed along the production line supporting the automatic assessment of abnormal body postures and the ergonomic risk level of each such occurrence MURI risk analysis approach 6.

An illustration of a workstation of the assembly line and the positions of two visual sensors monitoring the motion and assembly activities performed by a line worker during a task cycle execution.

Specifically, we introduce a non-obtrusive method for automatic postural assessment for each line worker during assembly tasks using visual data acquired by a stereo camera, thus exempting the requirement of body-worn sensors by the workers 7. Firstly, we use state-of-the-art 2D and 3D body pose estimation and tracking methods, the OpenPose 8 and FORTH MocapNet 9 methods, respectively, to compute the body-centered positions and orientations of 16 body joints of each worker in the presence of severe body occlusions during work tasks. Then, we extract view-invariant, 3D skeletal body features 10 to be used as input to the proposed deep learning-based approach for postural analysis and classification.

The workflow of the proposed methodology for vision-based postural assessment

The proposed approach supports four types of time-varying, abnormal postures based on the MURI risk analysis method11, commonly used in ergonomic risk analysis in manufacturing.

A set of four types of abnormal/awkward body postures and ergonomic risk levels based on the MURI risk analysis method

The selected abnormal postures indicate physical strain imposed on specific body joints: (a) rotation angle of the waist, (b) flexion, stretching angle of the knees, (c) flexion angle of the waist, (d) height of working arms. The risk level was quantified as high, medium or low based on the body joint positions or angles. To classify the type and risk level of an abnormal body posture in a trimmed video clip, we propose a novel approach for postural classification using Spatio-temporal Graph Convolutional Networks 12, to effectively model the correlation between body parts and assess their motion variations against a support set of prototypical postures.

A set of four types of abnormal/awkward body postures and ergonomic risk levels based on the MURI risk analysis method

Integration in the sustAGE system

The high-level semantic information related to the detected body postures and the associated risk for each worker is continuously monitored and aggregated by the system modules to support the vision-based user behaviour monitoring during assembly activities. The proposed vision-based module provides the sustAGE system with information on the detected postural events during or after every assembly task cycle performed by a line worker. This information is further aggregated over consecutive assembly task cycles performed by the same line worker to assess the short- and long-term ergonomic risk for physical strain. The final step is the generation and issue of personalized recommendations to the workers, i.e. to correct postural performance if medium strain condition is detected or take a short break in case of high strain condition, aiming to enhance occupational safety in a preventive manner.

Screenshots of the sustAGE mobile app (the main screen is shown in the left image) showing the notification (middle image, top screen) and the recommendation message in Italian (right screen) upon detection of high physical strain during a series of assembly tasks performed by the worker (message in English: “You seem to have been distressing your back during work. Try to correct your posture for the rest of your shift. OK?”).

VIDEO

  1. Marlene Ferreira Brito, Ana Luísa Ramos, Paula Carneiro, and Maria Antónia Gonçalves. (2019). Ergonomic Analysis in Lean Manufacturing and Industry 4.0—A Systematic Review. Springer International Publishing, Cham, 95–127.
  2. M. Lourakis, S. Ioannidis, N. Cummins, B. Schuller, E. Loutsetis, and D. Koutsouris. 2020. Biosensors and Internet of Things in smart healthcare applications: challenges and opportunities. In Wearable and Implantable Medical Devices. AAAI 2018
  3. Juul-Kristensen B, Fallentin N, Ekdahl C. Criteria for classification of posture in repetitive work by observation methods: A review. Int J Ind Ergon 1997; 19(5): 397–411[/efn-note].

    To overcome those limitations, the research topic of vision-based, automatic postural analysis for assessing ergonomic risks was introduced and has gained much attention by the research community and the industry over the last decade 3Nguyen TD, Kleinsorge M, Kruger J (2014) ErgoAssist: An Assistance System to Maintain Ergonomie Guidelines at Workplaces, Emerging Technology and Factory Automation (ETFA): 1–4.

  4. M. Lourakis, S. Ioannidis, N. Cummins, B. Schuller, E. Loutsetis, and D. Koutsouris. 2020. Biosensors and Internet of Things in smart healthcare applications: challenges and opportunities. In Wearable and Implantable Medical Devices. AAAI 2018
  5. Womack. From lean tools to lean management. Lean Enterprise Institute Email Newsletter, 21,2006
  6. Konstantinos Papoutsakis, Thodoris Papadopoulos, Michalis Maniadakis, Manolis Lourakis, Maria Pateraki, and Iraklis Varlamis. (2021). Detection of physical strain and fatigue in industrial environments using visual and non-visual sensors. In The 14th PErvasive Technologies Related to Assistive Environments Conference (PETRA 2021), June 29-July 2, 2021
  7. Zhe Cao, Gines Hidalgo, Tomas Simon, Shih-En Wei and Yaser Sheikh,(2021) “OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, 1 Jan. 2021.
  8. Ammar Qammaz and Antonis A. Argyros. (2020). Occlusion-tolerant and personalized 3D human pose estimation in RGB images. In IEEE ICPR 2020
  9. Konstantinos Papoutsakis, Costas Panagiotakis, and Antonis A Argyros. (2017). Temporal Action Co-Segmentation in 3D Motion Capture Data and Videos. In IEEE Computer Vision and Pattern Recognition (CVPR) 2017.
  10. Womack. From lean tools to lean management. Lean Enterprise Institute Email Newsletter, 21,2006
  11. Sijie Yan, Yuanjun Xiong, and Dahua Lin. (2018). Spatial Temporal Graph Convolutional Networks for Skeleton-Based Action Recognition. InM. Pateraki, K. Fysarakis, V. Sakkalis, G. Spanoudakis, I. Varlamis, M. Maniadakis