• Conference
  • Engineering and Numerical Tools

InHARD – Industrial Human Action Recognition Dataset in the Context of Industrial Collaborative Robotics

Conférence : Communications avec actes dans un congrès international

Nowadays, humans and robots are working more closely together. This increases business productivity and product quality, leading to efficiency and growth. However, human and robot collaboration is rather static; robots move to a specific position then humans perform their tasks while being assisted by the robots. In order to get a dynamic collaboration, robots need to understand the human’s intention and learn to recognize the performed actions complementing therefore his capabilities and relieving him of arduous tasks. Consequently, there is a need for a human action recognition dataset for Machine Learning algorithms. Currently available depth-based and RGB+D+S based human action recognition datasets have a number of limitations, counting the lack of training samples along with distinct class labels, camera views, diversity of subjects and more importantly the absence of actual industrial human actions in an industrial environment. Actual action recognition datasets include simple daily, mutual, or health-related actions. Therefore, in this paper we introduce an RGB+S dataset named “Industrial Human Action Recognition Dataset” (InHARD) from a real-world setting for industrial human action recognition with over 2 million frames, collected from 16 distinct subjects. This dataset contains 13 different industrial action classes and over 4800 action samples. The introduction of this dataset should allow us the study and development of various learning techniques for the task of human actions analysis inside industrial environments involving human robot collaboration.