Real-time postural analysis demonstrator. At ITCL, a module for detecting people and estimating posture has been developed. To provide a discrete platform, the module consists of a single RGB camera positioned at a suitable point of view, the images of which are processed by a deep learning system. This system consists of two neural networks, one of them is responsible for detecting the people in the image, while the other detects the positions of their joints.
With this information, analysis can be carried out about the position of people in their work environment or their postural hygiene.
Demonstrator Control of robotic arm movements. Using non-invasive devices, accelerometers and magnetometers, two cases are presented:
- In the first, the opening control of an accessory (2-finger robotic flexible gripper) of a robot by means of the degree of opening of the index and thumb fingers (pinching activity).
- On the other hand, the execution of predefined movements by a robotic arm when detecting a sequence of movements within the activities of an operator in the production chain.
The system is defined by a robot with immersive VR control, it has a stereoscopic camera with a three-axis motorized system that replicates the remote orientation of the operator’s head.
Virtual Reality, Augmented Reality, Mixed Reality, Robotics, Artificial Intelligence
Entity holder of the technological demonstrator
Javier Sedano, R&D Director
Demonstration space Expo Industria 4.0 Burgos. Expo Zone Floor 3 Forum Evolution: Demonstrator nº 23