Applying shop floor knowledge to achieve better run factories

Innovative 3D simulation software enables plant managers to streamline factory operations and take into account the experience of workers themselves.

A highly skilled workforce – supported by advanced automation and IT tools – has enabled European industries to become leaders in fields ranging from car making to chemicals. In order to ensure that factories and assembly lines remain at the cutting edge in a highly competitive world, the EU-funded INTERACT project has sought to better utilise workers’ knowledge in the development of next generation digital tools.

‘Manufacturing companies often use 3D software tools to simulate human tasks on the factory floor prior to their implementation,’ explains INTERACT project coordinator Professor Martin Manns from the University of Siegen in Germany. ‘Initially, tasks are described in a textual manner, before being translated into 3D simulations. This enables managers to make time and cost estimations and achieve production efficiencies. However the skills and knowledge of workers is often not utilised, and there is no standard mechanism for taking into account this valuable input.’

The INTERACT project has sought to facilitate the automatic generation of 3D assembly plant plans and enable workers and engineers themselves to contribute to optimising processes. ‘In traditional process planning, an initial plan is created by a planning engineer who documents critical issues and proposes solutions’ adds Manns. ‘Our aim has been to replace this with a completely virtual model.’

This has been achieved by using software involving controlled natural language commands – where grammar and vocabulary are restricted in order to eliminate ambiguity and complexity – along with a statistical motions database to generate realistic human motions. In addition, low-cost sensors were used to track actual tasks on the shop floor, in order to make the project’s 3D simulations more intuitive and interactive. The idea of motion optimisation through real life actions led to the design of new innovations such as a data glove with inertial sensors and bending and force sensors.

‘In this project, we specifically focused on manual production assembly lines and warehouse operations, and ran two case studies on an automotive manufacturer and a white goods manufacturer,’ explains Manns. ‘We looked at three key questions: whether a task is feasible by any worker; whether a worker can do the tasks in a given cycle time; and whether ergonomics issues are likely to arise if the worker does the same process over a period of years. Obviously, all three questions leave room for process optimisation.’

The key outcome has been a proof of concept demonstrator for automated, context dependent motion synthesised from controlled natural language. ‘This algorithmic solution has been developed from scratch and produces realistic-looking body motions, though we haven’t succeeded in visualising fingers yet,’ says Manns. ‘Interestingly, we found a rich variety of motions from the shop floor, which enabled us to increase the number of input motions to over 10 000.’

However, even this number only allowed 11 of the originally 22 planned motion types - walk, pick up, carry, etc. – to be modelled. Extending the number of motion types will require further input data. Nonetheless a browser based live demo is available on the INTERACT website. And while the technology is not yet ready for commercialisation, the project has sparked interest from companies in other fields such as motion capture, virtual reality, entertainment and academia. Further research projects are currently being prepared in order to bring the technology to market.

For further information, please see:
INTERACT project website

published: 2016-10-12
Comments


Privacy Policy