Resumo:
Computational simulation is a widely used tool for monitoring and optimizing performance
indicators in both academia and industry. However, human operators are commonly
represented as resources with constant nominal production in simulation models. Such
models are often insufficient, as they disregard the inherent variations due to human
physiology. Although there are ways to represent such variations in human performance in
the academic literature, emotional factors are rarely addressed. Moreover, the studies that
seek to model the psychophysical attitude, which includes emotions, of the operator usually
require sophisticated equipment or time-consuming forms to be filled by the operators. To
present an alternative for the inclusion of the emotional human factor in simulation models,
this work uses a machine learning model to identify the apparent mood of operators from
footage of a manufacturing production line in operation. This approach does not interfere
with the workload of the operators nor require any extra equipment other than a common
video camera, filling a gap observed in the literature. This study revealed 66 statistically
significant correlations between the mood variables estimated by the machine learning model
and operation time on the assembly line. It was also shown that the probability distribution
functions for the operation time differ significantly when considering different classes of
mood. Lastly, those different curves were demonstrated in a simulation model. The results
show an improvement trend in the model’s results, thus demonstrating the viability of this
technique.This study also provides instructions for applying the proposed technique that can
be used in similar projects. Besides simulations, this technique can be applied to a variety of
fields including defects prevention, occupational health and safety.