This study explores an info-structural model of cognition for non-interacting agents affected by human sensation, perception, emotion, and affection. We do not analyze the neuroscientific or psychological debate concerning the human mind working, but we underline the importance of modeling the above cognitive levels when designing artificial intelligence agents. Our aim was to start a reflection on the computational reproduction of intelligence, providing a methodological approach through which the aforementioned human factors in autonomous systems are enhanced. The presented model must be intended as part of a larger one, which also includes concepts of attention, awareness, and consciousness. Experiments have been performed by providing visual stimuli to the proposed model, coupling the emotion cognitive level with a supervised learner to produce artificial emotional activity. For this purpose, performances with Random Forest and XGBoost have been compared and, with the latter algorithm, 85% accuracy and 92% coherency over predefined emotional episodes have been achieved. The model has also been tested on emotional episodes that are different from those related to the training phase, and a decrease in accuracy and coherency has been observed. Furthermore, by decreasing the weight related to the emotion cognitive instances, the model reaches the same performances recorded during the evaluation phase. In general, the framework achieves a first emotional generalization responsiveness of 94% and presents an approximately constant relative frequency related to the agent’s displayed emotions.
Smart sensing: An info-structural model of cognition for non-interacting agents
Iovane G.;
2020-01-01
Abstract
This study explores an info-structural model of cognition for non-interacting agents affected by human sensation, perception, emotion, and affection. We do not analyze the neuroscientific or psychological debate concerning the human mind working, but we underline the importance of modeling the above cognitive levels when designing artificial intelligence agents. Our aim was to start a reflection on the computational reproduction of intelligence, providing a methodological approach through which the aforementioned human factors in autonomous systems are enhanced. The presented model must be intended as part of a larger one, which also includes concepts of attention, awareness, and consciousness. Experiments have been performed by providing visual stimuli to the proposed model, coupling the emotion cognitive level with a supervised learner to produce artificial emotional activity. For this purpose, performances with Random Forest and XGBoost have been compared and, with the latter algorithm, 85% accuracy and 92% coherency over predefined emotional episodes have been achieved. The model has also been tested on emotional episodes that are different from those related to the training phase, and a decrease in accuracy and coherency has been observed. Furthermore, by decreasing the weight related to the emotion cognitive instances, the model reaches the same performances recorded during the evaluation phase. In general, the framework achieves a first emotional generalization responsiveness of 94% and presents an approximately constant relative frequency related to the agent’s displayed emotions.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.