TY - GEN
T1 - Workload-driven modulation of mixed-reality robot-human communication
AU - Hirshfield, Leanne
AU - Williams, Tom
AU - Sommer, Natalie
AU - Grant, Trevor
AU - Gursoy, Senem Velipasalar
N1 - Publisher Copyright:
© 2018 Association for Computing Machinery.
PY - 2018/10/16
Y1 - 2018/10/16
N2 - In this work we explore how Augmented Reality annotations can be used as a form of Mixed Reality gesture, how neurophysiological measurements can inform the decision as to whether or not to use such gestures, and whether and how to adapt language when using such gestures. In this paper, we propose a preliminary investigation of how decisions regarding robot-to-human communication modality in mixed reality environments might be made on the basis of humans’ perceptual and cognitive states. Specifically, we propose to use brain data acquired with high-density functional near-infrared spectroscopy (fNIRS) to measure the neural correlates of cognitive and emotional states with particular relevance to adaptive human-robot interaction (HRI). In this paper we describe several states of interest that fNIRS is well suited to measure and that have direct implications to HRI adaptations and we leverage a framework developed in our prior work to explore how different neurophysiological measures could inform the selection of different communication strategies. We then describe results from a feasibility experiment where multilabel Convolutional Long Short Term Memory Networks were trained to classify the target mental states of 10 participants and we discuss a research agenda for adaptive human-robot teams based on our findings.
AB - In this work we explore how Augmented Reality annotations can be used as a form of Mixed Reality gesture, how neurophysiological measurements can inform the decision as to whether or not to use such gestures, and whether and how to adapt language when using such gestures. In this paper, we propose a preliminary investigation of how decisions regarding robot-to-human communication modality in mixed reality environments might be made on the basis of humans’ perceptual and cognitive states. Specifically, we propose to use brain data acquired with high-density functional near-infrared spectroscopy (fNIRS) to measure the neural correlates of cognitive and emotional states with particular relevance to adaptive human-robot interaction (HRI). In this paper we describe several states of interest that fNIRS is well suited to measure and that have direct implications to HRI adaptations and we leverage a framework developed in our prior work to explore how different neurophysiological measures could inform the selection of different communication strategies. We then describe results from a feasibility experiment where multilabel Convolutional Long Short Term Memory Networks were trained to classify the target mental states of 10 participants and we discuss a research agenda for adaptive human-robot teams based on our findings.
UR - http://www.scopus.com/inward/record.url?scp=85058351773&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85058351773&partnerID=8YFLogxK
U2 - 10.1145/3279810.3279848
DO - 10.1145/3279810.3279848
M3 - Conference contribution
AN - SCOPUS:85058351773
T3 - Proceedings of the Workshop on Modeling Cognitive Processes from Multimodal Data, MCPMD 2018
BT - Proceedings of the Workshop on Modeling Cognitive Processes from Multimodal Data, MCPMD 2018
PB - Association for Computing Machinery, Inc
T2 - 2018 Workshop on Modeling Cognitive Processes from Multimodal Data, MCPMD 2018
Y2 - 16 October 2018
ER -