Breathing Life Into Biomechanical User Models
Ikkala, Aleksi; Fischer, Florian; Klar, Markus; Bachinski, Miroslav; Fleig, Arthur; Howes, Andrew; Hämäläinen, Perttu; Muller, Jörg; Murray-Smith, Roderick; Oulasvirta, Antti
Original version
In: Agrawala, M., Wobbrock, J. O., Adar, E., Setlur, V. (eds.), UIST '22: Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, 90. 10.1145/3526113.3545689Abstract
Forward biomechanical simulation in HCI holds great promise as a tool for evaluation, design, and engineering of user interfaces. Although reinforcement learning (RL) has been used to simulate biomechanics in interaction, prior work has relied on unrealistic assumptions about the control problem involved, which limits the plausibility of emerging policies. These assumptions include direct torque actuation as opposed to muscle-based control; direct, privileged access to the external environment, instead of imperfect sensory observations; and lack of interaction with physical input devices. In this paper, we present a new approach for learning muscle-actuated control policies based on perceptual feedback in interaction tasks with physical input devices. This allows modelling of more realistic interaction tasks with cognitively plausible visuomotor control. We show that our simulated user model successfully learns a variety of tasks representing different interaction methods, and that the model exhibits characteristic movement regularities observed in studies of pointing. We provide an open-source implementation which can be extended with further biomechanical models, perception models, and interactive environments.