Touchscreen Typing As Optimal Supervisory Control
Touchscreen typing has been traditionally studied in terms of motor performance. However, recent research exposed a decisive role ofvisual attention being shared between the keyboard and the text area. Strategies for this are known to adapt to the task, design, anduser. In this paper, we propose a unifying account of touchscreen typing, regarding it as optimal supervisory control. This assumes thatdecisions on controlling visuo-motor resources are learned via exploration, and made to maximise typing performance. We outline thetheory and explain how visual and motor bounds affect this control problem. We then present a model, implemented via reinforcementlearning, simulating coordination of eye and finger movements. Comparison with human data affirms that the model creates realistic,interlinked finger- and eye-movement patterns. We demonstrate the model’s use in interface development to evaluate touchscreenkeyboard designs.