EDITalk: Towards Designing Eyes-free Interactions for Mobile Word Processing

EDITalk: Towards Designing Eyes-free Interactions for Mobile Word Processing

Authors

Debjyoti Ghosh, Pin Sym Foong, Shengdong Zhao, Di Chen, Morten Fjeld

Paper

EDITalk: Towards Designing Eyes-free Interactions for Mobile Word Processing

Abstract

We present EDITalk, a novel voice-based, eyes-free word processing interface. We used a Wizard-of-Oz elicitation study to investigate the viability of eyes-free word processing in the mobile context and to elicit user requirements for such scenarios. Results showed that meta-level operations like highlight and comment, and core operations like insert, delete and replace are desired by users. However, users were challenged by the lack of visual feedback and the cognitive load of remembering text while editing it. We then studied a commercial-grade dictation application and discovered serious limitations that preclude comfortable speak-to-edit interactions. We address these limitations through EDITalk’s closed-loop interaction design, enabling eyes-free operation of both meta-level and core word processing operations in the mobile context. Finally, we discuss implications for the design of future mobile, voice-based, eyes-free word processing interface.

Shen

Shen is an HCI professor at the National University of Singapore working on realizing his vision of HeadsUp Computing, a new Interaction paradigm that can transform the way we live and interact with computers. In his free time, Shen loves to read, run, spend time with family and friends, and explore nature.

Leave a Reply