Please use this identifier to cite or link to this item: http://dx.doi.org/10.18419/opus-10555
|Authors:||Le, Huy Viet|
|Title:||Hand-and-finger-awareness for mobile touch Interaction using deep learning|
|Abstract:||Mobile devices such as smartphones and tablets have replaced desktop computers for a wide range of everyday tasks. Virtually every smartphone incorporates a touchscreen which enables an intuitive interaction through a combination of input and output in a single interface. Due to the success of touch input, a wide range of applications became available for mobile devices which were previously exclusive to desktop computers. This transition increased the mobility of computing devices and enables users to access important applications even while on the move. Despite the success of touchscreens, traditional input devices such as keyboard and mouse are still superior due to their rich input capabilities. For instance, computer mice offer multiple buttons for different functions at the same cursor position while hardware keyboards provide modifier keys which augment the functionality of every other key. In contrast, touch input is limited to the two-dimensional location of touches sensed on the display. The limited input capabilities slow down the interaction and pose a number of challenges which affect the usability. Among others, shortcuts can merely be provided which affects experienced users and contradicts Shneiderman's golden rules for interface design. Moreover, the use of mostly one finger for input slows down the interaction while further challenges such as the fat-finger problem and limited reachability add additional inconveniences. Although the input capabilities are sufficient for simple applications, more complex everyday tasks which require intensive input, such as text editing, are still not widely adopted yet. Novel touch-based interaction techniques are needed to extend the touch input capabilities and enable multiple fingers and even parts of the hand to perform input similar to traditional input devices. This thesis examines how individual fingers and other parts of the hand can be recognized and used for touch input. We refer to this concept as hand-and-finger-awareness for mobile touch interaction. By identifying the source of input, different functions and action modifiers can be assigned to individual fingers and parts of the hand. We show that this concept increases the touch input capabilities and solves a number of touch input challenges. In addition, by applying the concept of hand-and-finger-awareness to input on the whole device surface, previously unused fingers on the back are now able to perform input and augment touches on the front side. This further addresses well-known challenges in touch interaction and provides a wide range of possibilities to realize shortcuts. We present twelve user studies which focus on the design aspects, technical feasibility, and the usability of hand-and-finger-awareness for mobile touch interaction. In a first step, we investigate the hand ergonomics and behavior during smartphone use to inform the design of novel interaction techniques. Afterward, we examine the feasibility of applying deep learning techniques to identify individual fingers and other hand parts based on the raw data of a single capacitive touchscreen as well as of a fully touch sensitive mobile device. Based on these findings, we present a series of studies which focus on bringing shortcuts from hardware keyboards to a fully touch sensitive device to improve mobile text editing. Thereby, we follow a user-centered design process adapted for the application of deep learning. The contribution of this thesis ranges from insights on the use of different fingers and parts of the hand for interaction, through technical contributions for the identification of the touch source using deep learning, to solutions for addressing limitations of mobile touch input.|
|Appears in Collections:||05 Fakultät Informatik, Elektrotechnik und Informationstechnik|
Items in OPUS are protected by copyright, with all rights reserved, unless otherwise indicated.