Browsing by Author "Mayer, Sven"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access Finger orientation as an additional input dimension for touchscreens(2019) Mayer, Sven; Henze, Niels (Prof. Dr.)Since the first digital computer in 1941 and the first personal computer back in 1975, the way we interact with computers has radically changed. The keyboard is still one of the two main input devices for desktop computers which is accompanied most of the time by a mouse or trackpad. However, the interaction with desktop and laptop computers today only make up a small percentage of current interaction with computing devices. Today, we mostly interact with ubiquitous computing devices, and while the first ubiquitous devices were controlled via buttons, this changed with the invention of touchscreens. Moreover, the phone as the most prominent ubiquitous computing device is heavily relying on touch interaction as the dominant input mode. Through direct touch, users can directly interact with graphical user interfaces (GUIs). GUI controls can directly be manipulated by simply touching them. However, current touch devices reduce the richness of touch input to two-dimensional positions on the screen. In this thesis, we investigate the potential of enriching a simple touch with additional information about the finger touching the screen. We propose to use the user’s finger orientation as two additional input dimensions. We investigate four key areas which make up the foundation to fully understand finger orientation as an additional input technique. With these insights, we provide designers with the foundation to design new gestures sets and use cases which take the finger orientation into account. We first investigate approaches to recognize finger orientation input and provide ready-to-deploy models to recognize the orientation. Second, we present design guidelines for a comfortable use of finger orientation. Third, we present a method to analyze applications in social settings to design use cases with possible conversation disruption in mind. Lastly, we present three ways how new interaction techniques like finger orientation input can be communicated to the user. This thesis contributes these four key insights to fully understand finger orientation as an additional input technique. Moreover, we combine the key insights to lay the foundation to evaluate every new interaction technique based on the same in-depth evaluation.Item Open Access Mobilelogging : assessing smartphone sensors for monitoring sleep behaviour(2013) Mayer, SvenThis work deals with mobile devices, more specifically with Android smartphones. Smartphones have more and more accurate sensor data. This we can use to make us a more accurate picture of our trade. Here, the techniques are demonstrated on the present state of the art show. Using an example is shown how this works in detail data recording. It is a user study carried out over 7 days. The goal is to determine whether the data can be analyzed sleep behavior by the smartphone. To control a commercial objective sleep measurement device is used.Item Open Access Modeling distant pointing for compensating systematic displacements(2014) Mayer, SvenPeople use gestures to give verbal communication more expression and also to replace the speech. One of the most concise and expressive gestures is the pointing gesture. Pointing gesture can be observed in the early childhood. In these early years, it is used to point at objects or people. Later, people use pointing gestures even for more complex things such as to visualize directions. Increasing pointing gestures are also used for interacting with computers. For example, gestures can be used to remotely interact with a display, without using an input tool. In this work we investigated how people point to objects and how the recognition accuracy can be improved by using a gesture recognition system. We performed a user study, where participants had to point on projected pointing targets. These gestures were recorded as reference data with the help of motion capture system. We used a variety of starting positions in which the study was carried out. For this the participants were placed with a distance of 2 to 3 meters to the pointing targets. At these two positions the participants had to sit and stand while pointing to the targets. From the recorded reference data we derived a pointing vector. Each vector describes the direction in which the gesture is directed. We generate these vectors out of different body parts. This is done to show that there are different ways to create these vectors but they behave all the same. In the optimal case, this vector would describe the path of the person pointing to the object, in this case, the projected point. By mathematical analyzes we show that in average over several experiments and over several participants a systematic deviation from this optimal vector can be detected. We specify models, which can compensate the systematic deviation. These models shift the pointing vector in the direction of the average distance between optimum and average study vector. Products of the consumer market can be used to detect pointing gestures. There gestures can be improved with the generated models. The focus here is, for example, products like the Kinect.