Sensor-based user interface concepts for continuous, around-device and gestural interaction on mobile devices

Sensor-based user interface concepts for continuous, around-device and gestural interaction on mobile devices

Beschreibung

vor 12 Jahren
A generally observable trend of the past 10 years is that the
amount of sensors embedded in mobile devices such as smart phones
and tablets is rising steadily. Arguably, the available sensors are
mostly underutilized by existing mobile user interfaces. In this
dissertation, we explore sensor-based user interface concepts for
mobile devices with the goal of making better use of the available
sensing capabilities on mobile devices as well as gaining insights
on the types of sensor technologies that could be added to future
mobile devices. We are particularly interested how novel sensor
technologies could be used to implement novel and engaging mobile
user interface concepts. We explore three particular areas of
interest for research into sensor-based user interface concepts for
mobile devices: continuous interaction, around-device interaction
and motion gestures. For continuous interaction, we explore the use
of dynamic state-space systems to implement user interfaces based
on a constant sensor data stream. In particular, we examine zoom
automation in tilt-based map scrolling interfaces. We show that
although fully automatic zooming is desirable in certain
situations, adding a manual override capability of the zoom level
(Semi-Automatic Zooming) will increase the usability of such a
system, as shown through a decrease in task completion times and
improved user ratings of user study. The presented work on
continuous interaction also highlights how the sensors embedded in
current mobile devices can be used to support complex interaction
tasks. We go on to introduce the concept of Around-Device
Interaction (ADI). By extending the interactive area of the mobile
device to its entire surface and the physical volume surrounding it
we aim to show how the expressivity and possibilities of mobile
input can be improved this way. We derive a design space for ADI
and evaluate three prototypes in this context. HoverFlow is a
prototype allowing coarse hand gesture recognition around a mobile
device using only a simple set of sensors. PalmSpace a prototype
exploring the use of depth cameras on mobile devices to track the
user's hands in direct manipulation interfaces through spatial
gestures. Lastly, the iPhone Sandwich is a prototype supporting
dual-sided pressure-sensitive multi-touch interaction. Through the
results of user studies, we show that ADI can lead to improved
usability for mobile user interfaces. Furthermore, the work on ADI
contributes suggestions for the types of sensors could be
incorporated in future mobile devices to expand the input
capabilities of those devices. In order to broaden the scope of
uses for mobile accelerometer and gyroscope data, we conducted
research on motion gesture recognition. With the aim of supporting
practitioners and researchers in integrating motion gestures into
their user interfaces at early development stages, we developed two
motion gesture recognition algorithms, the $3 Gesture Recognizer
and Protractor 3D that are easy to incorporate into existing
projects, have good recognition rates and require a low amount of
training data. To exemplify an application area for motion
gestures, we present the results of a study on the feasibility and
usability of gesture-based authentication. With the goal of making
it easier to connect meaningful functionality with gesture-based
input, we developed Mayhem, a graphical end-user programming tool
for users without prior programming skills. Mayhem can be used to
for rapid prototyping of mobile gestural user interfaces. The main
contribution of this dissertation is the development of a number of
novel user interface concepts for sensor-based interaction. They
will help developers of mobile user interfaces make better use of
the existing sensory capabilities of mobile devices. Furthermore,
manufacturers of mobile device hardware obtain suggestions for the
types of novel sensor technologies that are needed in order to
expand the input capabilities of mobile devices. This allows the
implementation of future mobile user interfaces with increased
input capabilities, more expressiveness and improved usability.

Kommentare (0)

Lade Inhalte...

Abonnenten

15
15
:
: