Cub Cadet Time Saver i1042 Operatör El Kitabı

Çim Biçme Makinesi Cub Cadet Time Saver i1042 için çevrimiçi göz atın veya pdf Operatör El Kitabı indirin. Cub Cadet Time Saver i1042 10 sayfaları. Zero turn riding mower
Ayrıca Cub Cadet Time Saver i1042 için: Uyumluluk Kılavuzu (3 sayfalar)

Enabling Always-Available Input with
1
T. Scott Saponas
, Desney S. Tan
1
Compuer Science and Engineering
DUB Group
University of Washington
{ssaponas, landay}@cs.washington.edu
ABSTRACT
Previous work has demonstrated the viability of applying
offline analysis to interpret forearm electromyography
(EMG) and classify finger gestures on a physical surface.
We extend those results to bring us closer to using muscle-
computer interfaces for always-available input in real-world
applications. We leverage existing taxonomies of natural
human grips to develop a gesture set covering interaction in
free space even when hands are busy with other objects. We
present a system that classifies these gestures in real-time
and we introduce a bi-manual paradigm that enables use in
interactive systems. We report experimental results demon-
strating four-finger classification accuracies averaging 79%
for pinching, 85% while holding a travel mug, and 88%
when carrying a weighted bag. We further show generali-
zability across different arm postures and explore the tra-
deoffs of providing real-time visual feedback.
H.1.2 [User/Machine Systems]; H.5.2
ACM Classification:
[User Interfaces]: Input devices and strategies; B.4.2 [In-
put/Output Devices]: Channels and controllers
Design, Human Factors
General terms:
Electromyography (EMG), Muscle-Computer
Keywords:
Interface, input, interaction.
INTRODUCTION
Our hands and our ability to control them have evolved
over thousands of years, yielding an amazing ability to pre-
cisely manipulate tools. As such, we have often crafted our
environments and technologies to take advantage of this
ability. For example, many current computer interfaces re-
quire manipulating physical devices such as keyboards,
mice, and joysticks. Even future looking research systems
often focus on physical input devices [5]. However, as
computing environments become more diverse, we often
find ourselves in scenarios where we either cannot, or pre-
fer not to, explicitly interact with a physical device in hand.
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. To copy otherwise,
or republish, to post on servers or to redistribute to lists, requires prior
specific permission and/or a fee.
UIST'09, October 4–7, 2009, Victoria, British Columbia, Canada.
Copyright 2009 ACM 978-1-60558-745-5/09/10...$10.00.
Muscle-Computer Interfaces
2
, Dan Morris
{desney, dan}@microsoft.com
3
Microsoft Corporation
2
, Ravin Balakrishnan
2
Microsoft Research
Previous work has explored hands-free and implement-free
input techniques based on a variety of sensing modalities.
For example, computer vision enables machines to recog-
nize faces, track movement and gestures, and reconstruct
3D scenes [24]. Similarly, speech recognition allows for
hands-free interaction, enabling a variety of speech-based
desktop and mobile applications [8, 11]. However, these
technologies have several inherent limitations. First, they
require observable interactions that can be inconvenient or
socially awkward. Second, they are relatively sensitive to
environmental factors such as light and noise. Third, in the
case of computer vision, sensors that visually sense the en-
vironment are often susceptible to occlusion.
We assert that computer input systems can leverage the full
bandwidth of finger and hand gestures without requiring the
user to manipulate a physical transducer. In this paper, we
show how forearm electromyography (EMG) can be used to
detect and decode human muscular movement in real time,
thus enabling interactive finger gesture interaction. We en-
vision that such sensing can eventually be achieved with an
unobtrusive wireless forearm EMG band (see Figure 1).
Previous work exploring muscle-sensing for input has pri-
marily focused either on using a single large muscle (rather
than the fingers) [2, 3, 4, 22, 25], which does not provide
the breadth of input signals required for computer input,
and/or on situations where the hand and arm are constrained
to a surface [3, 4, 15, 21, 23, 25], which is not a realistic
usage scenario for always-available input devices. Saponas
et al. [18] demonstrated the feasibility of using offline ma-
chine learning techniques to interpret forearm muscle-
sensing and classify finger gestures on a surface. We extend
their offline classification results to achieve online classifi-
Figure 1. Artist rendering of a forearm muscle-sensing band.
4
3
, Jim Turner
, James A. Landay
4
Department of
Computer Science
University of Toronto
1