Peripheral Interaction: Embedding
HCI in Everyday Life
The increase of computing technology in everyday life
creates opportunities as well as challenges for interaction design. One of
these challenges is the seamless integration of technology in our everyday
routines. Peripheral Interaction, is based on the observation that in everyday
life, many actions occur outside the focus of the attention.
A Context Server to Allow
Peripheral Interaction for people with disabilities
There is an increasing variety of services provided by local
machines, such as ATMs, vending machines, etc. These services are frequently
inaccessible for people with disabilities because they are equipped with rigid
user interfaces. The application of Ubiquitous Computing techniques allows
access to intelligent machines through wireless networks by means of mobile
devices. Smartphones can provide an excellent way to interact with ubiquitous
services that would otherwise be inaccessible. People with disabilities can
benefit from this type of interaction if they are provided with accessible
mobile devices that are well adapted to their characteristics and needs.
The INREDIS project created by B.G, L. G. and J. A., at Laboratory
of HCI for Special Needs. In this project laboratory
developed EGOKI for disabled users an automatic interface generator that is able to create
adapted and accessible user interfaces that are downloaded to the user device
when she or he wants to access a ubiquitous service. Peripheral interaction
includes all the implicit activities that are conducted to interact with an
application.
Context server can contribute to peripheral interaction by
providing the applications with valuable information that would otherwise be
explicitly requested from the user. The context provider can assist developers
to make use of the context in a simpler way. For instance, the context server
allows applications to select the most appropriate modality to interact with a
user with communication restrictions, due to disability or to a situational
impairment. For instance, if the microphone detects that the local level of
noise is too high the application can avoid voice commands and prioritize text
or images; or, if the inertial sensors detect that the user is walking, driving
or riding a bicycle, touch input can be switched to voice input.
Use of context server application, examples:
- Affective Interaction
- Smart Wheelchair
- Smart Traffic Lights
- Peripheral Interaction with EGOKI - EGOKI is a UI generator for ubiquitous services. The user’s abilities, device characteristics and service functionalities are taken into account to create an accessible UI
References:
"A Context Server to Allow Peripheral Interaction", B. G., L. G. and J. A. E., Laboratory of HCI for
Special Needs, University of the Basque Country (UPV/EHU), Donostia, Spain, 2013