Welcome

As a Human-Computer Interaction researcher, my focus is on creating novel interaction techniques and non-visual multimodal feedback. Some of my most recent research has focused on haptic interpersonal communication, user interface optimisation and smart material shape-changing interfaces.

Current Research

I am currently a research fellow at the Aalto Science Institute and the User Interfaces group where my work focuses on multimodal interaction, interface design for users with impairments, and novel interaction techniques in general. Some of my recent projects are listed below.

ForcePhone

Haptic Interpersonal Communication

During the Haptic Emotional Interaction project, we created ForcePhone – a mobile synchronous haptic communication system. With the help of lab and real world studies we document some concrete design decisions and experiences in augmented interpersonal communication.

Hand

ABICOMM

Ability-Based Multimodal Communication Using Biomechanical and Neurophysiological Measurements

This Academy of Finland project investigates text entry performance models and enhanced word prediction in an attempt to improve current AAC methods.

Multi-touch gesture on tablet

Multi-Touch Gestures

The Performance and Ergonomics of Rotation and Pinch Gestures

Gestures, such as pinch and rotate, involve some of the most complex motor action among common multi-touch gestures, yet little is known about the factors affecting performance and ergonomics. This work involved several studies where the angle, direction, diameter, and position of rotations and pinch gestures were systematically manipulated. We showed that some surprising interaction effects among the variables, and identified whole categories of gestures that are slow and cumbersome for users.

Haptic puck case

The Haptic Puck

The puck is a haptic device that provides multiple types of tactile feedback to allow a more natural approach in graph exploration for people with visual impairments. Our aim is to contribute to accessibility with an easily repoducable and affordable device.

Flexible Interfaces

Flexible interfaces are display surfaces made of malleable materials that can change into and retain arbitrary shapes so as to display output from the system or afford new actions. Sensing touch input and providing informative feedback with these interfaces can be very complicated. Some of my current work aims to increase the bandwidth of input and output of flexible interfaces.

Smart Materials

Smart materials are crucial for the development of more expressive, flexible and robust devices but we do not know yet which types are most suitable or how to construct effective inter- faces with them. Most current HCI research tries to mimic smart materials through mechanical implementations based on non-smart materials. These tend to be difficult to construct, power-hungry, and unreliable. Most smart material research takes place in Physics, Chemistry, and Engineering labs, and the materials are rarely applied to interactive devices despite their potential to radically change the nature of human-computer interaction. I’m working on bridging this gap so that smart materials are readily available for researchers outside the Chemistry, and Engineering labs. This will allow researchers to concentrate on more sophisticated and robust implementations of shape-changing interfaces using these materials.

Crossmodal Icons

Crossmodal Interaction

My PhD thesis, Crossmodal Interaction with Audio and Tactile Mobile Touchscreen Displays, asserts that using crossmodal auditory and tactile displays can help to reduce problems experienced by mobile device users. By using the auditory and tactile modalities in a crossmodal manner, information may be presented to the most appropriate modality given the situation. Whether the feedback indicates a simple button press or a complex email alert, as the user’s context changes so should the modality. My studies focused on amodal attributes that may be manipulated in both the auditory and tactile modalities, and applications which exploit crossmodal interaction.

Touchscreen

Haptic Feedback

According to ISO 9241-910, the term haptic means “sensory and/or motor activity based in the skin, muscles, joints and tendons”. The majority of information displayed by computers is presented in a visual form. This means information can be missed because of visual overload or because the user is not looking in the right place at the right time. It also means that some physical aspects of digital objects, such as weight or texture, cannot be displayed accurately. By augmenting visual interfaces with haptic feedback, we can mimic the physical sensation of pressing a button, holding an object, or even create entirely new touch stimuli.

Mobimood Application

MobiMood

Whilst working at Telefónica I+D with Karen Church, I investigated the role of mood and emotions in the HCI domain. We developed MobiMood, a proof-of-concept social mobile application that enables groups of friends to share their moods with each other.

Teaching

There are a variety of courses each year at the Department of Computer Science in the University of Helsinki and in the Computer Science, Electrical Engineering, and Design Departments at Aalto University

User Interfaces

This course focuses on model-based user interface design. The idea is to derive design solutions by analysis, simulation, or optimization.

Experimental UIs

This course provides an overview of novel and unconventional HCI techniques. These include touch, gestures and physical sensors

HCI Helsinki

We are an active and interactive research community conducting cutting-edge research and teaching on “hot” HCI topics.

Publications

If a pdf for the paper is not available, please contact me.

Conference Papers


Contact Me

Eve Hoggan

Aalto Science Institute

Aalto University, Finland 00076

first.last@aalto.fi


Find me on ...