BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Spatial Tactile Feedback Support for Mobile Touch-screen Devices -
  Koji Yatani\, University of Toronto
DTSTART:20110408T094000Z
DTEND:20110408T104000Z
UID:TALK30647@talks.cam.ac.uk
CONTACT:Microsoft Research Cambridge Talks Admins
DESCRIPTION:Mobile touch-screen devices have the capability to accept flex
 ible touch input and can provide larger screen space than mobile devices w
 ith physical buttons. However\, current user interfaces on mobile touch-sc
 reen devices heavily use visual feedback. This raises a number of user int
 erface challenges. For instance\, visually-demanding user interfaces make 
 it difficult for the user to interact with mobile touch-screen devices wit
 hout looking at the screen\, a task the user sometimes wishes to do partic
 ularly in a mobile setting. In addition\, user interfaces on a mobile touc
 h-screen device are not generally accessible to visually-impaired users.\n
 \nI have been working on addressing this high visual demand issue found in
  existing user interfaces on mobile touch-screen devices by using spatial 
 tactile feedback. I developed tactile feedback hardware employing multiple
  vibration motors in different locations on the backside of a mobile touch
 -screen device. This spatial arrangement allows the interface to produce v
 arious spatial vibration patterns on the user’s fingers and palm. I then
  developed systems with the spatial tactile feedback designed for eyes-fre
 e interaction\, interfaces for the visually-impaired\, and remote collabor
 ation\, and validated the effects of the spatial tactile feedback.\n
LOCATION:Small lecture theatre\, Microsoft Research Ltd\, 7 J J Thomson Av
 enue (Off Madingley Road)\, Cambridge
END:VEVENT
END:VCALENDAR
