Wednesday, September 12, 2012

Project Proposals

a portion of my list of prospective projects.


Contact based UI

PERVASIVE INTERACTION  via contact-aware sensor interfaces (RFID/NFC and tags) attached to objects

Though 'approach-and-contact' is a natural action in non-verbal communication between people, its subtleness makes it difficult for computers to utilize it as input so far. I am planning to put sensor technologies to realize contact-based interaction. This will introduce peer-to-peer recognition and communication between users and devices, drastically improving the speed and responsiveness of user-computer interaction.

  • NPC in real life - bring life and story to natural objects
  • Fast Authentication - automatic authentication and multi-user recognition
  • Fast Tagging - putting items into wishlist by touching price tags
  • Affection UI - objects or people that automatically discover each other



Usable Combination of Eye-gaze and Hand Gesture

COMBINED INTERACTION -  gazing at display + user's hand gestures

Eye gaze and hand gesture are complementary in their role: hand gestures give information on user's decision or intention, while eyes are primarily related with selection or awareness. The combination of such modalities would improve controllability and interaction speed compared to currently available input techniques.

  • Align UI - utilize the spatial alignment of eyes, hands, UI element for remote interaction
  • Shoot UI - aim with eyes and throw data with hands, as if throwing a basketball.


Enriching Gaming Experience with Non-verbal Communications

AUXILARY INTERACTION -  auxilary interactions supported aside main controls
SOCIAL INTERACTION - socially meaningful actions in collaborative environment

In a project called Minecraft.me, I am trying to project the user’s non-verbal communications onto virtual avatars in a 3-D game. Hacking the game and connecting with the Kinect interface, I was able to allow players to share gestures that are socially meaningful or instructive through their body actions. My next step of the exploration is finding how, if a auxilary social and instructive expression is given aside main controls, it would contribute to the collaboration in virtual setup.

  • More control with head and hands - more controllability through head and hand motion
  • Instructive Gestures - how can we give instructive actions in virtual environment?



Augmented, Connected, or Diminished Displays: extending out from 2D surfaces

I am exploring the future of display, in aspect of their physical forms, information they deal with, and how they interact with surrounding environment. So far, most of efforts were given to their size and processing power, but displays will provide much more possibilities by pulling their capabilities out from the boundary of  2-D screens.
  1. Augmented information - Find how to encompass higher dimension of visual information, or even non-visual information with displays.
  2. Connected displays - Explore new ways to extend the capabilities of displays for spatial collaboration, effectively filling our everyday places with digital data.
  3. Diminished displays - Displays are often overflown by too much visual data. Stripped down to give only core information, displays can reach into everywhere providing data in a more ambient manner.



Interaction on a Deformable Display

COMBINED INTERACTION - (touch + deformation) on flexible touchscreen
The tremendous advancement of display technology opened up new possibilities in user-computer interaction, by enabling various input methods and high data throughput. As one future direction, I hope to encompass two-way transmission of tactile information, receiving input and giving feedback through physical forms that are reconfigurable or deformable.

  • Deforming input - Utilize deforming force as input
  • Shape-based visualization - Visualize information through the deformation of display



More Pieces of Fun

Tools for Making our Surroundings Programmable
This project is to devise a platform and feasible design to plant programmable sets of interaction in indoor environment. People will be provided with programming tools with which they can implement their own environment connecting existing infras such as Lock Systems, Lightbulbs and Home Appliances.
  • Automatically control environmental condition based on measurements (thermometer, hydrometer, magnetometer, etc)
  • Easily install buttons or sensors to create interactive and intelligent environment

Enhanced Music Activity via Nuance Extraction
Devising phrases and writing music scores has always been separate tasks for musicians, in which lots of valuable musical information can be lost. For example, playing electric guitar has variety of factors influencing the sound, such as picking dynamics, scratching, stroke speed, nob control, bending, vibrato speed, and muting. Incorporating sensor techniques and collecting more information, a soulful play of a musician can be replicated with 100% precision.

about