Saturday, February 27, 2016

2016 WIRED INNOVATION FELLOWSHIP

Sang-won Leigh
MIT Media Lab, 2nd Year PhD
75 Amherst St, Cambridge, MA 02139
sangwon@media.mit.edu
+1-617-642-7996


Human desire to transform, to become an entity beyond given biological limitation has been a central driving force behind the collective and technological advancements we have achieved. Our lives are no longer separable from computers, always linked to the network of data and virtual worlds. On this segway from Homo sapiens to Cyborgs, I glimpse into the future through prototypes of machine-driven evolution.

Computational symbionts

Towards that end, I am creating a robotic wearable that provides extended body. Robotic joints worn on the wrist turn into extra fingers so that a person acquires skills beyond what five fingers can offer, or performs “tri-manual” tasks with the machine joints. It runs on a neural network that learns life-like controls autonomously. Like how a human simulates actions in mind, the robot conducts simulations in a virtual environment to teach itself a variety of handling strategies. Those internal trainings are memorized in the neural network - later used for intelligent physical supports for its wearers.


video link password: sangmedia (*work yet to be publicized.)

Extrapolating on this exploration, I and my advisor Pattie Maes at Media Lab are shaping a new computing paradigm of “Human-Machine Symbiosis” embracing late advancements in AI. This is a spiritual continuation of the vision Joseph C. R. Licklider had, where humans and machines synergistically collaborate by offloading repeated mechanical tasks to machines. We envision that, with increasingly advanced AI that can realize computational behaviors and thoughts close to biological organisms, we can realize more radical human-machine integrations. “The computer is incredibly fast, accurate, and stupid. Man is unbelievably slow, inaccurate, and brilliant.” My ultimate goal is to investigate the future where infinitely accurate computing is internalized into creative human.

Notably, there are sufficient evidences in neuroscience and evolution study supporting the plausibility of machine-driven evolution. The possession of an opposable thumb and the skills of coping with tools preceded the growth in brain size of Homo sapiens. This signifies that altered corporeal capabilities affect how our brain functions and develops. Not limited to genetic modifications, research suggests that our internal body model updates given temporary extensions to sensorimotor capabilities e.g. it is possible to trigger a person to sensorially perceive an existence of a virtual third arm.

Rethinking interfaces

In my past two years at MIT Media Lab, I’ve been crafting experimental (and somewhat magical) computer interfaces that rely on physical intelligence. My philosophy was to engage users in an intimate corporeal relationship with digital entities, not through swiping on flat screens or unknowingly waving hands in air. This journey of rethinking interfaces facilitated the discovery of physically/cognitively coupled humans and computers.



THAW explores the possibility of physically intervening into computer screens. Using a novel near-surface tracking technology, one can hold/move a smartphone over a screen and e.g. directly interact with a game character jumping in or on top of the phone. Cord UIs is a sensor-augmented cable, that allows one to manipulate digital data flowing through the cable with direct touch. Pinching on a headphone cable temporarily pauses music, or tightening a knot dims a lamp.

How can computers operate with physical intelligence more intelligently? AfterMath is an AI that understands physical environments through physics simulation in a virtual copy of the environment, and lets its users see physics-induced consequences through AR. L'evolved is a Harry Potter-esque concept of smart, flying objects, understanding what a person needs and working in close tandem.



Remnance of Form is an art installation where the shadow of an object transforms and displays human characters. It is shy and thus disappears when approached, or turns hostile and spiky. As a metaphor of inner self, the shadow is designed to mirror viewers’ inner emotions. Vignette “Dream of Flying” highlights human desire to transcend physical limits and defy gravity. A flying shadow is perceived as a distant wonder at first sight, however, one soon realizes it is the dream inhabiting our deepest inside.

These demonstrations of intimate computers demolish the border between physical and digital worlds. Humans and computers are put together through rich touch points. Then the question is, what the step after interface design is. What if, instead of creating computational world we dive into, we make computers that are extensions of ourselves? And begin to design ourselves, beyond interfaces.

From interface design to evolution design

In 2011, I lead the development of eyeCan, a highly customizable, open-source eye-tracker. My team collaborated with a Korean government entity to distribute eyeCan to 200 people with motor-disabilities. The main aspiration behind this project was to democratize assistive technology and help people gain capability beyond their challenged situation.

In retrospection, the project illustrates an exemplar and (actually) long-standing tradition of “evolution design”. Computer interfaces are increasingly designed not only for operating softwares, but for augmenting the corpora of their users. This shift of focus towards augmenting humans imposes a question on how to design the “users” and ultimately their evolutions.



In an artwork A Flying Pantograph, I am designing an evolution in creative expression with drone technology. A drone becomes an “expression agent” - then carries out the scaled-up process of drawing on a remote canvas. Not only mechanically extending a human artist, the drone plays a crucial part of the expression as its own motion dynamics add unique visual language to the art. The drone battling the vortex of air, with the suspense of crashing aftermath, poses a contrast with the optimistic idea of technologically evolved human artistry.

Like this risk of crashing, does this idea impose a risk to human nature? It can lead to a dehumanized future of substituting human physiology with lifeless machines or overuse of virtual reality. This is why I am investigating the language of evolution design, in form of symbioses, not replacement. There is a prime number that manifests unique human nature. It would be important not to deconstruct what we are, but to construct a larger prime number through the permutation of humans and machines - leading to a yet unique, diverse, sophisticated, and creative species.


Footnote
- Information on all the projects and related publications, awards, and media coverages on www.sangww.net
- The robotic wearable will be presented at CHI 2016, the most prestigious academic conference in Human-Computer Interaction field. My paper was awarded with Best Paper Honorable Mention, given to the top 4%
- THAW was nominated as one of the finalist in 2015 Fast Company Innovation by Design Awards. Also it was selected as Gizmodo’s “The 7 Most Important UI and UX Ideas of 2014”
- Cord UIs was the winner in 2015 Fast Company Innovation by Design Awards
- Remnance of Form was part of my artist-in-residence intern project at Microsoft Research. It was exhibited at Microsoft Research studio99 and top academic conferences (CHI, TEI) in the field
- eyeCan project gave birth to Samsung Electronics’ C-LAB, an in-house innovation lab where employees can self-initiate projects with socio-technical impacts. My team was also awarded with Samsung Social Responsibility Award in 2012



Bio


Exploring the gap between the physical and the digital, humans and machines, Sang-won is passionate to push the envelope of human-computer interaction. His ultimate goal is to reconstruct our notion of reality and evolution by turning imaginations into corporeal forms. His creations include THAW – a Magic Lens UI that combines smartphone and computer screens, Remnance of Form – a shadow art installation that demonstrates magical behaviors of a shadow, Cord UIs – reinventing cords as UI elements, and on-body robotics that give humans extra machine body parts. His research and arts have been presented in prestigious academic conferences (CHI, UIST, TEI, UbiComp), TEDx events, SXSW, and major media (BBC, WIRED, Fast Company, Engadget, Huffington Post, etc). His current focus is in the space of integrated human-machine body, aspired to redefine our own selves and the image of the selves.

Before joining MIT Media Lab, he was a software engineer at Samsung Electronics where he led the software development of eyeCan, an open-source DIY eye-mouse designed for people with motor-disability. This project became an integral contribution to the birth of Samsung’s C-LAB. The eyeCan project was covered by major newspapers in Korea, also he was invited to give talks in TEDx events, Seoul Digital Forum, and Tech Plus Forum. He received his Bachelor and Master of Science from Korea Advanced Institute of Science and Technology (KAIST), concentrating on 3D Computer Vision and Machine Learning.

He is now a PhD student at Fluid Interfaces Group of MIT Media Lab, working with Pattie Maes.
[curriculum vitae] [projects] [talks, awards, exhibitions, and media] [publications list]

Selected Awards

Body Integrated Programmable Joints Interface - CHI 2016 Best Paper Honorable Mention. 2016
Cord UIs - Winner, Students Category, 2015 Fast Company Innovation by Design Awards. Sep 14 2015
THAW - Finalist, Experimental Category, 2015 Fast Company Innovation by Design Awards. Sep 14 2015
THAW - Gizmodo, The 7 Most Important UI and UX Ideas of 2014, Dec 30 2014
Samsung Social Responsibility Award, Samsung Electronics Dec 2012

Selected Publications

S. Leigh, and P. Maes. Body Integrated Programmable Joints Interface, CHI 2016 (Best Paper Honorable Mention - 4%), 2016
*H. Agrawal, *S. Leigh, and P. Maes. L'evolved: autonomous and ubiquitous utilities as smart agents, Ubicomp 2015, 2015 [PDF]
S. Leigh, P. Schoessler, F. Heibeck, P. Maes, and H. Ishii. THAW: Tangible Interaction with See-Through Augmentation for Smartphones on Computer Screens, TEI 2015, 2015 [PDF]
P. Schoessler, S. Leigh, K. Jagannath, P. van Hoof, and H. Ishii. Cord UIs: Controlling Devices with Augmented Cables, TEI 2015, 2015 [PDF]

Selected Exhibitions

TEI 2016 Art Exhibition - A Flying Pantograph, Feb 14 - 17 2016
CHI 2015 Interactivity / Art.CHI 2015 - Remnance of Form, Apr 21 - 23 2015
TEI 2015 Art Exhibition - Remnance of Form, Jan 17 - 18 2015
Microsoft Research Studio 99 - Remnance of Form, Aug 22 - Sep 16 2014

about