Friday, November 10, 2023

Cuddly AI

TED AI Hackathon 3rd place winner

This was a 2 day hackathon project building a cuddly AI toy powered by LLM. I built the mechatronic system for animated year movements, consisting of 6 different animation types indicating different affective and conversational states. These include neutral, excited, sorry, listening, thinking, and rest. These states are dynamically mapped based on real-time analysis of conversation content, with which action decided by the toy is further expressed through movement of the ears.

The importance of tangible and embodied interaction is the core focus of this project. Being the only hackathon entry with a physical form factor, I felt the version of what the imminent future many of us think about is entirely disembodied. As exciting UI-less interaction with devices, data, and systems sounds, the current rush towards intelligent agent driven future easily overlooks how we feel, cog, and act through our body and emotions.

Team: Peter Fitzpatrick, Robyn Campbell, Sang Leigh, Joey Verbeke, Hiroaki Ozasa





about