Minecraft.me is a project to provide enhanced and immersive gaming experience by connecting user’s physical activity to the virtual world.
We can make places for ourselves, but why not our emotion?
Minecraft is a game about placing blocks to build anything you can imagine. However, while any imagination of players can be projected onto the architectures they create, the identity of the players themselves is not reflected on the avatars in the game. It is mainly due to the fact that the actions an avatar can do is quite limited - walking, jumping, looking around, or swinging hands. There are plug-in software that helps users to perform a couple of bonus actions with the avatar, but it allows only a few additional controllability. What if your character can dance Gangnam Style?!
Enriching Gaming Experience with Physical Activities of a User
- Translate user's body gestures and/or emotions into actions of the avatars in the game
- Analyze user's appearance (e.g. clothes, skin, hat or hair color, and so on) and reflect them onto the avatar
- Provide auxiliary user controls to enable avatar's gestures that have social meaning or instructive role
Windows PC + Minecraft Coder Pack (eclipse) + Kinect (w/ openNI)
Translating Body Gestures into User Input
Imported openNI library into decompiled Minecraft’s main class, and added Kinect input to the main event loop. Computed body motion is transferred to user’s model object, affecting the motion of the avatar in the game.
Added an additional UI to the program, in where users can adjust the color of clothes the avatar wears, based on their actual appearances.
May. 2012 ~ Present