Made to be a simple tech demonstration, this interactive uses an Oculus Rift and Leap Motion controller to build molecules and play with atoms.
For this experimental project I served as the UX designer, creating wire-frames and a generic graphic set.
Utilize a Leap Motion controller to create an educational experience where students can build molecules by grabbing atoms.
Leap Motion is a sensor that can detect your hand and finger position and orientation in real time. The initial assignment was to create a table top implementation with a large screen in front of the user, as seen in my wireframe on the left. So my first step was to become very familiar with the Leap by reading the documentation, playing around in their various applications and identifying what sort of detectable gestures we could utilize. Below is my wireframe of those gestures.
I set out trying to think how we might select atoms to build our molecules and my mind went to a wall of the periodic table. This ended up becoming too complicated to maneuver so it got scaled down to only a selection of atoms and upon instruction from the creative director they were turned into ping-pong balls with the atomic letters on them. From there I worked on how the work-space should be arranged and the types of information that should be displayed. My first wireframe pass is below.
Using this as the base I wireframed out the functionality of grabbing the atomic ping-pong balls and bringing them into the circle where they snap together to form the molecule.
After discussing with the developers and spending several days with the device and their latest Orion drivers, it became apparent that the direction Leap Motion was taking is towards virtual reality so we modified the project to exist in VR. This resulted in some slight wireframe changes like adding target circles, giving menus z-depth, and adding a floor for bouncing physics fun.
We also wanted to be able to view the molecule at different angles so I created the wireframe below that details how someone might grab and spin the object.
Below you will find a gallery of the final wireframes that created for the project, and at the very top of this post you can watch a video of the final product. As the development progressed the team and I would discuss how things were working and evaluate the functionality. Along the way some features were modified for a more responsive experience, along with adding some fun things like all of the atoms bouncing away at the end.
Working with the team at Bully we brainstormed to imagine and develop an attention getting HoloLens game to draw a crowd to Synergy Technical’s booth. The challenge was to create a quick game that would utilize the HoloLens features while simultaneously informing the player about Synergy’s services.
Part of the fun exploration was discovering how good the HoloLens is at detecting the world around you. We played with the idea of particles falling onto desks and spaceships bursting through walls. It was a lot of fun.
This was my weekend experiment with Vuforia and Unity. My goal was to explore the idea of using augmented reality to enhance repair instruction, or any instruction around physical objects. It was also my intro to Vuforia and some new things within Unity.
Hopefully I can develop it further and come up with a more fleshed out exploration using the object scanner app.