top of page
Title
Overview

Dargon

Lead Designer (2018)

Dargon was completed during the Seattle AR/VR Hackathon that I was invited to participate in September of 2018. During this project, I acted as both the Lead UX Designer and 3D Artist for the application. Because this was a Magic Leap-based project, I worked with multiple developers and a 3D artist to create a seamless experience that utilized some of the more advanced features the Magic Leap One Headset (ML1) had at the time. Our entire project was created over the course of a couple days and utilized head tracking and hand gesture technology.

Execution

INTERACTION & UX DESIGN

Concepting and Implementation

As the Lead UX Designer, I wanted to work with the engineers and artists to create something we all would enjoy. As this was our first time demoing the Magic Leap One we were full of ideas on what to create. As a team we decided we wanted to create something that also utilized the advanced technology the ML1 had to offer.

We assembled Friday evening, and ideated on many different project types and styles after trying on the headset and demoing content.

 

 

We settled a game pupetteering system that used the hand gesture software appropriately, while also emphasizing the game development background of a lot of our team members. We determined this would be beneficial to everyone in the group, and also something we could reasonably create within the time span of the hackathon.

20180915_201249.jpg

The lead developer and I collaborated together to figure out what would make the most logical sense for a puppeteering game. Looking at a lot of the hand gestures, I found that they operated very similarly and decided on action-based properties that one main character could do. We settled on a medieval theme, with our main character being a dragon. This dragon would have three main actions the the user could control according to the hand gesture and eye tracking system native to the ML1.

Postmortem

3D Art and Implementation

Seen below are some of the environmental models that I created during the course of the hackathon, I had done a little bit with environmental modeling but not nearly as much with Unity scripting for onCollision effects. This was a really fun experiment for me to learn and play around with Rigidbody effects in Unity.

POSTMORTEM

Lessons Learned

This project showcased to myself the extent of the Magic Leap One technology. One of the more difficult tasks was figuring out how to incorporate the spatial mesh, how to move a character around, and also how to reward the user with a mechnism that didn't require an environmental change.


The biggest takeaway I had from completing this project was how integral it is to design for the specific hardware and input your using, and design with it's limitations in mind.

 

Below you see a video demo of the project and all of the individuals that made this project possible (it was no small feat). We won first place in the Magic Leap project category for our integration of hand gestures and eye tracking.

TO TOP
End
bottom of page