top of page
Lead Designer (2018)
Rocket PowAR was completed during the Reality, Virtually Hackaton that I participated in January of 2018. The ideology for this project centers around muscle rehabilitation and physical therapy. Recruiting and using the correct muscle group(s) during an exercise is a crucial part of physical therapy to strengthen weak areas and support injured structures. We wanted to gamify physical therapy and utilize modern spatial computing technology in order to make it more enjoyable.
I operated as the lead designer for the concept, showcasing how we could utilize the correct muscle groupings by launching a rocket ship to the moon.
INTERACTION & UX DESIGN
For the beginning phases of the hackathon we as a team were incredibly interested in utilizing a Brain-Computer Interface (BCI) and integrating it with AR. We had to go through a significant amount of prototyping and testing in order to get the data stream set up correctly. Below you can see the Heart Rate Reading video, we are finally receiving a correct flow of neurodiagnostics into the program (ECG Signal). The second video seen is a a correct flow of EMG signals showcasing the movement of the correct muscles to a specific area in which the electrodes are placed.
Heart Rate Reading
Muscle Electrical Signal Reading
Electrodes and Ganglion
Sketches and User Flow
Once we decided on the project physical therapy application, we decided to focus on the gamification of the therapy itself. I iterated on a couple different designs that we could do and we decided on a prototype that would allow you to launch a rocket to Mars after the success of the specific muscle movements.
Below you can see the overall user flow as well as some fun sketches of layout for the human diagrams.
Working with the engineers on the project, I made and curated some very low-poly and basic assets to utilize for the application including the floor garage, human diagrams, and rocket ship.
Final Implementation and Process
Ultimately, this was an incredibly complex project. Below you can see the large amount of steps the information has to go through from the initial muscle contraction to the visualization on the headset.
As the muscle contracts, the signals are received by the ganglion amplifier. This interprets those signals and sends them via bluetooth to the computer. These signals. are organized via a data channel and within a certain threshold of success is given meaning as either a success or fail. Unity then receives this information and through the Magic Leap headset the end-user receives visual confirmation they have completed the task.
This project taught me about working outside my comfort zone and learning new things. Prior to this, I had absolutely no idea about the diverse amount of neurodiagnostics or even the difference between EEG, EMG, and ECG signals.
Incorporating this into an AR project pushed the boundaries of the technology as I had known it before, and I furthered my understanding about physical applications and input of digital computing.
bottom of page