As various forms of interaction evolve, so should the way we visually and cognitively engage with information. However, with many novel technologies, such as AR/VR, comes a steep learning curve that often serves as a point of hesitance for ubiquitous adoption.
This project explores how XR experiences can be engaged with through traditional forms of interaction and if this hybrid behavior can enrich the way individuals absorb new information.
VR Gravity Sketch
3 Weeks (Fall 2019)
This exploration can be benefitial to contexts such as the Phipps Conversatory, a botanical garden in Pittsburgh. Like many exhibition spaces, most of the plant life at Phipps is curated and installed in an inevitable, showcase-like manner. Plants are taken out of their contextual environments and as a result, visitors are unable to gain a holistic understanding of the exhibited plant's role in its natural ecosystem. Given these concerns, I framed my problems through these two questions:
“How can I provide users with a better contextual understanding of exhibition pieces and their natural environments and in the process, bring excitement to learning through technological innovation?
“How can I encourage visitors with a call to action in regards to their role as a player in their own communities?”
A tabletop AR terrain that adapts depending on the user's placement of physical action trophies. This design solution seeks to utilize existing intraction paradigms to help visitors visualize an exhibited plant in its broader natural habitat, as well as the role the Anthropocene plays within that specific environment.
Form of Interaction —
Every AR tabletop within Phipps consists of a “Start” trophy, the trigger that begins every simulation. When the user moves the trophy to certain spots on the table, the CG projected terrain reacts accordingly depending on what type of action trophy is used.
Information Cards —
Depending on which action trophies the user decides to move, appropriate signs appear with relevant information to that action. Since one of the interaction's primary goals is to visualize impacts of certain actions, each information sign leads to an additional call-to-action for the user to digest.
when beginning my research, I visited Phipps to not only better understand the exhibition material itself, but also the types of people that visited.
Breaking down my stakeholders helped me realize a few things. To truly satisfy both my primary and secondary users, I would need to create a highly intuitive experience capable of engaging, entertaining, and educating various groups of individuals that all have diverse motives.
A tricky part of prototyping for spatial interfaces is communicating the way information is displayed. With time as a primary factor, I built physical mockups of information cards.VR Prototyping: Terrain Projection
From there, I took my initial tabletop design ideas into Oculus' Gravity Sketch and began modeling in 3D space. Doing so allowed me to design a one-to-one model, giving me a more realistic understanding of what it might be like in world space.AR Prototyping: Terrain Projection
Designing in VR proved to be extremely valuable in many regards, but it couldn't mimic the qualities of world space. I transitioned out of the headset into Xcode's Reality Composer so that I could prototype interactively with AR. Below are a few iterations.
Final Prototype —
At the end of this process, I developed a design proposal that visualizes how an individual might engage with this interaction.
As technology gets smarter, so should the way we undestand its new affordances. This project projects an ideal future where technology becomes secondary in the way we interact with each other and the physical world. I don't think we're meant eventually to live our lives in a VR headset... but I do believe that these technologes will open doors to new ways we perceive and understand our surroundings.