History Mystery

Mixed Reality Tour @ Phipps Botanical Gardens

Interaction Design, Fall 2019

Brief

Phipps Conservatory and Botanical Gardens located in Pittsburgh in 2030 is interested in developing an augmented reality tour as a way to enhance the user experience. Expect that mixed reality at this point is accessible beyond a headset.

Target Audience

Ages 13+

Tools

Oculus Quest (Gravity Sketch, TiltBrush), Reality Composer, Microsoft Hololens, Adobe Aftereffects

Concept

How might we create an enhanced experience that visitors at Phipps otherwise wouldn’t be able to with human factors/constraints in mind?

Process

Upon receiving the project brief, I decided to create constraints for myself in order to create a realistic and practical implementation. I'm rather wary of VR/AR technologies because although technology advances rapidly, people stay the same. The two questions I kept in mind throughout the process was  

  1. How do you design for the future with the tools of today?

  2. How do you avoid creating something with the “iron man” effect (a great representation of the proposed technology that's actually an awful idea)

Following this, I decided to look into the mission statement of the client because Phipps is known to be a hot tourist spot by CMU students and locals. By visiting the gardens, I was able to understand who their audience group is along with their core values:

"To inspire and educate all with the beauty and importance of plants; to advance sustainability and promote human and environmental well-being through action and research; and to celebrate its historic glasshouse."

Personas

Spatial Constraints

One of the biggest things I noticed on my visit was how narrow the walkways were. There were moments of congestion if people were stopping to look at the plants then those behind them couldn’t get past. As a collective, all of the plants were really beautiful but it was hard to learn more beyond that point.

Image on the right is an example of what the walkways currently look like throughout each exhibit.

Storyboarding and Prototyping

The idea I wanted to fully prototype is having the user being able to pick a plant that they want in the botanical garden and watch its evolution over time by physically scrubbing through a dial. A really great resource that helped me throughout this process was Apple's video on rapid physical prototyping. Because there was a learning curve with using VR/AR technologies, it eased nerves to use a clear plastic ruler and some paper cut-outs.

One of the biggest things I learned throughout this project is the importance of designing within the headset in order to understand 3D space. This goes back to the idea of being able to design for the future with the tools of the present and not the past. So I decided to translate this concept of scrubbing time and being able to view the plant's evolution in Tilt Brush.

I now had the first-person experience of this interaction but I also wanted to see what it was like in the third person in order to understand the sizing and movement of each component. Through this foamcore model, I was able to garner a better sense of text in space.

Kicking the Aftereffects Curve (carefully)

I wanted a huge range of variety in my prototyping method because there was something to garner from each method, even Aftereffects. With my video, I was able to communicate my idea versus just tell it.

 

I was able to edit in the plant selection and the dial turning to scrub time; however, I didn’t quite know how I was going to show plant growth in an obvious way. The liquify tool was too strange and too subtle to prove my point in the video, so I decided to use my newfound skills to put a fire on the plant to denote a form of change. This was obviously ridiculous, but it was good practice.

Changes & Refinement

I decided to look at Microsoft's Guide regarding best practices of VR/AR technologies in order to better my current methods. I decided to look back at my video and better align methods such as the point-and-click to fit a practical implementation. 

  1. The Plant Selection Process
    This idea of window shopping came to mind. Because every plant at Phipps has a placard associated with it but no explanation, I decided to change my current gaze-and-select method to a gaze-and-point detection method where the point at the placard reconfirms which plant the user is trying to pick. After that, then the technology would detect the outline of that plant. The subtle emphasis for confirmation is what’s important in this case.

  1. The Plant Selection Process
     

  2. Creating Plants in Tilt Brush & Reality Composer
    The idea is to have three different sketches that fade in and out in either Aftereffects or Reality Composer. I ended up tampering with the Reality Composer object behaviors to make the plants in different stages rise and disappear based on time and reactions. The specific plant I was working with used to be in tree form back in the dinosaur age and I found that the heads of this plant stayed relatively the same so the transition would be able to flow seamlessly.

I also found that when converting from GBL to USDZ, the color doesn’t transfer, so I ended using colors in the LaTex format in order to get somewhat of a close match in terms of the green. I tested out this Reality Composer on paper first before going back to the museum to film.

Through this, I also revamped to timeline to be more familiar and compact. At first, I wanted the timeline to be stationary at the hip of the user but it would've been difficult to see the tick marks as time scrubbed. So I decided to keep above the dial along with an image so the visitor can see ​which era the plant is at.

Final Concept Video