Enriching the Narrative Experience in Place

Andrea Benatar
12 min readNov 21, 2019

--

Design Prompt: Phipps is interested in developing an augmented reality tour as a way to enhance the user experience. You will have the choice of either creating a stand-alone tour experience or one that works in conjunction with a guided tour. Either way, your goal is to tell a story, convey useful information and delight users.

You will be designing and developing a low-fi prototype of a single tour stop (e.g. about a single plant or arrangement) for the client to review. Note that you can allocate space in Phipps, whether in a general area or close to the actual plant, for a table/kiosk/surface for physical artifacts that can enhance the experience. For example, there could be a small table with bowls with different types of soil that people can touch. Part of the design challenge is to consider the appropriate mix of physical and digital artifacts and interactions.

Step 1: Visiting Phipps, Initial Storyboarding, and Persona Boards

In order to get the ball rolling with this project, I first wanted to go back to Phipps to see what specific location (or stop) I might want to focus on and what inspiration I might gather from the space itself. In taking a very slow stroll through the conservatory, I was able to identify what I might want to add to the Phipps experience using mixed reality. Below are some of the initial considerations I came up with:

  1. Showing the broader context that the plants exist in: at the conservatory, we see plants from all over the world- from different habitats and climates- all condensed in one place. What would it look like to pick out specific plants and “give them the spotlight”?
  2. The role that specific plants might play in the ecosystem: I especially noticed this in the tropical fruit room, in which you notice what it actually takes to grow some of the foods we eat on a daily basis.
  3. How people in the tour group might interact with each other: how might physical person-to-person interactions enhance what occurs in the mixed reality?

I also decided upon visiting that I wanted my tour experience to be self-guided. I found that the exploratory nature of just walking around Phipps was something that might be lost through a more formal tour. Thus, in my design, I was very interested in exploring how the visitor might be able to use the mixed reality to feed their own curiosities rather than a set, structured tour.

In starting to storyboard, I mainly focused on the first consideration, trying to create an experience that would give the visitor an idea of what it would be like to actually step into a plant’s natural habitat. More specifically, I thought it would be interesting for the visitor to be able to virtually see the weather and lighting conditions change around them (along with the surroundings) upon selecting a plant.

In terms of the physical location for this interaction, I was particularly inspired by the tunnel passageway that is a few rooms past the entrance. I thought having a designated space to walk through and experience this new habitat would make the experience richer and more immersive.

I wanted to explore what it would be like for the visitor to select a plant in front of the entrance and upon entering, the holographic images through the tunnel would start shifting (to represent the conditions and habitat of the selected plant). In order to determine what this experience might look like in the space, I prototyped this briefly in gravity sketch.

Having figured out the general idea for my MR tour stop, I also created three persona boards that could continue shaping the specifics of this experience.

My three personas ranged from an older, retired artist to a student to a family man. In creating three personas that were all relatively different from one another, I began considering how I might make this experience most pleasant for the varying audiences present at Phipps and what each person might take from the experience.

Step 2: Initial Prototyping

Before beginning to experiment with low fidelity prototypes, I wanted to first flesh out my tour stop a little more. During our in-class critique, Peter gave a few notes that I hadn’t considered before. First off, he mentioned that the passageway I had been planning on using was a bit too narrow and tight of a space for multiple people to walk through. Thus, he suggested relocating the interaction to an open space where people could walk around and explore more freely. The second note Peter gave had to do with the interaction gearing a bit more into VR than MR/AR. For this reason, I wanted to be more aware of when I was using the headset to add to what was already present versus creating an entirely different space.

Thus, after discussing this a bit further with Peter and also looking through some of the different rooms I had photographed during my visit to Phipps, I began to think of how I might adapt the design to fit into a larger space and create more interaction with the surrounding physical space.

As a second iteration, I considered having the same type of interaction occur within the more open rooms along the perimeter of Phipps. Instead of having the visitors walk through some sort of tunnel or passageway to experience the habitat, they would simply walk through the regular exhibitions (i.e: tropical room, or desert room) and then encounter an open space (potentially a curved wall) at the end for them to interact with.

One thing I was still thinking through with Peter at this point was whether or not the curved wall would be blank and the habitat is built through the MR or the wall has some sort of projected environment that can change with and without the MR. In both scenarios, I wanted to incorporate a tangible element, by creating pods that the visitors can stand on to provoke different changes in the habitat, such as weather/climate or seasons.

Thus, I made a rough foam core mockup of what this space would look like and shot a quick video of the interaction. In the video, I prototyped what it might look like for there to be a screen that the MR is adding to. What is not evident in this video, but I decided to incorporate through more in depth storyboarding, is that the generic environment projection could change according to where the visitors step, but with the MR headset on, the visitor would also see how their chosen plant (and the specifics of the plant’s habitat reacts to these changes.

Photoshop mock-up of the space. However, the rooms would be specific to a type of habitat (tropical, desert, etc)

Reflection: Digital media in physical environments

As more and more physical environments begin implementing digital components- many of which are the responsibility of engineers and programmers- the role of the designer really comes down to the seamless incorporation of the physical and digital so that the experience remains human-centered. In other words, the designer should be the one to consider the purpose of the technology in a specific space and how it might fit in with the surrounding physical components to create an elevated experience. We often see technology being added to spaces just for the sake of technology and when we don’t consider the tangible, human aspects of the experience, we lose touch with the actual purpose of the environment. I would say that’s where designers come in.

Step 3: More prototyping and Sketch Video

After having made a few rough prototypes that helped me to more clearly see the pain points in my idea, I found (with the help of Peter and Daphne) a few things I wanted to address before beginning my final sketch video.

  1. First off, Daphne had pointed out the issue of accessibility with the pressure points being raised. Thus, I decided to make the mostly flushed with the ground.
  2. Is there a more tangible element in the interaction that visitors can touch? This one was difficult for me because I found that most elements that I could place in space for the viewer to touch would somewhat interrupt or intrude the immersive (and separate) environment I wanted to create with the curved wall. Thus, I began thinking about how the stepping itself can be a more tactile experience. For example, could the station for one season be a cushioned material (to simulate stepping in wet soil), while for another it is concrete?
  3. How much should be added to the habitat based on the specific plant chosen? And how can the different factors vary enough within one type of habitat? This is another one I had to consider carefully and even do some extra research in order to figure out. In order to really enhance the differences, I chose to make the habitat on the projection a more generic tropical habitat, so that within the MR, there are many more variations of weather, light, colors, and wildlife.

After having a relatively clear idea of my direction, I decided to attempt putting together a sketch video of the entire experience in AfterEffects.

This first attempt was more or less a fail.

Not even having gotten to the main portion of my interaction and already very frustrated with the unrealistic-ness of the video I was making in AfterEffects, I discussed with Peter how I might re-evaluate my method of prototyping. He brought up a good point, which was that the final product did not necessarily have to be single all-encompassing video but rather there could be multiple methods for showing different details of the interaction. In particular, we both agreed that manipulating the existing space at Phipps would be very difficult to show in an AfterEffects video, but can be sufficiently communicated through a more detailed version of my physical prototype and Reality Composer.

Feeling much more confident with this idea, I began making a higher fidelity model in which I could play around with Reality Composer.

I began experimenting with how I might best communicate my interaction with the somewhat limited tools of Reality Composer. At first, I wasted a lot of time trying to successfully export my own models from tilt brush in order to use them in Reality Composer, and I ended up just using the models I could find online (at least for the first draft). This felt pretty limiting so I wanted to continue exploring ways of importing/making 3d models. However, just to get a sense for the software and how it would look inside the model, I made two rough videos using the few models I could access. The first video was meant to prototype the beginning of the interaction, when the visitor first enters the habitat of the plant they have selected. In the second video, I wanted to show what might happen when the visitor stands on a specific season. This was definitely more frustrating to prototype because I wanted to show the actual habitat and the plant changing according to the season, but that was not possible through Reality Composer alone.

Thus, moving into the second iteration of my Reality Composer prototypes, I wanted to be sure to add as much as I could within the app, but also add any necessary shifts in AfterEffects if not possible through Reality Composer.

In addition, I remade the beginning of the interaction (before reaching the wall) in AfterEffects, but this time leading the viewer through it in first person and eliminating the presence of any real person in order to make it more consistent with the Reality Composer videos.

Still needing some tweaks to the timing and flow, I felt a lot better about my second (and much more simplified) go at AfterEffects.

Step 5: Final Edits

After showing my Reality Composer and AfterEffects videos to Daphne, she brought up something I had definitely missed which was that both the angle at which the two videos inside the model were shot, as well as the size of the “screens” made the viewer feel ant-sized. Not having considered proportion very much, I went back to the Reality Composer files and made sure the screens were more accurately scaled, and I also shot the final iteration of the videos with my phone level with the ground so that the camera would be more or less eye level within the model. Another change I made to the Reality Composer videos was the addition of flat images. For one, I did this so the screens would better align with the ones shown in the AfterEffects video. The second reason for importing images over USDZ models was in order to have a greater quantity of plants in the scene. Because 3D models of plants are either very difficult to come across, very expensive, or difficult to convert to the right file type (trust me, I tried), I decided that for the range of motion I needed, PNGs might be sufficient and still a bit more three dimensional than if they were added in AfterEffects. In fact, after re- shooting the two videos, it was only very minor additions that I added in AfterEffects.

In addition to Reality Composer and AfterEffects, I also made a very quick stop motion in Photoshop to show what the seasons changing would look like if the visitor chose to do it without mixed reality.

Below are my four final videos.

Beginning of tour, in which visitors explore at their own pace and build a digital catalogue utilizing their mixed reality headset. This video was prototyped with AfterEffects.
Part 1 of my reality composer videos, showing the visitor entering their selected plant’s habitat.
Part 2 of the Reality Composer prototypes, in which the visitor steps on “winter” and sees their plant and habitat reacting to this seasonal change.
Simple stop motion showing what it might look like to step across the circles on the ground in order to change the season being projected without any mixed reality.

As seen in the final Reality Composer videos, I primarily added simple gifs (of rain falling and a bird flying) to make the scene more dynamic, as well as adding a contextualizing shot at the beginning of the “winter scene”. The AfterEffects video remained more or less the same, but I tried to fix the flow as best as possible.

Though I would have liked to prototype these interactions to a bit higher of a fidelity, I was happy with how the idea came across through these very simple videos. In addition, I found that using Reality Composer (for the most part) instead of AfterEffects was very beneficial because I was able to walk through my interaction in my physical model much more realistically, which in turn, made me consider the visitor’s perspective a bit more. With more time, I think I would have experimented a bit more with developing my own USDZ animations in Reality Composer, in order to make the curated habitat much more fleshed out and interactive.

Self-Reflection

I would say that overall the skills I learned from the first project had more to do with developing a thorough and all-encompassing presentation of a final concept, while the skills I gained in the second project were much more exploratory since they were about quick prototyping and new mediums. In other words, I think the museum project taught me a lot about demonstrating a concept from start to finish, whereas the XR tour taught me how to take a snippet of a concept and explore different modes of communicating it. While both projects required a lot of creative methods for prototyping, the first project had more formal components but less complex factors to prototype, which I thought was a very good precursor to the second prompt (which had less but more complex components). For that same reason, I felt as if the mixed reality project pushed me further out of my comfort zone, forcing me to accept the prototyping limitations we had and embrace creative methods for communicating our idea.

I would say that an environments designer is one that is in charge of both the visible and visible parts of an experience in order to make it as enjoyable and seamless as possible. Environments designers have to consider every individual interaction that occurs within a space (be that physical or digital), as well as considering how the interactions fit together to shape the overall experience. However, spaces can never be considered without the people that inhabit them; tactile and “small-scale” interactions still form a large part of an environment, which is a concept I hope to keep in mind going into my future practice.

--

--

No responses yet