This was a three-month project that focused on designing a new augmented reality experience for the Pokémon Go app. You can jump to the solution below.
The goal was to design an experience that allows players to pair Pokémon Go to a pair of hypothetical augmented reality glasses called Poké Glass. This case study details the end to end experience of Poké Glass from onboarding to catching Pokémon while wearing Poké Glass.
How to design an AR experience that allows players to explore the Pokémon filled world in a new way and that seamlessly integrated with Pokémon Go?
I wanted to work within the constraints that Niantic UX Designers face on a daily basis.
I had three questions in mind during my initial research.
(1) How had Niantic integrated new features into the core Pokémon Go app?
Pokémon Go launched a feature last fall that allows you to take photos with Pokémon in the real world. The introduction of the product was smooth, its onboarding effortlessly educational.
(2) What was the history and outcome of AR enabling hardware such as Google Glass ?
Google Glass’ consumer adoption failed in part due to a somewhat geeky aesthetic and issues of social dynamics that were not adequately designed for.
(3) What was the history and outcome of AR enabling hardware such as Snap Spectacles ?
Snap Spectacles was more socially accepted upon release. However, a key issue is that a primary use case of Snapchat is taking selfies. Spectacles, by their very form factor, made this use case impossible.
Finally, the video format was visually proprietary with no native way to export content to other platforms.
Niantic worked with Nintendo to introduce two hardware products that augment the experience of catching Pokémon:
Pokémon Go Plus
Pokéball Plus
I read accounts from Pokémon Go's subreddit and spoke with a variety of target players. The target audience would be casual but frequent players of Pokémon Go.
I delved deeper into Pokémon Go Plus and Poké Ball Plus. Both hardware products allow players to play Pokémon Go in a way that trades complexity for ease of use and novelty of play. This also undergirds the fact that the products' target demographic are players who love the game but don't play it obsessively.
I audited the default functionality available to a player during a Pokémon encounter to better explore what made sense to offload to a pair of AR glasses.
With that I settled on a core experience of:
I mapped out a few user flows. I started with the core experience of catching Pokémon before moving onto tertiary flows.
With the primary flows complete I started focusing on how the core experience would visually work.
I designed sketches on paper to explore the experience of encountering a Pokémon, throwing a Pokéball, and catching a Pokémon.
I wireframed a low fidelity visual user flow to test the onboarding of the experience on a phone.
I quickly designed how the phone screen would work when encountering a Pokémon while wearing Poké Glass. There were two interactions I had in mind:
I read Google Glass' documentation. Based on this, Poké Glass' dimensions would be 640 pixels by 360 pixels, the same as Google Glass.
An early demonstration of the core interaction of catching Pokémon I showed to a UX Design manager.
I sketched out the early visuals of Poké Glass' onboarding.
I searched GitHub to find a collection of Pokémon Go’s assets. With these assets I could start iterating on mid fidelity designs quickly.
I tested how clear the product onboarding was. My participants were casual players, friends, and members of the Pokémon Go subreddit.
Participants understand what the product was but the onboarding cards were too visually cluttered.
Honing in on these results I stripped more elements from the onboarding cards making them easier to take in. These screens would evolve even further as the project went on.
Up until this point, I focused on the experience of onboarding and catching Pokémon. Early on I designed a way to pair Pokémon Go to the Poké Glass hardware but cracks in this early design began to surface.
I initially used a screenshot taken at night as the background of my design. At night, the button to toggle Poké Glass was clear.
AR toggling fell apart during the day, when most players actually play.
Going back to the drawing board I studied how Niantic implemented both AR modes and different state changes in Pokémon Go.
After some iteration, I arrived at a button on the Pokémon Go main view that would appear once Poké Glass was introduced.
I worked with Pokémon Go's subreddit moderators to test the product's onboarding. I also demoed the app to colleagues whose experience with Pokémon Go varied. A few points emerged from these sessions:
But...
I made the silhouette of the illustration look like glasses and added the Poké Glass button to the onboarding cards.
I emphasized what button to press from the home screen.
Handling the post catch view was a bit of a challenge. In the base version of the game you catch Pokémon and view Pokédex entries
immediately after an encounter.
Why would a player want to be stopped every time they caught a Pokémon while wearing Poké Glass ?
The idea of batching Pokédex entries was born. Once you caught a Pokémon you could continue catching Pokémon uninterrupted. Only once Poké Glass was unpaired did you need to view Pokédex entries and stats.
By making changes to the product illustrations, emphasizing the toggle button, and editing the flow I solved the number one issue participants brought up during testing:
How do I use Poké Glass?
The solution is an end to end experience that allows players to learn about Poké Glass, activate the feature, and catch Pokémon wearing AR glasses. These are the main screens that players will interact with throughout the experience while on a phone.
I designed the mockups by using a Github repo that has all of Pokémon Go's assets which formed the basis of a de-facto design system.
From there I studied Pokémon Go's UI and began converting my wireframes to high fidelity mockups closely following Niantic's UI guidelines as expressed by Pokémon Go.
When applicable I would "trace" assets by masking them.
I also designed a marketing announcement that we could share with Pokémon Go players.
This document solved the problem of how to demonstrate the experience of using the Pokémon Go app on a phone while showing off corresponding actions on a pair of AR glasses?
The full announcement can be viewed on Behance and was shared on Reddit. This is a snapshot of a player's main interactions.
The primary result was an end to end product design that seamlessly introduced a new AR glasses feature within Pokémon Go.
The secondary result was a marketing document that served to capture important player actions and demonstrate them in a visually appealing and shareable format.
The final result, less tangible but no less important, was gaining a deeper understanding of the visual, interactive, and communication constraints that designers at companies like Niantic face.
I sought to make my conceptual design feel integrated with the rest of Pokémon Go. By utilizing the visual language that Niantic designed for the game, a language experienced by 100s of millions of players around the world, I was able to do so.