Control Tower monitors the status of EA games and services, to better enhance the player experience.
Through a clean visual language, Control Tower unifies various information sources and highlights critical needs in real-time – for a more efficient and holistic monitoring of player experience.
Note: The dashboard data are randomized to adhere to EA's confidentiality agreements. I also received permission to present my internship work as part of my portfolio.
For my three-month internship at Electronic Arts, I was given the task of unifying the design of the dashboards managed by the Player Reliability Team.
A big piece of this challenge is information architecture: dashboard sprawl is common at EA, with information located on separate websites. Much can be gained from consolidating information to have a more holistic understanding of issues.
The second piece is information design: many users find the current dashboards overwhelming, and it was difficult to spot critical issues.
As a test case, we were asked to re-design Monitoring, a dashboard that tracks the statuses of EA games and services.
To address these challenges, we followed a user-centered approach to design a dashboard prototype that easily highlights information users need to see.
As the lead UX researcher, I was responsible for defining user requirements, heuristic analysis of current tools and translating findings into low-fidelity prototypes. My co-intern, Mikei defined a style guide, created visual assets and designed a mid-fidelity prototype on InVision.
Our strategy was to first focus on the use case of the Mission Control team and then eventually expand the project to other users.
I interviewed members this group to find out how they use Monitoring today and the problems they face. Their main workflow includes triaging issues such as server drops, and so they need to find relevant information of when and where these happen, and what could have caused them.
To compare Monitoring to best HCI practices, we read Stephen Few's seminal book on Information Dashboard Design. I summarized his book into five heuristics.
Analysis of both the interviews and the heuristic evaluation shows that Monitoring needs to be redesigned to present critical information at-a-glance.
The information has not been organized in a way that aids easy filtering, searching and scanning. It's hard to find related information with each other within a reasonable amount of clicks. On top of this, the sheer number of colors distract from the critical (red) issues.
Content Inventory and Card Sorting
Looking at the existing categories for Monitoring, I observed a total of 24 menu categories and more than 100+ user-defined tags, many of them redundant or outdated.
Using the data from the current website and the research findings, I went back to the Incident Management team to conduct a card sorting activity for defining a simpler categorization scheme, followed by interviews to explain their rationale.
My analysis of the card sorts showed striking consistencies. One is the need for a high-level overview of the system that includes critical components to monitor such as most recent/popular games and the integral infrastructure.
There were also interesting challenges such as different naming conventions across all organizations, or the same word having different meanings, and so part of my task is standardizing terms for the new dashboard.
Overall, from 26 categories, they were reduced to 4 major categories.
I then organized the categories into an organization scheme / site map to guide the wireframing process.
FROM LOW- TO MID-FIDELITY
Based on the categorization scheme, I created low-fidelity protoypes. Mikei converted these into mid-fidelity prototypes based on iterative feedback sessions.
Finally, I tested our prototype with the users. I developed the user testing protocol, asked users about their general impressions and tested the ease of finding incidents and learning their root causes.
Overall, the system received a System Usability Scale (SUS) score of 93.13, which marks it as excellent in terms of ease of use.
Users liked the clean visual design, and noted its improvement over the previous interface in data presentation. They also completed all tasks and rated the functionalities as useful and easy to learn.
For improvements, there were labels that users found ambiguous, especially with regards to time (metric duration and last updated).
WINNING THE INTERN TECH FAIR
Finally, our team went to the EA Headquarters in Redwood Shores to present our work during the CTO Intern Tech Fair. It was a great experience to share our work with the public and learn about similar information needs throughout EA.
Out of all the other 60+ interns at the fair, I was ecstatic to be one of the seven interns chosen by the CTO leadership to win an award for our exemplary work over the summer.
All in all, Control Tower was a huge organizational success that launched conversations about more strategic ways of visualizing success metrics for both C-level leaders and information workers. It was also a great partnership between myself and Mikei for creating designs that are both functional and appealing.