Desktop Application Installer

The desktop application “AlphaZ” is the software responsible for performing operations and engineering functions inside manufacturing plants. This case study focuses on a specific function: installing this powerful software onto desktop computers.

smartmockups_ksgg4hvu.png

 

Challenge


The installer for the application, AlphaZ, required constant clicks from the user in order to function properly. The installation was an expensive, time-consuming process that many users found to be a boring and tedious experience. My goal was to simplify the experience, and also reduce installation times.

Approach


I considered three key components throughout this project.

  • Customer satisfaction- Workshops were held with customers to identify their pain points, desired functionality, and validate the new direction that was being taken.

  • Technological capabilities- It was important to understand the backend for the installer before designing the UI, to ensure the user would input all of the necessary details needed for the installer to run.

  • Modern visual design- The original installer’s visual design was outdated. This was an opportunity to not only improve the workflow but also modernize the UI.

Outcome


The new installer framework requires fewer clicks, screens, and is able to run from start to finish without interruption. As a result, the install time has decreased by more than 75% leading to annual savings of $475,000 per year.

My Role, Contributors, & Tools Used


Although I was the only UX researcher and designer on this project, others from different disciplines contributed during every phase of the project.

Duration: 9 months

Contributors

  • Systems Architect

  • Product Owner (PO)

  • Product Marketing Manager (PMM)

  • Three Software Test Engineers

Tools Used

  • Microsoft Teams

  • Adobe XD

  • Company Design System

  • Mural

  • UX Pressia

The Design Process

The Design Process Stages

This project did not follow a linear design process. Certain steps required revisiting previously completed stages, in order to validate the direction that was being taken.

Discovery

Current State Analysis (User Journey Map)

Backend Architecture Mapping

Competitor Research

Stakeholder Workshop (pt. 1)

Stakeholder Workshop (pt. 2)

Define

Findings Synthesis

Develop

Collaborative Design with Test Engineers

Low-Fidelity Wireframing

High-Fidelity Interactive Prototyping

Delivery

Installer Guideline Deliverable

Project Description Writing

1) Current State Analysis (User Journey Map)

My first task was to understand the current behavior of the installer. To do this, I worked with a Product Owner (PO) that served as the SME and learned more about the installation process through a demo he provided. During the demo, I noted which major steps were taken during installation.

With these notes, I was able to produce a user journey map for the installer’s current behavior.

The user journey map also helped to identify these metrics:

  • Total install time: 4 hours

  • Time spent on user interaction: 1 hour

  • Number of times user enters login credentials: 6

  • Manual reboots and logons: 4

I also reserved a section for pain points and opportunities/ideas, that would be filled with direct user feedback in a later workshop.

Download PDF version

User Journey Map

2) Backend Architecture Mapping

After using the user journey map to understand the workflow, I then sought to understand how it correlated with the backend. I worked with a Systems Architect who’d had a hand in developing the installer, and together we analyzed and mapped the backend.

The installer architecture mapped out

3) Stakeholder Workshop (pt. 1)

I now had a comprehensive understanding of the installer’s frontend and backend and was ready to engage stakeholders.

I facilitated a design thinking workshop with five stakeholders all from different backgrounds:

  • 1 Control Systems Engineer (customer)

  • 1 person from the Technology department (internal)

  • 1 Project Engineer (internal- responsible for performing troubleshooting for customers)

  • 1 person from Business Development (internal)

  • 1 person from Marketing (internal)

During the workshop, we reviewed the customer journey map to align on the current state of the installer. Afterward, I conducted a Wind, Sail, Anchor activity to discover what was working well for them, what was holding them back, and what the perfect future state might look like.

From the workshop we discovered:

  • Progress indicators are not indicative of the installation progression and create confusion for users

  • The application is not “one size fits all”, and users do not need to be shown all available configuration options during installation. They only want to see what is important to them.

  • The need to monitor a four-hour installation is tedious, boring, and decreases productivity.

  • Users would like a more modern and simplified interface

data withheld for confidentiality purposes
data withheld for confidentiality purposes

4) Competitor Research

After identifying common pain points and potential solutions, I was interested in knowing how other installers behaved and if I could get any inspiration from them.

I chose to focus on Microsoft Visual Studio, since it’s a widely used application within our organization and users would be familiar with it.

I walked myself through a demo, by installing the application onto my computer. Throughout the installation, I took screenshots and noted the ways we might be able to use similar design practices.

A key finding that I observed through Microsoft Visual Studio’s installer was their use of tabs, and how it reduced the number of clicks from the user. Tabs made it easy to navigate between previous and upcoming configuration steps.

5) Collaborative Design with Test Engineers

I began implementing my findings into solutions by working with three Test Engineers to design new user flows. While creating the new framework, we considered different ways to decrease total install time and time spent on user interaction.

We mapped out a total of three user flows for the new framework.

Inside of the new framework, we introduced four new features:

  1. Upfront Configurations- The original installer required the user to make configurations in the beginning, middle, and end of the installation. But, with our new framework, the user would make all configurations before the installation began running. This would allow the installer to run continuously from start to finish without interruption.

  2. Progressive Disclosure- Not every user configures the application in the same way, and because of this not all users need to be shown all configuration options. We saw a chance to reduce the number of things shown to the user by using progressive disclosure. We implemented prerequisite questions, that would be asked prior to configuration, so the installer would know how the user would be using the application. From the answers provided by the user, the installer would only show the configurations they would need to make based on their use case.

  3. Automatic Reboots and Logins- We could not decrease the number of times the computer would need to reboot, but found a way to allow the computer to reboot automatically, and also retrieve the user’s login credentials whenever needed while still considering security protocols.

  4. Parallelized Installs- Originally the installer installed the application’s components one by one. We came up with the idea of allowing multiple components to install simultaneously with the use of parallelized installs, thus shortening the overall install time.

6) Low-Fidelity Wireframing

I created low-fidelity wireframes that modelled the new framework and addressed the pain points experienced by users.

wireframes.jpg

7) Stakeholder Workshop (pt. 2)

I reconnected with the original stakeholders involved in the first stakeholder workshop and presented the framework map and wireframes that had been created since our initial meeting. I highlighted the key areas where we improved the installer experience based on their feedback.

In addition to that, they also gave their thoughts on the new direction that had been taken with the installer.

Based on our conversation we implemented the following changes into the installer:

  • The continue button should be disabled on mandatory configurations until the user has made a selection.

  • The continue button should always be enabled on optional configurations.

  • Although computer reboots are automatic, a visual cue as to when reboots are expected to occur would be helpful.

8) High-Fidelity Interactive Prototyping

After implementing the received feedback into the wireframes and framework, I converted the wireframes into high-fidelity prototypes using brand standards.

Link to interactive prototype

9) Developer Guideline Deliverable

I created and published an installer guideline that our developers would use. The guideline contained UI Design specifications such as typography and hex codes. The guideline also outlined how the installer would behave step by step.

(an excerpt from the full guideline)

(an excerpt from the full guideline)

10) Project Description Writing (Agile Process)

With a high-fidelity prototype and newly architected framework in hand, I worked with the Product Marketing Manager (PMM) that was assigned to this project to write its criteria:

Benefit Hypothesis
Increase efficiency of the installation experience by reducing install time and minimal user interaction. This will come as a result of automatic logins after reboots, parallelized installs, and upfront configurations.

Business Value Justification
The current installer requires 6-8 reboots and 10+ user interactions, resulting in a total of 4 hours needed to complete installation. Engineers must monitor the installer for the entire 4 hours so that they can make the required configurations, and input login credentials after each reboot. If for some reason they step away, the installer will pause until they come back.

This project can reduce install times to 1 hour.

Projected savings:

#of Systems x workstations per system x hours per workstation

Original Metrics vs. New Metrics

Time Spent (in hours)

Number of User Interactions

Projected Savings: $475,000 Annually