AcuForecast

Overview

Our goal is to design an app that helps users move from manual labor planning to a more accurate automated process, which will result in increased labor savings. Since they trust their own process and are cautious about new tools (10% adoption), the challenge is figuring out how to increase adoption of a tool they might initially distrust.

Hypothesis

If we understand the user’s manual workflow, the data they rely on to make decisions, and important data that they don’t currently have, we can design an experience users can be confident in, which will lead to higher adoption rates. As more users adopt the tool, they will plan labor more efficiently, resulting in significant savings due to more effective labor planning.

Results

  • $2 Million in annual labor planning savings
  • Schedule efficiency increased by 700 bps
  • Associate satisfaction score increased by 4.15 points

Persona Development

Understanding User Needs

I led user interviews to understand how they plan labor day-to-day. By uncovering their manual process, their key pain points, and their reasons for not using the auto-scheduler feature, I shaped the Information Architecture around their real workflow, creating a strong foundation for the application.

Findings

Through user interviews, I identified two distinct user groups: those focused on division-level performance and those responsible for editing demand forecasts. I recommended separating these workflows to create a more focused experience, but due to resource constraints, leadership opted for a single end-to-end solution instead. I pushed back, as designing for four users with different motivations conflicted with our goal of mirroring their manual process for easier adoption. Ultimately, the business prioritized short-term adoption to hit KPIs, with the plan to refine the interface to better align with user needs once more resources became available.

Design Process

Iterative Design

I mapped out the necessary components based on user workflows, quickly iterating from wireframes to high-fidelity designs. This was a true zero-to-one project, made even more interesting by the fact that I was working alongside our design system lead on an early-stage design system. Since many components didn’t yet exist, I sourced potential solutions, which had to be reviewed, approved, and then stress-tested in my designs. It was a unique challenge—building both the experience and the design system simultaneously.

Design Process

Iterative Design

Working on this project, I quickly learned the importance of documenting design decisions. When debates over operational logic put the project on hold, having clear records made it easier to pick up where we left off. In some cases, we even revisited scrapped designs, reinforcing the need for strong version control. This experience highlighted how essential it is to track decisions to maintain momentum and ensure a smooth design process.

Iteration #1

An early design allowed bulk editing of any value in the demand forecast, but users found it overwhelming. It also wasn’t ideal from a design and development standpoint—making all fields editable added unnecessary complexity to the logic and overall experience.

Iteration #2

Another iteration limited edits to total fields instead of day-by-day adjustments. This approach was considered because operations hadn’t finalized how the backend calculations would work.

Iteration #3

After months of iteration and stakeholder discussions, we aligned on the ideal user and editing process. We revisited the row edit best practice I had proposed earlier, which users responded well to due to its clear, focused approach.

Results

Initial Release

The initial release onboarded 57% of users to the new automated process, resulting in a 700bps increase in schedule efficiency and an overall satisfaction score of 4.15 . This improvement led to an estimated $2M in annual labor planning savings in the first year.

Next Steps

After the initial release of AcuForecast, I continued working on enhancements alongside our user researcher. We conducted a follow-up survey with 126 users across all four personas, which revealed a 23% drop in overall satisfaction and a low adoption rate with 5 users finding the interface not user-friendly, and 14 users reported information overload. Based on this feedback, we hypothesize that the decrease in satisfaction stems from the design not adhering to user-centered principles, which creates the extra design bloat. To address this, our next steps will focus on increasing adoption by simplifying the experience to better align with the core user journey.

What I Learned

This project taught me the balance between user needs and business constraints. While best practices are important, real-world limitations sometimes take precedent. Not every company is at the stage where they can push for the ideal user experience. As a designer, you have to adapt and work within constraints while still advocating for long-term usability, and minimizing design debt as best you can.