As a Fortune 50 company, The Home Depot (THD) relies on several people, technology, and processes to reach sales over $100 Billion dollars. One group that helps enable THD's success are Data Scientists.
(Short on time. Click icon for quick summary)
Data Science at THD:
At a high level, Data Scientists are responsible for gathering data, building models, executing tests, and making recommendations. Within the Space Optimization Solutions domain, data science was used to make recommendations so the Space Optimization Team could maximize as much revenue as possible per square foot in The Home Depot stores.
Role:
In this project, I conducted research to understand the processes and ecosystem of Space Optimization and Data Scientists. From that research I was able to craft a solution that provided functionality, efficiency, data entry capabilities, and data visualization.
Team:
1 PM, 2 Designers, 4 Engineers
Disclaimer:
Due to proprietary reasons with The Home Depot, I'm limited in what I can display publicly.
{ Additional commentary in red }
RESEARCH
During the initial kick-off meetings, the Data Science team and a few stakeholders wanted to find out if the technology organization could assist them in improving enterprise adoption of data science. They were also curious to know if streamlining the data process would enable Data Scientists to complete more projects.
Initial Research Methods:
In order to understand who requested data and why, how Data Scientists worked, and the entire process from beginning to end, we initiated a Generative Research approach using the following methods:
-
Field Studies
-
Interviews
-
Surveys
-
Analytics (captured from current software scientists used to better understand their metrics and software effectiveness)
Key Findings:

-
Departments requesting data, as well as the scientists, complained about how long the process took to complete.
-
Scientists were not able to run concurrent projects which limited the amount of projects they could complete.
-
Current software used by the scientists were housed on individual laptops which made collaboration and transparency difficult.
-
Current software caused frustration for newer scientists.
-
Departments requesting data wanted to understand the value of the recommendations presented by the Data Science team.
Three areas stood out from the research.
1. Who:
Data Scientists - Two types (Model Builders and Business Users)
The main user heavily involved from the beginning to end were the Data Scientists. They handled requests, built models, managed data tools, and provided recommendations. Data Scientists were the end users and not the business departments requesting data.
Model Builder

Business User

2. Project Process:
Data Science Process
There were several steps in the process the scientists had to follow. Because of the amount of data involved, projects would often breakdown in the middle of the process causing them to start over. A single project could take hours or days to complete. Improvements to the process and reducing project completion time would provide value to the end user.
3. Scalability:
Adaptable to changing needs or demands
One reason the process from beginning to end took so long to complete was because projects with heavy amounts of data were stored on local laptops. Removing the process from individual laptops would help users improve concurrency, speed, collaboration, and security.

Only the Inputs and Outputs are in the Google Cloud Platform (GCP)
{Example of how the Data Science process worked prior to any changes}
The insights and focus areas from the research helped define our problem statement:
Data Scientists have a lot of projects and several requests. The current tool they use is hard to work with, takes hours to run, and often breaks down; which keeps them from completing their work quickly and taking on more projects.
SKEPTICISM
We took what we learned from the research to share with the Data Science team and our stakeholders.
They weren't 100% convinced project time could be significantly reduced, scalability could be executed, and the user's experience could be improved.

Their doubts caused us to try a different approach. Since the work environment of our audience was more practical and scientific, we shifted how we explained the research and our process for tackling the problem. We used language that resonated with them; Research, Iteration, and Testing.
Our Explanation:
We don't rely on hypothesis or assumptions. We utilize proper research methodologies that allow us to learn from people so we can identify root causes and actual problems.
After the right problems are identified from research, potential solutions become clearer that we can test through prototypes with users and gather feedback. We take this feedback to learn and iterate so we can provide value by building what users actually need.
Rephrasing our communication to the language of our audience opened up a productive dialogue that enabled trust.
Opportunity:
During our dialogue, we shared how cloud capabilities could help with scalability and speed. We were able to convince them to give us one of their current projects to prove if moving to the cloud would reduce their process time. We had until the next meeting to prove if our idea would work.
CONTINUED RESEARCH

While the engineers worked on moving the test project to the cloud, our design team continued to shadow, chat with, and observe different Data Scientists.
We continued learning their work patterns, habits, and tasks. We dissected each step in their process and learned how their current software impacted their work.
Moving architecture to the Cloud
We Learned:
-
The current application contained several usability issues, and interaction between elements was not intuitive.
-
There was no way for scientists to see each other's projects.
{This negatively impacted team collaboration and succession planning}
-
The current application had accessibility challenges.
-
Users were comfortable with other software functionality (Excel) that integrated with the current application.
{Crafting a solution similar to interaction with current tools would help with adoption}
BUY-IN
The time had come for us to prove if our idea of moving to the cloud would speed up their process.

Success!!!
We proved migrating the process to the cloud decreased run time by 96%.
This was a game changer.
Reducing the time to complete a project removed a big frustration for our users.
Stakeholders and users were delighted at the results and potential possibilities. They were eager to see how the interface experience would work in tandem with the backend infrastructure.

Inputs, Outputs, and Processing are in the GCP
{Example of how the Data Science process worked after GCP changes}
DESIGN
We learned from research that the solution needed too quickly and easily guide the user, be intuitive, have some functionality with current software like Excel, and provide data visualization that can be updated in real-time.
.png)
I sketched out several ideas on paper and conducted a design studio with other designers to gather more ideas.
I took the feedback, made adjustments as needed, and started working on Mid-fidelity mockups to prepare for prototyping. The Mid-fidelity prototypes were tested with users to gather more feedback.


DELIVERY
After testing a few iterations with users, we were able to deliver a solution where we re-platformed an internal workflow application for Data Scientists with enhanced speed, consistent performance, improved usability and visualization while also providing scalability.
Due to proprietary reasons with The Home Depot, I'm limited in what I can display publicly.
Previous:
Next:


