Driving hardware redesign with research
How would you bring research impact to an organization with a low UX maturity?
At a global logistics company, I was brought in to improve the user experience of a new warehouse workstation. While my scope was focused on informing UI improvements, early research showed deeper problems. The station's hardware design was causing frustration and delays. By building relationships and expanding the scope of my research, I helped uncover core usability issues that led to a larger redesign effort.
Overview
Role: UX Researcher
Timeline: 3 months
Tools used: Zoom, Slack, Figjam, Internal client software
Methods: Desk research, contextual inquiry, usability testing, and user interviews
Internal Team: 1 UX Researcher, 2 UX Designers
Client Stakeholders: Junior UX Designer, Hardware Team
Note: This project is under an NDA. I can’t disclose the name of the client that I worked with or show deliverables and screens that tie back to the client in any way. The images used here are purely for illustrative purposes.
Problem Statement
The new warehouse processing station was visually impressive and featured advanced technology, but it lacked proper user feedback and learnability, leading to delays and inefficiencies.
Project Goals
Improve station processing time and reduce station downtime by:
Assessing the effectiveness of station feedback mechanisms
Identifying pain points for workers and engineers
Informing station UI improvements

The Context
Our client had recently launched a new processing station designed to speed up package handling in warehouses. The station had advanced capabilities, but it wasn’t working well in practice. Station workers weren’t hitting their success metrics and packages weren’t being processed fast enough. The client’s UX team assumed that the station’s UI display was the root cause of the problem or the station workers just weren’t processing their packages in an efficient way. I stepped in to understand what was going on with this station and why these station workers weren’t performing well.
Understanding station scanning behaviour
The client’s UX team first wanted to understand what kind of scan behaviours “expert” station workers exhibit in their day-to-day work. They thought that expert users had a more efficient flow which could be taught to novice workers.
I began by understanding how this station works and how it provides feedback to the workers. Then I travelled to a warehouse site to run a contextual inquiry study. What I learned was that more station experience did not indicate more efficient workflows, as expert and novice scanning behaviour was nearly identical. However, other issues were discovered around camera scan delays and ineffective sound and light feedback. These findings suggest that the core issues weren’t around user behaviour. The station itself needed to be redesigned.
-
The contextual inquiry study involved observing workers in action, asking follow-up questions, and capturing scanning behaviours using a side-mounted camera. I later analyzed the camera footage to see how packages were moved, flipped, and rotated during each scan by expert users versus novice users.
-
In order for me to understand the different scanning behaviours in detail, I had to observe station workers in person and ask follow-up questions. Interviewing workers was not possible and would not have provided the right feedback. Camera recordings were needed to quantify the different behaviours more accurately and capture important moments that would be easy to miss.
-
Scan delays from poor camera performance slowed down workflows, as workers had to pause for a second or two each time a delay took place.
Sound feedback was ineffective due to the loud warehouse environment and competing sounds from other stations.
Station lights were often blocked by large packages, which made it hard for workers to know whether their scans were successful.
Limited stakeholder buy-in
After presenting my findings, I learned that the client’s UX team had very little control over the physical design of the station. They were limited to making minor improvements to the station UI screen. The hardware team had the final say over the hardware design, and they weren’t convinced that any larger changes had to be made. They just launched the new station and didn’t want to revisit the design shortly after. This was frustrating, as our research uncovered larger usability issues with the station’s hardware design. We had to find a way to address these issues.
Building rapport and answering the team’s biggest questions
I realized that I needed to foster a good working relationship with the hardware team and build their trust in my work. This would allow me to propose more meaningful product changes that would have a larger positive impact on the station’s UX.
Actions:
Took the time to understand the hardware team’s business goals, which were around improving processing efficiency and reducing station downtime.
Positioned myself as a partner to the hardware team by joining their meetings and actively addressing their most pressing questions around the station through follow-up research.
Testing a new station flow with engineers
The hardware team was interested in testing a new camera validation flow within the station. They needed to understand the level of expertise that was needed to go through the flow and what pain points engineers experienced.
I conducted usability tests with five engineers to better understand the feature’s ease of use and learnability. Follow-up interviews were conducted to understand general station usage and pain points. I learned that the flow was difficult to complete, engineers did not fully understand the station, and many had to escalate various issues to other teams.
The station’s hardware design and lack of training around it caused further delays and station downtime when cameras needed to be tested. Some issues were reported to take hours to resolve, which means hundreds of unscanned packages.
-
A usability test was the most appropriate method here, as I wanted to evaluate task success, completion time, and general ease of use. These tests allowed me to observe pauses and moments of confusion. Follow-up questions helped me further understand pain points from the test and for station maintenance as a whole.
-
None of the engineers could complete the validation flow due to the complex placement of certain hardware components and occasional camera issues.
Most engineers did not fully understand how the station worked, as no training was made available to them around troubleshooting.
Many station issues had to be escalated to other teams because they were too advanced to resolve, which caused longer station downtimes.
Presenting findings to inform a large design effort
It was clear from the two rounds of research that various hardware UX problems were causing frustration to its users while also conflicting with the hardware team’s goals. The station feedback issues along with the poor station camera performance caused processing time delays. The lack of station maintenance training and debugging capabilities meant that many simple issues had to be escalated to a remote team, causing significant station downtime.
Based on these findings, I presented the hardware team with the following recommendations:
Improve Feedback Systems
Move physical lights above the scanning area for better visibility.
Reserve sound notifications for infrequent but critical feedback (ex. scan errors).
Enhance Scanning Technology
Upgrade cameras to reduce scan delays.
Replace some hardware components with a simpler lightweight tool for engineers to verify camera functionality.
Support Engineers
Develop hands-on training to help engineers troubleshoot effectively.
Update station UI for better guidance and clarity during troubleshooting.
New working relationship and redesign effort
By partnering with the hardware team and tackling their most pressing questions, I was able to build rapport with them. I kept the team in the loop about research findings throughout the project while informing smaller design changes on the station UI. The team knew I was there to help them, and this in turn built trust around my team’s work and recommendations.
My final report and presentation helped inform a larger hardware redesign effort for 2025. The hardware team is now focusing their efforts on improving the station’s feedback mechanisms and camera performance. They’re also creating a training module for engineers to help reduce maintenance time.
The new hardware redesign effort will help improve station processing time and reduce downtime, which means more packages will be delivered to customers faster.
Reflections
This project helped me understand the importance of relationship-building with stakeholders, especially in low UX-maturity organizations. It was important for me to balance immediate client needs with long-term station improvements.
By spending the time to get to know the hardware team and deliver small wins for them, I was able to build trust with them and have more influence over larger design and development efforts.