Aging Inventory Visualizations
Summary
The inventory management tool helps internal stakeholders monitor aging performance, access detailed reports, and perform part end-of-life functions. To modernize and improve the experience, we worked with our stakeholders, superusers, and audited the legacy tool. This case study will focus on the project journey for aging inventrory data visualizations.
As the senior designer, I was responsible for creating a responsive design experience, collaborating with our design lead, art directing and mentoring our design intern, gathering stakeholder requirements, planning and facilitating workshops, user testing, grooming, tracking bugs, and user acceptance testing.
__________________________
Design Challenge
Problem: The data visualization experience was highly fragmented, requiring access to multiple locations and outside teams. Requirements vary across the functional roles, and within each area, stakeholders had a lot of requirements (particularly the GIM, planning, and finance).
Goals: Provide general managers with an easy and accurate tool for monitoring, reporting, and presenting critical inventory aging performance metrics for the storage, client & server.

Target Audience

Our target audience included general inventory managers, finance, planning, and buyers.
Company: Dell
 Inventory management
Product Team: Product Owner (1), Designers (3), and Developers (10)
Stakeholders: Inventory Managers & Finance
Partners: Design System, Accessibility, 3rd party vendor for charts
Process
1. Created and sent 2 surveys to super users in the 5 functional roles in the 2 business units. The survey shared example visualizations and collected feedback on what details are helpful to them. We also asked what other tools or applications they use for gathering data visualizations and metrics. Lastly, we allowed users to share screenshots of any additional visualizations that felt would be helpful in their role. Once we had all the findings, we analyzed the data and created a workshop board in Figjam to share the findings and patterns.
2. We created rough sketches (procreate) 
3. Collaborative workshops:
   • Survey findings readout, gather insights and feedback
   • ​​​​​​​Define and prioritize core GIM & finance most critical tasks; share 2 rough sketched concepts for viability, gather feedback on the sketches
4. Build out mocks with the Dell digital components
5. Design reviews (sessions for internal design, accessibility, core team, business partners, super users)
6. Usability testing 
7. Grooming, bug testing & tracking in confluence
8. User Acceptance testing (UAT)
9. Fix critical bugs & launch (Storage has been launched, Client & Server is in development)
Testing
Methodologies: 2 surveys, 2 usability tests, 1 user acceptance test (storage).
Results and Findings: 
   • From the survey, we collected over 100 screenshots from the storage team survey and found several patterns in the types of charts and the types of details and filters needed. We also learned who needed the aging visualizations, and what details were needed.
   • From the usability test, we learned that users wanted the business units pulled out into tabs at the top of the page (instead of hiding in the filters)
   • From the data table test, we learned that the sorting feature gets lost when hidden behind an overflow icon. We also learned that our superusers, who are heavy Microsoft Excel users, would like filtering to work like the pivot tables.
Deliverables
   • 21 charts (3 data visualization types)
   • Global filters and filters unique to each chart 
   • Dozens of hover states and popups
   • Accessibility states
   • Dynamic label changes with filters
   • Responsive breakpoints
Outcomes
Performance data is pending but will be measured by tool usage and overall engagement within the 5 functional roles.
Lessons Learned
During UAT we learned 
   • The storage team needed 2 additional filters that were not identified as critical functions during requirements gathering (blockers to team adoption).
   • Some users expressed confusion about the chart-specific filters vs the global filters. This suggests a need for training materials, onboarding or possible rework.


Back to Top