[Case 04]

Improved platform recommendations led to solutions for 80% of critical pain points

Public Safety/ Telecommunications

UX Research on Motorola’s WAVE PTX Platform

Uncovering and solving usability challenges to improve partner workflows on a mission-critical communications system.

[Project Overview]

At Motorola Solutions, I joined as a UX Research Intern to improve the usability of the WAVE PTX platform — a key tool for managing and renting Motorola’s two-way radios.

The platform serves a diverse set of users:

  • Motorola employees who oversee operations

  • Partners and agents who manage rentals and logistics

  • End customers who rely on the service to power real-time communication

While the tool was functionally powerful, it was struggling to meet usability expectations — especially for partner-facing workflows. Through qualitative research, job shadowing, and competitive audits, we uncovered key friction points and translated them into actionable UX recommendations. This project wasn’t just about improving usability — it was about aligning the platform with the day-to-day realities of its users.

[The Research Process]

Our research process began with a competitive analysis of enterprise-level platforms such as Microsoft Azure and Amazon AWS, to understand common UX patterns and gaps in complex technical ecosystems. I led this analysis and was also responsible for note-taking and synthesizing insights during stakeholder interviews with Motorola teams across Europe and North America. These interviews focused on internal Motorola employees who interact directly with the WAVE PTX platform. I organized all findings into a comprehensive, color-coded insights spreadsheet, highlighting pain points by category. From this, I developed the Top 5 Pain Points & Recommendations reports, which were used to guide next-step improvements across the platform.

01 Competitive Analysis & Strategic Insights

Understanding Industry Standards
To ground our research in industry best practices, I conducted a competitive analysis of enterprise platforms like AWSand Microsoft Azure. Since internal product workflows were not publicly documented, I explored support forums, UX case studies, and customer reviews to understand how these platforms served complex B2B ecosystems — particularly around billing, licensing, and account management.

Where Motorola’s Platform Fell Short
Motorola’s system lacked this level of clarity — making it difficult for reseller partners to manage licensing and financials efficiently.

02 Efficiency Analysis Through Role Play

To better understand how different users navigated the WAVE PTX platform, I conducted role-play testing with simulated scenarios. We mapped out how Motorola employees, partners/agents, and end-users completed key tasks, tracking the number of steps or clicks required for each interaction.
⚠️ Issue: Ordering a single license took 18 separate steps — a time-consuming process that often overwhelmed users and triggered support tickets.

03 Mapping Insights to Drive Impact

To capture a global perspective, we conducted interviews and job-shadowing sessions across North America, Europe, and India. I played a key role in these sessions—leading interviews, taking structured notes, and observing how Motorola employees and partners interacted with the platform in real time.

Refining the Data for Action
Once the initial insights were captured, I segmented the data across six major stages of the end-to-end user journey. This thematic breakdown helped us: Identify patterns across user roles, Prioritize bottlenecks, Highlight areas where small changes could create maximum usability impact.

Prioritizing for Maximum Impact

To ensure effective problem-solving, we applied severity ratings and color-coded priority levels to highlight the most urgent usability issues. This structured approach allowed us to: Focus on high-impact areas first, ensuring measurable improvements, Streamline platform usability, reducing friction in key workflows, Deliver data-driven recommendations that enhanced overall efficiency.

[Industry]

Public Safety/ Telecommunications

[My Role]

UX Research Intern

[Platforms]

Desktop

[Timeline]

May 2023 - August 2023

01 Competitive Analysis & Strategic Insights

Understanding Industry Standards
To ground our research in industry best practices, I conducted a competitive analysis of enterprise platforms like AWSand Microsoft Azure. Since internal product workflows were not publicly documented, I explored support forums, UX case studies, and customer reviews to understand how these platforms served complex B2B ecosystems — particularly around billing, licensing, and account management.

Where Motorola’s Platform Fell Short
Motorola’s system lacked this level of clarity — making it difficult for reseller partners to manage licensing and financials efficiently.

02 Efficiency Analysis Through Role Play

To better understand how different users navigated the WAVE PTX platform, I conducted role-play testing with simulated scenarios. We mapped out how Motorola employees, partners/agents, and end-users completed key tasks, tracking the number of steps or clicks required for each interaction.
⚠️ Issue: Ordering a single license took 18 separate steps — a time-consuming process that often overwhelmed users and triggered support tickets.

03 Mapping Insights to Drive Impact

To capture a global perspective, we conducted interviews and job-shadowing sessions across North America, Europe, and India. I played a key role in these sessions—leading interviews, taking structured notes, and observing how Motorola employees and partners interacted with the platform in real time.

Refining the Data for Action
Once the initial insights were captured, I segmented the data across six major stages of the end-to-end user journey. This thematic breakdown helped us: Identify patterns across user roles, Prioritize bottlenecks, Highlight areas where small changes could create maximum usability impact.

Prioritizing for Maximum Impact

To ensure effective problem-solving, we applied severity ratings and color-coded priority levels to highlight the most urgent usability issues. This structured approach allowed us to: Focus on high-impact areas first, ensuring measurable improvements, Streamline platform usability, reducing friction in key workflows, Deliver data-driven recommendations that enhanced overall efficiency.

[User Research]

Conducted 25+ user interviews and job shadowing sessions with MSI employees and reseller partners.

Ran competitive analysis of enterprise platforms like AWS and Azure.

Organized data using Excel for insight tracking and Microsoft Slides for synthesis.

[User Research]

Conducted 25+ user interviews and job shadowing sessions with MSI employees and reseller partners.

Ran competitive analysis of enterprise platforms like AWS and Azure.

Organized data using Excel for insight tracking and Microsoft Slides for synthesis.

[User Research]

Conducted 25+ user interviews and job shadowing sessions with MSI employees and reseller partners.

Ran competitive analysis of enterprise platforms like AWS and Azure.

Organized data using Excel for insight tracking and Microsoft Slides for synthesis.

[Insights]

Due to NDA, specific pain points and solutions are not publicly shareable. However, the research led to actionable recommendations delivered to senior leadership.

Categorized pain points across user groups and workflows.

Identified system inefficiencies in billing, account setup, and license management.

[Insights]

Due to NDA, specific pain points and solutions are not publicly shareable. However, the research led to actionable recommendations delivered to senior leadership.

Categorized pain points across user groups and workflows.

Identified system inefficiencies in billing, account setup, and license management.

[Insights]

Due to NDA, specific pain points and solutions are not publicly shareable. However, the research led to actionable recommendations delivered to senior leadership.

Categorized pain points across user groups and workflows.

Identified system inefficiencies in billing, account setup, and license management.

[Steps Taken]

01 Competitive Analysis & Strategic Insights

02 Efficiency Analysis Through Role Play

03 Mapping Insights to Drive Impact

[Steps Taken]

01 Competitive Analysis & Strategic Insights

02 Efficiency Analysis Through Role Play

03 Mapping Insights to Drive Impact

[Steps Taken]

01 Competitive Analysis & Strategic Insights

02 Efficiency Analysis Through Role Play

03 Mapping Insights to Drive Impact

[Feedback]

No formal usability testing due to the scope and phase of the project.

Feedback was synthesized through transcript review and cross-team discussion.

Field observations at partner sites gave deeper visibility into live usage and friction points.

[Feedback]

No formal usability testing due to the scope and phase of the project.

Feedback was synthesized through transcript review and cross-team discussion.

Field observations at partner sites gave deeper visibility into live usage and friction points.

[Feedback]

No formal usability testing due to the scope and phase of the project.

Feedback was synthesized through transcript review and cross-team discussion.

Field observations at partner sites gave deeper visibility into live usage and friction points.

Organized over 25+ interview transcripts into a structured Excel system, enabling cross-team prioritization and strategic design decisions.
Presented categorized, actionable solutions (Top 5 pain points across WAVE, Platform, and Support) to senior stakeholders, directly influencing future design direction.
Revealed that core licensing tasks required up to 18 steps—driving the need for major simplification and rework across the partner hub.

[Key Learning]

Enterprise research demands structure.

Documenting insights with clear labels, dates, and severity helped manage complexity and empowered decision-making.

Enterprise research demands structure.

Documenting insights with clear labels, dates, and severity helped manage complexity and empowered decision-making.

Enterprise research demands structure.

Documenting insights with clear labels, dates, and severity helped manage complexity and empowered decision-making.

Small process flaws cause big operational friction.

Tracking click paths and job-shadowing exposed pain points that weren’t visible in interviews—like task duplication or vague account flows.

Small process flaws cause big operational friction.

Tracking click paths and job-shadowing exposed pain points that weren’t visible in interviews—like task duplication or vague account flows.

Small process flaws cause big operational friction.

Tracking click paths and job-shadowing exposed pain points that weren’t visible in interviews—like task duplication or vague account flows.

You don’t need pixels to make impact.

Even without redesigns, the right research can unlock massive improvements—if insights are packaged and shared effectively.

You don’t need pixels to make impact.

Even without redesigns, the right research can unlock massive improvements—if insights are packaged and shared effectively.

You don’t need pixels to make impact.

Even without redesigns, the right research can unlock massive improvements—if insights are packaged and shared effectively.

Select this text to see the highlight effect