Senior UX Researcher, Tech Mahindra
Client 1 – Microsoft
Project – User Testing & Optimization of the Volume Licensing Tool
Challenge:
The project involved a legacy tool, that was being used by employees to research and analyze data, providing insights into the volume of products and services Microsoft had sold, leased, and other related metrics. It also tracked customer agreements, renewals, and anniversaries. However, the tool had outdated code, which not only required a significant update to the underlying codebase but also to design overhaul but also to be necessitated.
My primary contribution was in evaluative research, as the new version of the tool was already launched. Here, I conducted user testing through interviews and screen-sharing sessions. These sessions helped me identify pain points and understand why users preferred the older version of the tool over the new one. Additionally, I helped roll out surveys to identify discrepancies (if any) and validate data across difefrent methodologies – user feedback, previous survey results, my interview sessions and actual telemetry data.
Methods:
- User Interviews: Conducted via Microsoft Teams to understand user pain points and gather qualitative feedback on both the old and new tool versions.
- Data Synthesis: Utilized Microsoft Excel for organizing and synthesizing data collected from interviews and surveys.
- Analysis & Reporting: Used Microsoft PowerPoint to present System Usability Scale (SUS) results and thematic analysis of verbatim feedback.
- Telemetry Analysis: Leveraged Microsoft Clarity for in-depth UX performance assessments to analyze actual user behavior and tool usage patterns.
Limitations:
- Limited Time Frame: I was recruited for only a six-month timeline, after which I transitioned to a new project. As a result, I was unable to see the long-term outcomes or ongoing results of the product post-launch.
Outcome:
Despite the time constraints, the research led to a significant increase in user adoption. Although I only had data for three months, my contributions helped improve the usability and acceptance of the tool, with adoption rates increasing by a significant percentage within that period.
Takeaways:
- Adhering to Company Protocols: I learned that large organizations like Microsoft have strict protocols when it comes to research and design, and it’s crucial to align with these standards.
- Importance of Guidelines & Templates: The significance of standardized templates and structured reports, even for minor tasks like meeting minutes or shareouts, became clear.
- UX Evaluation Methods: Gained in-depth knowledge of System Usability Scale (SUS) and HAX Guidelines, as well as the Tenets and Traps method for evaluating usability.
- Research Impact: Executed over 20 usability testing sessions, benchmark studies, and surveys as part of both evaluative and generative research efforts for the VL Central project.
- Cross-Functional Collaboration: Advocated for the user experience throughout the product development lifecycle, ensuring user feedback was integrated into the design, leading to improved decision-making and higher adoption rates.
Client – Principal
Project – Generative Research for Enhancing an internal Software Developers Portal
Challenge:
The organization had developed an internal Software Developer’s Portal intended to streamline projects and product management and software development tasks. This portal was designed to serve two primary user groups:
- Product Managers: They used the portal to track tasks, monitor the progress of bugs, view the severity of issues, and identify which tasks were assigned to developers for the products that they own.
- Software Developers: They used the portal to pick tasks they could solve, track the progress of their work, and report any issues that came up during the development process.
Despite being launched for over six months, there were significant concerns about whether the portal was effectively benefiting its users, especially since it combined the needs of both developers and product managers. The main challenge was to determine if users were able to fully benefit from the portal’s features, as tools like Jira were typically more suited for project managers (who manage overall developers projects and portfolio), and there was uncertainty about how well the portal catered to product managers.
My task as a UX researcher was to evaluate the effectiveness of this newly-formed portal by identifying if users were actively utilizing its features, if it was improving their workflow, and if it could be enhanced to better meet their needs.
Methods:
To understand how the users were interacting with the portal and where improvements could be made, I followed a structured approach:
- User Interviews: I conducted one-on-one interviews with both developers and product managers to gather qualitative insights into their experiences. Key areas of focus included:
- Ease of use
- Task assignment and tracking
- Accessibility of information (content if written in a too easy, medium or difficult format, able to discover the instant message given as nudges sometime or not)
- Focus Group Studies: I organized focus group discussions with a small cross-section of developers and product managers. This allowed for a more collaborative environment to discuss shared pain points and ideas for improvement. It also helped in identifying the gaps between how users expected the portal to function and how it was currently performing.
- Synthesis and Analysis: The findings from the interviews and focus groups were synthesized to identify recurring themes and pain points. I used methods like Affinity Diagrams to group insights and highlight patterns that could guide design improvements.
- Presentation of Findings: After gathering actionable insights, I prepared a comprehensive report and presentation for the stakeholders. This report included:
- Key issues users faced with the portal
- Areas of improvement
- Strategic recommendations for enhancing user experience
- Suggested changes to the UI/UX design to streamline the workflow for both developers and project managers
Limitations:
- Many employees were relatively new to the portal, and they had not used it actively despite it being live for over six months. As a result, the feedback I received was not comprehensive, and the responses were sometimes based on limited usage or first impressions rather than in-depth, experienced feedback.
- Due to urgent requirements the portal was launched without thorough user testing (pre and post launch). The initial design phase did not incorporate enough user feedback, leading to lack of understanding of ‘what really the users want’ and now making it difficult to understand to comprehend the less poitive impact of the tool on day-to-day tasks.
- The project was on a tight timeline, and while I was able to gather valuable insights, I did not have the opportunity to conduct a full-scale usability test before making recommendations. This meant that some of the proposed solutions might need further validation through iterative testing.
Outcome:
Despite the limitations, I completed conducted user interviews and focus group studies, with the target number of participants as per requirement from my manager. Uncovering valuable insights into the portal’s effectiveness and usability. I presented the findings to key stakeholders, which included product managers, project managers, and developers.
Some key outcomes from the research include:
- It became clear that developers struggled with the portal’s usability, particularly in navigating through task categories and understanding task priorities. On the other hand, product managers had issues with the lack of granularity in task tracking and reporting.
- Based on the insights, I recommended redesigning certain features to enhance clarity, such as:
- Improving task categorization to make it easier for developers to find relevant tasks and not repetative tasks.
- Adding real-time tracking features to show task progress more effectively.
- Incorporating more intuitive filtering options for developers and project managers to quickly sort tasks based on severity, status, and deadlines.
The findings also led to the development of a roadmap for future iterations, with a focus on usability enhancements and improving task visibility across teams.
My Takeaways:
- It’s crucial to involve users early on in the process, not just during the launch phase. While the portal had been live for several months, many users had not actively engaged with it, highlighting the need for continuous user feedback throughout the lifecycle of a product.
- Given the tight timeline, I learned how important it is to prioritize research despite time constraints. While the lack of user testing was a challenge, I understood that research and testing should ideally be part of every stage of development, especially for tools with multiple user personas.
- Both UI design and functional workflows need to be aligned with user expectations to ensure seamless adoption. The portal could have been more effective if usability testing was prioritized from the start.
Overall, this project reinforced the value of generative research in uncovering user pain points and crafting user-centric solutions that drive both engagement and satisfaction. The results of my research have laid the foundation for further improvements and a better user experience for all stakeholders involved.