Microsoft AI Science Engine

The AI-powered research platform had poor usability and communication issues in its UI.

UX Scorecard April 2022 table showing experience metrics. The table includes percentage values with color coding for performance.

Usability Audit 2022 - Failed

Key Problems

  • Overly technical language made interactions unclear.

  • A lack of feedback and intuitive results was displayed, making it difficult for scientists to interpret the data.

    Impact:

    • Scientists struggled to effectively use AI-driven insights.

    • The platform failed usability audits, which negatively impacted adoption.

Process & Solution

Redesigning AI Interactions

  • Improved UI for key interactions:

    • Search experience: Simplified querying and input prompts.

    • AI feedback visibility: Made system responses more apparent and actionable.

    • Query time transparency: Provided users with better progress updates.

Laptop displaying an error message with an access icon, indicating lack of access. A close-up diagram on the right explains the layout details, including icon size and text specifications.

Design System & Usability Enhancements

  • Refined the Design System for consistency across features.

  • Improved accessibility following Azure Design Systems.

  • Worked closely with developers to ensure pixel-perfect execution.

Error Messaging & Empty States

  • Conducted an audit of error messages to improve clarity.

  • Collaborated with copywriters & illustrators to create empty states.

  • Ensured error handling aligned with user expectations & best practices.

Comparison of Figma page organization. The left side shows a list of page names with a simple structure, while the right side shows improved, organized page names with icons and relevant details.

Collaboration & Workflow Optimization

  • Worked cross-functionally with Scientists, PMs, Developers, and Researchers.

  • Led handoff improvements between designers & engineers to streamline implementation.

  • Mentored designers & supported onboarding efforts.

UX scorecard table for April 2023 with experience metrics. Each metric has percentage scores.

Impact & Results

Audit Success

  • Before: 80% of user journeys failed usability tests.

  • After: 99% of user journeys passed in the final audit.

Improved Adoption & Efficiency

  • Modernized the UI to align with Microsoft’s standards.

  • Enhanced feedback mechanisms improve user trust in AI outputs.

  • Strengthened internal communication between designers, scientists, and engineers.

Lessons Learned

  • Handling AI-driven design is complex—balancing transparency and usability is key.

  • Accessibility-first design ensures better adoption across all users.

  • Collaboration across disciplines (AI, science, UX) is crucial for success.