top of page
Group 10.png

How I created a more inclusive and user-friendly experience for a company who wants to implement voice capabilities into Jarvis

The project aims at saving time and money for company while enhancing customer satisfaction.

Role

Interaction design, UX Design, Visual Design, Usability Testing

Team

UX researchers, product manager

Tools

Figma, FigJam, Adobe Illustrator, Adobe Premier Rush

Duration

1 month

Research

Who would use a voice-enabled chatbot?

Primary Users: 

  • Visually impaired (for example: blurred vision, color blindness, low vision)

  • Blind

  • People with dyslexia

  • People with Mobility issues (for example, amputation, paralysis, etc.)

  • Age group 50+/60+​

Secondary Users: 

  • When asking the question was faster than typing it and reading through the results.

  • Someone on the go

Opportunity

Cater to the left-out population and improve accessibility

Competitive analysis

Competitive analysis for done on similar companies in the Finance and Insurance industry.

Due to privacy concerns, the names of the company have been hidden but the analysis of each have been listen below

Analysis:

  • Voice chatbot on mobile app only, not on website.

  • Has only voice recognition and no Text to speech for accessibility.

  • Can be accessed through website.

  • Buttons are not accessible using readers.

  • Click on the voice icon again and again to speak.

User interview testing

Used a mock chatbot which has voice capabilities and tested on 2 users who used screen magnifiers and screen readers in their day-to-day life

User Interview and Testing
Used a mock chatbot which has voice capabilities and tested on 2 users who used screen magnifiers and screen readers in their day-to-day life.

 

Pre-test questionnaire:
Do you use any assistance devices in your daily life? Like screen readers, magnifiers, etc?
Have you ever used a voice assistant? 
What kind of stuff do you use it for?
Do you feel like it fulfills your needs to do whatever you’re aiming to do? Why/ why not?

 

Task: Reset password using voice chatbot
 

Post-test questionnaire:
What are your thoughts after using the voice chatbot?
Do you think you would use it if you had an option? Why/why not?
What did you like about the voice bot? What did you not like? Why/why not?

Empathy mapping

Based on the usability testing an empathy map was created that identifies what the user says, does, thinks, and feels to understand the problem from the user's point of view and identify the pain points and opportunities.

Empathy mapping

Usability testing analysis

Painpoints identified after usability testing

  • Can’t access chats through 'tab' button

  • Unlabelled buttons

  • Browser setting: Microphone permission pop-up

  • Scroll through the whole chat to reach to the most recent chat

  • Chatbot and voice input is not easily accessible

  • No feedback to know when to start/stop speaking and when the bot has spoken

Opportunities

Identified opportunities for improvement 

  • Make the voice chat speak back

  • Include button labels everywhere

  • Allow users to easily navigate through chat history

  • Override the default browser setting of permission to allow microphone access

  • Sounds for when the chatbot has started or stopped listening

  • Distinguish when the bot has said something and the user has said

Solution

I compiled the Hi-fi wireframes into a video for ease of understanding of the solution.
Please contact me to view the final solution. Due to NDA concerns, the final solution has been hidden

Outcomes

Reflection and learning

My project on designing a voice-enabled chatbot pushed me to think from an accessibility perspective for people with disabilities. This internship also allowed me to create experiences that are accessible which is something I was looking forward to learning. It has been a positive experience working at Nationwide and the skills that I gained have definitely helped me grow professionally.

Want to know more?

bottom of page