UX Case study


March 2021 to August 2021


UX Research

Organize and conduct the user interviews. Analyze the data.

UX Design

Create and organize all deliverables.


  • Figma: Mid and high fidelity prototyping
  • Miro: Create mental model & affinity map
  • Balsamiq: Low fidelity prototyping
  • Draw.io:  Flow charts and task analysis
  • Optimal Workshop: Card sorting
  • UsabilityHub: Preference testing
  • Affinity Designer: Give life to my user personas
  • Zeplin: Hand over to the development team


HALP was created during my UX Course at Career Foundry. 

The brief asks for a web application that enables anyone, anywhere to instantly chat with an expert.

I know enough to be the perfect go-to guy whenever you have an IT problem. What happens when I am not around?

The problem

We always face a difficulty with the devices we use every day. It can be difficult to find a solution in a timely and efficient manner.

While it can be easier for more technical savvy people, it is a real struggle for those who are not.

How do we make their lives easier?

The solution

HALP would solve your daily IT problems by connecting you to experts. It will adjust to the user’s technical knowledge and offer them a way to take notes and consult, whenever they want the content of the session with the expert.

My initial thoughts​

How do you create something adapted to all levels?

Does a service like this even exist already?

Can we make IT a bit more fun?

Illustration of someone drawing a maze shaped liked a question mark

Design Thinking Process

Sounds serious right? It can be, but for now, we will say that it is a cycle that brings clarity to complex problems. If you are curious about the topic, you can check this video from Norman Nielsen Group.


Understand the market and the users through user interviews​



Establish what our users need and want through user personas & user journey



Bring possible solutions via the user flows



Giving life to our ideas through prototyping



Check if we are aligned with our initial statement through testing.


The adventure begins by getting to know the existing market. Or for that matter, if that market even existed!

I identified Quora and JustAnswer as the closest answer to our current problem.

Quora brought an extensive database of questions and answers while JustAnswer was connecting customers to experts in their fields. 

Here is a snapshot of my finding through the SWOT analysis.

SWOT for JustAnswer


You are sure you will get a quality answer

The always available chat bot can quickly answer question related to specific subjects: health/law/animals


They advertise their broad subject coverage but they do seem to focus on specific ones like law, health, or animals.

Unclear price model

No video calls, just phone call at best (they favor chat).

No app (website is fully responsive though)


A real application

Implementing video

More available languages

A better focus on technological subjects


Most of the other platforms are free. JustAnswer’s price point is fairly high

SWOT for Quora


Worldwide famous 

A strong community base


The economical model makes sometimes the line between editorial and real content blurry.

The ads can also be invasive.

There is so much to navigate through it can be overwhelming

A lot of questions, tons of them does not have answers


A validation system for specific users to be tagged as expert

An clearer Information architecture


Reddit is their main competitor in the field. They have a clearer view with the thread format and hierarchical view

I love competitive analysis!

What I learned

  • There is a real market gap for a solution that brings swift and complete answers to IT problems.
  • The questions are indeed available online but the answers are either not available or far away from the actual solution.
  • Providing 24/7 answers is a plus (but is it necessary?)
  • HALP can only achieve its goal by providing qualitative answers in a timely manner.

Well that is nice, but when do we empathize?

While my initial hypothesis was based on market research and my own thinking, it was time to really understand my user’s needs and leverage two great tools: user survey and user interviews.

The online survey had 10 questions and obtained 25 answers. I believed that gathering quantitative data would help me determine patterns while the user interviews would bring me more qualitative answers.

I interviewed three people, two of them having little knowledge in IT and another participant able to solve problems alone. Of course, I made sure to have one older participant to grasp a picture of this population’s needs.

Illustration of a smiling man checking his watch

User survey

The goals
  • Discover if their age and technical proficiency impact how they approach their IT problems.
  • Finding out if they would feel safe about sharing access to their device to ensure proper troubleshooting.
  • Learning on what device they face most of their issues.
  • Understand how they are solving their current problem.
  • See if they value having 24/7 support.
Illustration of a woman checking boxes on a giant paper clip
Main takeaways

A young population

96% of my 25 respondents were below 50 years old, proving to me that I needed to dig deeper through interviews to obtain input from older people.

Fast & accurate

Almost all respondents asked for fast (84%) and accurate (68%) answers.

Young = tech saavy

A vast majority said that they knew their way around when it came to finding their answer (88%) verifying the idea that younger users were more at ease with IT difficulties.

Trusting the service

It was a 50/50 split regarding their confidence to share access to their device. The same went regarding the availability of the service. Those points required an explanation.

Request for pedagogy

People asked a lot for pedagogy, a language that can be easily understood.

Various tech platforms

They mostly owned a laptop (22 answers) and an iPhone (17). Against five people owning a desktop and eight people having an Android-based smartphone.

User interviews

The goals
  • Learn how my users are solving their current problems with their devices.
  • Know what they feel is lacking/is working well with the methods they are currently using.
  • Find if and how often there is a feeling of urgency when facing the problem.
  • Verify how they feel about sharing access to their device and why.
  • Establish what they hope to find within our app.
Illustration of a man and a woman discussing together
Making sense of the data

The first step was to identify patterns within my participant’s answers by organizing and combining them thanks to the affinity map. Using a mental model helped me confront my planned features against the real desires of our user base.

You can check below how I used both methods:

The affinity map and the mental model
Main takeaways

An adapted experience

The user wants to face people who adapt to their technical level. They correctly evaluate their level of knowledge.

Understand VS. taking control

Users want to understand how to achieve something. They do not want someone to take over and do it for them.

Rewards & Gamification

The user feels rewarded when they find the solution themselves or when they are able to remember how to do something. Invite users with gamification and progression through tech levels.

Avoid repetition

Knowledgeable users can easily spend days on a problem and they don’t want to face the frustration of repeating the same steps.

24/7 support

Older people are hoping to receive support 24/7 and would accept paying more for that level of service.

No technical jargon

Some users can feel overwhelmed by the tech lingo, besides the expert’s effort to use an adapted language, I will add a dictionary with technical terms.

Illustration of three martial arts belts (white, orange and black) in a circle

The belt system

One of our interviewed users has martial arts lessons and while discussing with him he made a comparison with the belt rank system. I took the idea to differentiate the level of users as I thought it was a well-known model.

Ok, that was a lot of work but how valuable was this!

Getting in the minds of your users can really shed the necessary light on your project and possibly identify any bias.

For now, we will bring this data to life because, let's face it, reading a spreadsheet is no fun.


I created user personas, which represent a snapshot of our user. As mentioned, HALP aims at delivering solutions for people with various levels of IT knowledge. The personas reflect exactly on this. Again, they are not based on guesses but on the data we gathered from the previous steps.

Please meet, Alan, Philippe and Meryl


He is young and has no appetite for computers and such. He knows he needs them and recognizes that there is a lot of things he needs to learn on the matter.

Icon of a White Belt


He can spend days searching for an answer, and that’s ok for him. Though, he knows when to rely on other people that he trusts to get a faster solution.

Icon of an Orange Belt


She is older and curious about IT. Though she can also be afraid of it. She needs help in her daily tasks, and it can be as simple as sending out a photo to someone through an email.

Icon of a White Belt

And how would our personas feel using HALP?

This is where journey maps come in handy. Through them, I could visualize opportunities for improvement.


The ideate phase helps us make sure that we don’t lose track of important aspects like navigation flow or content organization.

User flow

My goal here was to establish how different the experience should be for my white belt users versus my orange belt.

Meryl wants to scan her photos and store them on her computer.

Entry point

Home page with an existing account with credits

Success Criteria

My photos are on my computer

Philippe wants to obtain a quick solution, via chat, following an issue with his iPhone not syncing his documents.

Entry point

Home page with an account but no credits

Success Criteria

 I found my solution after receiving guidance

You see, easy! Well, it didn't go that way, you will see that when we reach the test phase!

Site map

We use the site map to get an overview of our application and how users are navigating through it. Again, visualization makes it clearer than a spreadsheet.

Card sorting

To validate the labels and organization of my content, I realized an unmoderated card sorting session with eight participants.

Picture of a few post its on the table with labels on it
Main takeaways

Positioning the Dictionary

My initial thought was to have the Dictionary in the Dashboard. But the standardization grid showed that people expected it next to the search area.

Content of the Dashboard

Users expect to see their agenda and messaging in their dashboard. They also want to have a view of their tokens and messages while being able to manage them from their profile page. 

Similarity matrix

Standardization grid

Refined map

Following the insights of the card sorting, I created the following site map version.

  • Move a view from the Agenda this week in the Dashboard. While the full agenda stays in my profile.
  • Add the Dictionary within Search.
  • Add Notifications labels within the Dashboard.
  • Add “Evaluate My Expert” in my profile (With notifications in the Dashboard.).
  • Move Search on level 1.0, Dashboard will move to 2.0, My profile to 3.0.
  • Move Conditions to About since my profile is very crowded.

We now have a better understanding of what would the experience be like to browse our application.

But how will it be looking like?


Through our findings we established the following features:

  • An onboarding to introduce our audience to HALP
  • A way to establish their technical level
  • Using the search engine to get a list of experts able to answer the user’s questions
  • A video session functionality for users to easily exchange with the experts
  • A solution for users to give feedback following a session
  • Creating, managing, and consulting their notes
Illustration showing the face of the Mascot : HALP

The mascot: HALP

Very early in the process, I left space to include a mascot, something that could bring a touch of joy in the cold-hearted world that is IT (I might be exaggerating here.).

The prototyping phase started we low fidelity wireframes, based on the original task flows and the competitive offering. Mid fidelity brought more details towards content hierarchy while high-fidelity refined the UI.

The onboarding

The dashboard

The add notes features

Want to see the complete prototype at this stage?


The test phase ensures that what we created is indeed aligned with our users needs. It is also a way to uncover invisible pain points.

Usability Testing

I tested my prototype with six participants with a balanced mix of tech-savvy and beginners. It brought me a total of 199 various feedbacks and it was my role to make sense of it by identifying patterns. See below the main comments.

The tech level assessment generates frustration and questioning.

The user got confused by My Notes against the Expert’s notes.

Unclear economical model.

The user didn’t use the bottom navigation bar.

The original file where I organised the content

Following my user's feedback, I updated my prototype, below, a few major changes.

My users didn’t want to choose their tech level; they were afraid of choosing the wrong level, leading to an unsuitable experience.

Updating the homepage allowed me to tackle multiple issues: 

  • being transparent about the economical model,
  • offer a way for the user to quickly get to know more about the app or create an account,
  • tell the user that they don’t have to become an expert if they don’t want to (something that was expressed during the usability test.

Make it pop!

During preference testing, I applied a splash of purple in the background. The reaction was positive, and I decided to change the scheme with a Purple/Yellow mix.

First with small touches, and then I realized that maintaining the blue was creating an inconsistent interface so I removed it.


And, we are done, right? Not quite now, actually, way later. 


Because UX Design requires multiple iterations, testing and implementations to achieve its ultimate goal, deliver the best experience.

Here are some comments I received after asking four of my fellow designers:

Account creation

“Where am I? I am still at the account creation?”

On the token page

“Maybe I don’t want to buy credits right now

“What can I do with one credit?”

Session’s page

“Is the chat my notes? Because it would be nice to be able to take my notes while chatting/video chatting”

Which brought the following updates

A button and a new page to get more information about the tokens.

An animation showing progress during the profile creation phase.

A “My notes” tab, separated from the chat window.

The resulting prototype

I proudly present you with the latest iteration of HALP. Enjoy browsing the prototype and feel free to add any comment through Figma: it helps me make better products!

What's next

To turn this into a real product you would require a development team. To facilitate the project’s integration, I built a design system showing the main guidelines associated with HALP.

You can see the entire design system here

I also thought of possible improvements:

  • Reducing the text in the onboarding to make it a seamless experience
  • Include filters in the results view
  • Increase the presence of HALP across the app to develop the approchable aspect of our service.

My takeaways

It is so rewarding and eye-opening to see how much the original design evolve. Proving, if necessary, the importance of collecting users feedback during the creation process.

There is a saying in sales: You don’t know what you don’t know.

I was tempted to make assumptions (like most people). The design thinking process and UX Design methodologies are the perfect tools to deliver aligned experiences. The cycle-based approach ensures that when your user evolves, you are evolving with them.

And looking back, I did evolve during this project!


I could write another thousand words since I learned a lot. Here are the most important ones :

  • Practice makes perfection. The more iterations, the better was this project. 
  • Use the tools correctly, (and again, practising helps). Not doing so brings frustrations and is a waste of time.
  • There is no quick fix:  you will have to sit down, reflect upon your work, ask for advice and then only you can progress.
  • In reality, when working on a UX project from scratch, there are very few elements that are 100% true.
Thank you for reading this far!

More work from me

Do you want to discuss your project?

Discover a bit more on