top of page

Post-submission screen to provide reassurance and next steps?

User Research & Testing | March – April, 2023

A design evaluation of Purposely's volunteer web platform aimed to identify paint points and opportunities within the user journey, and areas for UI improvement. As a UX researcher and UI Designer, I conducted usability tests, created prototypes, and analysed research. The project was completed in 4 weeks.​

By Audrey, Catelyn, Eliza, Kaitlyn & Sina
pdf icon.png

1. Purposely's Mission

In our initial stakeholder meeting with Purposely, the founder and design lead shared concerns around the usability of: the newly released Request an Opportunity form and the Impact Dashboard.

By improving these features, Purposely seeks to increase engagement on the platform, thus encouraging more volunteering.

2. Scope

Research Questions

For this study, we evaluated two features of Purposely’s platform:

1

Request an Opportunity

The Request an Opportunity feature helps teams find the perfect opportunity through a simple form submission.

Research Goals

Learnability

Understand how culture leads move through the current Request an Opportunity experience from discovery to submitting a request.

Utility

Better understand a culture lead's goals and needs when submitting a volunteer opportunity request.

Research Questions

  • How likely are culture leads to use the Request an Opportunity feature?

  • Does the current Request an Opportunity experience satisfy the needs and goals of a culture lead?

  • What part of the Request an Opportunity experience is challenging for a culture lead?

2

Impact Dashboard

The Impact Dashboard shows users the cumulative amount of sign ups, activities and total hours volunteered.

Research Goals

Satisfaction

Better understand what culture leads value about the current Impact Dashboard.

Utility

Understand how culture leads interact with the data on the Impact Dashboard.

  • How do culture leads feel when seeing the Impact Dashboard?

  • What is the value of the Impact Dashboard for culture leads?

3. Methodology

Participants

Participant Overview

  • 4 Participants

  • Recruited from Purposely

  • Culture Leads

  • Available within a short time frame (only 2 weeks to recruit)

Evaluation Methods

Round 1

  1. Think-aloud usability test of Purposely’s new Request an Opportunity feature

​Tasks:​

Assess Request Banner

Fill Request Form

2.  Conceptual Model Extraction of         Impact Dashboard

​Tasks:​

Observe

Questionnaire

3.  Pre/Post-study Questionnaire

Round 2

  1. Think-aloud usability test of created Prototype

​Tasks:​

Scenario

Complete Task

Questionnaire

2.  Pre/Post-study Questionnaire

1

2

Prototypes

Prototype Overview

For the second round of the study, my team and I prototyped 3 variations of the Request an Opportunity feature as a way to test participant responses to our design explorations. It also helped reveal possible solutions for improving Purposely's feature.

1. Input Fields: Input field format with questions on a single page

2. Multi-page From: Multi-paged form with questions separated across pages

3. Selection Tags: Input fields with Selection Tags to list volunteer interests

Data Analysis

Affinity Mapping

To analyze the qualitative data gathered from the questionnaires, tasks, observations, and think-aloud activities of round one, our team created an affinity map to group similar themes, such as pain points and impressions, for easy analysis. Affinity mapping was our chosen method of analysis as it combined qualitative data from multiple participants and helped identify important ideas and priorities to focus on.

Round 2

By observing the common pain points and user experiences, my team was able to create prototypes for the second round of the study.

 

The participant's reactions and the qualitative data gathered from the prototype testing in the second round of the study was then analyzed through more affinity mapping. This helped our team make substantial recommendations that accurately reflected the needs Purposely's user groups

Prototype: Selection Tags

Feedback on Request an Opportunity format

  • Participant valued that key information Purposely needs are clearly indicated

Real-time changes

Values seeing numbers change

  • Company leads are excited to see their volunteer stats go up over time, and become concerned when they are low

test email.png

4. Results

1

Request an Opportunity

Findings

1. Intended audience is unclear.

When viewing the banner, participants assumed it was mainly for remote employees outside of British Columbia; making the intended audience unclear

“This seems like something for good for our remote team members”

“So I thought that it was eye-catching in the sense that . . . I think [it would be] for our remote team members”

2. Format is too open-ended.

Generally, participants found the form straight-forward, but they suggested changing it from a single text box to a series of guiding questions to help reduce the time and effort required.​

“If the details section was broken apart into key questions, it might not be as overwhelming for the person typing it.”

“I think information could definitely be missed in this [current] format.”

3. Unclear next steps.

After submitting a request, the user is presented with a confirmation banner that some participants missed. Those that do see the message want to know when they can expect to hear back and what their next steps will be.

"After I fill in the form and click submit, I would expect some sort of confirmation saying, ‘Hey, we got your request and these are the next steps."

"I would like to see is when I would hear back. I’m fine with it taking a while, but I would like to know what the timeline is going to look like."

Recommendations

1

1. Alter questionnaire format

For a more guided experience, change single text boxes to

a series of structured questions (but let answers be flexible).

“I don't have to worry about how to write it out and make sure I hit those key points.”

“I like how it asked me for key information without me having to think about it.”

2

2. Add suggestions.

Support users in finding volunteer opportunities they would not have thought of volunteering for otherwise

"It’s a lot to take in, but at the same time, I like that you can select multiple [categories]. This opens your mind to more things."

"Categories to choose from are really great."

3

3. Include confirmation page.

After the user submits a request, confirm the submission and offer next steps so users receive clear confirmation.

“The message at the end that tells you that it went through and what to expect is very useful.”

"I like that it sets the expectation of when we will hear back. If we’re trying to set up something next week, it’s short notice but knowing that we’ll hear back in a day or so is helpful."

2

Impact Dashboard

Findings

1. More detail, more categories.

Participants were quite satisfied with the Impact Dashboard in its current state. However, access to a more detailed breakdown and additional data points would be valuable to track.

"We’d like to have the holistic view of like, what's our total impact? It would be great to look back every quarter and be able to see that we did 30 collective volunteer hours. We like to celebrate those successes."

2. Accuracy and recency.

Participants are concerned and worried that the dashboard does not reflect  up-to-date volunteer hours, events, and sign-ups 

“What's the timeline? I’m not sure when things get updated. I'm wondering because we have all these people that signed up. But it only says that two did. So I like the idea of having those data points. I just don't know when it updates.”

3. Sharing and celebrating progress.

Data can be shared internally to encourage more volunteering, and externally, to keep their key stakeholders in the loop.

"We can say we've done 20 volunteer hours in this quarter, and share that with the staff to show that that's the impact we're having. We also have loyal members and shareholders that we send newsletters out to, to share what our employees are up to aside from work.”

Recommendations

1. Allow users to interact with the dashboard.

Add additional categories of data that can be filtered and an export button to view more detailed summaries and share successes.

Frame 312.png
map.png
Group 1.png

2. Timestamp recent updates.

Help users build trust and gain context, by timestamping recent activities and presenting time ranges.

5. Challenges

Small participant pool.

In total, the perspectives collected from the study are based on a small sample of three companies.

Construct validity of tasks

Prompts for a task had to be changed after first participant, to better reflect real world use cases of the interface.

Changing evaluation method

Evaluation method for the Dashboard had to be changed after realizing the limitations of an affective evaluation.

6. Conclusion

1

Request an Opportunity

Research Goal

Assess the learnability and utility of the newly released Request an Opportunity Form for a culture lead.

Outcome

The Request an Opportunity form helps culture leads achieve their event organization goals and could be improved with step-by-step guidance and feedback.

Research Goal

Assess the satisfaction levels and utility of the Impact Dashboard for a culture lead.

Outcome

The dashboard helps participants track their progress and there is an opportunity for Purposely to explore how culture leads can interact and share data.

2

Impact Dashboard

7. Personal Reflection

Throughout the study, I gained valuable insight into the UX research process; developing my skills in each stage of our design evaluation.

1

2

3

Study Development

During the initial stage of our study, I helped create study plans by detailing research procedures, selecting proper evaluation methods and writing study questionnaires that best reflected our research goals.​ This has given me insight on how to build design evaluations that are well considered, prepared, and organized.

User Testing

Throughout the research and testing stage, I lead usability tests, analyzed data, and created prototypes based on found data. Leading a user interview was a difficult aspect of this project due to my lack of experience, however it challenged me to step out of my comfort zone.

 

Though the interview itself was challenging in terms of keeping track of participant responses and the allotted time for the interview, these challenges were overcome through preparation. This included writing notes to guide myself through the interviews and practicing the script to familiarize myself with the study. Additionally, this experience allowed me to strengthen my communication skills in a professional setting.

Data Analysis

In the final stages of the study, I was able to improve my skills in evaluating and compiling data by applying the learned analysis method of affinity mapping. â€‹This method is a valuable tool that facilitated our approach to data analysis, allowing my team and I to reflect upon and present relevant study results in a digestible and reflective way. 

8. Our Impact 

2025: Post-Research Interface Update

Based on the findings or our user research, Purposely has made significant updates to their interface in 2025. The redesign of the Request an Opportunity form and the Impact dashboard, informed by our team's recommendations, has greatly reduced user confusion.

1

Request an Opportunity

For the Request an Opportunity form, there were previously frequent invalid requests due to users not fully understanding the feature's purpose. Since the update, the number of invalid requests has dropped to 0%, with users now consistently submitting valid forms. The new layout has been highly praised, with Purposely's CEO calling the changes 'huge' for improving user clarity and ensuring the feature aligns with its intended goals.

2

Impact Dashboard

As recommended, the Impact dashboard was also updated with enhanced tracking methods, providing a more detailed breakdown with additional data points to better visualize users' volunteer impact.

bottom of page