ConcernCenter

View Presentation
UX Research: Heuristic Evaluation, Usability Testing

Project Overview

ConcernCenter is an online platform that provides curated resources to organizations ranging from corporations, universities, and veteran associations. Our team reached out to ConcernCenter to conduct usability testing on the front-end side as part of a class project. We conducted a heuristic evaluation, surveys, and usability testing as part of this project. I was the main contact person with the project manager of ConcernCenter and lead one of the usability tests.

Research

We first met with the client to discuss potential areas for investigation. The client mainly wanted to focus on the user interface and see what is working well and what is not working. Below are the specific research questions the client wanted us to focus on.

Research Questions

  • What are users’ expectations for finding resources for concerns like financial, mental health, or academic?
  • Would they use it with real concerns?
  • How does the scrolling pattern of the result page impact the user’s experience?
  • What roadblocks do users encounter when they try to find a resource? When they try to access said resource?
  • How do users expect to submit a concern referral? Do they understand what a concern referral means?

Heuristic Evaluation

Each team member conducted a heuristic evaluation independently on the ConcernCenter University of Rochester website. This allowed us to not only get familiar with the platform, it also helped us to identify areas of focus for usability testing. We reconvened after our evaluation and shared our findings as a team to further discussed. 15 heuristics provided by our professor were used during the evaluation. Our focus for the website was primarily on the accessibility, responsiveness, goal-seeking, and search results.

Usability Testing

After the heuristic evaluation, we discussed with the project manager our findings from the heuristic evaluation. Alongside the original research questions, we brought up potential areas for concern that we would like to focus on for the usability testing. This type of testing will allow us to see where and how potential users would have difficulty using this platform. Prescreening surveys were distributed to potential participants to take part in our usability testing.

Test Process

Screenshot of the concern center platform in developer mode.
Simulated mobile version of ConcernCenter on desktop browser.

Set Up

All testing was conducted over Zoom due to COVID. In order for participants to have the same experience across different platforms, we utilized the developer mode tools on the Chrome browser to simulate the mobile version of the college demo version of ConcernCenter.

Tasks

Participants were given an introduction to our study and an informed consent form before starting. This was then followed by a background questionnaire and a practice session so the user can get used to the controls. 3 tasks were created for participants to complete:

  1. Submit a referral
  2. Search for a resource
  3. Identify a resource

We conducted a within-subject test as we did not have a large number of participants. In order to reduce bias, each tasks were given in a different order.

Test Videos

Below you will find some highlight videos of participants going through the testing process. For more videos, make sure to check out the powerpoint!

This highlight video showcases the difficulties participants have with misspellings in the search bar.
This video highlights how participants leave ConcernCenter for an external site of the respective resource.
This video shows the difficulty participants have with finding the "Submit Referral" button.

Task Difficulty

After each task, participants were asked to rate the difficulty from 1 (easy) to 7 (hard). The average for all the tasks were 3, a medium difficulty. The easiest task was identifying a resource.

All participants were able to search and identify a resource. However, none of our participants were able to search for the referral button. These ratings are based on how our participants felt. With more participants, the scores and task completions could change the results.

Box plot showing the difficulty rating for search, identify, and referral tasks
Boxplot of the results from each task

System Usability Score

85.83

We conducted a system usability score (SUS) to evaluate the usability of the platform. ConcernCenter received a high score of 85, which is in the 90th percentile of system usability and would receive a grade of A.

Conclusion

01

Improve search function

Allow users to type resources and give suggestions if there are misspellings. This makes it easier to suggest potential resources.

02

Provide salient info on cards

More information would reduce time on external websites to find relevant information about a resource.

03

Make important functions more visible

Shortening the welcome screen allows more categories to be seen. The submit referral button is an important feature, but all participants missed it.

Reflection

We were able to meet with the project manager and present our findings to them after the semester had ended. We received positive feedback for all of our hard work and were told our research will be taken into consideration for their redesign. If I were to improve on this project, I would conduct more usability testing since we had a small sample size. This could impact the difficulty rating and the SUS scores, which will provide more feedback and insight on how we could improve ConcernCenter. It would also be helpful to test how the desktop version would perform as well. Although we were a smaller team of three, I thoroughly enjoyed working on this project!