All testing was conducted over Zoom due to COVID. In order for participants to have the same experience across different platforms, we utilized the developer mode tools on the Chrome browser to simulate the mobile version of the college demo version of ConcernCenter.
Participants were given an introduction to our study and an informed consent form before starting. This was then followed by a background questionnaire and a practice session so the user can get used to the controls. 3 tasks were created for participants to complete:
We conducted a within-subject test as we did not have a large number of participants. In order to reduce bias, each tasks were given in a different order.
Below you will find some highlight videos of participants going through the testing process. For more videos, make sure to check out the powerpoint!
After each task, participants were asked to rate the difficulty from 1 (easy) to 7 (hard). The average for all the tasks were 3, a medium difficulty. The easiest task was identifying a resource.
All participants were able to search and identify a resource. However, none of our participants were able to search for the referral button. These ratings are based on how our participants felt. With more participants, the scores and task completions could change the results.
We conducted a system usability score (SUS) to evaluate the usability of the platform. ConcernCenter received a high score of 85, which is in the 90th percentile of system usability and would receive a grade of A.
Allow users to type resources and give suggestions if there are misspellings. This makes it easier to suggest potential resources.
More information would reduce time on external websites to find relevant information about a resource.
Shortening the welcome screen allows more categories to be seen. The submit referral button is an important feature, but all participants missed it.