IADT MSc UX Design Assignment — Testing Assessment
Note: This is information exploring the results of the paper prototype testing. It is included with the assignment for completeness and for reference.
Back to Assignment Introduction
Comments from Participants
“I was looking for ‘Submit’, but it says ‘Upload’?” (1st iteration)
“Does ‘Done’ mean it’s finished? I didn’t see it sending”(1st iteration)
“It’s pretty simple” (1st and 2nd iteration)
“I can see what’s due and when” (2nd iteration)
“The red number on the calendar and the checked box on English tells me it’s done” (2nd iteration)
“The box is ticked so it’s done” (1st iteration)
“Easy to manoeuvre” (1st iteration)
“Easier than Teams” (1st and 2nd iteration)
“Better than Teams ‘cos it has a calendar like that” (2nd iteration)
“I can see English is done, but Art and Biology aren’t done” (2nd iteration)
“I prefer being asked if I am sure or want to wait. You’re more in control then. You have freedom to go back” (2nd iteration)
Task Completion
This test established if the hand-in task could be accomplished:
Time on Task
The time to complete the hand-in task was measured:
Error Rates
The number of errors encountered during the task were counted:
Satisfaction Scale
Tester satisfaction with each user experience version was rated:
Assessment
The following conclusions are offered:
- All three paper prototype tasks recorded a higher satisfaction rating than the Microsoft Teams equivalent.
- Time on task for all three prototype versions was faster than for Microsoft Teams.
- The initial paper prototype recorded 3 completion errors; related to the successive use of an Upload button label in the UI on the task completion journey.
- The first paper iteration recorded 0 errors after the renaming of the button Upload to Submit.
- The second paper iteration improved the time on task and recorded 0 errors by reducing the number of steps to upload and a combined processing/confirmation message.
- However, the first iteration offered more satisfaction over the second iteration and other versions tested.
The results of the testing process are documented as is. No recommendation can be offered on which user experience is ‘best’, only relative, positive and negative findings for consideration, further exploration, and iteration.
Deciding on which changes to be made is beyond the scope of this paper; “developing any product is a series of trade-offs in which you balance schedule, budget, people’s availability, and the changes that are needed.” (Usability.gov, 2020).