IADT MSc UX Design Assignment — Reflections from a Design Team
Back to Assignment Introduction
Project Approach
Design thinking, a non-linear, collaborative, process of empathy, definition, ideation, prototyping, and testing (Dam & Siang, 2020) were followed by the team (Ultan is part of the Howth team, along with Niamh Kearns and Áine O’Neill).
The adopted way of working also leveraged principles from the Agile Manifesto (Agile Alliance, 2001), Lean UX (Gothelf, 2013), and the best practices of IDEO (2020).
- The design team worked remotely using Microsoft Teams to collaborate.
- Combined online and offline tools were used: Pen, paper, scissors, wooden sticks, smartphones, cameras, Google Search, Mural app, Marvel App, Microsoft Teams, Surveymonkey, and Zoom.
- Design artifacts were stored in the shared cloud; easily changed for different hypotheses.
- The team leveraged friends, family, co-workers, networks; each brought their own diverse talents, from visual design to fashion creation, to product ownership, to a shared assignment.
- Each team member contributed a deliverable; the team then used deferred judgment before reaching a decision (IDEO, 2020).
- Iteration of content focused on the problem and remained user-centric.
- Team members worked flexibly, online, and leveraged class time for designing.
However, the ability to “read” a room or judge the mood of a situation in a large group was something we missed online. Going “digital” also comes at an environmental cost (McGovern, 2020).
A hybrid situation of online and in-person work may have offered more productivity and a more nuanced sense of empathy.
Teamwork Learning Lessons
- Agree on tools for collaborating and sharing.
- Offline tools are as important as digital solutions, even during remote working.
- Establish team norms and expectations about ways of working.
- Strive to value the diversity of team members and their knowledge. User experience work is “ideally carried out by multidisciplinary teams” (Sharpe et al, 2019).
- Defer judgment until all alternatives are explained.
- Listen actively to contributions.
Decision Making
Decisions were agreed by group consensus, evaluating alternatives with research and data to justify direction then validated by users. Discovery and framing artifacts that are easy to create, and are visual in nature, providing a high communication value to explaining options. Decisions around tactical design choices were taken in the context of the value to the user and the business return on investment.
Decision examples:
Tiffany
The Tiffany persona was chosen over others because of:
- Easy access to representative users in real life to validate personas, scenarios, and prototypes.
- The rapid adoption of collaborative platforms in schools due to the pandemic, and growth potential.
- Segmentation revealed other personas also wanted mobile, simple, and usable solutions.
The Task
The hand-in task was agreed over others because of higher:
Return on Investment (ROI), through:
- Satisfaction for pupils, teachers, and adults.
- Effectiveness for school management.
- Efficiency for pupils, teachers, parents, and school management.
Opportunities for exploration; through:
- Voice user interface.
- The surfacing of common tasks.
- New information architecture.
- Improving device notifications.
The Prototype
Although prototypes were explored in different formats, a common paper-based version of the same task was agreed to:
- Eliminate any distraction of visual styling to testers.
- Iterate versions quickly; without losing momentum.
- Enable replication across more users.
Testing Criteria
Efficiency, Effectiveness, and Satisfaction were employed as the test criteria because they:
- Provide for simple formative assessment.
- Enable rapid iterative testing and data collection.
- Are known industry-accepted criteria for determining usability with a small population sample.
‘Final’ Observations
The following observations are made based on the test data and user comments:
- All three paper prototypes recorded a higher satisfaction rating than the Microsoft Teams task equivalent.
- Time on task for all three prototype versions was faster than for Microsoft Teams.
- The initial paper prototype recorded 3 errors; related to the repeated use of an Upload button label in the UI.
- The resulting first iteration recorded 0 errors after the renaming of the second Upload button to Submit.
- The second iteration reduced the time on task and recorded 0 errors by eliminating two steps to upload and by using a combined processing/confirmation dialog.
- However, the first iteration rated higher satisfaction than the second iteration and other versions tested, and users felt strongly about this experience.
User experience expectations must be taken holistically, a balance of factors rather than simply a lower time on task or smaller number of clicks. For example, users of the second iteration expressed concern that the faster completion steps offered less control over the hand-in process and provided less error prevention; important heuristics to consider in further iterations.
The team continues to iterate and test prototypes using user-centric mindsets and methodologies; gather data and feedback using formative and summative methods to set direction, and to keep trying…
Don’t try to build a perfect app right on the first attempt. It almost impossible. Instead, treat your app as a continually evolving project, and use data from testing sessions and user feedback to constantly improve the experience. — (Babich, 2018)