A7: User testing and redesign – DUE Tuesday, Nov. 27 @ 12:30am (note unusual date!)

The goal of this assignment is to test your app prototype with two people to further streamline your app and to inform the comparative evaluation that you will do for A8.

Step 1: In-person user testing

Observe at least two different people testing your app prototype in-person. Try your best to find representative testers whom you would expect to use your app in real life; if you cannot do so, then get as close as possible to approximating your target population.

One person will facilitate the test and interact with the user, and the rest of the team will be in charge of taking notes/photos/video/etc. The facilitator should use the protocol that you developed in A6. Don't forget to have your user sign a printed consent form before beginning the test. Unlike in A3's heuristic evaluations, this time around your user will not be writing down the problems they find for you. It's your job to learn what the people testing your prototype are thinking, and what mental models they are forming. Your main goal here is to find ways to improve your interface.

Immediately after each test, do a quick debrief with your team and write down any reactions or thoughts that came up during the testing session.

Take a photo or draw a sketch of each tester using your prototype. As with the A1 needfinding assignment, these photos with captions should show breakdowns and design opportunities. Contextualize them by capturing the action, e.g. by using over-the-shoulder photos to capture the user using your app, and by showing the setting. Look for other breakdowns and pain points in your interface and try to understand what the problems are and how you might fix them. When possible, modify your app prototype before testing on the next participant so that you can get fresher data.

Watch the In-Person Experiments lecture video for some practical tips about running these sorts of experiments.

(You'll have time in class on Monday, Nov 26 for user testing, but we highly suggest that you get it done earlier since it's due the next day, and there are many other parts of this assignment.)

Step 2: Compile your findings

After testing, take some time with your team to reflect on your findings. Discuss as a team and define some general patterns in people's behavior. When you identify some interesting points, talk deeply about them—ask each other questions, recreate the different tests, analyze the decisions people made, other paths they could have taken, and so on.

Write up a detailed and understandable list of changes that you will implement as a result of your testing and discussion, with justifications. Then fix the bugs that are either small and easy to fix, or too severe to ignore. Make sure that you do this before moving on to the next step of this assignment.

Step 3: Create a Meaningful Redesign

Choose ONE non-trivial component of your prototype to redesign in order to either resolve a breakdown or provide a potentially better solution than what was created before, as informed by user testing. To do this, make a duplicate of the specific webpage where the change will take place. For example, if you are redesigning a component of the homepage, keep the current homepage ("homepage.htm") and create a second page with a route "homepage2.htm". You will be submitting both the original URL and the redesign URL.

The redesign will be used for comparative evaluation in the next assignment (A8). The redesigned component must also be noticeably different from the original design in a substantial way—just changing the background color or some simple visual features as your redesign is not enough. We highly suggest that you read A8 in detail to understand what kind of redesign is appropriate.

Step 4: Description of Comparative Evaluation

Submit a description of the comparative evaluation that you plan to run for next week's assignment (A8); you do NOT need to run the actual evaluation yet. The description should explain what the differences between your original and redesigned versions of the component are (from Step 3).

Next you should write down what you plan to measure in your comparative evaluation for A8. You can either choose to use quantitative or qualitative measures. For instance, you can count how frequently users perform certain actions in your app (quantitative), or what types of comments they provide while thinking aloud (qualitative) when using both your versions. You don't need to be too formal here, and you can change your measures once you start doing A8 and see what works and doesn't work in practice; we just want you to propose a realistic-sounding plan at this point.

Finally, include your best guess of all possible outcomes and interpretations for next week's comparative evaluation, and what that may mean for the design of your final project. In other words, put down what you think will happen next week.

Step 5: Update Development Plan

Just as we've been doing in the previous weeks, update your development plan. Add new tasks for this week and the following weeks while marking when existing tasks have been completed. Add stretch goals that you find feasible and adjust other tasks that may be out of reach.

Student Examples

Here are three student examples. However, the A7 prompt from prior years was different in that they could make paper prototype redesigns, whereas we want you to implement your redesign in code. Also you will do an in-person comparative evaluation instead of an online A/B test.

  • Example 1 - This is an example of an A+ level assignment. This group obviously put a lot of thought into their in-person test, and was able to motivate their redesign from the conclusions they drew from the in-person test.
  • Example 2 - This is an example of a B-level assignment. This group lost points for not including their consent form for the in-person test. We also wished the feedback was more substantive beyond obvious usability bugs (one of which had been mentioned by the TA in a previous assignment). For the online test description, we were not convinced that measuring click rates was the right metric to measure success.
  • Example 3 - This is an example of an A-level assignment. We liked the clean and well captioned photos for each participant testing their app. They also tested more than the required two users.

Assignment Submission

Submit a single well-formatted PDF file for your entire team with the following items concatenated within it:

  • Names and PIDs of all of your team members, along with your team name. (If you forget someone's name, they will not get credit for this assignment.)
  • Your testing protocol and signed consent forms, as well as any materials you gave to the user as part of your tests (either as text, PDF, or a scanned image). (Testing Protocol & Documentation)
  • Notes taken from user studies with at least TWO users (User Study Notes)
  • Captioned photos for each participant testing your prototype. (Photo Documentation)
  • A detailed list of changes you will implement in your next iteration, with justifications. (Planned Changes Based on Test)
  • The URL of the original prototype you tested (with possible bug fixes as a result of testing) and the URL of the implemented alternative redesign of one non-trivial interface element. Very important: the contents of both of these URLs should not change in the upcoming week while your TA is grading this assignment, or else that is a violation of the academic honesty policy; test your new changes at a different URL. (Meaningful Design)
  • Description of your planned comparative evaluation for A8, as well as your guess of all possible outcomes, interpreted with implications on the design of your prototype. (Description of Comparative Evaluation)
  • A copy of last week's and this week's development plan embedded in a single PDF in a readable and easy-to-compare format. (Update Development Plan)

Submit your single formatted PDF in Gradescope to the bin for your studio section. Only one team member needs to submit on behalf of the entire team.

Evaluation criteria & Grading rubric

The rubric below contains criteria that are worth one point each and will be graded independently and in a binary fashion.

  1. Signed consent forms & photos/sketches of each user testing participant are submitted.
  2. Notes taken from performing the user testing protocol on at least TWO users are submitted.
  3. Notes from the first user include breakdowns, errors, or design improvements inspired by that user's testing session.
  4. Notes from the second user include breakdowns, errors, or design improvements inspired by that user's testing session.
  5. List of changes demonstrates changes that your team plans to implement, based off the user testing observations and subsequent discussions.
  6. Redesign stems from user testing, and properly reflects an underlying design breakdown or design opportunity that arose from such testing.
  7. The original and alternative redesign webpage URLs are submitted and publicly vieweable by the staff. (e.g.,,
  8. The redesigned webpage is fully interactive, functional, and ready for the comparative evaluation next week (2 points).
  9. The redesign only changed a single feature of one page of the app, thus making it possible to test with comparative evaluation.
  10. The planned comparative evaluation contains realistic measures, which can be either quantitative or qualitative.
  11. All possible outcomes of the planned comparative evaluation are identified and interpreted with implications on the design of your final project.
  12. Development plan is included as a PDF, is easy for your TA to read and understand, and has been updated to show progress and additional tasks in the new version compared to the previous week.

Due In Studio: Teammate Assessment

During studio, click here to assess how each of your teammates contributed to this assignment.