Shadowing and the Art of Task & Usability Analysis - Elicitation Technique #2

       Observe - Humans are so Interesting!

       Observe - Humans are so Interesting!

Shadowing - it sounds mysterious and a bit film noir, but shadowing is, basically, observing users in action as they perform their tasks. When used in conjunction with:

  • Task Analysis - breaking a task down into smaller actions and/or steps for observation/analysis
  • Human Factor Analysis - human behavior as it relates to products, services, environment, etc.
  • Usability Analysis - evaluating usability of an application, product, or service via user interaction

Shadowing becomes a powerful elicitation technique that can lead to user-centric designs that are clean, efficient, intuitive, and scalable.

Observation gives you firsthand knowledge of how a user completes a process/task or interacts with a product/application, given all the environmental factors around them. Basically, showing you what they do, how they've adapted, and what they don't do! This simple process of watching, interviewing, recording, and analyzing the outcomes can:

  • Identify issues, problems, and opportunities for improvement
  • Measure speed, task completion, responsiveness, performance, and user satisfaction
  • Assess environmental factors, hardware, and additional software/interface needs
  • Build a cohesive team by directly involving users in the design process
  • Lead to a deeper understanding of the users, tasks/processes, and product/application

Although shadowing can, also, be used to improve and/or update an existing process, today's blog will focus on its use to improve and/or replace an existing software application.

Get Ready

A little prep work prior to the observation sessions will ensure you have everything you need for the actual sessions, final analysis, and report/recommendations. Start with this quick list.

What to do:

  • Read the user-training manual, if available, so that you have an idea of how the system works.
  • Identify and ask for permission to observe different levels of users conducting the same task (e.g. the experienced user, the inexperienced user, the tech savvy, the not so tech savvy, etc.).
  • Note the identified users' titles and job responsibilities.
  • Send an email to the identified users letting them know that you will be conducting the sessions and what to expect. Focus on the system.
  • Determine what type of hardware the user has (e.g. how old is it, are they on different versions of software, does it need to be portable, touch-screen, type of internet connection, etc.). This information can also be obtained by conducting a survey (see the "Survey Says" article), and should be included in your final analysis.
  • Put the user at ease, be friendly, be empathetic, you are there to evaluate the system not them.
  • Be aware of the fact that observing someone may change the way they behave (e.g. people freak out a little when they are being watched, and may not behave naturally). 
  • Look for patterns across the pool of users you observe.
  • Take notes...lots of notes...and diagram things like workflows, task, sub-tasks, etc.
  • Note any performance issues that occur and where in the workflow they occur (also note date and time of the occurrence).

What not to do:

  • Don't suggest possible solutions. If a user suggests a possible solution, note their suggestion, but redirect them to focus on the details of the problem.
  • Try not to interrupt the user too often, be patient, make a note and ask later.
  • Try to remain unbiased.
  • Don't interpret behavior without confirming  (e.g. If a user rolls their eyes, ask why).
  • Don't get in the way!

The Observation Process - Show and Tell, Then Repeat

Elicitation techniques do not provide everything a business analyst needs in a single sitting, they need to be repeated multiple times (i.e. iterations) in order to capture the high level concepts and detail information. Since we are using shadowing/observation combined with several other techniques, we recommend the following process to get the most out of your sessions:

  • Who - Begin with the most experienced/knowledgeable user who can provide you with the most accurate information. If there is more than one user group/role involved, start with those involved in the beginning of the process and work your way to those involved in the end of the process.
  • What - Decide what it is that you want to learn from your observation (e.g. the overall user experience, usability of the application, task flow, etc.).
  • When - Determine if a task/process is affected by peak, normal, or slow hours during the day or if the task/process only happens during certain times of the day, week, etc. and then observe during those times. Limit each observation to 1-hour sessions.
  • Where - Environment plays a key role in observation and, ultimately, designs (e.g. banks operate differently than hospitals - even within hospitals patient information, for example, will be acquired, accessed and used differently by different roles/positions/departments). The design of the user interface and the delivery products used should reflect the user environment. Note the environment/working conditions and take photos if necessary.
  • How - For the first session have the user walk you through their tasks and interactions (focus on the positive path first). Don't interrupt them or their train of thought, unless they go completely off topic. Outline the process (graphic boxes of task flow, and supporting text such as Step 1, Step 2, etc.). In the second session (and if you're lucky, subsequent sessions) have the user repeat the tasks, including any alternate paths to completing the main tasks, and this time ask questions to clarify the process. You can create a checklist (from the next section - What to Look For...) to note usability issues, human behavior variations, etc.

What to Look For - 7 Areas of Observation

Here is a starter list of seven areas of observation that incorporates some high level principles of process, information, and technology design around the user experience (UX). A web designer/engineer would have a far more comprehensive approach, but these observations will provide you with the level of detail needed for your analysis and will help support your recommendations in your final report.

1.  Security

  • Logins/Emulations - are views specific to the logged in user, and can a logged in user emulate another user in order to handle that user's workload?
  • Roles - Are views limited to roles within the system?
  • Access - Is access to data restricted based on role or business rule (is restriction limited to read only, update, etc.).

2.  Performance

  • Speed/response issues (e.g. upload, download, page/list renderings, search, saving, updating, etc.).
  • Restrictions on file size or data processing.
  • System downtime and recovery issues.

3.  Process

  • Workflow/Tasks

    • Is the application designed in such a way that it guides the users through their workflow/interactions or does the user need to bounce around to complete their task.
    • Is the application designed to reduce the number of steps in the process?
    • How long does it take to complete each task?
    • Note any difficulty in completing each task.
    • Can a user easily correct, back out, or undo mistakes (e.g. clicking on the wrong button or entering incorrect data)?
  • Business Rules

    • Are business rules embedded in the workflow (e.g. completion of current task is dependent on completion of previous task)?
    • Are business rules highlighted for the user (e.g. text - This transaction must be completed within one business day)?
  • Status

    • Is workflow or task status communicated to the user (e.g. Pending, Open, Closed or Items in Cart, Items Purchased, etc.) so that the user knows where they are in the process?
    • During processing, does the system display an hourglass or spinning wheel to indicate that something is happening?
    • If the user is completing multiple forms, does the system indicate where the user is in the process, and the time or steps remaining to complete the process?
  • Inputs/Outputs

    • Note any inputs coming from outside the system (e.g. file feeds, scanned items, etc.).
    • Note any outputs generated from the system (e.g. form letters, fulfillment, etc.).

4.  Navigation

  • Does the navigation seem intuitive and easy to use?
  • Is it obvious to the user which actions are available?
  • Is the current location of the user indicated (e.g. menu item is highlighted, etc.)?
  • Count the number of clicks it takes to navigate to a task or complete a task. Can the clickstream be reduced?
  • Can users easily find what they are looking for?

5.  Interface

  • Page Layout

    • Are the elements on the screen, form or template based on user groups, or by role, or is it a generic screen used by all, where one group must ignore the fields used by another group of users.
    • Is the look and feel consistent throughout the application?
    • Does the information/tasks appear in a logical order based on priority/importance of the user?
    • Does the layout guide the user to the first or next step/action?
    • Is the page cluttered with too much information? Not enough information? Not the right information?
    • Does information previously collected from other pages/forms/templates display with an option to edit/update it or must it be re-entered?
    • Are fields clearly labeled and provide information or instructions where necessary (e.g. Select only one option, *Required field, expected input values, etc.).
    • Are there unnecessary or legacy fields that can be eliminated?
  • Information (Forms/Templates)

    • Does the system use Forms and/or Templates?
    • Do the forms pre-populate with data entered elsewhere in the application?
    • Are the same questions used in multiple forms? Can these questions be consolidated into a standard form with separate customizable sections?
    • Are there any redundant fields or questions?
    • Is the content up-to-date, and free of spelling and grammatical errors?
    • Is the content/language written for the intended audience?
  • Functionality/Features

    • Are plugins and/or widgets used wherever possible (e.g. Search, Calendar, Clock, etc.)?
    • All controls should be consistently labeled and function as intended (radio buttons, text, checkboxes, save, delete, cancel, etc.)
    • Are there Tool Tips and/or Help available on-demand?
    • Can users report system issues or suggestions via the system?
  • Data

    • What data is required to perform the task?
    • Are there calculations? What data is used for the calculations? What data is a result of a calculation, algorithm, etc.?
    • What data is prepopulated from other areas?
  • Messaging, Feedback

    • Does the system prompt the user via messages to confirm actions such as Delete, Update, exiting an incomplete process, etc.?
    • Error messages describe the error and provide concise instructions to correct the error.

6.  Reporting

  • How is reporting handled? Are there standard reports? On-Demand reports? Dashboard Reporting?
  • Are there not enough reports? Too many reports? Reports with conflicting data?
  • Is the data on the reports in real-time, near-time, point-in-time, daily, weekly, etc.?

7.  User Behavior

  • Process Variation

    • Note variations/modifications in approach, workflow, navigation, etc. between users.

    • Pay attention to what the user is saying in comparison to what they are doing.

    • Are user deviating from expected paths/actions due to issues with the system or as a short-cut, etc.?

  • Workarounds

    • Note any workarounds the user has created, because the system does not support the action.
    • Note users referencing information in binders, on sticky notes or tacked to their walls.
    • Are users copying or re-entering system information on spreadsheets or word docs, because they don't trust the system or they need metrics or reporting that is not supported by the system?

Once your observations are done, you may need to follow-up with individual or group interviews for further clarification before you begin your analysis.

For more information on evaluating existing software applications, stay tuned for my co-hort, Kathleen's, next blog.

You know my method. It is founded upon the observation of trifles.
— Arthur Conan Doyle, The Boscombe Valley Mystery