Interviewing Users

Interviewing Users

Jakob Nielsen’s advice on interviews.

  • What users say and what they do are different
  • User interviews are appropriate when used in cases where they generate valid data

What interviews can’t provide

  • Where user interviews fail:

    • when a user is asked to recall the past
    • when a user is asked to speculate the future
  • Our memories are fallible - we construct stories to rationalize what we remember, or think we remember, to make it sound more plausible or logical

  • The present is the only valid data a user can offer, everything else is recollection or speculation

  • Users are pragmatic and concrete - users (non-designers) can’t naturally visualize something that doesn’t yet exist, and similarly, designers don’t see the world from a users’s perspective. This explains the failure of a specification document and waterfall product development. It speculates that the product will succeed.

  • In contrast, An Agile team focused on learning will validate design decisions at each iteration.

  • Decisions on colours, html form element types, number of items, tone of voice are not something to ask users. Instead, these decisions will be determined from observing users use the product.

  • Avoid asking users:

    • Would you use (unbuild feature) - again, this is speculation
    • How useful is (existing feature) - these questions may lead to confused responses and unreliable data. caveat - if you do ask “how useful is (existing feature)” also ask the same for a non-existing feature
  • To gain this feedback more accurately:

    • pay attention to user comments while using these features
    • ask questions immediately after use

What interviews can provide

  • Overall feelings of using the site after use
  • Acquiring general attitudes or “how they think of a problem” - use this feedback to design solutions
  • Use the critical incident method to ask to recall stand-out examples:
    • when they faced particular difficulty
    • when there was little friction
  • Avoid idealised examples by:
    • avoiding asking for their “usual” workflow - asking this can result in the omission of details that remove them from what they actually do

The Query Effect

  • People make up opinions when asked for one
  • Asking leading questions can act as a catalyst
  • Be cautious not to use these opinions to make design or business decisions
  • To gain this feedback more accurately:
    • resist asking about particular attributes that might result in forced comments
    • take note of unprompted comments during usability testing

Combining methods

  • User testing will always give you the most valuable data
  • Triangulate the findings to gain a better understanding