Authentic vs. Synthetic: A Comparison of Different Methods for Studying Users’ Online Behaviors and Experiences

In recent years, researchers have devoted considerable research efforts to investigate the relationships between users’ online behaviors and their tasks and to personalize information or support to suit users’ task at hand. Various methods have been used to collect user behavior such as laboratory experiments and field studies. Users may work on their own tasks or tasks assigned by the researcher. However, the impact of these methods on user behavior remain unclear; and reliability of the data collected are directly influenced by the methods used. This research aims to understand how study setting and task authenticity affect users’ searching and browsing behaviors and experiences, such as their engagement and barriers encountered.

Data Collection Methods:

Laboratory experiments, remote experiments, questionnaires, semi-structured interview, web logging

Procedure:

I recruited 36 university students from 21 different disciplines to participate in a 2×2 repeated design study. Each participant finished two sessions: one lab session and one remote session. In each session, they completed two information search tasks online (i.e., finding information online to answer questions): one task of their own (i.e., authentic task) and one task given by me (i.e., simulated task). Their online activities-such as pages viewed and queries issued-were recorded by a Coagmento browser extension. They also reported their experiences in two questionnaires before and after each task and a semi-structured individual interview.

Data Analysis:

  • Data preprocessing in Python
  • Quantitative analyses in R: Wilcoxon signed-rank test, multilevel modeling
  • Qualitative coding in Dedoose

Primary Findings:

  1. Some aspects of user behavior, such as their page dwelling time and number of page visited, were significantly affected by study settings and task authenticity.
  2. Users’ behaviors at the beginning of a session were less affected than the whole session data.
  3. Users were more motivated to do their own tasks in the lab than in the field, but not the simulated tasks.
  4. Distraction, multi-tasking, and dividing tasks into parts were prevalent in the field but could rarely be observed in the lab.

Publication and Manuscript:

Wang, Y. (2019). Study setting and task configuration for task-based information seeking research. Presented at the Jean Tague-Sutcliffe Doctoral Student Research Poster Competition at the ALISE 2019 Annual Conference, Knoxville, TN. [Award Honorable Mention][HCI][HIB][IIR]

Wang, Y., & Shah, C. (under review). Authentic vs. synthetic: An investigation of the influences of study settings and task configurations on search behaviors. Information Processing & Management. [HCI][HIB][IIR]

Back to Research