105. re-imagining embodied multimodal meaning making through computational ethnography

Department: Computer Science & Engineering
Faculty Advisor(s): Nadir Weibel

Primary Student
Name: Steven Robert Rick
Email: srick@ucsd.edu
Phone: 858-534-8637
Grad Year: 2020

Abstract
Observation is at the heart of much of human-centered research. From studying human communication, to analyzing complex socio-technical systems, to evaluating novel interaction technology, directly emph{seeing} natural human activity succeeds when indirect observation or reporting falls short. Increasingly, ubiquitous sensing technologies are giving researchers new ways to observe, enabling the capture of richer and more diverse data than ever before. What was previously just video is now a multifaceted stream of sight, sound, and real-time interpretation. Still, the act of deploying sensors in the wild, capturing synchronized data, and ultimately exploring and analyzing natural behavior is both time consuming and technically demanding to accomplish. To reduce this barrier and enhance researchers' abilities to conduct rich human-centered research, we developed ChronoSense, an extensible multimodal data capture system. We highlight how use during in-situ observation generated insight that iteratively shaped the system across five different studies. We demonstrate the key advantages that emerged while working with naturally captured, temporally-linked data, and show how our approach accelerated analysis while maintaining ecological validity. Ultimately, this work shows how combining the in-depth study of activity and behavior from ethnography and anthropology with modern sensors allows researchers to effectively embrace the concept of Computational Ethnography, which provides the means to accelerate sense-making of complex and large behavioral data.

Industry Application Area(s)
Life Sciences/Medical Devices & Instruments | Human Factors, UX

Related Links:

  1. chronosense.io

« Back to Posters or Search Results