Ulla Sternemann

Ulla Sternemann

Bachelor's Thesis

Smart annotation support for daily activity labeling

Arne Küderle (M.Sc.), Prof. Dr. Björn Eskofier

03/2018 – 08/2019

Over the past decade, human activity analysis has become a commonly used modality in modern health care approaches. Being able to track basic activities on a day-to-day basis can provide key insights into a user’s lifestyle and associated medical implications. However, the success of these approaches depends on the quality of the available data. The use of (wearable) sensor systems in combination with machine learning algorithms has become a common approach to gain insights into data recorded in a user’s daily life. However, all these algorithms require large amounts of labeled data for training [1]. Furthermore, it is often impossible to detect very specific or very complex activities with high accuracy [2]. In these cases, it is still necessary to ask the user for manual input, for example in the form of diary-like activity labeling where users are asked to either document their current behavior every couple of minutes or retrospectively recall their activities at the end of the day [3].
While manual labeling by the user is often the only way to gather information about their day-to-day life, several drawbacks have been reported in literature. The manual labels provided by the users are not always accurate due to unintentional or intentional misreporting [3]. Furthermore, the process of manual labeling is time-consuming, disruptive and hence tedious for the users which can lead to low compliance [1, 3, 4].

To overcome these issues, a couple of approaches have been proposed. Van Laerhoven et al. used visualization of recorded data and additional context information to help people recalling their daily activities during the labeling process at the end of the day [5, 6]. As another example, Carter et al.  recorded several types of media, e.g. photographs or audio recordings, throughout the day and presented them to the user as a memory aid when completing the annotations at a later time point [7]. All these options attempt to increase the quality of the labels and to facilitate the labeling process which should result in more accurate and compliant logging of daily activities of users. However, even with the listed approaches, labeling remains an effortful and time-consuming process for the user (or researcher).

The goal of this work is to simplify diary-like labeling approaches by supporting the labeling efforts with the help of recommender systems trained on the user’s past activities and additional context information. This work builds upon similar concepts in related research fields. One example is an approach for automatic diary generation, where daily diary entries of high-level activities are generated based on data mining techniques on a smartphone without requiring the user to enter any information [8]. Another example is activity recognition based on body-worn sensors, which often uses semi-supervised learning techniques, as it doesn‘t require large amounts of labeled data [1].
In the proposed work, such a recommender systems will be implemented as an Android app, which provides an interface for manual activity labeling and additionally continuously records available smartphone data (such as wifi-connectivity, location or screen-on time) in the background. Using a combination of the recorded sensor data and manual labels provided by the user, this system should identify patterns in the user’s daily life. This information is then used to support labeling of future activities by providing smart recommendations, ideally reducing the amount of required manual input over time.
This app then could be used as a supportive modality for sensor-based studies, either to provide a convenient way to gather ground truth labels by facilitation of labeling for the training of activity recognition algorithms or to add additional context information which requires manual user input, aiding the analysis of the recorded sensor data. Reducing the labeling effort for the study participant ideally results in higher quality data and higher motivation to participate in longer studies.


  1.  M. Stikic, K. Van Laerhoven, and B. Schiele, “Exploring Semi-Supervised and Active Learning for Activity Recognition,” in 2008 12th IEEE International Symposium on Wearable Computers, 2008, pp. 81–88.
  2.  S. Dernbach, B. Das, N. C. Krishnan, B. L. Thomas, and D. J. Cook, “Simple and complex activity recognition through smart phones,” in 2012 Eighth International Conference on Intelligent Environments, 2012, pp. 214–221.
  3. N. Bolger, A. Davis, and E. Rafaeli, “Diary Methods: Capturing Life as it is Lived,” Annual Review of Psychology, vol. 54, no. 1, pp. 579–616, 2002.
  4.  A. A. Stone, S. Shiffman, J. E. Schwartz, J. E. Broderick, and M. R. Hufford, “Patient compliance with paper and electronic diaries,” Controlled Clinical Trials, vol. 24, no. 2, pp. 182–199, 2003.
  5. M. Dietrich and K. Van Laerhoven, “Recall your Actions! Using Wearable Activity Recognition to Augment the Human Mind,” in Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp ’14 Adjunct), 2014, pp. 1347–1353.
  6. K. Van Laerhoven, D. Kilian, and B. Schiele, “Using rhythm awareness in long-term activity recognition,” in 2008 12th IEEE International Symposium on Wearable Computers, 2008, pp. 63–66.
  7. S. Carter and J. Mankoff, “When participants do the capturing: the role of media in diary studies,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’05), 2005, pp. 899–908.
  8. J. Liao, Z. Wang, L. Wan, Q. C. Cao, and H. Qi, “Smart Diary: A Smartphone- Based Framework for Sensing, Inferring, and Logging Users’ Daily Life,” IEEE Sensors Journal, vol. 15, no. 5, pp. 2761–2773, 2015.