05/2018 – 10/2018
Wearable sensors have the potential to assess human movement outside the laboratory, e.g. for the ambulant assessment of patients or field studies of athletes. Machine learning methods have been proposed to learn the mapping between inertial sensor data and biomechanical parameters from laboratory measurements. A lot of training data is necessary to represent all possible conditions like movement types, population characteristics, and measurement setups. Especially, deep neural networks require large annotated data. Contemplementary, biomechanical parameters can be yielded by tracking sensor-data with a musculoskeletal model solving an optimal control problem. This method does not require training data, but suffers from the limitations induced by the physical model. In this work, the physics-based and data-based approach should be combined to take advantage of both methods. The idea is to employ the physical simulation to synthesize training data. The synthesized data should then be transformed to better match real data, e.g. by modeling sensor noise or movement artefacts. A learning framework should be trained on real and synthetic data.
- A. J. Van Den Bogert, D. Blana, and D. Heinrich, Implicit methods for efficient musculoskeletal simulation and optimal control, Procedia IUTAM, vol. 2, pp. 297-316, 2011.
- A. Shrivastava, T. Pster, O. Tuzel, J. Susskind, W. Wang, and R. Webb, Learning from Simulated and Unsupervised Images through Adversarial Training, 2016.
- J. Hannink, T. Kautz, C. F. Pasluosta, K.-G. Gassmann, J. Klucken, and B. M. Eskofier, Sensor-based Gait Parameter Extraction with Deep Convolutional Neural Networks, IEEE J. Biomed. Heal. informatics, pp. 1-8, 2016.
- S. Yao, S. Hu, Y. Zhao, A. Zhang, and T. Abdelzaher, DeepSense: a Unied Deep Learning Framework for Time-Series Mobile Sensing Data Processing, www, pp. 17-4, 2017.