Abstract
Ambient Assisted Living using mobile device sensors is an active area of research in pervasive computing. Multiple approaches have shown that wearable sensors perform very well and distinguish falls reliably from Activities of Daily Living. However, these systems are tested in a controlled environment and are optimized for a given set of sensor types, sensor positions, and subjects. In this work, we propose a self-adaptive pervasive fall detection approach that is robust to the heterogeneity of real life situations. Therefore, we combine sensor data of four publicly available datasets, covering about 100 subjects, 5 devices, and 3 sensor placements. In a comprehensive evaluation, we show that our system is not only robust regarding the different dimensions of heterogeneity, but also adapts autonomously to spontaneous changes in the sensor's position at runtime.