Phone sensor->cloud

Human Activity Recognition (HAR) based on wearable sensor data aims to identify a person's actions, such as standing, sitting, jumping, and going up stairs, by using data from body-worn sensors. Conventional pattern recognition based on heuristic hand-crafted feature extraction do not generalize well. But automatic high-level feature extraction using deep learning shows promise in many areas.

Data needed for analysis is available. One example is the Human Activity Recognition Using Smartphones Data Set. It is a Human Activity Recognition database built from the recordings of 30 subjects performing activities of daily living (ADL) while carrying a waist-mounted smartphone with embedded inertial sensors.

By using your smartphone browser and saving the data to Google sheets, you can generate your own data to verify the solutions. Sensor readings have high precision timestamps, enabling synchronization of data between the different sensors like Accelerometer and Gyroscope

We developed an Android app in 2011 called Scratch Sensor that allows Scratch to read the sensor data from Android phones and this site is a continuation of that effort. We provide the raw data and leave it to the imagination of users to make use of that data.

References
  1. How to Model Human Activity From Smartphone Data
  2. A public domain dataset for human activity recognition using smartphones
  3. Deep Learning for Sensor-based Activity Recognition: A Survey