Human Activity Recognition (HAR)

By: Kristina P. Sinaga

Human activity recognition (HAR) as a challenging time series classification task has been an important area of computer vision research since the 1980s. Anguita et al. in 2013 [1] defines human activity recognition (HAR) as a framework to identify the actions carried out by a person given a set of observations of him/herself and the surrounding environment. These observations can be recorded remotely by using camera video, wearable device, and binary sensor.

Generally, activity recognition (AR) using binary sensors provide information for monitoring elderly people and people who are living independently. It involves predicting the movement of a person based on sensor data and traditionally involves deep domain expertise and methods from signal processing to correctly engineer features from the raw data to fit a machine learning model. Passive infrared sensors and motion detectors enable the light devices in sensor to recognize more complex activities, such as toileting, grooming, cooking or undertaking light housework.

The activity recognition based on sensor data has been the focus of extensive research. Unlike camera video and wearable device, sensor did not directly identify people but enable to recognize their activities. For instance, an accelerometer in a smartphone is often used for measuring a person’s movement. An accelerometer in a smartphone measures the force of acceleration caused by movement or by gravity, or vibration. The accelerometer in smartphone can track each motion, whether it is moving uphill, falling over, tilting, flying horizontally, or aligning downward. The measurement of the accelerometer even stable with the change of velocity or speed. Moreover, it can protect individual privacy without disrupted the observations. Therefore, it is reasonable and practical to reduce the cost and complexity of using techniques such as camera video, and wearable devices.

Here is a scenario for reader get better understanding about human activity recognition. A man has an elderly mother living alone one hour away. Last week she knocked the phone off the hook and was unavailable for an entire day. The man walks into a hardware store and emerges with a large brown box. It contains several dozen nondescript, quarter-sized sensors that stick to any surface. Following directions, the man attaches sensors to doors, drawers, and chairs. He pulls out a CD-ROM and installs software on a personal computer and plugs a device into a USB port. The software instructs him to perform a quick walk-through of the house, touching every sensor. Later that week the man logs onto the internet, types a password, and checks to see that his mother has eaten lunch. One week later he checks that she has been cooking and eating meals. One month later he checks whether her activity levels are steady. The system reports that activity levels are abnormally low today. He calls and finds that his mother seems to be coming down with the flu.

References

  • Anguita, D. [et al.]. A public domain dataset for human activity recognition using smartphones. A: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. “Proceedings of the 21th International European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning”. Bruges: 2013, p. 437-442
  • Wilson D.H., Atkeson C. (2005) Simultaneous Tracking and Activity Recognition (STAR) Using Many Anonymous, Binary Sensors. In: Gellersen H.W., Want R., Schmidt A. (eds) Pervasive Computing. Pervasive 2005. Lecture Notes in Computer Science, vol 3468. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11428572_5