Toward on-device weight monitoring from selfie face images using smartphones
Authors
Advisors
Issue Date
Type
Keywords
Citation
Abstract
Obesity is a serious health problem that is on the rise at the global level. Recent studies suggest that BMI can be inferred from facial images using deep learning-based convolutional neural networks (CNNs) for obesity classification with about 85–90% accuracy. However, training and testing these deep learning models involves high computation and storage due to the involvement of millions of parameters. A recent trend is the use of lightweight CNN models to facilitate on-device computation in resource-constrained mobile and wearable devices. In this study, we evaluate several lightweight CNNs such as MobileNet-V2, ShuffleNet-V2, and lightCNN-29 for BMI prediction and obesity classification from facial images captured using smartphones. The comparative analysis is done with heavyweight VGG-16 and ResNet-50-based CNN models. These lightweight models when deployed on smartphones can act as self-diagnostic tool in weight changes and obesity monitoring. These tools can facilitate remote monitoring of patients, obtaining patients’ vital signs, and in improving the quality of care provided. Self-diagnostic tools would also help in keeping users’ health data private, safe, and secure.