We now have a new way that your smartphone could save your life, using a simple selfie. Conventional camera sales tumbled after the 2007 launch of the first generation Apple iPhone with a 2MP rear camera. Apple’s 2010 iPhone 4 introduction gave the world the first widely-distributed smartphone with a front image and video camera, a 0.3 MP VGA device that, in turn, launched the “selfie.” Smartphone mobile apps and wireless connectivity were critical enabling components of wearable and portable digital health and medical technologies. New research at multiple Chinese Universities and hospitals indicates that smartphone selfies could save your life.

In a study published on August 21, 2020, in the European Heart Journal, lead researcher Zhe Zheng and others demonstrated that AI could detect heart disease using deep-learning to analyze patient selfies. Among other positions, Zheng is vice director of the National Center for Cardiovascular Diseases and vice president of Fuwai Hospital, Chinese Academy of Medical Sciences, and Peking Union Medical College in Beijing. Zheng and others enrolled nearly 6,000 cardiac patients from eight Chinese hospitals. The subjects were already having imaging to investigate blood vessels. They were divided into training and validation groups. Trained research nurses took four digital facial photos of each of the patients. The nurses also interviewed the patients and recorded their medical history, lifestyle, and socioeconomic data. Radiologists then reviewed the patient angiograms to determine the degree of heart disease based on blood vessel narrowing and location. Researchers used the patient facial images and other data to build, train, and test the deep-learning algorithm.

Once they established the algorithm, the team tested another 1,013 patients from nine hospitals. The algorithm outperformed two conventional clinical disease risk models (the Diamond-Forrester model and the CAD consortium clinical score). In the original validation group, the deep-learning model scored 80% sensitivity (positive test accuracy) and 61% specificity (negative test accuracy). The test group results were again 80% sensitivity but slipped to 54% specificity.

The study report’s discussion section considered the algorithm only moderately successful due to the relatively low specificity accuracy rate. However, because additional information and clinical testing did not improve diagnosis accuracy, the team concluded the algorithm is appropriate for use independently to predict potential heart disease from facial photos. Analysis of the nose, forehead, and cheeks provided more useful data than other facial areas, which supported the researchers’ belief that patient selfies have the potential for self-screening for cardiac disease.

Zheng’s team continues to work on the algorithm, aiming to improve the tool’s specificity to reduce false-negative test results. Study limitations include limited population sampling because the patients in all three groups were of Han Chinese ethnicity. The paper also presents concerns about the potential misuse of facial data for discriminatory purposes or to breach personal data security.

We recognize the need for additional, broader testing and consideration to protect ethnic and personal identity. All the same, the potential for a simple selfie to outperform gold-standard clinical tests in screening for heart disease shines a bright light on selfies, AI, and machine learning as tools to combat one of the most frequent causes of death worldwide.