News | Artificial Intelligence | August 24, 2020

Selfies Might be Used to Detect Heart Disease

New research uses artificial intelligence to analyze facial photos

Selfied might be used with AI to identify patients with heart disease. Getty Images

Getty Images

August 24, 2020 — Sending a photo selfie to the doctor could be a cheap and simple way of detecting heart disease using artificial intelligence (AI), according to the authors of a new study published in the European Heart Journal.[1]

The study is the first to show that it is possible to use a deep learning computer algorithm to detect coronary artery disease (CAD) by analyzing four photographs of a person’s face.

Although the algorithm needs to be developed further and tested in larger groups of people from different ethnic backgrounds, the researchers say it has the potential to be used as a screening tool that could identify possible heart disease in people in the general population or in high-risk groups, who could be referred for further clinical investigations.

“To our knowledge, this is the first work demonstrating that artificial intelligence can be used to analyze faces to detect heart disease. It is a step towards the development of a deep learning-based tool that could be used to assess the risk of heart disease, either in outpatient clinics or by means of patients taking ‘selfies’ to perform their own screening. This could guide further diagnostic testing or a clinical visit,” said Professor Zhe Zheng, who led the research and is vice director of the National Center for Cardiovascular Diseases and vice president of Fuwai Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, People’s Republic of China.

“Our ultimate goal is to develop a self-reported application for high-risk communities to assess heart disease risk in advance of visiting a clinic," he explained. "This could be a cheap, simple and effective of identifying patients who need further investigation. However, the algorithm requires further refinement and external validation in other populations and ethnicities.”

It is known already that certain facial features are associated with an increased risk of heart disease. These include thinning or grey hair, wrinkles, ear lobe crease, xanthelasmata (small, yellow deposits of cholesterol underneath the skin, usually around the eyelids) and arcus corneae (fat and cholesterol deposits that appear as a hazy white, grey or blue opaque ring in the outer edges of the cornea). However, they are difficult for humans to use successfully to predict and quantify heart disease risk.

Zheng, Professor Xiang-Yang Ji, who is director of the Brain and Cognition Institute in the Department of Automation at Tsinghua University, Beijing, and other colleagues enrolled 5,796 patients from eight hospitals in China to the study between July 2017 and March 2019. The patients were undergoing imaging procedures to investigate their blood vessels, such as coronary angiography or coronary computed tomography angiography (CCTA). They were divided randomly into training (5,216 patients, 90%) or validation (580, 10%) groups.

Trained research nurses took four facial photos with digital cameras: one frontal, two profiles and one view of the top of the head. They also interviewed the patients to collect data on socioeconomic status, lifestyle and medical history. Radiologists reviewed the patients’ angiograms and assessed the degree of heart disease depending on how many blood vessels were narrowed by 50% or more (≥ 50% stenosis), and their location. This information was used to create, train and validate the deep learning algorithm.

The researchers then tested the algorithm on a further 1,013 patients from nine hospitals in China, enrolled between April 2019 and July 2019. The majority of patients in all the groups were of Han Chinese ethnicity.

They found that the algorithm out-performed existing methods of predicting heart disease risk (Diamond-Forrester model and the CAD consortium clinical score). In the validation group of patients, the algorithm correctly detected heart disease in 80% of cases (the true positive rate or ‘sensitivity’) and correctly detected heart disease was not present in 61% of cases (the true negative rate or ‘specificity’). In the test group, the sensitivity was 80% and specificity was 54%.

“The algorithm had a moderate performance, and additional clinical information did not improve its performance, which means it could be used easily to predict potential heart disease based on facial photos alone," Prof. Ji said. "The cheek, forehead and nose contributed more information to the algorithm than other facial areas. However, we need to improve the specificity as a false positive rate of as much as 46% may cause anxiety and inconvenience to patients, as well as potentially overloading clinics with patients requiring unnecessary tests.”

As well as requiring testing in other ethnic groups, limitations of the study include the fact that only one center in the test group was different to those centers, which provided patients for developing the algorithm, which may further limit its generalizabilty to other populations.

In an accompanying editorial,[2] Charalambos Antoniades, professor of Cardiovascular Medicine at the University of Oxford, U.K., and Dr Christos Kotanidis, a DPhil student working under Prof. Antoniades at Oxford, write, “Overall, the study by Lin et al. highlights a new potential in medical diagnostics... The robustness of the approach of Lin et al. lies in the fact that their deep learning algorithm requires simply a facial image as the sole data input, rendering it highly and easily applicable at large scale.”

They continued, “Using selfies as a screening method can enable a simple yet efficient way to filter the general population towards more comprehensive clinical evaluation. Such an approach can also be highly relevant to regions of the globe that are underfunded and have weak screening programs for cardiovascular disease. A selection process that can be done as easily as taking a selfie will allow for a stratified flow of people that are fed into healthcare systems for first-line diagnostic testing with CCTA. Indeed, the ‘high risk’ individuals could have a CCTA, which would allow reliable risk stratification with the use of the new, AI-powered methodologies for CCTA image analysis.”

They highlight some of the limitations that Prof. Zheng and Prof. Ji also include in their paper. These include the low specificity of the test, that the test needs to be improved and validated in larger populations, and that it raises ethical questions about “misuse of information for discriminatory purposes. Unwanted dissemination of sensitive health record data, that can easily be extracted from a facial photo, renders technologies such as that discussed here a significant threat to personal data protection, potentially affecting insurance options. Such fears have already been expressed over misuse of genetic data, and should be extensively revisited regarding the use of AI in medicine”.

The authors of the research paper agree on this point. Prof. Zheng said: “Ethical issues in developing and applying these novel technologies is of key importance. We believe that future research on clinical tools should pay attention to the privacy, insurance and other social implications to ensure that the tool is used only for medical purposes.”

Prof. Antoniades and Dr. Kotanidis also write in their editorial that defining CAD as ≥ 50% stenosis in one major coronary artery “may be a simplistic and rather crude classification as it pools in the non-CAD group individuals that are truly healthy, but also people who have already developed the disease but are still at early stages (which might explain the low specificity observed)."

 

References:

1. Shen Lin et al. Feasibility of using deep learning to detect coronary artery disease based on facial photo. European Heart Journal. doi:10.1093/eurheartj/ehaa640.

2. Christos P. Kotanidis and Charalambos Antoniades. Selfies in cardiovascular medicine: welcome to a new era of medical diagnostics. European Heart Journal. doi:10.1093/eurheartj/ehaa608.

Related Content

The U.S. Food and Drug Administration (FDA) has cleared AliveCor's Kardia AI V2 next generation of interpretive artificial intelligence (AI)-based personal electrocardiogram (ECG) algorithms.

The U.S. Food and Drug Administration (FDA) has cleared AliveCor's Kardia AI V2 next generation of interpretive artificial intelligence (AI)-based personal electrocardiogram (ECG) algorithms.

News | Artificial Intelligence | November 24, 2020
November 24, 2020 — The U.S.
Dia's LVivo artificial intelligence software can help automate many features of echocardiograms to speed workflow and aid novice users. The software is now integrated into the Konica Minolta Exa PACS.

Dia's LVivo artificial intelligence software can help automate many features of echocardiograms to speed workflow and aid novice users. The software is now integrated into the Konica Minolta Exa PACS. 

News | Artificial Intelligence | November 12, 2020
November 12, 2020 – Konica Minolta Healthcare Americas Inc. and DiA Imaging Analysis Ltd.
The artificial intelligence-driven Caption Guidance software guides point of care ultrasound (POCUS) users to get optimal cardiac ultrasound images. The AI software is an example of a FDA-cleared software that is helping improve imaging, even when used by less experienced users.

The artificial intelligence-driven Caption Guidance software guides point of care ultrasound (POCUS) users to get optimal cardiac ultrasound images. The AI software is an example of a FDA-cleared software that is helping improve imaging, even when used by less experienced users.

Feature | Artificial Intelligence | September 29, 2020 | Joe Fornadel, J.D., and Wes Moran, J.D.
The number of Federal Drug Administration (FDA)-approved AI-based algorithms is significant and has grown at a steady
aption Health a leader in medical AI technology, has received U.S. Food and Drug Administration (FDA) 510(k) clearance for an updated version of Caption Interpretation, which uses artificial intelligence (AI) to enable clinicians to obtain quick, easy and accurate measurements of cardiac ejection fraction (EF) at the point of care.

Caption Health a leader in medical AI technology, has received U.S. Food and Drug Administration (FDA) 510(k) clearance for an updated version of Caption Interpretation, which uses artificial intelligence (AI) to enable clinicians to obtain quick, easy and accurate measurements of cardiac ejection fraction (EF) at the point of care.

News | Artificial Intelligence | August 19, 2020
August 19, 2020 — Caption Health a leader in medical AI technology, has received U.S.
In February 2020, the U.S. Food and Drug Administration (FDA) cleared artificial intelligence software to assist in the acquisition of cardiac ultrasound images. The Caption Guidance software from Caption Health is an accessory to compatible diagnostic ultrasound systems and uses artificial intelligence to help the user capture images of a patient’s heart that are of acceptable diagnostic quality. It is aimed at point of care ultrasound (POCUS) exams, where users may not be regular sonographers.

In February 2020, the U.S. Food and Drug Administration (FDA) cleared artificial intelligence software to assist in the acquisition of cardiac ultrasound images. The Caption Guidance software from Caption Health is an accessory to compatible diagnostic ultrasound systems and uses artificial intelligence to help the user capture images of a patient’s heart that are of acceptable diagnostic quality. It is aimed at point of care ultrasound (POCUS) exams, where users may not be regular sonographers.

Feature | Artificial Intelligence | August 18, 2020 | Dave Fornell, Editor
The No.
An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019.

An example of DiA'a automated ejection fraction AI software on the GE vScan POCUS system at RSNA 2019.

News | Artificial Intelligence | May 26, 2020
May 26, 2020 — DiA Imaging Analysis, a provider of AI based ultrasound analysis solutions, said it received a governm