Aging 10 101629

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

www.aging‐us.

com               AGING 2018, Vol. 10, No. 11

Research Paper

PhotoAgeClock: deep learning algorithms for development of non‐
invasive visual biomarkers of aging 
 
Eugene Bobrov1,2,*, Anastasia Georgievskaya1,3,*, Konstantin Kiselev1,*, Artem Sevastopolsky1,4, 
Alex Zhavoronkov5,6,7, Sergey Gurov2, Konstantin Rudakov3, Maria del Pilar Bonilla Tobar8, Sören 
Jaspers8, Sven Clemann8 
 
1
HautAI OU, Tallinn, Estonia 
2
Lomonosov Moscow State University, Moscow, Russia 
3
Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Moscow, Russia 
4
Skolkovo Institute of Science and Technology, Moscow, Russia 
5
Insilico Medicine, Rockville, MD 20850, USA 
6
The Buck Institute for Research on Aging, Novato, CA 94945, USA 
7
The Biogerontology Research Foundation, London, UK 
 8Beiersdorf AG, Hamburg, Germany 
*
Equal contribution 
 
Correspondence to: Anastasia Georgievskaya; email:  [email protected] 
Keywords: photographic aging clock, photographic aging biomarker, age prediction, biomedical imaging, computer vision, 
deep learning 
Received:  August 30, 2018  Accepted:  October 27, 2018  Published:  November 9, 2018 
 
Copyright:  Bobrov  et  al.  This  is  an  open‐access  article  distributed  under  the  terms  of  the  Creative  Commons  Attribution
License  (CC  BY  3.0),  which  permits  unrestricted  use,  distribution,  and  reproduction  in  any  medium,  provided  the  original
author and source are credited.
 
ABSTRACT

Aging  biomarkers  are  the  qualitative  and  quantitative  indicators  of  the  aging  processes  of  the  human  body.
Estimation  of  biological  age  is  important  for  assessing  the  physiological  state  of  an  organism.  The  advent  of
machine  learning  lead  to  the  development  of  the  many  age  predictors  commonly  referred  to  as  the  “aging
clocks” varying in biological relevance, ease of use, cost, actionability, interpretability, and applications. Here
we  present  and  investigate  a  novel  non‐invasive  class  of  visual  photographic  biomarkers  of  aging.  We
developed  a  simple  and  accurate  predictor  of  chronological  age  using  just  the  anonymized  images  of  eye
corners  called  the  PhotoAgeClock.  Deep  neural  networks  were  trained  on  8414  anonymized  high‐resolution
images of eye corners labeled with the correct chronological  age. For people within the age range of 20 to 80
in a specific population, the model was able to achieve a mean absolute error of 2.3 years and 95% Pearson and
Spearman correlation.  
O erro seria grande dms para perceber sinais em um ano

INTRODUCTION perceived age reflects the age other people guess a


person to be [1]. Chronological age predictors can be
One of the critical challenges in aging and longevity used to identify the divergence between estimated
research and healthcare in general is the development of biological or perceived age and true chronological age
widely-available and reliable biomarkers of aging. among people with accelerated or delayed aging [1]. It
Individuals can be of the same chronological age and is important to develop non-invasive photographic
have different biological ages. Biological age reflects biomarkers, since they are capable of providing
functional capability and physiological status, whereas valuable insights about the condition of the human

www.aging‐us.com  3249  AGING


body. Highly-accurate predictors of chronological age from either strong facial expressions or head movement.
can also be used to evaluate the various lifestyle, Qawaqneh et al. [14] employed the VGG-Face network
medical, and cosmetic interventions. for age prediction. Their solution achieves high accu-
racy of age prediction on Adience in-the-wild dataset
Humans can predict the age of other humans with (59.9 exact accuracy and 90.57 1-off accuracy for 8 age
reasonable accuracy. However, people’s error rates vary groups). Zhang et al. [15] proposed a solution for
across ethnic groups and from person to person. The simultaneous age and gender prediction with Residual
many diseases as well as the general health status of the Networks of Residual Networks (RoR) pre-trained on
person are often apparent to the trained professionals, the ImageNet [16] and the IMDB-WIKI [8] datasets.
family members and untrained individuals. Humans Their solution achieves very high quality on the
without prior medical training can detect a variety of Adience dataset (66.7 exact accuracy and 97.38 1-off
acute diseases using just facial images [2]. accuracy) with a 34-layer network. Rothe et al. [8]
studied the tasks of chronological (real) and apparent
In order to investigate the biological relevance of (perceived by other people) age prediction. Their
photographic biomarkers of biological age, the accurate pipeline started with a face detector which determined
chronological age predictors must be developed and the position of a face on an input image, the position
studied. In this study, the photographic images of was then normalized (without using facial landmarks).
human skin were used to predict age. Wrinkles and Next, classification occured via the convolution neural
changes in skin pigmentation indicate aging making network (CNN) pre-trained on ImageNet dataset and
skin condition a reasonably good predictor of fine-tuned on the IMDB-Wiki dataset. The authors
chronological age (here and below the age refers to the reported the MAE of 3.318 years for apparent age
chronological age). Our main finding was that the prediction on IMDB-Wiki dataset and MAE 2.68 and
photographic images of the skin around the eye can 3.09 years for chronological age prediction on
serve as a very accurate, non-invasive biomarker of the MORPH2, and FG-NET datasets, respectively. Another
age. We also found that this photographic aging clock application of using face images as a biomarker of
was able to estimate age with higher precision than the aging involves imposing age changes on the image
methylation aging clock commonly referred to as the using Deep Feature Interpolation [17] and Generative
Horvath’s clock [3]. Horvath’s clock is a DNA-based Adversarial Networks (GANs) [18]. It is possible to
boa
observação epigenetic measure which is considered to be a state-of- produce an older image of a person from a recent
the-art aging biomarker. While age predictors such as photograph with high quality results [17, 18]. One of
the methylation aging clock are accurate in predicting the popular approaches to age estimation is based on the
the chronological age and multiple studies suggest the analysis of various biological data types. For instance,
biological relevance of these clocks, there are some Zhavoronkov et al. [19] predicted age by training the
questions regarding the biological utility of and the ensembles of deep neural networks (DNN) on the basic
relationship between these predictors [4]. blood biochemistry data. These models were trained on
over 60,000 blood biochemistry tests samples. The
There are numerous papers on age prediction with authors reported the values R2 = 0.82 and MAE = 5.55
biomarkers; these papers cover a broad range of years for chronological age prediction. The ensemble of
disciplines including biology, bioinformatics, machine DNNs also identified the five most important markers
learning and computer vision. The use of the facial for predicting human chronological age: albumin,
images for age estimation is widespread. This approach glucose and urea concentration, alkaline phosphatase
is supported by a large number of images of faces and activity, and erythrocytes number. Further studies of
datasets of faces available on the internet, such as FG- this approach helped establish its biological relevance
NET [5], MORPH (the largest public face aging by testing the modified predictor on the large population
dataset) [6], Adience [7], and IMDB-Wiki [8]. The data sets including the data from the National Health
predictors trained on imaging data are commonly used and Nutrition Examination Survey (NHANES) and
for related tasks, such as: gender estimation, landmark demonstrate that the DNN-based age-predictors can be
estimation, and 3D model reconstruction. Most population-specific by comparing the accuracy of the
computation methods invented in the last decade rely on predictor in the Canadian, Korean and Eastern European
statistical models and manually-designed features [9- Populations [20]. Age can also be predicted by DNA
12], which are generally useful only for specific tasks methylation rate, as was discussed in this seminal paper
[13]. However, with advances in computer vision, age [21]. In this paper, using of 8,000 samples from 82
estimation can even be done on unconstrained (in-the- Illumina DNA methylation array datasets, encompass-
wild real-life) images. Unconstrained images are images ing 51 healthy tissues and cell types, allowed age to be
that may contain artifacts such as blur, occlusion, or predicted with R2 = 0.96 (between methylation-base
various degrees of deformation. Deformation can occur predicted age and true chronological age) and MAE =

www.aging‐us.com  3250  AGING


2.7 years. Despite high precision, the main disadvantage respectively (see Table 1). Table 2 contains the
of DNA methylation-based methods is their invasive accuracy evaluations of the predictions for the various
nature [5, 21]. age groups on Adience dataset.

Our work is devoted to usage of deep learning approach The neural network model for age prediction was
for accurate chronological age prediction and invest- designed to accept images of an arbitrary resolution,
tigation of features contributing to age prediction. This then the convolutional layers applied kernels of fixed
method only requires a single high-resolution photo of size to the image regions. Lower resolution images had
the corner eye area. The eye corner area of the human relatively sharper color transitions, for the convolutional
face is believed to be the most prone to aging [22]. kernels this corresponded to rougher skin look. On the
Therefore, we believe it holds important clues for the other hand, higher image resolution corresponded to
creation of photographic aging biomarkers. smoother colors and better smooth-looking skin. Age
was greatly overestimated for the lower-resolution
RESULTS images (224 x 224 pixels) and greatly underestimated
for the higher-resolution images (424 x 424 pixels)
Neural networks training when passed through the developed neural network,
with kernels trained for 299 x 299 pixels resolution (see
The best Xception-based model achieved a MAE of Fig. 1). Therefore, it seems that the model heavily
2.38 and of 2.30 years before and after skip-connection, depends on skin conditions, such as wrinkles and pig-

Table 1. Comparison of the best described approaches of age estimation and their accuracy 
assumed by MAE, years.

Approach name Dataset MAE, years

Xception (this work) Eye corners photos 2.38

Xception with skip-connections (this work) Eye corners photos 2.30

VGG [8] FG-NET 3.09


VGG [8] MORPH 2.68

SVR on Gabor filters [5] FG-NET 3.17

Penalized regression model [21] DNA-methylation data 2.70


Ensemble of 21 DNN [19] Blood sample test 5.55

Figure 1. Prediction error (predicted age minus true age) for the same 25 images with various resolutions.
Images were passed through the developed neural network, with kernels trained for 299 x 299 pixels resolution. 

www.aging‐us.com  3251  AGING


pigmentation. For downsampled images, convolutional accuracy of age prediction is almost same as for the
kernels coincided with larger areas of the image, which original not occluded image. For the picture of the
corresponded to relatively more smooth skin. For subject of younger age (50 years) the accuracy decreases
upsampled images this pattern was reversed. dramatically for the image with occluded eyelid area and
eye corner area, the decrease in accuracy is less
Effect of area occlusion on the prediction quality significant for the older adult (62 years). When half of the
eye corner image is occluded, the accuracy falls dra-
The progression of the prediction error with the image matically for older age subjects as well. These results are
area occluded is shown on Fig. 2 for two people of represented in Fig. 3 as a qualitative evaluation. Same
different age. When only the eye area is closed, the trend was observed on a number of subjects (see Fig. 4).

Table 2. Exact and 1‐off accuracy of age estimation (this work) for 
Adience dataset age groups. 
Age group (years range) Exact accuracy 1-off accuracy

25-32 0.68 0.98

33-38 0.50 1.00

38-44 0.63 0.95

44-48 0.55 0.92

48-54 0.60 0.97

54-60 0.54 0.99

60-69 0.78 0.98


1‐off accuracy  represents the accuracy when the result is off by 1 
adjacent age label left or right [7].

Figure  2.  Predicted  age  vs.  the  extent  of  occlusion  for  two  persons. Picture  order  (up  to
bottom): original, covered eye area, eyelid and corner covered, and half image area covered. See text
for clarifications. Real chronological age for the left subject is 50 years, for the right subject is 62 years. 

www.aging‐us.com  3252  AGING


Figure 3. Estimated age vs. the occlusion step for two persons. The first plot represents the results for the younger‐aged person
(50 years). The second plot represents the results for the older‐aged person (62 years). Blue points correspond to the age produced by
zeros  tensor.  This  age  reflects  the  initial  step  of  age  estimation  by  neural  network  model  when  it  was  fed  an  all‐black  image.  This
happened because of learned biases parameters. 

Figure 4. Estimation error for several significant steps of occlusion. Mean and standard deviation of the error over 165 pairs of
validation images (left and right eye) is reported. 

Validation of the algorithm DISCUSSION

Fig. 5 contains the distribution of prediction error Comparison with other age prediction approaches
(MAE) with regard to the (w.r.t) age group. The
distribution was calculated empirically by moving The data for the accuracy of age prediction by the best
average window on four points of age bins. The plot known methods is compared in Table 1. The results
shows that prediction error is the lowest for the age clearly indicate that high-resolution information about
range of 40-65; for the age range of 20-40 and 65+ eye corner wrinkles can be utilized for accurate chrono-
years age prediction error was relatively higher. logical age estimation. We believe our approach is be-

www.aging‐us.com  3253  AGING


neficial for age prediction due to its non-invasive est relative error. The experiment implicates that the
nature, ability to work with anonymized data and high wrinkles in the eye region contained the most important
accuracy. features for age prediction suggesting that these areas
may be used for development of candidate photo-
graphic aging biomarkers. It is very important to
emphasize that the omitted area of the eye affected the
age prediction only to a small extent. Fig. 2 and Fig. 3
demonstrate that for the middle-age person (50 years
old), right after the third occlusion step which
corresponds to eyelid and eye corner covered, predicted
age falls to the value as when neural network model was
fed an all black image (“zeros age”). For the older
person (62 years old) the predicted age falls slower
w.r.t. image area occluded. The same effect was
observed for many different images (see Fig. 4). We
believe this happens since age-related wrinkles were
more evenly distributed across the face skin of the
elderly person. Another candidate photographic bio-
marker is skin pigmentation of the malar area.

Accuracy prediction for various age groups


Figure 5. PhotoAgeClock predicted age error for the test
set within different ages.   The age prediction error was quite uniform for persons
within age group 40-65 years. It was observed that age
prediction error was higher for ages range 20-40 years
and range 65+. The most possible reason is that these
Investigation of features contributing to age age groups were represented to a lesser extent in the
prediction dataset. Another possible reason for larger error in
senior age group is that divergence in human pheno-
The following procedures were performed in order to types becomes larger as they age. People of the same
detect the areas most sensitive for the age changes. In senior age can look much younger or older than they
the experiment, the eyes were occluded with black actually are as they age at a different rate. Aging rate,
markings of various sizes. The eye corner and eyelid presence of wrinkles and pigmentation, which we
areas influenced the predicted age the most. When believe are photographic biomarkers of aging depends a
occluded, the absence of these areas produced the high- lot on the lifestyle.

Figure 6. Distribution of actual age in the dataset and predicted age (PhotoAgeClock)
labels in the validation set. 

www.aging‐us.com  3254  AGING


The distributions of predicted age labels in the Correlation coefficients describe the strength and the
validation dataset resembles the age distribution in the direction of the relationship of PhotoAgeClock and
dataset (see Fig. 6). chronological age, whereas Pearson correlation co-
efficient evaluates linear relations and Spearman
Algorithm validation correlation coefficient evaluates monotonic relations.
Pearson correlation coefficient between the actual and
We considered several measures of predictive accuracy: predicted values by PhotoAgeClock on validation
mean average error (MAE), Pearson correlation dataset was equal to 0.96 and Spearman correlation
coefficient and Spearman correlation coefficient. MAE coefficient was equal to 0.95 (see Fig. 7). According to
refers to the results of measuring the difference between these accuracy measures, PhotoAgeClock performs well
PhotoAgeClock predictions and chronological age. We on high-resolution dataset of eye pictures.
got MAE of 2.3 years on validation set, which indicates
that average sum of all absolute errors between Other applications
PhotoAgeClock and chronological age on all instances
of the validation dataset is 2.3 years. A special web demo was created to conduct a quality
assessment of the constructed PhotoAgeClock.
PhotoAgeClock showed a significant extent of domain
invariance and can be applied to arbitrary high-resolu-
tion photos that contain a full face of person (see Fig. 8
left) and even photos obtained with frontal cameras of
mobile devices (see Fig.8 right).

In Fig. 8 (left) the right eye produced a relatively higher


age of 56.8 years. The left eye produced a relatively
lower age of 51.6 years. The reason for this discrepancy
may be the difference in the number of wrinkles around
the eyes in this photo.

Fig. 8 (right) demonstrates PhotoAgeClock can accurate-


ly recognized age even despite the strong face expres-
sion.

Limitations

We represent a novel non-invasive biomarker of aging


and demonstrate that PhotoAgeClock can predict
chronological age with quite high accuracy on the
Figure 7.  Correlation between predicted age and actual specific dataset of high-resolution eye corner images.
age on validation dataset.  However, the true biomarker of aging needs to show the

Figure  8.  Algorithm  performance  on  images  obtained  with  professional  cameras  and  mobile  devices. (left)  Algorithm
performance on a high resolution photo of a celebrity (George Clooney). Chronological age of the person for the time when the picture was
taken  was  53  years,  predicted  age  by  two  eye  corner  areas  is  54.2  years.  Editorial  credit: Denis  Makarenko / Shutterstock.com.  (right)
Algorithm performance on photo obtained with frontal camera of mobile device (selfie). Chronological age of the person is 22, predicted
age by two eye corner areas is 23.5. The skin of eye area is smooth enough and young age is recognized despite the strong face expression. 

www.aging‐us.com  3255  AGING


association with morbidity or mortality, and this Algorithm development
analysis requires experiments on retrospective data.
Our approach aimed to solve the task of age prediction.
PhotoAgeClock demonstrates the potential to be We used Xception [24], a DNN-based model. In this
utilized for assessment of high-quality pictures neural network model all layers, except the last fully-
obtained with devices like professional cameras and connected (dense) layer, were initialized with pre-
smartphones (see Fig. 8), nevertheless, PhotoAgeClock trained weights from the ImageNet [16] dataset. We
works best on standardized high-resolution images of modified the model to increase its quality in the
eye corners. following manner. We added skip-connections from
each residual block to the dense output layer and
MATERIALS AND METHODS changed the last layer to fit the regression model. We
used the ADAM optimization algorithm [25] and MSE
The dataset loss function for training. The best quality was reached
after 150 epochs.
The dataset consisted of 8,414 anonymized high-
resolution left and right eye corner photos of caucasian Figs. 9 A-C visually represent the cases of the algorithm
females with labeled true chronological age. The dataset application when PhotoAgeClock: 1. predicts age
was split into a training set and a test set at the correctly (Fig. 9A), 2. overestimates age the most (Fig.
proportion of 7:1. Training set and the test set contained 9B) and 3. underestimates age the most (Fig. 9C).
only images for different people to avoid overfitting.
Images of the left and the right eye corners of the same Algorithm validation
person were put together, either in the training set or in
the test set. The age of the individs was in the range of We used MAE to evaluate algorithm accuracy. Pearson
20-80 years. Initially, images size was 2258 x 1506 correlation coefficient and Spearman correlation
pixels. The age distribution in the dataset used was coefficient were used to measure correlations between
uneven: the age range of 40-65 years was represented to actual age and predicted age by PhotoAgeClock.
a higher extent (see Fig. 6). During training, we
sampled images at probabilities equal to the inverse Screening of the images for the pattern more
frequencies of their ages. Thus, the images of persons sensitive for age changes
with each age were presented to the neural network with
the same frequency. It made the model suited to work In order to evaluate how the predicted age estimation
with all age groups from the dataset. changed when a certain fraction of the area of the image
was occluded, we detected eye landmarks with Dlib
Image pre-processing library [26]. Then the color of pixels around the eye
border was changed to black. The difference between
The input images were resized to 299 x 299 pixels, to fit the age predicted for the images without the black
the DNN input dimension, and color intensities were pixels and images occluded area with black pixels was
linearly translated to a range [-1; 1]. They contained calculated. Photos of the left and right eyes for two
facial skin details such as wrinkles and pigmentation different age people with different occlusion steps are
which are important features of aging process. presented in Fig. 2. For this paper the eye areas were
covered with red markings for anonymity purposes.
In addition, the images from test dataset were resized However, in the he actual dataset the eye areas were not
from their original resolutions: to 424 x 424 pixels, to covered.
the standard resolution for our neural network (299 x
299 pixels), and to a lower resolution (224 x 224 pixels CONCLUSION
— commonly used as a standard for ResNet [23]).
In this study we developed a deep-learning network
The following data augmentation methods were applied that uses photographs of eye corners from facial
to increase the dataset size: horizontal and vertical images to predict human chronological age referred to
mirroring, rotation up to ±10◦, horizontal and vertical as the PhotoAgeClock. We demonstrated that by
shifts up to ±15% of width and height, respectively, making use of the current advances in deep learning and
zoom from 70% to 130% of image size, and affine shear computer vision, it is possible to achieve very high
with an angle up to ±0.5 radians. The quantitative quality age estimations based only on the information
degree of each elementary image transformation was from a small facial region (the eye and the skin area
picked uniformly from the aforementioned parameter around the eye). These findings have several important
ranges. implications.

www.aging‐us.com  3256  AGING


Figure 9. Examples of PhotoAgeClock performance. (A) Cases when the trained model produced the
lowest errors on the test set. (B) Cases when the trained model overestimated age the most on the test set.
(C) Cases when the trained model underestimated the age the most on the test set. True vs. predicted age is
labeled. Eye areas were erased for anonymity purposes but were present in the actual dataset pictures. 

Firstly, when estimating age from facial images, high- related with age. It may be possible to use these
resolution images are very beneficial. Secondly, biomarkers of aging to provide early detection and
wrinkles and skin pigmentation serve as reliable non- prevent the onset of a variety of diseases. The
invasive visual biomarkers of aging, thus, they can be emergence of the many credible geroprotectors [28],
used as a source of valuable insight into the condition of including senolytics [29], NAD-pathway modulators
the human body and health. Thirdly, as accuracy of age [30-32], metformin [33], rapamycin and other TORC
prediction does not decrease significantly when inhibitors [34-35], their natural mimetics [36] and the
occluding the eye area PhotoAgeClock can be utilized many techniques for identifying novel interventions
to predict age on anonymized datasets of images. [37] calls for the rapid assessment of efficacy and safety
and any panel of aging biomarkers can be easily
We also found that, compared to other regions on the augmented with the PhotoAgeClock and other non-
investigated images, the trained model considers the invasive photographic predictors of age. We believe the
skin around the eye to be the most age-relevant area. main value of PhotoAgeClock and other imaging
Based on these findings, we believe that it is prudent for biomarkers trained on skin imaging data is in estimation
future studies to explore what information pattern of the differential changes induced by the various
recognition based on the condition of human skin can interventions including cosmetic, lifestyle, and medical
provide. and establishing the correlations between the many
other aging clocks that are rapidly emerging. The
We hypothesized that deep learning systems trained on photographic aging clocks may also help with the
large numbers of annotated human facial images could development of biomedical interventions and skin care
outperform humans in predicting various diseases and treatments for the individual health status, skin type,
aging. While aging by itself is not likely to be classified climate, geography and other parameters and persona-
as a disease [27], many human diseases are closely cor- lize the treatments for each individual.

www.aging‐us.com  3257  AGING


Abbreviations Am J Epidemiol. 2018; 187:1220–30.   
  https://doi.org/10.1093/aje/kwx346 
MAE: Mean absolute error; DNN: Deep convolution 5.   El  Dib  MY,  El‐Saban  M.  Human  age  estimation  using 
neural network; CNN: Convolution neural network; enhanced  bio‐inspired  features,  17th  IEEE 
RoR: Residual networks of residual networks; GANs: International  Conference  on  Image  Processing  (ICIP), 
Generative adversarial networks. IEEE 1589–1592. 2010. 

ACKNOWLEDGEMENTS 6.   Ricanek  K,  Tesafaye  T.  Morph:  A  longitudinal  image 


database  of  normal  adult  age‐progression,  7th 
This work was performed in collaboration with the International  Conference  on  Automatic  Face  and 
Beiersdorf AG R&D department and the R&D Gesture Recognition, IEEE, 341–345. 2006. 
Biophysics AI/IoT lab. 7.   Eidinger  E,  Enbar  R,  Hassner  T.  Age  and  gender 
estimation of unfiltered faces. IEEE Trans Inf Forensics 
CONFLICTS OF INTEREST Security. 2014; 9:2170–79.  
  https://doi.org/10.1109/TIFS.2014.2359646 
Anastasia Georgievskaya, Konstantin Kiselev, Eugene 8.   Rothe R, Timofte R, Van Gool L. Deep expectation of 
Bobrov and Artem Sevastopolsky are employed by real  and  apparent  age  from  a  single  image  without 
Haut.AI OU, a company specializing in the photo- facial landmarks. Int J Comput Vis. 2018; 126:144–57. 
graphic biomarkers of aging, skin health, direct to https://doi.org/10.1007/s11263‐016‐0940‐3 
consumer personalized skin care. Alex Zhavoronkov is
the director of Insilico Medicine, a company specializ- 9.   Guo G, Fu Y, Dyer CR, Huang TS. Image‐based human 
ing in artificial intelligence for drug discovery. Maria age  estimation  by  manifold  learning  and  locally 
del Pilar Bonilla Tobar, Sven Clemann and Sören adjusted robust regression. IEEE Trans Image Process. 
Jaspers are employed by Beiersdorf AG, a global 2008; 17:1178–88.  
personal care company with a broad spectrum of   https://doi.org/10.1109/TIP.2008.924280 
products. 10.  Guo G, Mu G, Fu Y, Huang TS. Human age estimation 
using  bio‐inspired  features,  IEEE  Conference  on 
FUNDING Computer Vision and Pattern Recognition, IEEE, 112–
119. 2009. 
This research received no specific grant from any 11.  Guo  G,  Mu  G.  Joint  estimation  of  age,  gender  and 
funding agency in the public, commercial, or not-for- ethnicity:  Cca  vs.  pls.,10th  IEEE  international 
profit sectors. conference  and  workshops  on  Automatic  face  and 
gesture recognition (fg), IEEE, 1–6. 2013. 
REFERENCES
12.  Yan  S,  Liu  M,  Huang  TS.  Extracting  age  information 
1.   Christensen  K,  Thinggaard  M,  McGue  M,  Rexbye  H,  from  local  spatially  flexible  patches.,  IEEE 
Hjelmborg  JV,  Aviv  A,  Gunn  D,  van  der  Ouderaa  F,  International  Conference  on  Acoustics,  Speech  and 
Vaupel  JW.  Perceived  age  as  clinically  useful  Signal Processing, IEEE, 737–740. 2008. 
biomarker  of  ageing:  cohort  study.  BMJ.  2009;  13.  Dollár  P,  Tu  Z,  Tao  H,  Belongie  S.  Feature  mining  for 
339:b5262. https://doi.org/10.1136/bmj.b5262  image  classification.  In  Computer  Vision  and  Pattern 
2.   Axelsson  J,  Sundelin  T,  Olsson  MJ,  Sorjonen  K,  Recognition, IEEE Conference, 1‐8. 2007. 
Axelsson  C,  Lasselin  J,  Lekander  M.  Identification  of  14.  Qawaqneh  Z,  Mallouh  AA,  Barkana  BD.  Deep 
acutely  sick  people  and  facial  cues  of  sickness.  Proc  convolutional  neural  network  for  age  estimation 
Biol Sci. 2018; 285:20172430.   based  on  vgg‐face  model.  arXiv  1709.0166,  2017. 
  https://doi.org/10.1098/rspb.2017.2430  https://arxiv.org/abs/1709.01664 
3.   Horvath S, Raj K. DNA methylation‐based biomarkers  15.  Zhang K, Gao C, Guo L, Sun M, Yuan X, Han TX, Zhao Z, 
and  the  epigenetic  clock  theory  of  ageing.  Nat  Rev  Li  B.  Age  group  and  gender  estimation  in  the  wild 
Genet. 2018; 19:371–84.   with  deep  ror  architecture.  IEEE  Access.  2017; 
  https://doi.org/10.1038/s41576‐018‐0004‐3  5:22492–503. 
4.   Belsky DW, Moffitt TE, Cohen AA, Corcoran DL, Levine  https://doi.org/10.1109/ACCESS.2017.2761849 
ME,  Prinz  JA,  Schaefer  J,  Sugden  K,  Williams  B,  16.  Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma 
Poulton  R,  Caspi  A.  Eleven  Telomere,  Epigenetic  S,  Huang  Z,  Karpathy  A,  Khosla  A,  Bernstein  M,  Berg 
Clock,  and  Biomarker‐Composite  Quantifications  of  AC, Fei‐Fei L. ImageNet large scale visual recognition 
Biological Aging: Do They Measure the Same Thing?   challenge. Int J Comput Vis. 2015; 115:211–52.  

www.aging‐us.com  3258  AGING


https://doi.org/10.1007/s11263‐015‐0816‐y    https://doi.org/10.18632/aging.100799 
17.  Upchurch  P,  Gardner  J,  Bala  K,  Pless  R,  Snavely  N,  29.  Kirkland  JL,  Tchkonia  T,  Zhu  Y,  Niedernhofer  LJ, 
Weinberger KQ. Deep feature interpolation for image  Robbins PD. The clinical potential of senolytic drugs. J 
content  changes.  arXiv  1611.05507,  2016.  Am Geriatr Soc. 2017; 65:2297–301.  
https://arxiv.org/abs/1611.05507.    https://doi.org/10.1111/jgs.14969 
18.  Antipov G, Baccouche M, Dugelay JL. Face aging with  30.  Verdin E. The many faces of sirtuins: coupling of NAD 
conditional  generative  adversarial  networks.  arXiv  metabolism,  sirtuins  and  lifespan.  Nat  Med.  2014; 
1702.01983, 2017 https://arxiv.org/abs/1702.01983.  20:25–27. https://doi.org/10.1038/nm.3447 
19.  Putin  E,  Mamoshina  P,  Aliper  A,  Korzinkin  M,  31.  Verdin  E.  NAD+  in  aging,  metabolism,  and 
Moskalev A, Kolosov A, Ostrovskiy A, Cantor C, Vijg J,  neurodegeneration.  Science.  2015;  350:1208–13. 
Zhavoronkov  A.  Deep  biomarkers  of  human  aging:  https://doi.org/10.1126/science.aac4854 
application  of  deep  neural  networks  to  biomarker  32.  Rajman  L,  Chwalek  K,  Sinclair  DA.  Therapeutic 
development.  Aging  (Albany  NY).  2016;  8:1021–33. 
potential  of  NAD‐boosting  molecules:  the  in  vivo 
https://doi.org/10.18632/aging.100968  evidence. Cell Metab. 2018; 27:529–47.  
20.  Mamoshina  P,  Kochetov  K,  Putin  E,  Cortese  F,  Aliper    https://doi.org/10.1016/j.cmet.2018.02.011 
A,  Lee  WS,  Ahn  SM,  Uhn  L,  Skjodt  N,  Kovalchuk  O,  33.  Novelle  MG, Ali  A,  Diéguez  C,  Bernier M, de  Cabo R. 
Scheibye‐Knudsen  M,  Zhavoronkov  A.  Population 
Metformin: a hopeful promise in aging research. Cold 
Specific Biomarkers of Human Aging: A Big Data Study 
Spring  Harb  Perspect  Med.  2016;  6:a025932. 
Using South Korean, Canadian, and Eastern European 
https://doi.org/10.1101/cshperspect.a025932 
Patient  Populations.  J  Gerontol  A  Biol  Sci  Med  Sci. 
2018; 73:1482–90.   34.  Mannick JB, Morris M, Hockey HU, Roma G, Beibel M, 
  https://doi.org/10.1093/gerona/gly005  Kulmatycki  K,  Watkins  M,  Shavlakadze  T,  Zhou  W, 
Quinn  D,  Glass  DJ,  Klickstein  LB.  TORC1  inhibition 
21.  Horvath  S.  DNA  methylation  age  of  human  tissues  enhances immune function and reduces infections in 
and  cell  types.  Genome  Biol.  2013;  14:R115.  the  elderly.  Sci  Transl  Med.  2018;  10:eaaq1564. 
https://doi.org/10.1186/gb‐2013‐14‐10‐r115 
https://doi.org/10.1126/scitranslmed.aaq1564 
22.  Flament F, Bazin R, Laquieze S, Rubert V, Simonpietri 
35.  Blagosklonny  MV.  Does  rapamycin  slow  down  time? 
E,  Piot B. Effect  of the  sun on  visible  clinical  signs  of 
Oncotarget. 2018; 9:30210–12.  
aging  in  Caucasian  skin.  Clin  Cosmet  Investig    https://doi.org/10.18632/oncotarget.25788 
Dermatol. 2013; 6:221–32.  
  https://doi.org/10.2147/CCID.S44686  36.  Aliper  A,  Jellen  L,  Cortese  F,  Artemov  A,  Karpinsky‐
Semper  D,  Moskalev  A,  Swick  AG,  Zhavoronkov  A. 
23.  He K, Zhang X, Ren S, Sun J. Deep residual learning for  Towards  natural  mimetics  of  metformin  and 
image  recognition.  Computer  Vision  and  Pattern 
rapamycin.  Aging  (Albany  NY).  2017;  9:2245–68. 
Recognition. 2016; 2016:770–78. CVPR  https://doi.org/10.18632/aging.101319 
24.  Chollet  F.  Xception:  Deep  learning  with  depthwise  37.  Aliper A, Belikov AV, Garazha A, Jellen L, Artemov A, 
separable  convolutions.  arXiv  1610.02357.  2016.  Suntsova M, Ivanova A, Venkova L, Borisov N, Buzdin 
https://arxiv.org/abs/1610.02357 
A, Mamoshina P, Putin E, Swick AG, et al. In search for 
25.  Kingma  DP,  Ba  L.  J.  ADAM:  a  method  for  stochastic  geroprotectors:  in  silico  screening  and  in  vitro 
optimization.,  International  Conference  on  Learning  validation  of  signalome‐level  mimetics  of  young 
Representations. 2015.  healthy  state.  Aging  (Albany  NY).  2016;  8:2127–52. 
26.  King  DE.  Dlib‐ml:  A  machine  learning  toolkit.  J  Mach  https://doi.org/10.18632/aging.101047 
Learn Res. 2009; 10:1755–58.   
27.  Zhavoronkov  A,  Bhullar  B.  Classifying  aging  as  a 
disease  in  the  context  of  ICD‐11.  Front  Genet.  2015; 
6:326. https://doi.org/10.3389/fgene.2015.00326 
28.  Moskalev A, Chernyagina E, de Magalhães JP, Barardo 
D,  Thoppil  H,  Shaposhnikov  M,  Budovsky  A,  Fraifeld 
VE, Garazha A, Tsvetkov V, Bronovitsky E, Bogomolov 
V,  Scerbacov  A,  et  al.  Geroprotectors.org:  a  new, 
structured  and  curated  database  of  current  thera‐
peutic interventions in aging and age‐related disease. 
Aging (Albany NY). 2015; 7:616–28.  

www.aging‐us.com  3259  AGING

You might also like