Assessment of female facial beauty based on anthropometric, non-permanent and acquisition characteristics View Full Text


Ontology type: schema:ScholarlyArticle     


Article Info

DATE

2014-09-09

AUTHORS

Antitza Dantcheva, Jean-Luc Dugelay

ABSTRACT

In this work we study the interrelation between, on the one hand, subjective perception of female facial aesthetics, and on the other hand, selected objective parameters that include facial features, photo-quality, as well as non-permanent facial characteristics. This study seeks to provide insight on the role of this specific set of features in affecting the way humans perceive facial images. The approach is novel in that it jointly considers both previous results on photo quality and beauty assessment, as well as non-permanent facial characteristics and expressions. Based on 37 such objective parameters, we construct a metric that aims to quantify modifiable parameters for aesthetics enhancement, as well as tunes systems that would seek to predict the way humans perceive facial aesthetics. The proposed metric is evaluated on a face dataset, that includes images with variations in illumination, image quality, as well as age, ethnicity and expression. We show that our approach outperforms two state of the art beauty estimation metrics. In addition we apply the designed metric in three interesting datasets, where we assess beauty in images of females before and after plastic surgery, of females across time, as well as of females famous for their beauty. We conclude by giving insight towards beauty prediction. More... »

PAGES

11331-11355

Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/s11042-014-2234-5

DOI

http://dx.doi.org/10.1007/s11042-014-2234-5

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1004127290


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information and Computing Sciences", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Artificial Intelligence and Image Processing", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0803", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Computer Software", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0806", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information Systems", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "INRIA, Sophia Antipolis, France", 
          "id": "http://www.grid.ac/institutes/grid.5328.c", 
          "name": [
            "INRIA, Sophia Antipolis, France"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Dantcheva", 
        "givenName": "Antitza", 
        "id": "sg:person.011744727715.61", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011744727715.61"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "EURECOM, Route des Chappes, France", 
          "id": "http://www.grid.ac/institutes/grid.28848.3e", 
          "name": [
            "EURECOM, Route des Chappes, France"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Dugelay", 
        "givenName": "Jean-Luc", 
        "id": "sg:person.015053427343.37", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.015053427343.37"
        ], 
        "type": "Person"
      }
    ], 
    "citation": [
      {
        "id": "sg:pub.10.1007/978-3-642-15555-0_1", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1050326522", 
          "https://doi.org/10.1007/978-3-642-15555-0_1"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-642-13923-9_3", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1001760733", 
          "https://doi.org/10.1007/978-3-642-13923-9_3"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-319-05491-9_7", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1039027758", 
          "https://doi.org/10.1007/978-3-319-05491-9_7"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/0-387-34239-7", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1042498983", 
          "https://doi.org/10.1007/0-387-34239-7"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-642-15567-3_32", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1044999709", 
          "https://doi.org/10.1007/978-3-642-15567-3_32"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1038/368239a0", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1043737700", 
          "https://doi.org/10.1038/368239a0"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/s12144-999-1020-4", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1028252214", 
          "https://doi.org/10.1007/s12144-999-1020-4"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-642-13772-3_43", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1011567861", 
          "https://doi.org/10.1007/978-3-642-13772-3_43"
        ], 
        "type": "CreativeWork"
      }
    ], 
    "datePublished": "2014-09-09", 
    "datePublishedReg": "2014-09-09", 
    "description": "In this work we study the interrelation between, on the one hand, subjective perception of female facial aesthetics, and on the other hand, selected objective parameters that include facial features, photo-quality, as well as non-permanent facial characteristics. This study seeks to provide insight on the role of this specific set of features in affecting the way humans perceive facial images. The approach is novel in that it jointly considers both previous results on photo quality and beauty assessment, as well as non-permanent facial characteristics and expressions. Based on 37 such objective parameters, we construct a metric that aims to quantify modifiable parameters for aesthetics enhancement, as well as tunes systems that would seek to predict the way humans perceive facial aesthetics. The proposed metric is evaluated on a face dataset, that includes images with variations in illumination, image quality, as well as age, ethnicity and expression. We show that our approach outperforms two state of the art beauty estimation metrics. In addition we apply the designed metric in three interesting datasets, where we assess beauty in images of females before and after plastic surgery, of females across time, as well as of females famous for their beauty. We conclude by giving insight towards beauty prediction.", 
    "genre": "article", 
    "id": "sg:pub.10.1007/s11042-014-2234-5", 
    "inLanguage": "en", 
    "isAccessibleForFree": false, 
    "isPartOf": [
      {
        "id": "sg:journal.1044869", 
        "issn": [
          "1380-7501", 
          "1573-7721"
        ], 
        "name": "Multimedia Tools and Applications", 
        "publisher": "Springer Nature", 
        "type": "Periodical"
      }, 
      {
        "issueNumber": "24", 
        "type": "PublicationIssue"
      }, 
      {
        "type": "PublicationVolume", 
        "volumeNumber": "74"
      }
    ], 
    "keywords": [
      "facial images", 
      "tune system", 
      "face datasets", 
      "estimation metrics", 
      "interesting datasets", 
      "photo quality", 
      "beauty assessment", 
      "image quality", 
      "facial features", 
      "facial characteristics", 
      "images", 
      "metrics", 
      "datasets", 
      "female facial beauty", 
      "acquisition characteristics", 
      "features", 
      "specific set", 
      "facial beauty", 
      "set", 
      "quality", 
      "modifiable parameters", 
      "images of females", 
      "facial aesthetics", 
      "objective parameters", 
      "system", 
      "work", 
      "hand", 
      "parameters", 
      "aesthetic enhancement", 
      "illumination", 
      "plastic surgery", 
      "subjective perception", 
      "characteristics", 
      "previous results", 
      "females", 
      "time", 
      "prediction", 
      "insights", 
      "results", 
      "assessment", 
      "expression", 
      "age", 
      "state", 
      "surgery", 
      "interrelations", 
      "aesthetics", 
      "enhancement", 
      "ethnicity", 
      "perception", 
      "study", 
      "role", 
      "addition", 
      "beauty", 
      "approach", 
      "variation", 
      "female facial aesthetics", 
      "non-permanent facial characteristics", 
      "way humans perceive facial images", 
      "humans perceive facial images", 
      "perceive facial images", 
      "such objective parameters", 
      "way humans perceive facial aesthetics", 
      "humans perceive facial aesthetics", 
      "perceive facial aesthetics", 
      "art beauty estimation metrics", 
      "beauty estimation metrics", 
      "beauty prediction"
    ], 
    "name": "Assessment of female facial beauty based on anthropometric, non-permanent and acquisition characteristics", 
    "pagination": "11331-11355", 
    "productId": [
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1004127290"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1007/s11042-014-2234-5"
        ]
      }
    ], 
    "sameAs": [
      "https://doi.org/10.1007/s11042-014-2234-5", 
      "https://app.dimensions.ai/details/publication/pub.1004127290"
    ], 
    "sdDataset": "articles", 
    "sdDatePublished": "2022-01-01T18:32", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-springernature-scigraph/baseset/20220101/entities/gbq_results/article/article_627.jsonl", 
    "type": "ScholarlyArticle", 
    "url": "https://doi.org/10.1007/s11042-014-2234-5"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/s11042-014-2234-5'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/s11042-014-2234-5'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/s11042-014-2234-5'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/s11042-014-2234-5'


 

This table displays all metadata directly associated to this object as RDF triples.

175 TRIPLES      22 PREDICATES      102 URIs      84 LITERALS      6 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1007/s11042-014-2234-5 schema:about anzsrc-for:08
2 anzsrc-for:0801
3 anzsrc-for:0803
4 anzsrc-for:0806
5 schema:author Ne9d7ec2f87c8453d8bb78c4b5acd338f
6 schema:citation sg:pub.10.1007/0-387-34239-7
7 sg:pub.10.1007/978-3-319-05491-9_7
8 sg:pub.10.1007/978-3-642-13772-3_43
9 sg:pub.10.1007/978-3-642-13923-9_3
10 sg:pub.10.1007/978-3-642-15555-0_1
11 sg:pub.10.1007/978-3-642-15567-3_32
12 sg:pub.10.1007/s12144-999-1020-4
13 sg:pub.10.1038/368239a0
14 schema:datePublished 2014-09-09
15 schema:datePublishedReg 2014-09-09
16 schema:description In this work we study the interrelation between, on the one hand, subjective perception of female facial aesthetics, and on the other hand, selected objective parameters that include facial features, photo-quality, as well as non-permanent facial characteristics. This study seeks to provide insight on the role of this specific set of features in affecting the way humans perceive facial images. The approach is novel in that it jointly considers both previous results on photo quality and beauty assessment, as well as non-permanent facial characteristics and expressions. Based on 37 such objective parameters, we construct a metric that aims to quantify modifiable parameters for aesthetics enhancement, as well as tunes systems that would seek to predict the way humans perceive facial aesthetics. The proposed metric is evaluated on a face dataset, that includes images with variations in illumination, image quality, as well as age, ethnicity and expression. We show that our approach outperforms two state of the art beauty estimation metrics. In addition we apply the designed metric in three interesting datasets, where we assess beauty in images of females before and after plastic surgery, of females across time, as well as of females famous for their beauty. We conclude by giving insight towards beauty prediction.
17 schema:genre article
18 schema:inLanguage en
19 schema:isAccessibleForFree false
20 schema:isPartOf N096be8a44f1541c9ae8f40add2aa3ebf
21 N82da3bda1d584068ad666a806f02da4e
22 sg:journal.1044869
23 schema:keywords acquisition characteristics
24 addition
25 aesthetic enhancement
26 aesthetics
27 age
28 approach
29 art beauty estimation metrics
30 assessment
31 beauty
32 beauty assessment
33 beauty estimation metrics
34 beauty prediction
35 characteristics
36 datasets
37 enhancement
38 estimation metrics
39 ethnicity
40 expression
41 face datasets
42 facial aesthetics
43 facial beauty
44 facial characteristics
45 facial features
46 facial images
47 features
48 female facial aesthetics
49 female facial beauty
50 females
51 hand
52 humans perceive facial aesthetics
53 humans perceive facial images
54 illumination
55 image quality
56 images
57 images of females
58 insights
59 interesting datasets
60 interrelations
61 metrics
62 modifiable parameters
63 non-permanent facial characteristics
64 objective parameters
65 parameters
66 perceive facial aesthetics
67 perceive facial images
68 perception
69 photo quality
70 plastic surgery
71 prediction
72 previous results
73 quality
74 results
75 role
76 set
77 specific set
78 state
79 study
80 subjective perception
81 such objective parameters
82 surgery
83 system
84 time
85 tune system
86 variation
87 way humans perceive facial aesthetics
88 way humans perceive facial images
89 work
90 schema:name Assessment of female facial beauty based on anthropometric, non-permanent and acquisition characteristics
91 schema:pagination 11331-11355
92 schema:productId N2bb924db1cfb4dfeaa8d2900e4a73df8
93 N8acbb33a28bb41579efa61670983e309
94 schema:sameAs https://app.dimensions.ai/details/publication/pub.1004127290
95 https://doi.org/10.1007/s11042-014-2234-5
96 schema:sdDatePublished 2022-01-01T18:32
97 schema:sdLicense https://scigraph.springernature.com/explorer/license/
98 schema:sdPublisher N72fabdd732284c9581118b3a03ef07d9
99 schema:url https://doi.org/10.1007/s11042-014-2234-5
100 sgo:license sg:explorer/license/
101 sgo:sdDataset articles
102 rdf:type schema:ScholarlyArticle
103 N096be8a44f1541c9ae8f40add2aa3ebf schema:issueNumber 24
104 rdf:type schema:PublicationIssue
105 N25471ec082814a069294e59f91c78b33 rdf:first sg:person.015053427343.37
106 rdf:rest rdf:nil
107 N2bb924db1cfb4dfeaa8d2900e4a73df8 schema:name doi
108 schema:value 10.1007/s11042-014-2234-5
109 rdf:type schema:PropertyValue
110 N72fabdd732284c9581118b3a03ef07d9 schema:name Springer Nature - SN SciGraph project
111 rdf:type schema:Organization
112 N82da3bda1d584068ad666a806f02da4e schema:volumeNumber 74
113 rdf:type schema:PublicationVolume
114 N8acbb33a28bb41579efa61670983e309 schema:name dimensions_id
115 schema:value pub.1004127290
116 rdf:type schema:PropertyValue
117 Ne9d7ec2f87c8453d8bb78c4b5acd338f rdf:first sg:person.011744727715.61
118 rdf:rest N25471ec082814a069294e59f91c78b33
119 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
120 schema:name Information and Computing Sciences
121 rdf:type schema:DefinedTerm
122 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
123 schema:name Artificial Intelligence and Image Processing
124 rdf:type schema:DefinedTerm
125 anzsrc-for:0803 schema:inDefinedTermSet anzsrc-for:
126 schema:name Computer Software
127 rdf:type schema:DefinedTerm
128 anzsrc-for:0806 schema:inDefinedTermSet anzsrc-for:
129 schema:name Information Systems
130 rdf:type schema:DefinedTerm
131 sg:journal.1044869 schema:issn 1380-7501
132 1573-7721
133 schema:name Multimedia Tools and Applications
134 schema:publisher Springer Nature
135 rdf:type schema:Periodical
136 sg:person.011744727715.61 schema:affiliation grid-institutes:grid.5328.c
137 schema:familyName Dantcheva
138 schema:givenName Antitza
139 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011744727715.61
140 rdf:type schema:Person
141 sg:person.015053427343.37 schema:affiliation grid-institutes:grid.28848.3e
142 schema:familyName Dugelay
143 schema:givenName Jean-Luc
144 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.015053427343.37
145 rdf:type schema:Person
146 sg:pub.10.1007/0-387-34239-7 schema:sameAs https://app.dimensions.ai/details/publication/pub.1042498983
147 https://doi.org/10.1007/0-387-34239-7
148 rdf:type schema:CreativeWork
149 sg:pub.10.1007/978-3-319-05491-9_7 schema:sameAs https://app.dimensions.ai/details/publication/pub.1039027758
150 https://doi.org/10.1007/978-3-319-05491-9_7
151 rdf:type schema:CreativeWork
152 sg:pub.10.1007/978-3-642-13772-3_43 schema:sameAs https://app.dimensions.ai/details/publication/pub.1011567861
153 https://doi.org/10.1007/978-3-642-13772-3_43
154 rdf:type schema:CreativeWork
155 sg:pub.10.1007/978-3-642-13923-9_3 schema:sameAs https://app.dimensions.ai/details/publication/pub.1001760733
156 https://doi.org/10.1007/978-3-642-13923-9_3
157 rdf:type schema:CreativeWork
158 sg:pub.10.1007/978-3-642-15555-0_1 schema:sameAs https://app.dimensions.ai/details/publication/pub.1050326522
159 https://doi.org/10.1007/978-3-642-15555-0_1
160 rdf:type schema:CreativeWork
161 sg:pub.10.1007/978-3-642-15567-3_32 schema:sameAs https://app.dimensions.ai/details/publication/pub.1044999709
162 https://doi.org/10.1007/978-3-642-15567-3_32
163 rdf:type schema:CreativeWork
164 sg:pub.10.1007/s12144-999-1020-4 schema:sameAs https://app.dimensions.ai/details/publication/pub.1028252214
165 https://doi.org/10.1007/s12144-999-1020-4
166 rdf:type schema:CreativeWork
167 sg:pub.10.1038/368239a0 schema:sameAs https://app.dimensions.ai/details/publication/pub.1043737700
168 https://doi.org/10.1038/368239a0
169 rdf:type schema:CreativeWork
170 grid-institutes:grid.28848.3e schema:alternateName EURECOM, Route des Chappes, France
171 schema:name EURECOM, Route des Chappes, France
172 rdf:type schema:Organization
173 grid-institutes:grid.5328.c schema:alternateName INRIA, Sophia Antipolis, France
174 schema:name INRIA, Sophia Antipolis, France
175 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...