A Comparison of Human, Automatic and Collaborative Music Genre Classification and User Centric Evaluation of Genre Classification Systems View Full Text


Ontology type: schema:Chapter      Open Access: True


Chapter Info

DATE

2011

AUTHORS

Klaus Seyerlehner , Gerhard Widmer , Peter Knees

ABSTRACT

In this paper two sets of evaluation experiments are conducted. First, we compare state-of-the-art automatic music genre classification algorithms to human performance on the same dataset, via a listening experiment. This will show that the improvements of content-based systems over the last years have reduced the gap between automatic and human classification performance, but could not yet close this gap. As an important extension to previous work in this context, we will also compare the automatic and human classification performance to a collaborative approach. Second, we propose two evaluation metrics, called user scores, that are based on the votes of the participants of the listening experiment. This user centric evaluation approach allows to get rid of predefined ground truth annotations and allows to account for the ambiguous human perception of musical genre. To take genre ambiguities into account is an important advantage with respect to the evaluation of content-based systems, especially since the dataset compiled in this work (both the audio files and collected votes) are publicly available. More... »

PAGES

118-131

Book

TITLE

Adaptive Multimedia Retrieval. Context, Exploration, and Fusion

ISBN

978-3-642-27168-7
978-3-642-27169-4

Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/978-3-642-27169-4_9

DOI

http://dx.doi.org/10.1007/978-3-642-27169-4_9

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1005693801


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Artificial Intelligence and Image Processing", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information and Computing Sciences", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "Johannes Kepler University of Linz", 
          "id": "https://www.grid.ac/institutes/grid.9970.7", 
          "name": [
            "Dept. of Computational Perception, Johannes Kepler University, Linz, Austria"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Seyerlehner", 
        "givenName": "Klaus", 
        "id": "sg:person.016131276755.19", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.016131276755.19"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Johannes Kepler University of Linz", 
          "id": "https://www.grid.ac/institutes/grid.9970.7", 
          "name": [
            "Dept. of Computational Perception, Johannes Kepler University, Linz, Austria"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Widmer", 
        "givenName": "Gerhard", 
        "id": "sg:person.013641401431.40", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013641401431.40"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Johannes Kepler University of Linz", 
          "id": "https://www.grid.ac/institutes/grid.9970.7", 
          "name": [
            "Dept. of Computational Perception, Johannes Kepler University, Linz, Austria"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Knees", 
        "givenName": "Peter", 
        "id": "sg:person.012374103011.27", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012374103011.27"
        ], 
        "type": "Person"
      }
    ], 
    "citation": [
      {
        "id": "https://doi.org/10.1080/09298210802479268", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1010993586"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/j.cognition.2004.12.005", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1026598201"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1145/1178723.1178728", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1046520800"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tsa.2002.800560", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061786085"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1121/1.2750160", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1062314346"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/icassp.2004.1326806", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1094147699"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/icassp.1998.675470", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1094462885"
        ], 
        "type": "CreativeWork"
      }
    ], 
    "datePublished": "2011", 
    "datePublishedReg": "2011-01-01", 
    "description": "In this paper two sets of evaluation experiments are conducted. First, we compare state-of-the-art automatic music genre classification algorithms to human performance on the same dataset, via a listening experiment. This will show that the improvements of content-based systems over the last years have reduced the gap between automatic and human classification performance, but could not yet close this gap. As an important extension to previous work in this context, we will also compare the automatic and human classification performance to a collaborative approach. Second, we propose two evaluation metrics, called user scores, that are based on the votes of the participants of the listening experiment. This user centric evaluation approach allows to get rid of predefined ground truth annotations and allows to account for the ambiguous human perception of musical genre. To take genre ambiguities into account is an important advantage with respect to the evaluation of content-based systems, especially since the dataset compiled in this work (both the audio files and collected votes) are publicly available.", 
    "editor": [
      {
        "familyName": "Detyniecki", 
        "givenName": "Marcin", 
        "type": "Person"
      }, 
      {
        "familyName": "Knees", 
        "givenName": "Peter", 
        "type": "Person"
      }, 
      {
        "familyName": "N\u00fcrnberger", 
        "givenName": "Andreas", 
        "type": "Person"
      }, 
      {
        "familyName": "Schedl", 
        "givenName": "Markus", 
        "type": "Person"
      }, 
      {
        "familyName": "Stober", 
        "givenName": "Sebastian", 
        "type": "Person"
      }
    ], 
    "genre": "chapter", 
    "id": "sg:pub.10.1007/978-3-642-27169-4_9", 
    "inLanguage": [
      "en"
    ], 
    "isAccessibleForFree": true, 
    "isPartOf": {
      "isbn": [
        "978-3-642-27168-7", 
        "978-3-642-27169-4"
      ], 
      "name": "Adaptive Multimedia Retrieval. Context, Exploration, and Fusion", 
      "type": "Book"
    }, 
    "name": "A Comparison of Human, Automatic and Collaborative Music Genre Classification and User Centric Evaluation of Genre Classification Systems", 
    "pagination": "118-131", 
    "productId": [
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1007/978-3-642-27169-4_9"
        ]
      }, 
      {
        "name": "readcube_id", 
        "type": "PropertyValue", 
        "value": [
          "1b20968fe0a814a5835b34cdc17965bb080578df6aa54b694d0fb147d698e7f1"
        ]
      }, 
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1005693801"
        ]
      }
    ], 
    "publisher": {
      "location": "Berlin, Heidelberg", 
      "name": "Springer Berlin Heidelberg", 
      "type": "Organisation"
    }, 
    "sameAs": [
      "https://doi.org/10.1007/978-3-642-27169-4_9", 
      "https://app.dimensions.ai/details/publication/pub.1005693801"
    ], 
    "sdDataset": "chapters", 
    "sdDatePublished": "2019-04-15T17:55", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-uberresearch-data-dimensions-target-20181106-alternative/cleanup/v134/2549eaecd7973599484d7c17b260dba0a4ecb94b/merge/v9/a6c9fde33151104705d4d7ff012ea9563521a3ce/jats-lookup/v90/0000000001_0000000264/records_8681_00000009.jsonl", 
    "type": "Chapter", 
    "url": "http://link.springer.com/10.1007/978-3-642-27169-4_9"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-27169-4_9'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-27169-4_9'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-27169-4_9'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-27169-4_9'


 

This table displays all metadata directly associated to this object as RDF triples.

120 TRIPLES      23 PREDICATES      34 URIs      20 LITERALS      8 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1007/978-3-642-27169-4_9 schema:about anzsrc-for:08
2 anzsrc-for:0801
3 schema:author Nb248f801ba4741229569d832e81f980d
4 schema:citation https://doi.org/10.1016/j.cognition.2004.12.005
5 https://doi.org/10.1080/09298210802479268
6 https://doi.org/10.1109/icassp.1998.675470
7 https://doi.org/10.1109/icassp.2004.1326806
8 https://doi.org/10.1109/tsa.2002.800560
9 https://doi.org/10.1121/1.2750160
10 https://doi.org/10.1145/1178723.1178728
11 schema:datePublished 2011
12 schema:datePublishedReg 2011-01-01
13 schema:description In this paper two sets of evaluation experiments are conducted. First, we compare state-of-the-art automatic music genre classification algorithms to human performance on the same dataset, via a listening experiment. This will show that the improvements of content-based systems over the last years have reduced the gap between automatic and human classification performance, but could not yet close this gap. As an important extension to previous work in this context, we will also compare the automatic and human classification performance to a collaborative approach. Second, we propose two evaluation metrics, called user scores, that are based on the votes of the participants of the listening experiment. This user centric evaluation approach allows to get rid of predefined ground truth annotations and allows to account for the ambiguous human perception of musical genre. To take genre ambiguities into account is an important advantage with respect to the evaluation of content-based systems, especially since the dataset compiled in this work (both the audio files and collected votes) are publicly available.
14 schema:editor Na612c767249a4ae28609ab63c0a172b0
15 schema:genre chapter
16 schema:inLanguage en
17 schema:isAccessibleForFree true
18 schema:isPartOf N134761d200274c99a13e88cbdcda0ace
19 schema:name A Comparison of Human, Automatic and Collaborative Music Genre Classification and User Centric Evaluation of Genre Classification Systems
20 schema:pagination 118-131
21 schema:productId N299c2571f389425abf6c673f62f568ff
22 N85f10a7638dd4075b74a24b67e142aff
23 N92b6efeb34b547f4a612dd45551ddbfc
24 schema:publisher N23eb5b6e19a1497f9c3906204002a3ce
25 schema:sameAs https://app.dimensions.ai/details/publication/pub.1005693801
26 https://doi.org/10.1007/978-3-642-27169-4_9
27 schema:sdDatePublished 2019-04-15T17:55
28 schema:sdLicense https://scigraph.springernature.com/explorer/license/
29 schema:sdPublisher Nabafffb8be7e4ac2868f0b9d607f0a28
30 schema:url http://link.springer.com/10.1007/978-3-642-27169-4_9
31 sgo:license sg:explorer/license/
32 sgo:sdDataset chapters
33 rdf:type schema:Chapter
34 N134761d200274c99a13e88cbdcda0ace schema:isbn 978-3-642-27168-7
35 978-3-642-27169-4
36 schema:name Adaptive Multimedia Retrieval. Context, Exploration, and Fusion
37 rdf:type schema:Book
38 N23eb5b6e19a1497f9c3906204002a3ce schema:location Berlin, Heidelberg
39 schema:name Springer Berlin Heidelberg
40 rdf:type schema:Organisation
41 N299c2571f389425abf6c673f62f568ff schema:name doi
42 schema:value 10.1007/978-3-642-27169-4_9
43 rdf:type schema:PropertyValue
44 N3ae9b3fe9cc24de3bcb4d1a3f41c1b92 schema:familyName Nürnberger
45 schema:givenName Andreas
46 rdf:type schema:Person
47 N3ce01b63e6e047b0b68ddc674e3f8824 schema:familyName Schedl
48 schema:givenName Markus
49 rdf:type schema:Person
50 N497d8091dc4c4001a28619b13a6c5fe9 schema:familyName Stober
51 schema:givenName Sebastian
52 rdf:type schema:Person
53 N4db299a5daed4cef802c3e06a8240e50 rdf:first N3ae9b3fe9cc24de3bcb4d1a3f41c1b92
54 rdf:rest Nc30432eb5c544e2aa422a4ecacf39be1
55 N85f10a7638dd4075b74a24b67e142aff schema:name dimensions_id
56 schema:value pub.1005693801
57 rdf:type schema:PropertyValue
58 N884b9a2807914e3ea2143efd39df4b69 schema:familyName Knees
59 schema:givenName Peter
60 rdf:type schema:Person
61 N92b6efeb34b547f4a612dd45551ddbfc schema:name readcube_id
62 schema:value 1b20968fe0a814a5835b34cdc17965bb080578df6aa54b694d0fb147d698e7f1
63 rdf:type schema:PropertyValue
64 N9375e65550464f70bcb7bb6b5d11004d rdf:first sg:person.012374103011.27
65 rdf:rest rdf:nil
66 Na612c767249a4ae28609ab63c0a172b0 rdf:first Nbf2f82a7d3c746c493b856ba5f8f1c2e
67 rdf:rest Naf5a4c483ba643e99764420e8a8f2197
68 Nabafffb8be7e4ac2868f0b9d607f0a28 schema:name Springer Nature - SN SciGraph project
69 rdf:type schema:Organization
70 Naf5a4c483ba643e99764420e8a8f2197 rdf:first N884b9a2807914e3ea2143efd39df4b69
71 rdf:rest N4db299a5daed4cef802c3e06a8240e50
72 Nb248f801ba4741229569d832e81f980d rdf:first sg:person.016131276755.19
73 rdf:rest Nc5fcfaba4e8b43e6a501d6206772a460
74 Nb6ed57538cbf4b4ebe8490544e834870 rdf:first N497d8091dc4c4001a28619b13a6c5fe9
75 rdf:rest rdf:nil
76 Nbf2f82a7d3c746c493b856ba5f8f1c2e schema:familyName Detyniecki
77 schema:givenName Marcin
78 rdf:type schema:Person
79 Nc30432eb5c544e2aa422a4ecacf39be1 rdf:first N3ce01b63e6e047b0b68ddc674e3f8824
80 rdf:rest Nb6ed57538cbf4b4ebe8490544e834870
81 Nc5fcfaba4e8b43e6a501d6206772a460 rdf:first sg:person.013641401431.40
82 rdf:rest N9375e65550464f70bcb7bb6b5d11004d
83 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
84 schema:name Information and Computing Sciences
85 rdf:type schema:DefinedTerm
86 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
87 schema:name Artificial Intelligence and Image Processing
88 rdf:type schema:DefinedTerm
89 sg:person.012374103011.27 schema:affiliation https://www.grid.ac/institutes/grid.9970.7
90 schema:familyName Knees
91 schema:givenName Peter
92 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012374103011.27
93 rdf:type schema:Person
94 sg:person.013641401431.40 schema:affiliation https://www.grid.ac/institutes/grid.9970.7
95 schema:familyName Widmer
96 schema:givenName Gerhard
97 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013641401431.40
98 rdf:type schema:Person
99 sg:person.016131276755.19 schema:affiliation https://www.grid.ac/institutes/grid.9970.7
100 schema:familyName Seyerlehner
101 schema:givenName Klaus
102 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.016131276755.19
103 rdf:type schema:Person
104 https://doi.org/10.1016/j.cognition.2004.12.005 schema:sameAs https://app.dimensions.ai/details/publication/pub.1026598201
105 rdf:type schema:CreativeWork
106 https://doi.org/10.1080/09298210802479268 schema:sameAs https://app.dimensions.ai/details/publication/pub.1010993586
107 rdf:type schema:CreativeWork
108 https://doi.org/10.1109/icassp.1998.675470 schema:sameAs https://app.dimensions.ai/details/publication/pub.1094462885
109 rdf:type schema:CreativeWork
110 https://doi.org/10.1109/icassp.2004.1326806 schema:sameAs https://app.dimensions.ai/details/publication/pub.1094147699
111 rdf:type schema:CreativeWork
112 https://doi.org/10.1109/tsa.2002.800560 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061786085
113 rdf:type schema:CreativeWork
114 https://doi.org/10.1121/1.2750160 schema:sameAs https://app.dimensions.ai/details/publication/pub.1062314346
115 rdf:type schema:CreativeWork
116 https://doi.org/10.1145/1178723.1178728 schema:sameAs https://app.dimensions.ai/details/publication/pub.1046520800
117 rdf:type schema:CreativeWork
118 https://www.grid.ac/institutes/grid.9970.7 schema:alternateName Johannes Kepler University of Linz
119 schema:name Dept. of Computational Perception, Johannes Kepler University, Linz, Austria
120 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...