Relational recurrent neural networks for polyphonic sound event detection View Full Text


Ontology type: schema:ScholarlyArticle     


Article Info

DATE

2019-01-09

AUTHORS

Junbo Ma, Ruili Wang, Wanting Ji, Hao Zheng, En Zhu, Jianping Yin

ABSTRACT

A smart environment is one of the application scenarios of the Internet of Things (IoT). In order to provide a ubiquitous smart environment for humans, a variety of technologies are developed. In a smart environment system, sound event detection is one of the fundamental technologies, which can automatically sense sound changes in the environment and detect sound events that cause changes. In this paper, we propose the use of Relational Recurrent Neural Network (RRNN) for polyphonic sound event detection, called RRNN-SED, which utilized the strength of RRNN in long-term temporal context extraction and relational reasoning across a polyphonic sound signal. Different from previous sound event detection methods, which rely heavily on convolutional neural networks or recurrent neural networks, the proposed RRNN-SED method can solve long-lasting and overlapping problems in polyphonic sound event detection. Specifically, since the historical information memorized inside RRNNs is capable of interacting with each other across a polyphonic sound signal, the proposed RRNN-SED method is effective and efficient in extracting temporal context information and reasoning the unique relational characteristic of the target sound events. Experimental results on two public datasets show that the proposed method achieved better sound event detection results in terms of segment-based F-score and segment-based error rate. More... »

PAGES

1-19

References to SciGraph publications

  • 2017-09. Multimedia cloud transmission and storage system based on internet of things in MULTIMEDIA TOOLS AND APPLICATIONS
  • 2006-12. A Discriminative Model for Polyphonic Piano Transcription in APPLIED SIGNAL PROCESSING
  • 2019-02. Dictionary-based active learning for sound event classification in MULTIMEDIA TOOLS AND APPLICATIONS
  • 2013-12. Context-dependent sound event detection in EURASIP JOURNAL ON AUDIO, SPEECH, AND MUSIC PROCESSING
  • Identifiers

    URI

    http://scigraph.springernature.com/pub.10.1007/s11042-018-7142-7

    DOI

    http://dx.doi.org/10.1007/s11042-018-7142-7

    DIMENSIONS

    https://app.dimensions.ai/details/publication/pub.1111312535


    Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
    Incoming Citations Browse incoming citations for this publication using opencitations.net

    JSON-LD is the canonical representation for SciGraph data.

    TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

    [
      {
        "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
        "about": [
          {
            "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
            "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
            "name": "Artificial Intelligence and Image Processing", 
            "type": "DefinedTerm"
          }, 
          {
            "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
            "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
            "name": "Information and Computing Sciences", 
            "type": "DefinedTerm"
          }
        ], 
        "author": [
          {
            "affiliation": {
              "alternateName": "National University of Defense Technology", 
              "id": "https://www.grid.ac/institutes/grid.412110.7", 
              "name": [
                "Massey University, Auckland, New Zealand", 
                "School of Computer, National University of Defense Technology, Changsha, China"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Ma", 
            "givenName": "Junbo", 
            "id": "sg:person.010071173434.24", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010071173434.24"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Zhejiang Gongshang University", 
              "id": "https://www.grid.ac/institutes/grid.413072.3", 
              "name": [
                "Massey University, Auckland, New Zealand", 
                "Zhejiang Gongshang University, Hangzhou, China"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Wang", 
            "givenName": "Ruili", 
            "id": "sg:person.01112556557.70", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01112556557.70"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Zhejiang Gongshang University", 
              "id": "https://www.grid.ac/institutes/grid.413072.3", 
              "name": [
                "Massey University, Auckland, New Zealand", 
                "Zhejiang Gongshang University, Hangzhou, China"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Ji", 
            "givenName": "Wanting", 
            "id": "sg:person.010550736405.59", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010550736405.59"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Nanjing Xiaozhuang University", 
              "id": "https://www.grid.ac/institutes/grid.440845.9", 
              "name": [
                "College of information engineering, Nanjing Xiaozhuang University, Nanjing, China"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Zheng", 
            "givenName": "Hao", 
            "id": "sg:person.07575706553.15", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.07575706553.15"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "National University of Defense Technology", 
              "id": "https://www.grid.ac/institutes/grid.412110.7", 
              "name": [
                "School of Computer, National University of Defense Technology, Changsha, China"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Zhu", 
            "givenName": "En", 
            "id": "sg:person.012334352267.67", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012334352267.67"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Dongguan University of Technology", 
              "id": "https://www.grid.ac/institutes/grid.459466.c", 
              "name": [
                "Dongguan University of Technology, Dongguan, China"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Yin", 
            "givenName": "Jianping", 
            "id": "sg:person.012631125327.29", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012631125327.29"
            ], 
            "type": "Person"
          }
        ], 
        "citation": [
          {
            "id": "https://doi.org/10.1016/j.neunet.2014.09.003", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1013219854"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1016/j.jclepro.2016.10.006", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1017673424"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.3390/app6060162", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1025035487"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1186/1687-4722-2013-1", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1049472325", 
              "https://doi.org/10.1186/1687-4722-2013-1"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s11042-015-2967-9", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1054523468", 
              "https://doi.org/10.1007/s11042-015-2967-9"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s11042-015-2967-9", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1054523468", 
              "https://doi.org/10.1007/s11042-015-2967-9"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1155/2007/48317", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1063202294", 
              "https://doi.org/10.1155/2007/48317"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/taslp.2017.2690575", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1085641971"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1016/j.neucom.2017.07.021", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1090741558"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/tii.2017.2739340", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1091268684"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/icita.2005.231", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1093873558"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/ijcnn.2015.7280624", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1094027910"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/icassp.2015.7177950", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1095144935"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/icassp.2016.7472917", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1095196539"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/grc.2005.1547359", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1095596629"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/eusipco.2016.7760424", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1095637962"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/icassp.2017.7952260", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1095991193"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/icassp.2017.7952260", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1095991193"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.21437/interspeech.2016-392", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1099086765"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1016/j.patrec.2018.01.013", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1100726552"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/icot.2017.8336092", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1103265657"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/comst.2018.2844341", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1104439462"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s11042-018-6380-z", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1105830556", 
              "https://doi.org/10.1007/s11042-018-6380-z"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "https://doi.org/10.1109/ijcnn.2018.8489470", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1107705379"
            ], 
            "type": "CreativeWork"
          }
        ], 
        "datePublished": "2019-01-09", 
        "datePublishedReg": "2019-01-09", 
        "description": "A smart environment is one of the application scenarios of the Internet of Things (IoT). In order to provide a ubiquitous smart environment for humans, a variety of technologies are developed. In a smart environment system, sound event detection is one of the fundamental technologies, which can automatically sense sound changes in the environment and detect sound events that cause changes. In this paper, we propose the use of Relational Recurrent Neural Network (RRNN) for polyphonic sound event detection, called RRNN-SED, which utilized the strength of RRNN in long-term temporal context extraction and relational reasoning across a polyphonic sound signal. Different from previous sound event detection methods, which rely heavily on convolutional neural networks or recurrent neural networks, the proposed RRNN-SED method can solve long-lasting and overlapping problems in polyphonic sound event detection. Specifically, since the historical information memorized inside RRNNs is capable of interacting with each other across a polyphonic sound signal, the proposed RRNN-SED method is effective and efficient in extracting temporal context information and reasoning the unique relational characteristic of the target sound events. Experimental results on two public datasets show that the proposed method achieved better sound event detection results in terms of segment-based F-score and segment-based error rate.", 
        "genre": "research_article", 
        "id": "sg:pub.10.1007/s11042-018-7142-7", 
        "inLanguage": [
          "en"
        ], 
        "isAccessibleForFree": false, 
        "isPartOf": [
          {
            "id": "sg:journal.1044869", 
            "issn": [
              "1380-7501", 
              "1573-7721"
            ], 
            "name": "Multimedia Tools and Applications", 
            "type": "Periodical"
          }
        ], 
        "name": "Relational recurrent neural networks for polyphonic sound event detection", 
        "pagination": "1-19", 
        "productId": [
          {
            "name": "readcube_id", 
            "type": "PropertyValue", 
            "value": [
              "badf6270d32bc0dd2e3e501301c03d2b76a37e57f5279173b04e36b8a08bc8ee"
            ]
          }, 
          {
            "name": "doi", 
            "type": "PropertyValue", 
            "value": [
              "10.1007/s11042-018-7142-7"
            ]
          }, 
          {
            "name": "dimensions_id", 
            "type": "PropertyValue", 
            "value": [
              "pub.1111312535"
            ]
          }
        ], 
        "sameAs": [
          "https://doi.org/10.1007/s11042-018-7142-7", 
          "https://app.dimensions.ai/details/publication/pub.1111312535"
        ], 
        "sdDataset": "articles", 
        "sdDatePublished": "2019-04-11T08:37", 
        "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
        "sdPublisher": {
          "name": "Springer Nature - SN SciGraph project", 
          "type": "Organization"
        }, 
        "sdSource": "s3://com-uberresearch-data-dimensions-target-20181106-alternative/cleanup/v134/2549eaecd7973599484d7c17b260dba0a4ecb94b/merge/v9/a6c9fde33151104705d4d7ff012ea9563521a3ce/jats-lookup/v90/0000000315_0000000315/records_6310_00000000.jsonl", 
        "type": "ScholarlyArticle", 
        "url": "https://link.springer.com/10.1007%2Fs11042-018-7142-7"
      }
    ]
     

    Download the RDF metadata as:  json-ld nt turtle xml License info

    HOW TO GET THIS DATA PROGRAMMATICALLY:

    JSON-LD is a popular format for linked data which is fully compatible with JSON.

    curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/s11042-018-7142-7'

    N-Triples is a line-based linked data format ideal for batch operations.

    curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/s11042-018-7142-7'

    Turtle is a human-readable linked data format.

    curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/s11042-018-7142-7'

    RDF/XML is a standard XML format for linked data.

    curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/s11042-018-7142-7'


     

    This table displays all metadata directly associated to this object as RDF triples.

    171 TRIPLES      21 PREDICATES      46 URIs      16 LITERALS      5 BLANK NODES

    Subject Predicate Object
    1 sg:pub.10.1007/s11042-018-7142-7 schema:about anzsrc-for:08
    2 anzsrc-for:0801
    3 schema:author N99c8a62665d34fcfbe3a2678e1a6462b
    4 schema:citation sg:pub.10.1007/s11042-015-2967-9
    5 sg:pub.10.1007/s11042-018-6380-z
    6 sg:pub.10.1155/2007/48317
    7 sg:pub.10.1186/1687-4722-2013-1
    8 https://doi.org/10.1016/j.jclepro.2016.10.006
    9 https://doi.org/10.1016/j.neucom.2017.07.021
    10 https://doi.org/10.1016/j.neunet.2014.09.003
    11 https://doi.org/10.1016/j.patrec.2018.01.013
    12 https://doi.org/10.1109/comst.2018.2844341
    13 https://doi.org/10.1109/eusipco.2016.7760424
    14 https://doi.org/10.1109/grc.2005.1547359
    15 https://doi.org/10.1109/icassp.2015.7177950
    16 https://doi.org/10.1109/icassp.2016.7472917
    17 https://doi.org/10.1109/icassp.2017.7952260
    18 https://doi.org/10.1109/icita.2005.231
    19 https://doi.org/10.1109/icot.2017.8336092
    20 https://doi.org/10.1109/ijcnn.2015.7280624
    21 https://doi.org/10.1109/ijcnn.2018.8489470
    22 https://doi.org/10.1109/taslp.2017.2690575
    23 https://doi.org/10.1109/tii.2017.2739340
    24 https://doi.org/10.21437/interspeech.2016-392
    25 https://doi.org/10.3390/app6060162
    26 schema:datePublished 2019-01-09
    27 schema:datePublishedReg 2019-01-09
    28 schema:description A smart environment is one of the application scenarios of the Internet of Things (IoT). In order to provide a ubiquitous smart environment for humans, a variety of technologies are developed. In a smart environment system, sound event detection is one of the fundamental technologies, which can automatically sense sound changes in the environment and detect sound events that cause changes. In this paper, we propose the use of Relational Recurrent Neural Network (RRNN) for polyphonic sound event detection, called RRNN-SED, which utilized the strength of RRNN in long-term temporal context extraction and relational reasoning across a polyphonic sound signal. Different from previous sound event detection methods, which rely heavily on convolutional neural networks or recurrent neural networks, the proposed RRNN-SED method can solve long-lasting and overlapping problems in polyphonic sound event detection. Specifically, since the historical information memorized inside RRNNs is capable of interacting with each other across a polyphonic sound signal, the proposed RRNN-SED method is effective and efficient in extracting temporal context information and reasoning the unique relational characteristic of the target sound events. Experimental results on two public datasets show that the proposed method achieved better sound event detection results in terms of segment-based F-score and segment-based error rate.
    29 schema:genre research_article
    30 schema:inLanguage en
    31 schema:isAccessibleForFree false
    32 schema:isPartOf sg:journal.1044869
    33 schema:name Relational recurrent neural networks for polyphonic sound event detection
    34 schema:pagination 1-19
    35 schema:productId N2c2e516e925e472191409812dd80f495
    36 N3fc2bfcb34f64403a5fff56e8ca8a845
    37 Neb54b3cc6c0b4eaeb0f4118349497d99
    38 schema:sameAs https://app.dimensions.ai/details/publication/pub.1111312535
    39 https://doi.org/10.1007/s11042-018-7142-7
    40 schema:sdDatePublished 2019-04-11T08:37
    41 schema:sdLicense https://scigraph.springernature.com/explorer/license/
    42 schema:sdPublisher N1b3aff69ad464299b29e8136e99630b0
    43 schema:url https://link.springer.com/10.1007%2Fs11042-018-7142-7
    44 sgo:license sg:explorer/license/
    45 sgo:sdDataset articles
    46 rdf:type schema:ScholarlyArticle
    47 N0662d4a821b148e3aac50b64742d9b83 rdf:first sg:person.01112556557.70
    48 rdf:rest N0b6ecb17cd624a92bf7ed6b397b6c140
    49 N0b6ecb17cd624a92bf7ed6b397b6c140 rdf:first sg:person.010550736405.59
    50 rdf:rest Nd60fad43197e48e7b3153c21ca6fe1ac
    51 N1b3aff69ad464299b29e8136e99630b0 schema:name Springer Nature - SN SciGraph project
    52 rdf:type schema:Organization
    53 N2c2e516e925e472191409812dd80f495 schema:name doi
    54 schema:value 10.1007/s11042-018-7142-7
    55 rdf:type schema:PropertyValue
    56 N3fc2bfcb34f64403a5fff56e8ca8a845 schema:name dimensions_id
    57 schema:value pub.1111312535
    58 rdf:type schema:PropertyValue
    59 N5d6e331fe1e24a40ab1559fe32f00877 rdf:first sg:person.012334352267.67
    60 rdf:rest Nc55727a6521f4df6824225457f3b1a8e
    61 N99c8a62665d34fcfbe3a2678e1a6462b rdf:first sg:person.010071173434.24
    62 rdf:rest N0662d4a821b148e3aac50b64742d9b83
    63 Nc55727a6521f4df6824225457f3b1a8e rdf:first sg:person.012631125327.29
    64 rdf:rest rdf:nil
    65 Nd60fad43197e48e7b3153c21ca6fe1ac rdf:first sg:person.07575706553.15
    66 rdf:rest N5d6e331fe1e24a40ab1559fe32f00877
    67 Neb54b3cc6c0b4eaeb0f4118349497d99 schema:name readcube_id
    68 schema:value badf6270d32bc0dd2e3e501301c03d2b76a37e57f5279173b04e36b8a08bc8ee
    69 rdf:type schema:PropertyValue
    70 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
    71 schema:name Information and Computing Sciences
    72 rdf:type schema:DefinedTerm
    73 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
    74 schema:name Artificial Intelligence and Image Processing
    75 rdf:type schema:DefinedTerm
    76 sg:journal.1044869 schema:issn 1380-7501
    77 1573-7721
    78 schema:name Multimedia Tools and Applications
    79 rdf:type schema:Periodical
    80 sg:person.010071173434.24 schema:affiliation https://www.grid.ac/institutes/grid.412110.7
    81 schema:familyName Ma
    82 schema:givenName Junbo
    83 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010071173434.24
    84 rdf:type schema:Person
    85 sg:person.010550736405.59 schema:affiliation https://www.grid.ac/institutes/grid.413072.3
    86 schema:familyName Ji
    87 schema:givenName Wanting
    88 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010550736405.59
    89 rdf:type schema:Person
    90 sg:person.01112556557.70 schema:affiliation https://www.grid.ac/institutes/grid.413072.3
    91 schema:familyName Wang
    92 schema:givenName Ruili
    93 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01112556557.70
    94 rdf:type schema:Person
    95 sg:person.012334352267.67 schema:affiliation https://www.grid.ac/institutes/grid.412110.7
    96 schema:familyName Zhu
    97 schema:givenName En
    98 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012334352267.67
    99 rdf:type schema:Person
    100 sg:person.012631125327.29 schema:affiliation https://www.grid.ac/institutes/grid.459466.c
    101 schema:familyName Yin
    102 schema:givenName Jianping
    103 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012631125327.29
    104 rdf:type schema:Person
    105 sg:person.07575706553.15 schema:affiliation https://www.grid.ac/institutes/grid.440845.9
    106 schema:familyName Zheng
    107 schema:givenName Hao
    108 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.07575706553.15
    109 rdf:type schema:Person
    110 sg:pub.10.1007/s11042-015-2967-9 schema:sameAs https://app.dimensions.ai/details/publication/pub.1054523468
    111 https://doi.org/10.1007/s11042-015-2967-9
    112 rdf:type schema:CreativeWork
    113 sg:pub.10.1007/s11042-018-6380-z schema:sameAs https://app.dimensions.ai/details/publication/pub.1105830556
    114 https://doi.org/10.1007/s11042-018-6380-z
    115 rdf:type schema:CreativeWork
    116 sg:pub.10.1155/2007/48317 schema:sameAs https://app.dimensions.ai/details/publication/pub.1063202294
    117 https://doi.org/10.1155/2007/48317
    118 rdf:type schema:CreativeWork
    119 sg:pub.10.1186/1687-4722-2013-1 schema:sameAs https://app.dimensions.ai/details/publication/pub.1049472325
    120 https://doi.org/10.1186/1687-4722-2013-1
    121 rdf:type schema:CreativeWork
    122 https://doi.org/10.1016/j.jclepro.2016.10.006 schema:sameAs https://app.dimensions.ai/details/publication/pub.1017673424
    123 rdf:type schema:CreativeWork
    124 https://doi.org/10.1016/j.neucom.2017.07.021 schema:sameAs https://app.dimensions.ai/details/publication/pub.1090741558
    125 rdf:type schema:CreativeWork
    126 https://doi.org/10.1016/j.neunet.2014.09.003 schema:sameAs https://app.dimensions.ai/details/publication/pub.1013219854
    127 rdf:type schema:CreativeWork
    128 https://doi.org/10.1016/j.patrec.2018.01.013 schema:sameAs https://app.dimensions.ai/details/publication/pub.1100726552
    129 rdf:type schema:CreativeWork
    130 https://doi.org/10.1109/comst.2018.2844341 schema:sameAs https://app.dimensions.ai/details/publication/pub.1104439462
    131 rdf:type schema:CreativeWork
    132 https://doi.org/10.1109/eusipco.2016.7760424 schema:sameAs https://app.dimensions.ai/details/publication/pub.1095637962
    133 rdf:type schema:CreativeWork
    134 https://doi.org/10.1109/grc.2005.1547359 schema:sameAs https://app.dimensions.ai/details/publication/pub.1095596629
    135 rdf:type schema:CreativeWork
    136 https://doi.org/10.1109/icassp.2015.7177950 schema:sameAs https://app.dimensions.ai/details/publication/pub.1095144935
    137 rdf:type schema:CreativeWork
    138 https://doi.org/10.1109/icassp.2016.7472917 schema:sameAs https://app.dimensions.ai/details/publication/pub.1095196539
    139 rdf:type schema:CreativeWork
    140 https://doi.org/10.1109/icassp.2017.7952260 schema:sameAs https://app.dimensions.ai/details/publication/pub.1095991193
    141 rdf:type schema:CreativeWork
    142 https://doi.org/10.1109/icita.2005.231 schema:sameAs https://app.dimensions.ai/details/publication/pub.1093873558
    143 rdf:type schema:CreativeWork
    144 https://doi.org/10.1109/icot.2017.8336092 schema:sameAs https://app.dimensions.ai/details/publication/pub.1103265657
    145 rdf:type schema:CreativeWork
    146 https://doi.org/10.1109/ijcnn.2015.7280624 schema:sameAs https://app.dimensions.ai/details/publication/pub.1094027910
    147 rdf:type schema:CreativeWork
    148 https://doi.org/10.1109/ijcnn.2018.8489470 schema:sameAs https://app.dimensions.ai/details/publication/pub.1107705379
    149 rdf:type schema:CreativeWork
    150 https://doi.org/10.1109/taslp.2017.2690575 schema:sameAs https://app.dimensions.ai/details/publication/pub.1085641971
    151 rdf:type schema:CreativeWork
    152 https://doi.org/10.1109/tii.2017.2739340 schema:sameAs https://app.dimensions.ai/details/publication/pub.1091268684
    153 rdf:type schema:CreativeWork
    154 https://doi.org/10.21437/interspeech.2016-392 schema:sameAs https://app.dimensions.ai/details/publication/pub.1099086765
    155 rdf:type schema:CreativeWork
    156 https://doi.org/10.3390/app6060162 schema:sameAs https://app.dimensions.ai/details/publication/pub.1025035487
    157 rdf:type schema:CreativeWork
    158 https://www.grid.ac/institutes/grid.412110.7 schema:alternateName National University of Defense Technology
    159 schema:name Massey University, Auckland, New Zealand
    160 School of Computer, National University of Defense Technology, Changsha, China
    161 rdf:type schema:Organization
    162 https://www.grid.ac/institutes/grid.413072.3 schema:alternateName Zhejiang Gongshang University
    163 schema:name Massey University, Auckland, New Zealand
    164 Zhejiang Gongshang University, Hangzhou, China
    165 rdf:type schema:Organization
    166 https://www.grid.ac/institutes/grid.440845.9 schema:alternateName Nanjing Xiaozhuang University
    167 schema:name College of information engineering, Nanjing Xiaozhuang University, Nanjing, China
    168 rdf:type schema:Organization
    169 https://www.grid.ac/institutes/grid.459466.c schema:alternateName Dongguan University of Technology
    170 schema:name Dongguan University of Technology, Dongguan, China
    171 rdf:type schema:Organization
     




    Preview window. Press ESC to close (or click here)


    ...