BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning View Full Text


Ontology type: schema:ScholarlyArticle      Open Access: True


Article Info

DATE

2021-11-10

AUTHORS

Saeed Reza Kheradpisheh, Maryam Mirsadeghi, Timothée Masquelier

ABSTRACT

We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and a form of temporal coding known as time-to-first-spike coding. With this coding scheme, neurons fire at most once per stimulus, but the firing order carries information. Here, we introduce BS4NN, a modification of S4NN in which the synaptic weights are constrained to be binary (+\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$+$$\end{document} 1 or − 1), in order to decrease memory (ideally, one bit per synapse) and computation footprints. This was done using two sets of weights: firstly, real-valued weights, updated by gradient descent, and used in the backward pass of backpropagation, and secondly, their signs, used in the forward pass. Similar strategies have been used to train (non-spiking) binarized neural networks. The main difference is that BS4NN operates in the time domain: spikes are propagated sequentially, and different neurons may reach their threshold at different times, which increases computational power. We validated BS4NN on two popular benchmarks, MNIST and Fashion-MNIST, and obtained reasonable accuracies for this sort of network (97.0% and 87.3% respectively) with a negligible accuracy drop with respect to real-valued weights (0.4% and 0.7%, respectively). We also demonstrated that BS4NN outperforms a simple BNN with the same architectures on those two datasets (by 0.2% and 0.9% respectively), presumably because it leverages the temporal dimension. More... »

PAGES

1-19

References to SciGraph publications

  • 2019-11-27. Towards spike-based machine intelligence with neuromorphic computing in NATURE
  • 2016-09-17. XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks in COMPUTER VISION – ECCV 2016
  • 2011. Error-Backpropagation in Networks of Fractionally Predictive Spiking Neurons in ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING – ICANN 2011
  • 2019-05-13. A novel and efficient classifier using spiking neural network in THE JOURNAL OF SUPERCOMPUTING
  • Identifiers

    URI

    http://scigraph.springernature.com/pub.10.1007/s11063-021-10680-x

    DOI

    http://dx.doi.org/10.1007/s11063-021-10680-x

    DIMENSIONS

    https://app.dimensions.ai/details/publication/pub.1142516805


    Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
    Incoming Citations Browse incoming citations for this publication using opencitations.net

    JSON-LD is the canonical representation for SciGraph data.

    TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

    [
      {
        "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
        "about": [
          {
            "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
            "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
            "name": "Information and Computing Sciences", 
            "type": "DefinedTerm"
          }, 
          {
            "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
            "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
            "name": "Artificial Intelligence and Image Processing", 
            "type": "DefinedTerm"
          }
        ], 
        "author": [
          {
            "affiliation": {
              "alternateName": "Department of Computer and Data Sciences, Faculty of Mathematical Sciences, Shahid Beheshti University, Tehran, Iran", 
              "id": "http://www.grid.ac/institutes/grid.412502.0", 
              "name": [
                "Department of Computer and Data Sciences, Faculty of Mathematical Sciences, Shahid Beheshti University, Tehran, Iran"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Kheradpisheh", 
            "givenName": "Saeed Reza", 
            "id": "sg:person.01030064000.31", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01030064000.31"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Department of Electrical Engineering, Amirkabir University of Technology, Tehran, Iran", 
              "id": "http://www.grid.ac/institutes/grid.411368.9", 
              "name": [
                "Department of Electrical Engineering, Amirkabir University of Technology, Tehran, Iran"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Mirsadeghi", 
            "givenName": "Maryam", 
            "id": "sg:person.011467610503.31", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011467610503.31"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "CerCo UMR 5549, CNRS, Universit\u00e9 Toulouse 3, Toulouse, France", 
              "id": "http://www.grid.ac/institutes/grid.4444.0", 
              "name": [
                "CerCo UMR 5549, CNRS, Universit\u00e9 Toulouse 3, Toulouse, France"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Masquelier", 
            "givenName": "Timoth\u00e9e", 
            "id": "sg:person.01271016666.11", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01271016666.11"
            ], 
            "type": "Person"
          }
        ], 
        "citation": [
          {
            "id": "sg:pub.10.1007/978-3-642-21735-7_8", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1033736046", 
              "https://doi.org/10.1007/978-3-642-21735-7_8"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1038/s41586-019-1677-2", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1122923895", 
              "https://doi.org/10.1038/s41586-019-1677-2"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/978-3-319-46493-0_32", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1005632531", 
              "https://doi.org/10.1007/978-3-319-46493-0_32"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s11227-019-02881-y", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1114199837", 
              "https://doi.org/10.1007/s11227-019-02881-y"
            ], 
            "type": "CreativeWork"
          }
        ], 
        "datePublished": "2021-11-10", 
        "datePublishedReg": "2021-11-10", 
        "description": "We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and a form of temporal coding known as time-to-first-spike coding. With this coding scheme, neurons fire at most once per stimulus, but the firing order carries information. Here, we introduce BS4NN, a modification of S4NN in which the synaptic weights are constrained to be binary (+\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$$+$$\\end{document} 1 or \u2212 1), in order to decrease memory (ideally, one bit per synapse) and computation footprints. This was done using two sets of weights: firstly, real-valued weights, updated by gradient descent, and used in the backward pass of backpropagation, and secondly, their signs, used in the forward pass. Similar strategies have been used to train (non-spiking) binarized neural networks. The main difference is that BS4NN operates in the time domain: spikes are propagated sequentially, and different neurons may reach their threshold at different times, which increases computational power. We validated BS4NN on two popular benchmarks, MNIST and Fashion-MNIST, and obtained reasonable accuracies for this sort of network (97.0% and 87.3% respectively) with a negligible accuracy drop with respect to real-valued weights (0.4% and 0.7%, respectively). We also demonstrated that BS4NN outperforms a simple BNN with the same architectures on those two datasets (by 0.2% and 0.9% respectively), presumably because it leverages the temporal dimension.", 
        "genre": "article", 
        "id": "sg:pub.10.1007/s11063-021-10680-x", 
        "inLanguage": "en", 
        "isAccessibleForFree": true, 
        "isPartOf": [
          {
            "id": "sg:journal.1132792", 
            "issn": [
              "1370-4621", 
              "1573-773X"
            ], 
            "name": "Neural Processing Letters", 
            "publisher": "Springer Nature", 
            "type": "Periodical"
          }
        ], 
        "keywords": [
          "neural network", 
          "sort of network", 
          "Fashion-MNIST", 
          "computational power", 
          "accuracy drop", 
          "popular benchmarks", 
          "temporal coding", 
          "forward pass", 
          "same architecture", 
          "spike coding", 
          "gradient descent", 
          "backward pass", 
          "set of weights", 
          "synaptic weights", 
          "coding", 
          "network", 
          "backpropagation", 
          "temporal dimension", 
          "fire neurons", 
          "MNIST", 
          "architecture", 
          "algorithm", 
          "dataset", 
          "benchmarks", 
          "reasonable accuracy", 
          "scheme", 
          "firing order", 
          "different neurons", 
          "time domain", 
          "accuracy", 
          "BNN", 
          "information", 
          "set", 
          "order", 
          "memory", 
          "domain", 
          "footprint", 
          "main difference", 
          "pass", 
          "time", 
          "operates", 
          "sort", 
          "descent", 
          "adaptation", 
          "different times", 
          "power", 
          "strategies", 
          "similar strategies", 
          "dimensions", 
          "threshold", 
          "respect", 
          "weight", 
          "form", 
          "spikes", 
          "modification", 
          "neurons", 
          "drop", 
          "signs", 
          "differences", 
          "stimuli", 
          "multilayers", 
          "S4NN algorithm", 
          "adaptation of backpropagation", 
          "BS4NN", 
          "modification of S4NN", 
          "S4NN", 
          "computation footprints", 
          "BS4NN operates", 
          "negligible accuracy drop", 
          "simple BNN", 
          "Binarized"
        ], 
        "name": "BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning", 
        "pagination": "1-19", 
        "productId": [
          {
            "name": "dimensions_id", 
            "type": "PropertyValue", 
            "value": [
              "pub.1142516805"
            ]
          }, 
          {
            "name": "doi", 
            "type": "PropertyValue", 
            "value": [
              "10.1007/s11063-021-10680-x"
            ]
          }
        ], 
        "sameAs": [
          "https://doi.org/10.1007/s11063-021-10680-x", 
          "https://app.dimensions.ai/details/publication/pub.1142516805"
        ], 
        "sdDataset": "articles", 
        "sdDatePublished": "2022-01-01T19:04", 
        "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
        "sdPublisher": {
          "name": "Springer Nature - SN SciGraph project", 
          "type": "Organization"
        }, 
        "sdSource": "s3://com-springernature-scigraph/baseset/20220101/entities/gbq_results/article/article_913.jsonl", 
        "type": "ScholarlyArticle", 
        "url": "https://doi.org/10.1007/s11063-021-10680-x"
      }
    ]
     

    Download the RDF metadata as:  json-ld nt turtle xml License info

    HOW TO GET THIS DATA PROGRAMMATICALLY:

    JSON-LD is a popular format for linked data which is fully compatible with JSON.

    curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/s11063-021-10680-x'

    N-Triples is a line-based linked data format ideal for batch operations.

    curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/s11063-021-10680-x'

    Turtle is a human-readable linked data format.

    curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/s11063-021-10680-x'

    RDF/XML is a standard XML format for linked data.

    curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/s11063-021-10680-x'


     

    This table displays all metadata directly associated to this object as RDF triples.

    159 TRIPLES      22 PREDICATES      98 URIs      86 LITERALS      4 BLANK NODES

    Subject Predicate Object
    1 sg:pub.10.1007/s11063-021-10680-x schema:about anzsrc-for:08
    2 anzsrc-for:0801
    3 schema:author Nea32edf761a14828a6593af3f6de05e0
    4 schema:citation sg:pub.10.1007/978-3-319-46493-0_32
    5 sg:pub.10.1007/978-3-642-21735-7_8
    6 sg:pub.10.1007/s11227-019-02881-y
    7 sg:pub.10.1038/s41586-019-1677-2
    8 schema:datePublished 2021-11-10
    9 schema:datePublishedReg 2021-11-10
    10 schema:description We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and a form of temporal coding known as time-to-first-spike coding. With this coding scheme, neurons fire at most once per stimulus, but the firing order carries information. Here, we introduce BS4NN, a modification of S4NN in which the synaptic weights are constrained to be binary (+\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$+$$\end{document} 1 or − 1), in order to decrease memory (ideally, one bit per synapse) and computation footprints. This was done using two sets of weights: firstly, real-valued weights, updated by gradient descent, and used in the backward pass of backpropagation, and secondly, their signs, used in the forward pass. Similar strategies have been used to train (non-spiking) binarized neural networks. The main difference is that BS4NN operates in the time domain: spikes are propagated sequentially, and different neurons may reach their threshold at different times, which increases computational power. We validated BS4NN on two popular benchmarks, MNIST and Fashion-MNIST, and obtained reasonable accuracies for this sort of network (97.0% and 87.3% respectively) with a negligible accuracy drop with respect to real-valued weights (0.4% and 0.7%, respectively). We also demonstrated that BS4NN outperforms a simple BNN with the same architectures on those two datasets (by 0.2% and 0.9% respectively), presumably because it leverages the temporal dimension.
    11 schema:genre article
    12 schema:inLanguage en
    13 schema:isAccessibleForFree true
    14 schema:isPartOf sg:journal.1132792
    15 schema:keywords BNN
    16 BS4NN
    17 BS4NN operates
    18 Binarized
    19 Fashion-MNIST
    20 MNIST
    21 S4NN
    22 S4NN algorithm
    23 accuracy
    24 accuracy drop
    25 adaptation
    26 adaptation of backpropagation
    27 algorithm
    28 architecture
    29 backpropagation
    30 backward pass
    31 benchmarks
    32 coding
    33 computation footprints
    34 computational power
    35 dataset
    36 descent
    37 differences
    38 different neurons
    39 different times
    40 dimensions
    41 domain
    42 drop
    43 fire neurons
    44 firing order
    45 footprint
    46 form
    47 forward pass
    48 gradient descent
    49 information
    50 main difference
    51 memory
    52 modification
    53 modification of S4NN
    54 multilayers
    55 negligible accuracy drop
    56 network
    57 neural network
    58 neurons
    59 operates
    60 order
    61 pass
    62 popular benchmarks
    63 power
    64 reasonable accuracy
    65 respect
    66 same architecture
    67 scheme
    68 set
    69 set of weights
    70 signs
    71 similar strategies
    72 simple BNN
    73 sort
    74 sort of network
    75 spike coding
    76 spikes
    77 stimuli
    78 strategies
    79 synaptic weights
    80 temporal coding
    81 temporal dimension
    82 threshold
    83 time
    84 time domain
    85 weight
    86 schema:name BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning
    87 schema:pagination 1-19
    88 schema:productId N50f09288f93d47b3b49de667578cb258
    89 N8d64c5ff6e6948b88f4ebd0443bc6396
    90 schema:sameAs https://app.dimensions.ai/details/publication/pub.1142516805
    91 https://doi.org/10.1007/s11063-021-10680-x
    92 schema:sdDatePublished 2022-01-01T19:04
    93 schema:sdLicense https://scigraph.springernature.com/explorer/license/
    94 schema:sdPublisher N04afe369b4f748a3ba5a474e3124da1b
    95 schema:url https://doi.org/10.1007/s11063-021-10680-x
    96 sgo:license sg:explorer/license/
    97 sgo:sdDataset articles
    98 rdf:type schema:ScholarlyArticle
    99 N043e84d5d2b94ddba292aead5152de0d rdf:first sg:person.011467610503.31
    100 rdf:rest Na257780f14604f94913894755bc85455
    101 N04afe369b4f748a3ba5a474e3124da1b schema:name Springer Nature - SN SciGraph project
    102 rdf:type schema:Organization
    103 N50f09288f93d47b3b49de667578cb258 schema:name doi
    104 schema:value 10.1007/s11063-021-10680-x
    105 rdf:type schema:PropertyValue
    106 N8d64c5ff6e6948b88f4ebd0443bc6396 schema:name dimensions_id
    107 schema:value pub.1142516805
    108 rdf:type schema:PropertyValue
    109 Na257780f14604f94913894755bc85455 rdf:first sg:person.01271016666.11
    110 rdf:rest rdf:nil
    111 Nea32edf761a14828a6593af3f6de05e0 rdf:first sg:person.01030064000.31
    112 rdf:rest N043e84d5d2b94ddba292aead5152de0d
    113 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
    114 schema:name Information and Computing Sciences
    115 rdf:type schema:DefinedTerm
    116 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
    117 schema:name Artificial Intelligence and Image Processing
    118 rdf:type schema:DefinedTerm
    119 sg:journal.1132792 schema:issn 1370-4621
    120 1573-773X
    121 schema:name Neural Processing Letters
    122 schema:publisher Springer Nature
    123 rdf:type schema:Periodical
    124 sg:person.01030064000.31 schema:affiliation grid-institutes:grid.412502.0
    125 schema:familyName Kheradpisheh
    126 schema:givenName Saeed Reza
    127 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01030064000.31
    128 rdf:type schema:Person
    129 sg:person.011467610503.31 schema:affiliation grid-institutes:grid.411368.9
    130 schema:familyName Mirsadeghi
    131 schema:givenName Maryam
    132 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011467610503.31
    133 rdf:type schema:Person
    134 sg:person.01271016666.11 schema:affiliation grid-institutes:grid.4444.0
    135 schema:familyName Masquelier
    136 schema:givenName Timothée
    137 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01271016666.11
    138 rdf:type schema:Person
    139 sg:pub.10.1007/978-3-319-46493-0_32 schema:sameAs https://app.dimensions.ai/details/publication/pub.1005632531
    140 https://doi.org/10.1007/978-3-319-46493-0_32
    141 rdf:type schema:CreativeWork
    142 sg:pub.10.1007/978-3-642-21735-7_8 schema:sameAs https://app.dimensions.ai/details/publication/pub.1033736046
    143 https://doi.org/10.1007/978-3-642-21735-7_8
    144 rdf:type schema:CreativeWork
    145 sg:pub.10.1007/s11227-019-02881-y schema:sameAs https://app.dimensions.ai/details/publication/pub.1114199837
    146 https://doi.org/10.1007/s11227-019-02881-y
    147 rdf:type schema:CreativeWork
    148 sg:pub.10.1038/s41586-019-1677-2 schema:sameAs https://app.dimensions.ai/details/publication/pub.1122923895
    149 https://doi.org/10.1038/s41586-019-1677-2
    150 rdf:type schema:CreativeWork
    151 grid-institutes:grid.411368.9 schema:alternateName Department of Electrical Engineering, Amirkabir University of Technology, Tehran, Iran
    152 schema:name Department of Electrical Engineering, Amirkabir University of Technology, Tehran, Iran
    153 rdf:type schema:Organization
    154 grid-institutes:grid.412502.0 schema:alternateName Department of Computer and Data Sciences, Faculty of Mathematical Sciences, Shahid Beheshti University, Tehran, Iran
    155 schema:name Department of Computer and Data Sciences, Faculty of Mathematical Sciences, Shahid Beheshti University, Tehran, Iran
    156 rdf:type schema:Organization
    157 grid-institutes:grid.4444.0 schema:alternateName CerCo UMR 5549, CNRS, Université Toulouse 3, Toulouse, France
    158 schema:name CerCo UMR 5549, CNRS, Université Toulouse 3, Toulouse, France
    159 rdf:type schema:Organization
     




    Preview window. Press ESC to close (or click here)


    ...