Group Pruning Using a Bounded-ℓp Norm for Group Gating and Regularization View Full Text


Ontology type: schema:Chapter     


Chapter Info

DATE

2019-10-25

AUTHORS

Chaithanya Kumar Mummadi , Tim Genewein , Dan Zhang , Thomas Brox , Volker Fischer

ABSTRACT

Deep neural networks achieve state-of-the-art results on several tasks while increasing in complexity. It has been shown that neural networks can be pruned during training by imposing sparsity inducing regularizers. In this paper, we investigate two techniques for group-wise pruning during training in order to improve network efficiency. We propose a gating factor after every convolutional layer to induce channel level sparsity, encouraging insignificant channels to become exactly zero. Further, we introduce and analyse a bounded variant of the ℓ1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document} regularizer, which interpolates between ℓ1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document} and ℓ0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _0$$\end{document}-norms to retain performance of the network at higher pruning rates. To underline effectiveness of the proposed methods, we show that the number of parameters of ResNet-164, DenseNet-40 and MobileNetV2 can be reduced down by 30%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$30\%$$\end{document}, 69%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$69\%$$\end{document}, and 75%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$75\%$$\end{document} on CIFAR100 respectively without a significant drop in accuracy. We achieve state-of-the-art pruning results for ResNet-50 with higher accuracy on ImageNet. Furthermore, we show that the light weight MobileNetV2 can further be compressed on ImageNet without a significant drop in performance . More... »

PAGES

139-155

Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/978-3-030-33676-9_10

DOI

http://dx.doi.org/10.1007/978-3-030-33676-9_10

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1122088281


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information and Computing Sciences", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Artificial Intelligence and Image Processing", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "University of Freiburg, Freiburg im Breisgau, Germany", 
          "id": "http://www.grid.ac/institutes/grid.5963.9", 
          "name": [
            "Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany", 
            "University of Freiburg, Freiburg im Breisgau, Germany"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Mummadi", 
        "givenName": "Chaithanya Kumar", 
        "id": "sg:person.011327254760.54", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011327254760.54"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany", 
          "id": "http://www.grid.ac/institutes/grid.6584.f", 
          "name": [
            "Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Genewein", 
        "givenName": "Tim", 
        "id": "sg:person.01256112710.60", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01256112710.60"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany", 
          "id": "http://www.grid.ac/institutes/grid.6584.f", 
          "name": [
            "Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Zhang", 
        "givenName": "Dan", 
        "id": "sg:person.016177345541.36", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.016177345541.36"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "University of Freiburg, Freiburg im Breisgau, Germany", 
          "id": "http://www.grid.ac/institutes/grid.5963.9", 
          "name": [
            "University of Freiburg, Freiburg im Breisgau, Germany"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Brox", 
        "givenName": "Thomas", 
        "id": "sg:person.012443225372.65", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012443225372.65"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany", 
          "id": "http://www.grid.ac/institutes/grid.6584.f", 
          "name": [
            "Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Fischer", 
        "givenName": "Volker", 
        "id": "sg:person.012152667524.49", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012152667524.49"
        ], 
        "type": "Person"
      }
    ], 
    "datePublished": "2019-10-25", 
    "datePublishedReg": "2019-10-25", 
    "description": "Deep neural networks achieve state-of-the-art results on several tasks while increasing in complexity. It has been shown that neural networks can be pruned during training by imposing sparsity inducing regularizers. In this paper, we investigate two techniques for group-wise pruning during training in order to improve network efficiency. We propose a gating factor after every convolutional layer to induce channel level sparsity, encouraging insignificant channels to become exactly zero. Further, we introduce and analyse a bounded variant of the \u21131\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$$\\ell _1$$\\end{document} regularizer, which interpolates between \u21131\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$$\\ell _1$$\\end{document} and \u21130\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$$\\ell _0$$\\end{document}-norms to retain performance of the network at higher pruning rates. To underline effectiveness of the proposed methods, we show that the number of parameters of ResNet-164, DenseNet-40 and MobileNetV2 can be reduced down by 30%\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$$30\\%$$\\end{document}, 69%\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$$69\\%$$\\end{document}, and 75%\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$$75\\%$$\\end{document} on CIFAR100 respectively without a significant drop in accuracy. We achieve state-of-the-art pruning results for ResNet-50 with higher accuracy on ImageNet. Furthermore, we show that the light weight MobileNetV2 can further be compressed on ImageNet without a significant drop in performance\n.", 
    "editor": [
      {
        "familyName": "Fink", 
        "givenName": "Gernot A.", 
        "type": "Person"
      }, 
      {
        "familyName": "Frintrop", 
        "givenName": "Simone", 
        "type": "Person"
      }, 
      {
        "familyName": "Jiang", 
        "givenName": "Xiaoyi", 
        "type": "Person"
      }
    ], 
    "genre": "chapter", 
    "id": "sg:pub.10.1007/978-3-030-33676-9_10", 
    "isAccessibleForFree": false, 
    "isPartOf": {
      "isbn": [
        "978-3-030-33675-2", 
        "978-3-030-33676-9"
      ], 
      "name": "Pattern Recognition", 
      "type": "Book"
    }, 
    "keywords": [
      "neural network", 
      "deep neural networks", 
      "channel-level sparsity", 
      "higher pruning rate", 
      "convolutional layers", 
      "ResNet-164", 
      "ResNet-50", 
      "art results", 
      "insignificant channels", 
      "DenseNet-40", 
      "pruning rate", 
      "network efficiency", 
      "number of parameters", 
      "MobileNetV2", 
      "ImageNet", 
      "network", 
      "high accuracy", 
      "regularizer", 
      "pruning", 
      "accuracy", 
      "gating factor", 
      "sparsity", 
      "task", 
      "performance", 
      "complexity", 
      "training", 
      "regularization", 
      "effectiveness", 
      "technique", 
      "efficiency", 
      "state", 
      "results", 
      "order", 
      "method", 
      "channels", 
      "number", 
      "analyse", 
      "variants", 
      "parameters", 
      "layer", 
      "norms", 
      "significant drop", 
      "rate", 
      "drop", 
      "gating", 
      "factors", 
      "paper"
    ], 
    "name": "Group Pruning Using a Bounded-\u2113p Norm for Group Gating and Regularization", 
    "pagination": "139-155", 
    "productId": [
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1122088281"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1007/978-3-030-33676-9_10"
        ]
      }
    ], 
    "publisher": {
      "name": "Springer Nature", 
      "type": "Organisation"
    }, 
    "sameAs": [
      "https://doi.org/10.1007/978-3-030-33676-9_10", 
      "https://app.dimensions.ai/details/publication/pub.1122088281"
    ], 
    "sdDataset": "chapters", 
    "sdDatePublished": "2022-10-01T06:58", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-springernature-scigraph/baseset/20221001/entities/gbq_results/chapter/chapter_380.jsonl", 
    "type": "Chapter", 
    "url": "https://doi.org/10.1007/978-3-030-33676-9_10"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/978-3-030-33676-9_10'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/978-3-030-33676-9_10'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/978-3-030-33676-9_10'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/978-3-030-33676-9_10'


 

This table displays all metadata directly associated to this object as RDF triples.

148 TRIPLES      22 PREDICATES      71 URIs      64 LITERALS      7 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1007/978-3-030-33676-9_10 schema:about anzsrc-for:08
2 anzsrc-for:0801
3 schema:author Nd984300c1d5a4c1993946a1e9bd90b3e
4 schema:datePublished 2019-10-25
5 schema:datePublishedReg 2019-10-25
6 schema:description Deep neural networks achieve state-of-the-art results on several tasks while increasing in complexity. It has been shown that neural networks can be pruned during training by imposing sparsity inducing regularizers. In this paper, we investigate two techniques for group-wise pruning during training in order to improve network efficiency. We propose a gating factor after every convolutional layer to induce channel level sparsity, encouraging insignificant channels to become exactly zero. Further, we introduce and analyse a bounded variant of the ℓ1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document} regularizer, which interpolates between ℓ1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document} and ℓ0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _0$$\end{document}-norms to retain performance of the network at higher pruning rates. To underline effectiveness of the proposed methods, we show that the number of parameters of ResNet-164, DenseNet-40 and MobileNetV2 can be reduced down by 30%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$30\%$$\end{document}, 69%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$69\%$$\end{document}, and 75%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$75\%$$\end{document} on CIFAR100 respectively without a significant drop in accuracy. We achieve state-of-the-art pruning results for ResNet-50 with higher accuracy on ImageNet. Furthermore, we show that the light weight MobileNetV2 can further be compressed on ImageNet without a significant drop in performance .
7 schema:editor Ne8acfc0df62a4495a64330f764c1b1a7
8 schema:genre chapter
9 schema:isAccessibleForFree false
10 schema:isPartOf N60cfcdeb0123492487db70e7791b7cb3
11 schema:keywords DenseNet-40
12 ImageNet
13 MobileNetV2
14 ResNet-164
15 ResNet-50
16 accuracy
17 analyse
18 art results
19 channel-level sparsity
20 channels
21 complexity
22 convolutional layers
23 deep neural networks
24 drop
25 effectiveness
26 efficiency
27 factors
28 gating
29 gating factor
30 high accuracy
31 higher pruning rate
32 insignificant channels
33 layer
34 method
35 network
36 network efficiency
37 neural network
38 norms
39 number
40 number of parameters
41 order
42 paper
43 parameters
44 performance
45 pruning
46 pruning rate
47 rate
48 regularization
49 regularizer
50 results
51 significant drop
52 sparsity
53 state
54 task
55 technique
56 training
57 variants
58 schema:name Group Pruning Using a Bounded-ℓp Norm for Group Gating and Regularization
59 schema:pagination 139-155
60 schema:productId N41b87deeac9d4326b406097252d2b1cf
61 Nfa1551cd860842749f06a7c9fa8c9163
62 schema:publisher Nf5bfd0e87dc74a78a48c25498dd1fa7e
63 schema:sameAs https://app.dimensions.ai/details/publication/pub.1122088281
64 https://doi.org/10.1007/978-3-030-33676-9_10
65 schema:sdDatePublished 2022-10-01T06:58
66 schema:sdLicense https://scigraph.springernature.com/explorer/license/
67 schema:sdPublisher N46dbd908f38548ad868f8940cdf69d29
68 schema:url https://doi.org/10.1007/978-3-030-33676-9_10
69 sgo:license sg:explorer/license/
70 sgo:sdDataset chapters
71 rdf:type schema:Chapter
72 N245050dbae1f43a19f08bb047cf6e337 schema:familyName Frintrop
73 schema:givenName Simone
74 rdf:type schema:Person
75 N29e55ac9adbf4c52b936bcf3a5dd5eaf rdf:first N245050dbae1f43a19f08bb047cf6e337
76 rdf:rest N4fe4180e0dfd4f69a1f1ffc72cbc52a3
77 N3a0c678cf09341f7a4f2dd31547e7641 rdf:first sg:person.012443225372.65
78 rdf:rest Nfd80ad2a082544649a8de134acd3c6fc
79 N3fd0fdb8dc8c4ebfa3b7b330f49cce6f rdf:first sg:person.01256112710.60
80 rdf:rest Ne000ea7abfd144c8af6af7bbd7f0330b
81 N41b87deeac9d4326b406097252d2b1cf schema:name doi
82 schema:value 10.1007/978-3-030-33676-9_10
83 rdf:type schema:PropertyValue
84 N46dbd908f38548ad868f8940cdf69d29 schema:name Springer Nature - SN SciGraph project
85 rdf:type schema:Organization
86 N4fe4180e0dfd4f69a1f1ffc72cbc52a3 rdf:first N66a92c34daac4e70a33b73c0b57f1625
87 rdf:rest rdf:nil
88 N60cfcdeb0123492487db70e7791b7cb3 schema:isbn 978-3-030-33675-2
89 978-3-030-33676-9
90 schema:name Pattern Recognition
91 rdf:type schema:Book
92 N66a92c34daac4e70a33b73c0b57f1625 schema:familyName Jiang
93 schema:givenName Xiaoyi
94 rdf:type schema:Person
95 Nbd0ea5545bf14e3f88e532f800c0bc9e schema:familyName Fink
96 schema:givenName Gernot A.
97 rdf:type schema:Person
98 Nd984300c1d5a4c1993946a1e9bd90b3e rdf:first sg:person.011327254760.54
99 rdf:rest N3fd0fdb8dc8c4ebfa3b7b330f49cce6f
100 Ne000ea7abfd144c8af6af7bbd7f0330b rdf:first sg:person.016177345541.36
101 rdf:rest N3a0c678cf09341f7a4f2dd31547e7641
102 Ne8acfc0df62a4495a64330f764c1b1a7 rdf:first Nbd0ea5545bf14e3f88e532f800c0bc9e
103 rdf:rest N29e55ac9adbf4c52b936bcf3a5dd5eaf
104 Nf5bfd0e87dc74a78a48c25498dd1fa7e schema:name Springer Nature
105 rdf:type schema:Organisation
106 Nfa1551cd860842749f06a7c9fa8c9163 schema:name dimensions_id
107 schema:value pub.1122088281
108 rdf:type schema:PropertyValue
109 Nfd80ad2a082544649a8de134acd3c6fc rdf:first sg:person.012152667524.49
110 rdf:rest rdf:nil
111 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
112 schema:name Information and Computing Sciences
113 rdf:type schema:DefinedTerm
114 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
115 schema:name Artificial Intelligence and Image Processing
116 rdf:type schema:DefinedTerm
117 sg:person.011327254760.54 schema:affiliation grid-institutes:grid.5963.9
118 schema:familyName Mummadi
119 schema:givenName Chaithanya Kumar
120 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011327254760.54
121 rdf:type schema:Person
122 sg:person.012152667524.49 schema:affiliation grid-institutes:grid.6584.f
123 schema:familyName Fischer
124 schema:givenName Volker
125 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012152667524.49
126 rdf:type schema:Person
127 sg:person.012443225372.65 schema:affiliation grid-institutes:grid.5963.9
128 schema:familyName Brox
129 schema:givenName Thomas
130 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012443225372.65
131 rdf:type schema:Person
132 sg:person.01256112710.60 schema:affiliation grid-institutes:grid.6584.f
133 schema:familyName Genewein
134 schema:givenName Tim
135 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01256112710.60
136 rdf:type schema:Person
137 sg:person.016177345541.36 schema:affiliation grid-institutes:grid.6584.f
138 schema:familyName Zhang
139 schema:givenName Dan
140 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.016177345541.36
141 rdf:type schema:Person
142 grid-institutes:grid.5963.9 schema:alternateName University of Freiburg, Freiburg im Breisgau, Germany
143 schema:name Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany
144 University of Freiburg, Freiburg im Breisgau, Germany
145 rdf:type schema:Organization
146 grid-institutes:grid.6584.f schema:alternateName Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany
147 schema:name Bosch Center for Artificial Intelligence, Robert Bosch GmbH, Renningen, Germany
148 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...