Observing Dynamic Urban Environment through Stereo-Vision Based Dynamic Occupancy Grid Mapping View Full Text


Ontology type: schema:Chapter      Open Access: True


Chapter Info

DATE

2013

AUTHORS

You Li , Yassine Ruichek

ABSTRACT

Occupancy grid maps are popular tools of representing surrounding environments for mobile robots/ intelligent vehicles. When moving in dynamic real world, traditional occupancy grid mapping is required not only to be able to detect occupied areas, but also to be able to understand the dynamic circumstance. The paper addresses this issue by presenting a stereo-vision based framework to create dynamic occupancy grid map, for the purpose of intelligent vehicle. In the proposed framework, a sparse feature points matching and a dense stereo matching are performed in parallel for each stereo image pair. The former process is used to analyze motions of the vehicle itself and also surrounding moving objects. The latter process calculates dense disparity image, as well as U-V disparity maps applied for pixel-wise moving objects segmentation and dynamic occupancy grid mapping. Principal advantage of the proposed framework is the ability of mapping occupied areas and moving objects at the same time. Meanwhile, compared with some existing methods, the stereo-vision based occupancy grid mapping algorithm is improved. The proposed method is verified in real datasets acquired by our platform SeT-Car. More... »

PAGES

379-388

Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/978-3-642-41184-7_39

DOI

http://dx.doi.org/10.1007/978-3-642-41184-7_39

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1014475735


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information and Computing Sciences", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Artificial Intelligence and Image Processing", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "Institut de Recherche sur les Transports, l\u2019Energie et la Soci\u00e9t\u00e9, le laboratoire Syst\u00e8mes et Transport (IRTES-SET), Universit\u00e9 de Technology of Belfort-Montb\u00e9liard, 90010, Belfort, France", 
          "id": "http://www.grid.ac/institutes/grid.509737.f", 
          "name": [
            "Institut de Recherche sur les Transports, l\u2019Energie et la Soci\u00e9t\u00e9, le laboratoire Syst\u00e8mes et Transport (IRTES-SET), Universit\u00e9 de Technology of Belfort-Montb\u00e9liard, 90010, Belfort, France"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Li", 
        "givenName": "You", 
        "id": "sg:person.010555603471.48", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010555603471.48"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Institut de Recherche sur les Transports, l\u2019Energie et la Soci\u00e9t\u00e9, le laboratoire Syst\u00e8mes et Transport (IRTES-SET), Universit\u00e9 de Technology of Belfort-Montb\u00e9liard, 90010, Belfort, France", 
          "id": "http://www.grid.ac/institutes/grid.509737.f", 
          "name": [
            "Institut de Recherche sur les Transports, l\u2019Energie et la Soci\u00e9t\u00e9, le laboratoire Syst\u00e8mes et Transport (IRTES-SET), Universit\u00e9 de Technology of Belfort-Montb\u00e9liard, 90010, Belfort, France"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Ruichek", 
        "givenName": "Yassine", 
        "id": "sg:person.012444646701.78", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012444646701.78"
        ], 
        "type": "Person"
      }
    ], 
    "datePublished": "2013", 
    "datePublishedReg": "2013-01-01", 
    "description": "Occupancy grid maps are popular tools of representing surrounding environments for mobile robots/ intelligent vehicles. When moving in dynamic real world, traditional occupancy grid mapping is required not only to be able to detect occupied areas, but also to be able to understand the dynamic circumstance. The paper addresses this issue by presenting a stereo-vision based framework to create dynamic occupancy grid map, for the purpose of intelligent vehicle. In the proposed framework, a sparse feature points matching and a dense stereo matching are performed in parallel for each stereo image pair. The former process is used to analyze motions of the vehicle itself and also surrounding moving objects. The latter process calculates dense disparity image, as well as U-V disparity maps applied for pixel-wise moving objects segmentation and dynamic occupancy grid mapping. Principal advantage of the proposed framework is the ability of mapping occupied areas and moving objects at the same time. Meanwhile, compared with some existing methods, the stereo-vision based occupancy grid mapping algorithm is improved. The proposed method is verified in real datasets acquired by our platform SeT-Car.", 
    "editor": [
      {
        "familyName": "Petrosino", 
        "givenName": "Alfredo", 
        "type": "Person"
      }
    ], 
    "genre": "chapter", 
    "id": "sg:pub.10.1007/978-3-642-41184-7_39", 
    "inLanguage": "en", 
    "isAccessibleForFree": true, 
    "isPartOf": {
      "isbn": [
        "978-3-642-41183-0", 
        "978-3-642-41184-7"
      ], 
      "name": "Image Analysis and Processing \u2013 ICIAP 2013", 
      "type": "Book"
    }, 
    "keywords": [
      "occupancy grid mapping", 
      "occupancy grid map", 
      "grid mapping", 
      "intelligent vehicles", 
      "grid map", 
      "occupancy grid mapping algorithm", 
      "dynamic occupancy grid map", 
      "dense disparity image", 
      "Moving Object Segmentation", 
      "ability of mapping", 
      "dynamic real world", 
      "grid mapping algorithm", 
      "stereo image pairs", 
      "dense stereo matching", 
      "sparse feature points", 
      "traditional occupancy grid mapping", 
      "dynamic urban environment", 
      "disparity image", 
      "object segmentation", 
      "stereo matching", 
      "disparity map", 
      "feature points", 
      "real datasets", 
      "image pairs", 
      "mapping algorithm", 
      "real world", 
      "dynamic circumstances", 
      "popular tool", 
      "framework", 
      "objects", 
      "vehicle", 
      "urban environment", 
      "segmentation", 
      "mapping", 
      "algorithm", 
      "environment", 
      "same time", 
      "maps", 
      "datasets", 
      "platform", 
      "matching", 
      "images", 
      "principal advantage", 
      "tool", 
      "method", 
      "advantages", 
      "issues", 
      "occupied area", 
      "process", 
      "parallel", 
      "motion", 
      "world", 
      "point", 
      "area", 
      "former process", 
      "time", 
      "purpose", 
      "ability", 
      "pairs", 
      "latter process", 
      "circumstances", 
      "paper", 
      "mobile robots/ intelligent vehicles", 
      "robots/ intelligent vehicles", 
      "pixel-wise moving objects segmentation", 
      "dynamic occupancy grid mapping", 
      "Stereo-Vision Based Dynamic Occupancy Grid Mapping", 
      "Based Dynamic Occupancy Grid Mapping"
    ], 
    "name": "Observing Dynamic Urban Environment through Stereo-Vision Based Dynamic Occupancy Grid Mapping", 
    "pagination": "379-388", 
    "productId": [
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1014475735"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1007/978-3-642-41184-7_39"
        ]
      }
    ], 
    "publisher": {
      "name": "Springer Nature", 
      "type": "Organisation"
    }, 
    "sameAs": [
      "https://doi.org/10.1007/978-3-642-41184-7_39", 
      "https://app.dimensions.ai/details/publication/pub.1014475735"
    ], 
    "sdDataset": "chapters", 
    "sdDatePublished": "2021-11-01T18:49", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-springernature-scigraph/baseset/20211101/entities/gbq_results/chapter/chapter_179.jsonl", 
    "type": "Chapter", 
    "url": "https://doi.org/10.1007/978-3-642-41184-7_39"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-41184-7_39'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-41184-7_39'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-41184-7_39'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-41184-7_39'


 

This table displays all metadata directly associated to this object as RDF triples.

135 TRIPLES      23 PREDICATES      94 URIs      87 LITERALS      7 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1007/978-3-642-41184-7_39 schema:about anzsrc-for:08
2 anzsrc-for:0801
3 schema:author N36c024c332b2440ca1d57a891b818242
4 schema:datePublished 2013
5 schema:datePublishedReg 2013-01-01
6 schema:description Occupancy grid maps are popular tools of representing surrounding environments for mobile robots/ intelligent vehicles. When moving in dynamic real world, traditional occupancy grid mapping is required not only to be able to detect occupied areas, but also to be able to understand the dynamic circumstance. The paper addresses this issue by presenting a stereo-vision based framework to create dynamic occupancy grid map, for the purpose of intelligent vehicle. In the proposed framework, a sparse feature points matching and a dense stereo matching are performed in parallel for each stereo image pair. The former process is used to analyze motions of the vehicle itself and also surrounding moving objects. The latter process calculates dense disparity image, as well as U-V disparity maps applied for pixel-wise moving objects segmentation and dynamic occupancy grid mapping. Principal advantage of the proposed framework is the ability of mapping occupied areas and moving objects at the same time. Meanwhile, compared with some existing methods, the stereo-vision based occupancy grid mapping algorithm is improved. The proposed method is verified in real datasets acquired by our platform SeT-Car.
7 schema:editor N2330658703474f479bc197748cf04cf8
8 schema:genre chapter
9 schema:inLanguage en
10 schema:isAccessibleForFree true
11 schema:isPartOf N8e8b55743e5648959ac50e9fed1db616
12 schema:keywords Based Dynamic Occupancy Grid Mapping
13 Moving Object Segmentation
14 Stereo-Vision Based Dynamic Occupancy Grid Mapping
15 ability
16 ability of mapping
17 advantages
18 algorithm
19 area
20 circumstances
21 datasets
22 dense disparity image
23 dense stereo matching
24 disparity image
25 disparity map
26 dynamic circumstances
27 dynamic occupancy grid map
28 dynamic occupancy grid mapping
29 dynamic real world
30 dynamic urban environment
31 environment
32 feature points
33 former process
34 framework
35 grid map
36 grid mapping
37 grid mapping algorithm
38 image pairs
39 images
40 intelligent vehicles
41 issues
42 latter process
43 mapping
44 mapping algorithm
45 maps
46 matching
47 method
48 mobile robots/ intelligent vehicles
49 motion
50 object segmentation
51 objects
52 occupancy grid map
53 occupancy grid mapping
54 occupancy grid mapping algorithm
55 occupied area
56 pairs
57 paper
58 parallel
59 pixel-wise moving objects segmentation
60 platform
61 point
62 popular tool
63 principal advantage
64 process
65 purpose
66 real datasets
67 real world
68 robots/ intelligent vehicles
69 same time
70 segmentation
71 sparse feature points
72 stereo image pairs
73 stereo matching
74 time
75 tool
76 traditional occupancy grid mapping
77 urban environment
78 vehicle
79 world
80 schema:name Observing Dynamic Urban Environment through Stereo-Vision Based Dynamic Occupancy Grid Mapping
81 schema:pagination 379-388
82 schema:productId N196d4d226a32465395c968c89d060d02
83 Nca74127f6fb74fb3b1a6e496a3b136fa
84 schema:publisher N4c60c4a5ddf14a0bb8e3cdf9eabef1d7
85 schema:sameAs https://app.dimensions.ai/details/publication/pub.1014475735
86 https://doi.org/10.1007/978-3-642-41184-7_39
87 schema:sdDatePublished 2021-11-01T18:49
88 schema:sdLicense https://scigraph.springernature.com/explorer/license/
89 schema:sdPublisher N5fea8894dad7411bb5cf07373c119e83
90 schema:url https://doi.org/10.1007/978-3-642-41184-7_39
91 sgo:license sg:explorer/license/
92 sgo:sdDataset chapters
93 rdf:type schema:Chapter
94 N196d4d226a32465395c968c89d060d02 schema:name dimensions_id
95 schema:value pub.1014475735
96 rdf:type schema:PropertyValue
97 N2330658703474f479bc197748cf04cf8 rdf:first N80f14c0e4d8a43d5bf672d59b938f189
98 rdf:rest rdf:nil
99 N36c024c332b2440ca1d57a891b818242 rdf:first sg:person.010555603471.48
100 rdf:rest Nd06d8c87427c4e8d834996a1cb739390
101 N4c60c4a5ddf14a0bb8e3cdf9eabef1d7 schema:name Springer Nature
102 rdf:type schema:Organisation
103 N5fea8894dad7411bb5cf07373c119e83 schema:name Springer Nature - SN SciGraph project
104 rdf:type schema:Organization
105 N80f14c0e4d8a43d5bf672d59b938f189 schema:familyName Petrosino
106 schema:givenName Alfredo
107 rdf:type schema:Person
108 N8e8b55743e5648959ac50e9fed1db616 schema:isbn 978-3-642-41183-0
109 978-3-642-41184-7
110 schema:name Image Analysis and Processing – ICIAP 2013
111 rdf:type schema:Book
112 Nca74127f6fb74fb3b1a6e496a3b136fa schema:name doi
113 schema:value 10.1007/978-3-642-41184-7_39
114 rdf:type schema:PropertyValue
115 Nd06d8c87427c4e8d834996a1cb739390 rdf:first sg:person.012444646701.78
116 rdf:rest rdf:nil
117 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
118 schema:name Information and Computing Sciences
119 rdf:type schema:DefinedTerm
120 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
121 schema:name Artificial Intelligence and Image Processing
122 rdf:type schema:DefinedTerm
123 sg:person.010555603471.48 schema:affiliation grid-institutes:grid.509737.f
124 schema:familyName Li
125 schema:givenName You
126 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010555603471.48
127 rdf:type schema:Person
128 sg:person.012444646701.78 schema:affiliation grid-institutes:grid.509737.f
129 schema:familyName Ruichek
130 schema:givenName Yassine
131 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012444646701.78
132 rdf:type schema:Person
133 grid-institutes:grid.509737.f schema:alternateName Institut de Recherche sur les Transports, l’Energie et la Société, le laboratoire Systèmes et Transport (IRTES-SET), Université de Technology of Belfort-Montbéliard, 90010, Belfort, France
134 schema:name Institut de Recherche sur les Transports, l’Energie et la Société, le laboratoire Systèmes et Transport (IRTES-SET), Université de Technology of Belfort-Montbéliard, 90010, Belfort, France
135 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...