Motion-estimation-based stabilization of infrared video View Full Text


Ontology type: schema:ScholarlyArticle     


Article Info

DATE

2017-03-30

AUTHORS

Seokhoon Kang, Chanhyuk Park

ABSTRACT

In the course of the filming of infrared (IR) video, intrinsic equipment instability incurs movement that in turn causes image blurring. For image clarity and viewing comfortability, it is required that such movement be countered. Presently, video stabilization systems perform Motion Estimation of frames that is then applied frame-by-frame to subsequent frames in order to calculate a motion vector, counter movement, and produce, thereby, a more stable image. However, frame-by-frame comparison for long-distance filming often is difficult due to lack of information. The present study determined the appropriate blocks with the most information for Motion Estimation. We also were able to differentiate between equipment movement and movement in the video itself. By these means, we were able to stabilize videos. The methods employed in the experimentation were 5 sets of 640 × 480 long-distance videos and 5 sets of 480 × 320 long-distance videos. When compared with the current motion estimation methods, our proposed method afforded a 10% increase in accuracy. More... »

PAGES

24635-24647

Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/s11042-017-4647-4

DOI

http://dx.doi.org/10.1007/s11042-017-4647-4

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1084029012


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information and Computing Sciences", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Artificial Intelligence and Image Processing", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "Department of Embedded Systems Engineering, University of Incheon, 406-772, Incheon, Republic of Korea", 
          "id": "http://www.grid.ac/institutes/grid.412977.e", 
          "name": [
            "Department of Embedded Systems Engineering, University of Incheon, 406-772, Incheon, Republic of Korea"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Kang", 
        "givenName": "Seokhoon", 
        "id": "sg:person.012324271205.52", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012324271205.52"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Department of Embedded Systems Engineering, University of Incheon, 406-772, Incheon, Republic of Korea", 
          "id": "http://www.grid.ac/institutes/grid.412977.e", 
          "name": [
            "Department of Embedded Systems Engineering, University of Incheon, 406-772, Incheon, Republic of Korea"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Park", 
        "givenName": "Chanhyuk", 
        "type": "Person"
      }
    ], 
    "datePublished": "2017-03-30", 
    "datePublishedReg": "2017-03-30", 
    "description": "In the course of the filming of infrared (IR) video, intrinsic equipment instability incurs movement that in turn causes image blurring. For image clarity and viewing comfortability, it is required that such movement be countered. Presently, video stabilization systems perform Motion Estimation of frames that is then applied frame-by-frame to subsequent frames in order to calculate a motion vector, counter movement, and produce, thereby, a more stable image. However, frame-by-frame comparison for long-distance filming often is difficult due to lack of information. The present study determined the appropriate blocks with the most information for Motion Estimation. We also were able to differentiate between equipment movement and movement in the video itself. By these means, we were able to stabilize videos. The methods employed in the experimentation were 5 sets of 640\u00a0\u00d7\u00a0480 long-distance videos and 5 sets of 480\u00a0\u00d7\u00a0320 long-distance videos. When compared with the current motion estimation methods, our proposed method afforded a 10% increase in accuracy.", 
    "genre": "article", 
    "id": "sg:pub.10.1007/s11042-017-4647-4", 
    "isAccessibleForFree": false, 
    "isPartOf": [
      {
        "id": "sg:journal.1044869", 
        "issn": [
          "1380-7501", 
          "1573-7721"
        ], 
        "name": "Multimedia Tools and Applications", 
        "publisher": "Springer Nature", 
        "type": "Periodical"
      }, 
      {
        "issueNumber": "23", 
        "type": "PublicationIssue"
      }, 
      {
        "type": "PublicationVolume", 
        "volumeNumber": "76"
      }
    ], 
    "keywords": [
      "motion estimation", 
      "motion estimation method", 
      "video stabilization system", 
      "estimation method", 
      "motion vectors", 
      "estimation", 
      "stabilization system", 
      "equipment instability", 
      "equipment movement", 
      "set", 
      "appropriate blocks", 
      "frame comparison", 
      "image blurring", 
      "subsequent frames", 
      "frame", 
      "infrared video", 
      "vector", 
      "accuracy", 
      "instability", 
      "image clarity", 
      "most information", 
      "system", 
      "stable image", 
      "order", 
      "method", 
      "stabilization", 
      "comfortability", 
      "means", 
      "information", 
      "filming", 
      "experimentation", 
      "comparison", 
      "blurring", 
      "movement", 
      "images", 
      "video", 
      "block", 
      "turn", 
      "increase", 
      "lack of information", 
      "such movements", 
      "counter movement", 
      "present study", 
      "study", 
      "clarity", 
      "lack", 
      "course"
    ], 
    "name": "Motion-estimation-based stabilization of infrared video", 
    "pagination": "24635-24647", 
    "productId": [
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1084029012"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1007/s11042-017-4647-4"
        ]
      }
    ], 
    "sameAs": [
      "https://doi.org/10.1007/s11042-017-4647-4", 
      "https://app.dimensions.ai/details/publication/pub.1084029012"
    ], 
    "sdDataset": "articles", 
    "sdDatePublished": "2022-12-01T06:35", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-springernature-scigraph/baseset/20221201/entities/gbq_results/article/article_729.jsonl", 
    "type": "ScholarlyArticle", 
    "url": "https://doi.org/10.1007/s11042-017-4647-4"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4647-4'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4647-4'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4647-4'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4647-4'


 

This table displays all metadata directly associated to this object as RDF triples.

110 TRIPLES      20 PREDICATES      71 URIs      63 LITERALS      6 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1007/s11042-017-4647-4 schema:about anzsrc-for:08
2 anzsrc-for:0801
3 schema:author N0e6cd85fa61f42ea8010edd5af63f574
4 schema:datePublished 2017-03-30
5 schema:datePublishedReg 2017-03-30
6 schema:description In the course of the filming of infrared (IR) video, intrinsic equipment instability incurs movement that in turn causes image blurring. For image clarity and viewing comfortability, it is required that such movement be countered. Presently, video stabilization systems perform Motion Estimation of frames that is then applied frame-by-frame to subsequent frames in order to calculate a motion vector, counter movement, and produce, thereby, a more stable image. However, frame-by-frame comparison for long-distance filming often is difficult due to lack of information. The present study determined the appropriate blocks with the most information for Motion Estimation. We also were able to differentiate between equipment movement and movement in the video itself. By these means, we were able to stabilize videos. The methods employed in the experimentation were 5 sets of 640 × 480 long-distance videos and 5 sets of 480 × 320 long-distance videos. When compared with the current motion estimation methods, our proposed method afforded a 10% increase in accuracy.
7 schema:genre article
8 schema:isAccessibleForFree false
9 schema:isPartOf N54cea146bb674b588b2dcd9a3a245d27
10 Nf12ae90cea2b42578dde5d6ceba8084b
11 sg:journal.1044869
12 schema:keywords accuracy
13 appropriate blocks
14 block
15 blurring
16 clarity
17 comfortability
18 comparison
19 counter movement
20 course
21 equipment instability
22 equipment movement
23 estimation
24 estimation method
25 experimentation
26 filming
27 frame
28 frame comparison
29 image blurring
30 image clarity
31 images
32 increase
33 information
34 infrared video
35 instability
36 lack
37 lack of information
38 means
39 method
40 most information
41 motion estimation
42 motion estimation method
43 motion vectors
44 movement
45 order
46 present study
47 set
48 stabilization
49 stabilization system
50 stable image
51 study
52 subsequent frames
53 such movements
54 system
55 turn
56 vector
57 video
58 video stabilization system
59 schema:name Motion-estimation-based stabilization of infrared video
60 schema:pagination 24635-24647
61 schema:productId N44ec75c8bf5c472db753c7e283f3ff3f
62 N59be6cf17c21486aaffc1d129a7d48b5
63 schema:sameAs https://app.dimensions.ai/details/publication/pub.1084029012
64 https://doi.org/10.1007/s11042-017-4647-4
65 schema:sdDatePublished 2022-12-01T06:35
66 schema:sdLicense https://scigraph.springernature.com/explorer/license/
67 schema:sdPublisher Nc28a8ab25eb34512a90c3f330ddec161
68 schema:url https://doi.org/10.1007/s11042-017-4647-4
69 sgo:license sg:explorer/license/
70 sgo:sdDataset articles
71 rdf:type schema:ScholarlyArticle
72 N0e6cd85fa61f42ea8010edd5af63f574 rdf:first sg:person.012324271205.52
73 rdf:rest Nbd8a8cc7d1264f3c884ee73ae516bac0
74 N3e93ba75bb0b4ce9b16711f04f946855 schema:affiliation grid-institutes:grid.412977.e
75 schema:familyName Park
76 schema:givenName Chanhyuk
77 rdf:type schema:Person
78 N44ec75c8bf5c472db753c7e283f3ff3f schema:name doi
79 schema:value 10.1007/s11042-017-4647-4
80 rdf:type schema:PropertyValue
81 N54cea146bb674b588b2dcd9a3a245d27 schema:issueNumber 23
82 rdf:type schema:PublicationIssue
83 N59be6cf17c21486aaffc1d129a7d48b5 schema:name dimensions_id
84 schema:value pub.1084029012
85 rdf:type schema:PropertyValue
86 Nbd8a8cc7d1264f3c884ee73ae516bac0 rdf:first N3e93ba75bb0b4ce9b16711f04f946855
87 rdf:rest rdf:nil
88 Nc28a8ab25eb34512a90c3f330ddec161 schema:name Springer Nature - SN SciGraph project
89 rdf:type schema:Organization
90 Nf12ae90cea2b42578dde5d6ceba8084b schema:volumeNumber 76
91 rdf:type schema:PublicationVolume
92 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
93 schema:name Information and Computing Sciences
94 rdf:type schema:DefinedTerm
95 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
96 schema:name Artificial Intelligence and Image Processing
97 rdf:type schema:DefinedTerm
98 sg:journal.1044869 schema:issn 1380-7501
99 1573-7721
100 schema:name Multimedia Tools and Applications
101 schema:publisher Springer Nature
102 rdf:type schema:Periodical
103 sg:person.012324271205.52 schema:affiliation grid-institutes:grid.412977.e
104 schema:familyName Kang
105 schema:givenName Seokhoon
106 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012324271205.52
107 rdf:type schema:Person
108 grid-institutes:grid.412977.e schema:alternateName Department of Embedded Systems Engineering, University of Incheon, 406-772, Incheon, Republic of Korea
109 schema:name Department of Embedded Systems Engineering, University of Incheon, 406-772, Incheon, Republic of Korea
110 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...