High accuracy block-matching sub-pixel motion estimation through detection of error surface minima View Full Text


Ontology type: schema:ScholarlyArticle     


Article Info

DATE

2017-02-18

AUTHORS

Konstantinos Konstantoudakis, Lazaros Vrysis, George Papanikolaou, Charalampos Dimoulas

ABSTRACT

The present paper focuses on high-accuracy block-based sub-pixel motion estimation utilizing a straightforward error minimization approach. In particular, the mathematics of bilinear interpolation are utilized for the selection of the candidate motion vectors that minimize the error criterion, by estimating local minima in the error surface with arbitrary accuracy. The implemented approach favors optimum accuracy over computational load demands, making it ideal as a benchmark for faster methods to compare against; however, it is not best suited to real-time critical applications (i.e. video compression). Other video processing needs relying on motion vectors and requiring high-resolution/accuracy can also take advantage of the proposed solution (and its simplified nature in terms of underlying theoretical complexity), such as motion-compensation filtering for super resolution image enhancement, motion analysis in sensitive areas (e.g. high-speed video monitoring, medical imaging, motion analysis in sport science, big-data visual surveillance, etc.). The proposed method is thoroughly evaluated using both real video and synthetic motion sequences from still images, adopting well-tested block-based motion estimation evaluation procedures. Assessment includes comparisons to a number of existing block-based methods with respect to PSNR and SSIM metrics over ground-truth samples. The conducted evaluation takes into consideration both the original (arbitrary-accuracy) and the truncated motion vectors (after rounding them to the nearest half, quarter, or eighth of a pixel), where superior performance with more accurate motion vector estimation is revealed. In this context, the degree to which sub-pixel motion estimation methods actually produce sub-pixel motion vectors is investigated, and the implications thereof are discussed. More... »

PAGES

5837-5856

References to SciGraph publications

  • 2014-10-10. Combining High-Speed Cameras and Stop-Motion Animation Software to Support Students’ Modeling of Human Body Movement in JOURNAL OF SCIENCE EDUCATION AND TECHNOLOGY
  • 2004. High Accuracy Optical Flow Estimation Based on a Theory for Warping in COMPUTER VISION - ECCV 2004
  • 2012-04-05. Iterative random search: a new local minima resistant algorithm for motion estimation in high-definition videos in MULTIMEDIA TOOLS AND APPLICATIONS
  • 2014-05-15. The application of biological motion research: biometrics, sport, and the military in PSYCHONOMIC BULLETIN & REVIEW
  • 2011-07-21. An adaptive motion-compensated approach for video deinterlacing in MULTIMEDIA TOOLS AND APPLICATIONS
  • 1994-02. Performance of optical flow techniques in INTERNATIONAL JOURNAL OF COMPUTER VISION
  • 2014-05-22. Noise-robust video super-resolution using an adaptive spatial-temporal filter in MULTIMEDIA TOOLS AND APPLICATIONS
  • 2007-10-31. Joint Wavelet Video Denoising and Motion Activity Detection in Multimodal Human Activity Analysis: Application to Video-Assisted Bioacoustic/Psychophysiological Monitoring in EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING
  • 2012. Highly Accurate Estimation of Sub-pixel Motion Using Phase Correlation in PATTERN RECOGNITION
  • 2011-04-29. Stereo Video Surveillance Multi-agent System: New Solutions for Human Motion Analysis in JOURNAL OF MATHEMATICAL IMAGING AND VISION
  • 2012-11-10. 3D Deformable Super-Resolution for Multi-Camera 3D Face Scanning in JOURNAL OF MATHEMATICAL IMAGING AND VISION
  • 2016-02-13. Digital image stabilization based on adaptive motion filtering with feedback correction in MULTIMEDIA TOOLS AND APPLICATIONS
  • Identifiers

    URI

    http://scigraph.springernature.com/pub.10.1007/s11042-017-4497-0

    DOI

    http://dx.doi.org/10.1007/s11042-017-4497-0

    DIMENSIONS

    https://app.dimensions.ai/details/publication/pub.1083885943


    Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
    Incoming Citations Browse incoming citations for this publication using opencitations.net

    JSON-LD is the canonical representation for SciGraph data.

    TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

    [
      {
        "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
        "about": [
          {
            "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
            "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
            "name": "Information and Computing Sciences", 
            "type": "DefinedTerm"
          }, 
          {
            "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
            "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
            "name": "Artificial Intelligence and Image Processing", 
            "type": "DefinedTerm"
          }
        ], 
        "author": [
          {
            "affiliation": {
              "alternateName": "Aristotle University of Thessaloniki, Thessaloniki, Greece", 
              "id": "http://www.grid.ac/institutes/grid.4793.9", 
              "name": [
                "Aristotle University of Thessaloniki, Thessaloniki, Greece"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Konstantoudakis", 
            "givenName": "Konstantinos", 
            "id": "sg:person.011304610241.87", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011304610241.87"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Aristotle University of Thessaloniki, Thessaloniki, Greece", 
              "id": "http://www.grid.ac/institutes/grid.4793.9", 
              "name": [
                "Aristotle University of Thessaloniki, Thessaloniki, Greece"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Vrysis", 
            "givenName": "Lazaros", 
            "id": "sg:person.011162175541.54", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011162175541.54"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Aristotle University of Thessaloniki, Thessaloniki, Greece", 
              "id": "http://www.grid.ac/institutes/grid.4793.9", 
              "name": [
                "Aristotle University of Thessaloniki, Thessaloniki, Greece"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Papanikolaou", 
            "givenName": "George", 
            "id": "sg:person.011246201177.68", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011246201177.68"
            ], 
            "type": "Person"
          }, 
          {
            "affiliation": {
              "alternateName": "Aristotle University of Thessaloniki, Thessaloniki, Greece", 
              "id": "http://www.grid.ac/institutes/grid.4793.9", 
              "name": [
                "Aristotle University of Thessaloniki, Thessaloniki, Greece"
              ], 
              "type": "Organization"
            }, 
            "familyName": "Dimoulas", 
            "givenName": "Charalampos", 
            "id": "sg:person.013475131641.09", 
            "sameAs": [
              "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013475131641.09"
            ], 
            "type": "Person"
          }
        ], 
        "citation": [
          {
            "id": "sg:pub.10.1007/s11042-015-3183-3", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1005404435", 
              "https://doi.org/10.1007/s11042-015-3183-3"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/978-3-540-24673-2_3", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1045812409", 
              "https://doi.org/10.1007/978-3-540-24673-2_3"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/bf01420984", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1021499342", 
              "https://doi.org/10.1007/bf01420984"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s11042-011-0845-7", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1045225319", 
              "https://doi.org/10.1007/s11042-011-0845-7"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s11042-014-2079-y", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1001085900", 
              "https://doi.org/10.1007/s11042-014-2079-y"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s10851-011-0290-2", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1005073312", 
              "https://doi.org/10.1007/s10851-011-0290-2"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1155/2008/792028", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1063202836", 
              "https://doi.org/10.1155/2008/792028"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/978-3-642-33506-8_24", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1046258731", 
              "https://doi.org/10.1007/978-3-642-33506-8_24"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s11042-012-1033-0", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1000243731", 
              "https://doi.org/10.1007/s11042-012-1033-0"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.3758/s13423-014-0659-5", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1030121467", 
              "https://doi.org/10.3758/s13423-014-0659-5"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s10956-014-9521-9", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1009903822", 
              "https://doi.org/10.1007/s10956-014-9521-9"
            ], 
            "type": "CreativeWork"
          }, 
          {
            "id": "sg:pub.10.1007/s10851-012-0399-y", 
            "sameAs": [
              "https://app.dimensions.ai/details/publication/pub.1030662861", 
              "https://doi.org/10.1007/s10851-012-0399-y"
            ], 
            "type": "CreativeWork"
          }
        ], 
        "datePublished": "2017-02-18", 
        "datePublishedReg": "2017-02-18", 
        "description": "The present paper focuses on high-accuracy block-based sub-pixel motion estimation utilizing a straightforward error minimization approach. In particular, the mathematics of bilinear interpolation are utilized for the selection of the candidate motion vectors that minimize the error criterion, by estimating local minima in the error surface with arbitrary accuracy. The implemented approach favors optimum accuracy over computational load demands, making it ideal as a benchmark for faster methods to compare against; however, it is not best suited to real-time critical applications (i.e. video compression). Other video processing needs relying on motion vectors and requiring high-resolution/accuracy can also take advantage of the proposed solution (and its simplified nature in terms of underlying theoretical complexity), such as motion-compensation filtering for super resolution image enhancement, motion analysis in sensitive areas (e.g. high-speed video monitoring, medical imaging, motion analysis in sport science, big-data visual surveillance, etc.). The proposed method is thoroughly evaluated using both real video and synthetic motion sequences from still images, adopting well-tested block-based motion estimation evaluation procedures. Assessment includes comparisons to a number of existing block-based methods with respect to PSNR and SSIM metrics over ground-truth samples. The conducted evaluation takes into consideration both the original (arbitrary-accuracy) and the truncated motion vectors (after rounding them to the nearest half, quarter, or eighth of a pixel), where superior performance with more accurate motion vector estimation is revealed. In this context, the degree to which sub-pixel motion estimation methods actually produce sub-pixel motion vectors is investigated, and the implications thereof are discussed.", 
        "genre": "article", 
        "id": "sg:pub.10.1007/s11042-017-4497-0", 
        "isAccessibleForFree": false, 
        "isPartOf": [
          {
            "id": "sg:journal.1044869", 
            "issn": [
              "1380-7501", 
              "1573-7721"
            ], 
            "name": "Multimedia Tools and Applications", 
            "publisher": "Springer Nature", 
            "type": "Periodical"
          }, 
          {
            "issueNumber": "5", 
            "type": "PublicationIssue"
          }, 
          {
            "type": "PublicationVolume", 
            "volumeNumber": "77"
          }
        ], 
        "keywords": [
          "sub-pixel motion estimation", 
          "motion vectors", 
          "motion estimation", 
          "real-time critical applications", 
          "super-resolution image enhancement", 
          "candidate motion vectors", 
          "motion vector estimation", 
          "block-based method", 
          "motion estimation method", 
          "real videos", 
          "critical applications", 
          "SSIM metric", 
          "processing needs", 
          "image enhancement", 
          "motion sequences", 
          "ground truth samples", 
          "bilinear interpolation", 
          "error surface", 
          "vector estimation", 
          "superior performance", 
          "optimum accuracy", 
          "high resolution/accuracy", 
          "local minima", 
          "motion analysis", 
          "arbitrary accuracy", 
          "error criterion", 
          "minimization approach", 
          "error minimization approach", 
          "accuracy", 
          "fast method", 
          "PSNR", 
          "estimation method", 
          "video", 
          "load demand", 
          "benchmarks", 
          "estimation", 
          "vector", 
          "evaluation procedure", 
          "filtering", 
          "images", 
          "metrics", 
          "method", 
          "interpolation", 
          "applications", 
          "performance", 
          "detection", 
          "advantages", 
          "demand", 
          "selection", 
          "solution", 
          "sensitive areas", 
          "context", 
          "need", 
          "present paper", 
          "evaluation", 
          "mathematics", 
          "number", 
          "sequence", 
          "surface", 
          "minimum", 
          "consideration", 
          "area", 
          "respect", 
          "enhancement", 
          "criteria", 
          "comparison", 
          "approach", 
          "analysis", 
          "procedure", 
          "surface minima", 
          "assessment", 
          "degree", 
          "samples", 
          "implications", 
          "paper"
        ], 
        "name": "High accuracy block-matching sub-pixel motion estimation through detection of error surface minima", 
        "pagination": "5837-5856", 
        "productId": [
          {
            "name": "dimensions_id", 
            "type": "PropertyValue", 
            "value": [
              "pub.1083885943"
            ]
          }, 
          {
            "name": "doi", 
            "type": "PropertyValue", 
            "value": [
              "10.1007/s11042-017-4497-0"
            ]
          }
        ], 
        "sameAs": [
          "https://doi.org/10.1007/s11042-017-4497-0", 
          "https://app.dimensions.ai/details/publication/pub.1083885943"
        ], 
        "sdDataset": "articles", 
        "sdDatePublished": "2022-12-01T06:37", 
        "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
        "sdPublisher": {
          "name": "Springer Nature - SN SciGraph project", 
          "type": "Organization"
        }, 
        "sdSource": "s3://com-springernature-scigraph/baseset/20221201/entities/gbq_results/article/article_751.jsonl", 
        "type": "ScholarlyArticle", 
        "url": "https://doi.org/10.1007/s11042-017-4497-0"
      }
    ]
     

    Download the RDF metadata as:  json-ld nt turtle xml License info

    HOW TO GET THIS DATA PROGRAMMATICALLY:

    JSON-LD is a popular format for linked data which is fully compatible with JSON.

    curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4497-0'

    N-Triples is a line-based linked data format ideal for batch operations.

    curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4497-0'

    Turtle is a human-readable linked data format.

    curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4497-0'

    RDF/XML is a standard XML format for linked data.

    curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/s11042-017-4497-0'


     

    This table displays all metadata directly associated to this object as RDF triples.

    201 TRIPLES      21 PREDICATES      111 URIs      91 LITERALS      6 BLANK NODES

    Subject Predicate Object
    1 sg:pub.10.1007/s11042-017-4497-0 schema:about anzsrc-for:08
    2 anzsrc-for:0801
    3 schema:author Ne01e10b4a2bf4baa9124f28754837f84
    4 schema:citation sg:pub.10.1007/978-3-540-24673-2_3
    5 sg:pub.10.1007/978-3-642-33506-8_24
    6 sg:pub.10.1007/bf01420984
    7 sg:pub.10.1007/s10851-011-0290-2
    8 sg:pub.10.1007/s10851-012-0399-y
    9 sg:pub.10.1007/s10956-014-9521-9
    10 sg:pub.10.1007/s11042-011-0845-7
    11 sg:pub.10.1007/s11042-012-1033-0
    12 sg:pub.10.1007/s11042-014-2079-y
    13 sg:pub.10.1007/s11042-015-3183-3
    14 sg:pub.10.1155/2008/792028
    15 sg:pub.10.3758/s13423-014-0659-5
    16 schema:datePublished 2017-02-18
    17 schema:datePublishedReg 2017-02-18
    18 schema:description The present paper focuses on high-accuracy block-based sub-pixel motion estimation utilizing a straightforward error minimization approach. In particular, the mathematics of bilinear interpolation are utilized for the selection of the candidate motion vectors that minimize the error criterion, by estimating local minima in the error surface with arbitrary accuracy. The implemented approach favors optimum accuracy over computational load demands, making it ideal as a benchmark for faster methods to compare against; however, it is not best suited to real-time critical applications (i.e. video compression). Other video processing needs relying on motion vectors and requiring high-resolution/accuracy can also take advantage of the proposed solution (and its simplified nature in terms of underlying theoretical complexity), such as motion-compensation filtering for super resolution image enhancement, motion analysis in sensitive areas (e.g. high-speed video monitoring, medical imaging, motion analysis in sport science, big-data visual surveillance, etc.). The proposed method is thoroughly evaluated using both real video and synthetic motion sequences from still images, adopting well-tested block-based motion estimation evaluation procedures. Assessment includes comparisons to a number of existing block-based methods with respect to PSNR and SSIM metrics over ground-truth samples. The conducted evaluation takes into consideration both the original (arbitrary-accuracy) and the truncated motion vectors (after rounding them to the nearest half, quarter, or eighth of a pixel), where superior performance with more accurate motion vector estimation is revealed. In this context, the degree to which sub-pixel motion estimation methods actually produce sub-pixel motion vectors is investigated, and the implications thereof are discussed.
    19 schema:genre article
    20 schema:isAccessibleForFree false
    21 schema:isPartOf N43b10dc754b34bc18b537a7c8882a29c
    22 Nba39dce9e2e24aa39e09d03223a610b1
    23 sg:journal.1044869
    24 schema:keywords PSNR
    25 SSIM metric
    26 accuracy
    27 advantages
    28 analysis
    29 applications
    30 approach
    31 arbitrary accuracy
    32 area
    33 assessment
    34 benchmarks
    35 bilinear interpolation
    36 block-based method
    37 candidate motion vectors
    38 comparison
    39 consideration
    40 context
    41 criteria
    42 critical applications
    43 degree
    44 demand
    45 detection
    46 enhancement
    47 error criterion
    48 error minimization approach
    49 error surface
    50 estimation
    51 estimation method
    52 evaluation
    53 evaluation procedure
    54 fast method
    55 filtering
    56 ground truth samples
    57 high resolution/accuracy
    58 image enhancement
    59 images
    60 implications
    61 interpolation
    62 load demand
    63 local minima
    64 mathematics
    65 method
    66 metrics
    67 minimization approach
    68 minimum
    69 motion analysis
    70 motion estimation
    71 motion estimation method
    72 motion sequences
    73 motion vector estimation
    74 motion vectors
    75 need
    76 number
    77 optimum accuracy
    78 paper
    79 performance
    80 present paper
    81 procedure
    82 processing needs
    83 real videos
    84 real-time critical applications
    85 respect
    86 samples
    87 selection
    88 sensitive areas
    89 sequence
    90 solution
    91 sub-pixel motion estimation
    92 super-resolution image enhancement
    93 superior performance
    94 surface
    95 surface minima
    96 vector
    97 vector estimation
    98 video
    99 schema:name High accuracy block-matching sub-pixel motion estimation through detection of error surface minima
    100 schema:pagination 5837-5856
    101 schema:productId N0e33e8e2291f40f1b7ea8692b1a0b8fa
    102 N82dbed4334d3494f97cb748fc1c965ad
    103 schema:sameAs https://app.dimensions.ai/details/publication/pub.1083885943
    104 https://doi.org/10.1007/s11042-017-4497-0
    105 schema:sdDatePublished 2022-12-01T06:37
    106 schema:sdLicense https://scigraph.springernature.com/explorer/license/
    107 schema:sdPublisher Na4b2301f34c8405cb6815d32d9996fd4
    108 schema:url https://doi.org/10.1007/s11042-017-4497-0
    109 sgo:license sg:explorer/license/
    110 sgo:sdDataset articles
    111 rdf:type schema:ScholarlyArticle
    112 N0e33e8e2291f40f1b7ea8692b1a0b8fa schema:name doi
    113 schema:value 10.1007/s11042-017-4497-0
    114 rdf:type schema:PropertyValue
    115 N43b10dc754b34bc18b537a7c8882a29c schema:volumeNumber 77
    116 rdf:type schema:PublicationVolume
    117 N6916f195e679472e96dfa7892d13e9b1 rdf:first sg:person.011246201177.68
    118 rdf:rest Nfb060aa766ba40cabe30dbac389f9544
    119 N82dbed4334d3494f97cb748fc1c965ad schema:name dimensions_id
    120 schema:value pub.1083885943
    121 rdf:type schema:PropertyValue
    122 Na4b2301f34c8405cb6815d32d9996fd4 schema:name Springer Nature - SN SciGraph project
    123 rdf:type schema:Organization
    124 Nba39dce9e2e24aa39e09d03223a610b1 schema:issueNumber 5
    125 rdf:type schema:PublicationIssue
    126 Ne01e10b4a2bf4baa9124f28754837f84 rdf:first sg:person.011304610241.87
    127 rdf:rest Ne0c15810d2834273a5c8d258ce9a069e
    128 Ne0c15810d2834273a5c8d258ce9a069e rdf:first sg:person.011162175541.54
    129 rdf:rest N6916f195e679472e96dfa7892d13e9b1
    130 Nfb060aa766ba40cabe30dbac389f9544 rdf:first sg:person.013475131641.09
    131 rdf:rest rdf:nil
    132 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
    133 schema:name Information and Computing Sciences
    134 rdf:type schema:DefinedTerm
    135 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
    136 schema:name Artificial Intelligence and Image Processing
    137 rdf:type schema:DefinedTerm
    138 sg:journal.1044869 schema:issn 1380-7501
    139 1573-7721
    140 schema:name Multimedia Tools and Applications
    141 schema:publisher Springer Nature
    142 rdf:type schema:Periodical
    143 sg:person.011162175541.54 schema:affiliation grid-institutes:grid.4793.9
    144 schema:familyName Vrysis
    145 schema:givenName Lazaros
    146 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011162175541.54
    147 rdf:type schema:Person
    148 sg:person.011246201177.68 schema:affiliation grid-institutes:grid.4793.9
    149 schema:familyName Papanikolaou
    150 schema:givenName George
    151 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011246201177.68
    152 rdf:type schema:Person
    153 sg:person.011304610241.87 schema:affiliation grid-institutes:grid.4793.9
    154 schema:familyName Konstantoudakis
    155 schema:givenName Konstantinos
    156 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011304610241.87
    157 rdf:type schema:Person
    158 sg:person.013475131641.09 schema:affiliation grid-institutes:grid.4793.9
    159 schema:familyName Dimoulas
    160 schema:givenName Charalampos
    161 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013475131641.09
    162 rdf:type schema:Person
    163 sg:pub.10.1007/978-3-540-24673-2_3 schema:sameAs https://app.dimensions.ai/details/publication/pub.1045812409
    164 https://doi.org/10.1007/978-3-540-24673-2_3
    165 rdf:type schema:CreativeWork
    166 sg:pub.10.1007/978-3-642-33506-8_24 schema:sameAs https://app.dimensions.ai/details/publication/pub.1046258731
    167 https://doi.org/10.1007/978-3-642-33506-8_24
    168 rdf:type schema:CreativeWork
    169 sg:pub.10.1007/bf01420984 schema:sameAs https://app.dimensions.ai/details/publication/pub.1021499342
    170 https://doi.org/10.1007/bf01420984
    171 rdf:type schema:CreativeWork
    172 sg:pub.10.1007/s10851-011-0290-2 schema:sameAs https://app.dimensions.ai/details/publication/pub.1005073312
    173 https://doi.org/10.1007/s10851-011-0290-2
    174 rdf:type schema:CreativeWork
    175 sg:pub.10.1007/s10851-012-0399-y schema:sameAs https://app.dimensions.ai/details/publication/pub.1030662861
    176 https://doi.org/10.1007/s10851-012-0399-y
    177 rdf:type schema:CreativeWork
    178 sg:pub.10.1007/s10956-014-9521-9 schema:sameAs https://app.dimensions.ai/details/publication/pub.1009903822
    179 https://doi.org/10.1007/s10956-014-9521-9
    180 rdf:type schema:CreativeWork
    181 sg:pub.10.1007/s11042-011-0845-7 schema:sameAs https://app.dimensions.ai/details/publication/pub.1045225319
    182 https://doi.org/10.1007/s11042-011-0845-7
    183 rdf:type schema:CreativeWork
    184 sg:pub.10.1007/s11042-012-1033-0 schema:sameAs https://app.dimensions.ai/details/publication/pub.1000243731
    185 https://doi.org/10.1007/s11042-012-1033-0
    186 rdf:type schema:CreativeWork
    187 sg:pub.10.1007/s11042-014-2079-y schema:sameAs https://app.dimensions.ai/details/publication/pub.1001085900
    188 https://doi.org/10.1007/s11042-014-2079-y
    189 rdf:type schema:CreativeWork
    190 sg:pub.10.1007/s11042-015-3183-3 schema:sameAs https://app.dimensions.ai/details/publication/pub.1005404435
    191 https://doi.org/10.1007/s11042-015-3183-3
    192 rdf:type schema:CreativeWork
    193 sg:pub.10.1155/2008/792028 schema:sameAs https://app.dimensions.ai/details/publication/pub.1063202836
    194 https://doi.org/10.1155/2008/792028
    195 rdf:type schema:CreativeWork
    196 sg:pub.10.3758/s13423-014-0659-5 schema:sameAs https://app.dimensions.ai/details/publication/pub.1030121467
    197 https://doi.org/10.3758/s13423-014-0659-5
    198 rdf:type schema:CreativeWork
    199 grid-institutes:grid.4793.9 schema:alternateName Aristotle University of Thessaloniki, Thessaloniki, Greece
    200 schema:name Aristotle University of Thessaloniki, Thessaloniki, Greece
    201 rdf:type schema:Organization
     




    Preview window. Press ESC to close (or click here)


    ...