SpotCard: an optical mark recognition tool to improve field data collection speed and accuracy View Full Text


Ontology type: schema:ScholarlyArticle      Open Access: True


Article Info

DATE

2019-12

AUTHORS

Hamish A. Symington, Beverley J. Glover

ABSTRACT

Background: When taking photographs of plants in the field, it is often necessary to record additional information such as sample number, biological replicate number and subspecies. Manual methods of recording such information are slow, often involve laborious transcription from hand-written notes or the need to have a laptop or tablet on site, and present a risk by separating written data capture from image capture. Existing tools for field data capture focus on recording information rather than capturing pictures of plants. Results: We present SpotCard, a tool comprising two macros. The first can be used to create a template for small, reusable cards for use when photographing plants. Information can be encoded on these cards in a human- and machine-readable form, allowing the user to swiftly make annotations before taking the photograph. The second part of the tool automatically reads the annotations from the image and tabulates them in a CSV file, along with picture date, time and GPS coordinates. The SpotCard also provides a convenient scale bar and coordinate location within the image for the flower itself, enabling automated measurement of floral traits such as area and perimeter. Conclusions: This tool is shown to read annotations with a high degree of accuracy and at a speed greatly faster than manual transcription. It includes the ability to read the date and time of the photograph, as well as GPS location. It is an open-source ImageJ/Fiji macro and is available online. Its use requires no knowledge of the ImageJ macro coding language, and it is therefore well suited to all researchers taking pictures in the field. More... »

PAGES

19

Identifiers

URI

http://scigraph.springernature.com/pub.10.1186/s13007-019-0403-2

DOI

http://dx.doi.org/10.1186/s13007-019-0403-2

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1112315639

PUBMED

https://www.ncbi.nlm.nih.gov/pubmed/30833981


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Artificial Intelligence and Image Processing", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information and Computing Sciences", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "University of Cambridge", 
          "id": "https://www.grid.ac/institutes/grid.5335.0", 
          "name": [
            "Department of Plant Sciences, University of Cambridge, Downing Street, CB2 3EA, Cambridge, UK"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Symington", 
        "givenName": "Hamish A.", 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "University of Cambridge", 
          "id": "https://www.grid.ac/institutes/grid.5335.0", 
          "name": [
            "Department of Plant Sciences, University of Cambridge, Downing Street, CB2 3EA, Cambridge, UK"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Glover", 
        "givenName": "Beverley J.", 
        "type": "Person"
      }
    ], 
    "citation": [
      {
        "id": "https://doi.org/10.1626/pps.16.9", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1018693513"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1038/nmeth.2019", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1026706357", 
          "https://doi.org/10.1038/nmeth.2019"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1186/s13007-015-0069-3", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1042579040", 
          "https://doi.org/10.1186/s13007-015-0069-3"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1186/s13007-015-0069-3", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1042579040", 
          "https://doi.org/10.1186/s13007-015-0069-3"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1002/mrd.22489", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1053308274"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.2135/cropsci2013.08.0579", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1069032216"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.3791/50028", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1071423534"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1093/gigascience/giw019", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1083866707"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1186/s13007-017-0248-5", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1092575478", 
          "https://doi.org/10.1186/s13007-017-0248-5"
        ], 
        "type": "CreativeWork"
      }
    ], 
    "datePublished": "2019-12", 
    "datePublishedReg": "2019-12-01", 
    "description": "Background: When taking photographs of plants in the field, it is often necessary to record additional information such as sample number, biological replicate number and subspecies. Manual methods of recording such information are slow, often involve laborious transcription from hand-written notes or the need to have a laptop or tablet on site, and present a risk by separating written data capture from image capture. Existing tools for field data capture focus on recording information rather than capturing pictures of plants.\nResults: We present SpotCard, a tool comprising two macros. The first can be used to create a template for small, reusable cards for use when photographing plants. Information can be encoded on these cards in a human- and machine-readable form, allowing the user to swiftly make annotations before taking the photograph. The second part of the tool automatically reads the annotations from the image and tabulates them in a CSV file, along with picture date, time and GPS coordinates. The SpotCard also provides a convenient scale bar and coordinate location within the image for the flower itself, enabling automated measurement of floral traits such as area and perimeter.\nConclusions: This tool is shown to read annotations with a high degree of accuracy and at a speed greatly faster than manual transcription. It includes the ability to read the date and time of the photograph, as well as GPS location. It is an open-source ImageJ/Fiji macro and is available online. Its use requires no knowledge of the ImageJ macro coding language, and it is therefore well suited to all researchers taking pictures in the field.", 
    "genre": "research_article", 
    "id": "sg:pub.10.1186/s13007-019-0403-2", 
    "inLanguage": [
      "en"
    ], 
    "isAccessibleForFree": true, 
    "isFundedItemOf": [
      {
        "id": "sg:grant.3956483", 
        "type": "MonetaryGrant"
      }
    ], 
    "isPartOf": [
      {
        "id": "sg:journal.1035309", 
        "issn": [
          "1746-4811"
        ], 
        "name": "Plant Methods", 
        "type": "Periodical"
      }, 
      {
        "issueNumber": "1", 
        "type": "PublicationIssue"
      }, 
      {
        "type": "PublicationVolume", 
        "volumeNumber": "15"
      }
    ], 
    "name": "SpotCard: an optical mark recognition tool to improve field data collection speed and accuracy", 
    "pagination": "19", 
    "productId": [
      {
        "name": "readcube_id", 
        "type": "PropertyValue", 
        "value": [
          "1bccc06bcd6726d0587984a488da2a17abd2d50fb4f37d22d7c5710c902c6dea"
        ]
      }, 
      {
        "name": "pubmed_id", 
        "type": "PropertyValue", 
        "value": [
          "30833981"
        ]
      }, 
      {
        "name": "nlm_unique_id", 
        "type": "PropertyValue", 
        "value": [
          "101245798"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1186/s13007-019-0403-2"
        ]
      }, 
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1112315639"
        ]
      }
    ], 
    "sameAs": [
      "https://doi.org/10.1186/s13007-019-0403-2", 
      "https://app.dimensions.ai/details/publication/pub.1112315639"
    ], 
    "sdDataset": "articles", 
    "sdDatePublished": "2019-04-11T11:20", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-uberresearch-data-dimensions-target-20181106-alternative/cleanup/v134/2549eaecd7973599484d7c17b260dba0a4ecb94b/merge/v9/a6c9fde33151104705d4d7ff012ea9563521a3ce/jats-lookup/v90/0000000354_0000000354/records_11716_00000002.jsonl", 
    "type": "ScholarlyArticle", 
    "url": "https://link.springer.com/10.1186%2Fs13007-019-0403-2"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1186/s13007-019-0403-2'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1186/s13007-019-0403-2'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1186/s13007-019-0403-2'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1186/s13007-019-0403-2'


 

This table displays all metadata directly associated to this object as RDF triples.

102 TRIPLES      21 PREDICATES      37 URIs      21 LITERALS      9 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1186/s13007-019-0403-2 schema:about anzsrc-for:08
2 anzsrc-for:0801
3 schema:author N68cf5c98c2bf4c34b11a21c2b85cc6a9
4 schema:citation sg:pub.10.1038/nmeth.2019
5 sg:pub.10.1186/s13007-015-0069-3
6 sg:pub.10.1186/s13007-017-0248-5
7 https://doi.org/10.1002/mrd.22489
8 https://doi.org/10.1093/gigascience/giw019
9 https://doi.org/10.1626/pps.16.9
10 https://doi.org/10.2135/cropsci2013.08.0579
11 https://doi.org/10.3791/50028
12 schema:datePublished 2019-12
13 schema:datePublishedReg 2019-12-01
14 schema:description Background: When taking photographs of plants in the field, it is often necessary to record additional information such as sample number, biological replicate number and subspecies. Manual methods of recording such information are slow, often involve laborious transcription from hand-written notes or the need to have a laptop or tablet on site, and present a risk by separating written data capture from image capture. Existing tools for field data capture focus on recording information rather than capturing pictures of plants. Results: We present SpotCard, a tool comprising two macros. The first can be used to create a template for small, reusable cards for use when photographing plants. Information can be encoded on these cards in a human- and machine-readable form, allowing the user to swiftly make annotations before taking the photograph. The second part of the tool automatically reads the annotations from the image and tabulates them in a CSV file, along with picture date, time and GPS coordinates. The SpotCard also provides a convenient scale bar and coordinate location within the image for the flower itself, enabling automated measurement of floral traits such as area and perimeter. Conclusions: This tool is shown to read annotations with a high degree of accuracy and at a speed greatly faster than manual transcription. It includes the ability to read the date and time of the photograph, as well as GPS location. It is an open-source ImageJ/Fiji macro and is available online. Its use requires no knowledge of the ImageJ macro coding language, and it is therefore well suited to all researchers taking pictures in the field.
15 schema:genre research_article
16 schema:inLanguage en
17 schema:isAccessibleForFree true
18 schema:isPartOf N8d92f14f241f4b558b436a4dfb21ccf1
19 Nf24b6f34ea254c31855d70d89d5f0037
20 sg:journal.1035309
21 schema:name SpotCard: an optical mark recognition tool to improve field data collection speed and accuracy
22 schema:pagination 19
23 schema:productId N1a423299de7c451e9b88a644fdf10f4d
24 N325c930aa8744fa7bdb3931eeab0ebc9
25 N5593b83f33f64b91ab2eabaefa838f11
26 N63702d374b2b425791e44eec829df4f0
27 Nccd8e41b700344b4a569ab5c51a043f3
28 schema:sameAs https://app.dimensions.ai/details/publication/pub.1112315639
29 https://doi.org/10.1186/s13007-019-0403-2
30 schema:sdDatePublished 2019-04-11T11:20
31 schema:sdLicense https://scigraph.springernature.com/explorer/license/
32 schema:sdPublisher N5ae78951512f4165a5faf19a08e0bf5f
33 schema:url https://link.springer.com/10.1186%2Fs13007-019-0403-2
34 sgo:license sg:explorer/license/
35 sgo:sdDataset articles
36 rdf:type schema:ScholarlyArticle
37 N1a423299de7c451e9b88a644fdf10f4d schema:name dimensions_id
38 schema:value pub.1112315639
39 rdf:type schema:PropertyValue
40 N325c930aa8744fa7bdb3931eeab0ebc9 schema:name doi
41 schema:value 10.1186/s13007-019-0403-2
42 rdf:type schema:PropertyValue
43 N4bbf2e17a1854379bd7c7354ca46334c rdf:first N9e4ee89de8704b52a20ccbe12d924926
44 rdf:rest rdf:nil
45 N5593b83f33f64b91ab2eabaefa838f11 schema:name pubmed_id
46 schema:value 30833981
47 rdf:type schema:PropertyValue
48 N5ae78951512f4165a5faf19a08e0bf5f schema:name Springer Nature - SN SciGraph project
49 rdf:type schema:Organization
50 N63702d374b2b425791e44eec829df4f0 schema:name readcube_id
51 schema:value 1bccc06bcd6726d0587984a488da2a17abd2d50fb4f37d22d7c5710c902c6dea
52 rdf:type schema:PropertyValue
53 N68cf5c98c2bf4c34b11a21c2b85cc6a9 rdf:first Na6c4cf88058f4c798bc2a82b54ec3200
54 rdf:rest N4bbf2e17a1854379bd7c7354ca46334c
55 N8d92f14f241f4b558b436a4dfb21ccf1 schema:volumeNumber 15
56 rdf:type schema:PublicationVolume
57 N9e4ee89de8704b52a20ccbe12d924926 schema:affiliation https://www.grid.ac/institutes/grid.5335.0
58 schema:familyName Glover
59 schema:givenName Beverley J.
60 rdf:type schema:Person
61 Na6c4cf88058f4c798bc2a82b54ec3200 schema:affiliation https://www.grid.ac/institutes/grid.5335.0
62 schema:familyName Symington
63 schema:givenName Hamish A.
64 rdf:type schema:Person
65 Nccd8e41b700344b4a569ab5c51a043f3 schema:name nlm_unique_id
66 schema:value 101245798
67 rdf:type schema:PropertyValue
68 Nf24b6f34ea254c31855d70d89d5f0037 schema:issueNumber 1
69 rdf:type schema:PublicationIssue
70 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
71 schema:name Information and Computing Sciences
72 rdf:type schema:DefinedTerm
73 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
74 schema:name Artificial Intelligence and Image Processing
75 rdf:type schema:DefinedTerm
76 sg:grant.3956483 http://pending.schema.org/fundedItem sg:pub.10.1186/s13007-019-0403-2
77 rdf:type schema:MonetaryGrant
78 sg:journal.1035309 schema:issn 1746-4811
79 schema:name Plant Methods
80 rdf:type schema:Periodical
81 sg:pub.10.1038/nmeth.2019 schema:sameAs https://app.dimensions.ai/details/publication/pub.1026706357
82 https://doi.org/10.1038/nmeth.2019
83 rdf:type schema:CreativeWork
84 sg:pub.10.1186/s13007-015-0069-3 schema:sameAs https://app.dimensions.ai/details/publication/pub.1042579040
85 https://doi.org/10.1186/s13007-015-0069-3
86 rdf:type schema:CreativeWork
87 sg:pub.10.1186/s13007-017-0248-5 schema:sameAs https://app.dimensions.ai/details/publication/pub.1092575478
88 https://doi.org/10.1186/s13007-017-0248-5
89 rdf:type schema:CreativeWork
90 https://doi.org/10.1002/mrd.22489 schema:sameAs https://app.dimensions.ai/details/publication/pub.1053308274
91 rdf:type schema:CreativeWork
92 https://doi.org/10.1093/gigascience/giw019 schema:sameAs https://app.dimensions.ai/details/publication/pub.1083866707
93 rdf:type schema:CreativeWork
94 https://doi.org/10.1626/pps.16.9 schema:sameAs https://app.dimensions.ai/details/publication/pub.1018693513
95 rdf:type schema:CreativeWork
96 https://doi.org/10.2135/cropsci2013.08.0579 schema:sameAs https://app.dimensions.ai/details/publication/pub.1069032216
97 rdf:type schema:CreativeWork
98 https://doi.org/10.3791/50028 schema:sameAs https://app.dimensions.ai/details/publication/pub.1071423534
99 rdf:type schema:CreativeWork
100 https://www.grid.ac/institutes/grid.5335.0 schema:alternateName University of Cambridge
101 schema:name Department of Plant Sciences, University of Cambridge, Downing Street, CB2 3EA, Cambridge, UK
102 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...