Real-time detection of rice phenology through convolutional neural network using handheld camera images View Full Text


Ontology type: schema:ScholarlyArticle     


Article Info

DATE

2020-06-28

AUTHORS

Jingye Han, Liangsheng Shi, Qi Yang, Kai Huang, Yuanyuan Zha, Jin Yu

ABSTRACT

Smallholder farmers play an important role in the global food supply. As smartphones become increasingly pervasive, they enable smallholder farmers to collect images at very low cost. In this study, an efficient deep convolutional neural network (DCNN) architecture was proposed to detect development stages (DVS) of paddy rice using photographs taken by a handheld camera. The DCNN model was trained with different strategies and compared against the traditional time series Green chromatic coordinate (time-series Gcc) method and the manually extracted feature-combining support vector machine (MF-SVM) method. Furthermore, images taken at different view angles, model training strategies, and interpretations of predictions of the DCNN models were investigated. Optimal results were obtained by the DCNN model trained with the proposed two-step fine-tuning strategy, with a high overall accuracy of 0.913 and low mean absolute error of 0.090. The results indicated that images taken at large view angles contained more valuable information and the performance of the model can be further improved by using images taken at multiple angles. The two-step fine-tuning strategy greatly improved the model robustness against the randomness of view angle. The interpretation results demonstrated that it is possible to extract phenology-related features from images. This study provides a phenology detection approach to utilize handheld camera images in real time and some important insights into the use of deep learning in real world scenarios. More... »

PAGES

154-178

Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/s11119-020-09734-2

DOI

http://dx.doi.org/10.1007/s11119-020-09734-2

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1128825284


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/07", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Agricultural and Veterinary Sciences", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0703", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Crop and Pasture Production", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China", 
          "id": "http://www.grid.ac/institutes/grid.49470.3e", 
          "name": [
            "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Han", 
        "givenName": "Jingye", 
        "id": "sg:person.012656574611.32", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012656574611.32"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China", 
          "id": "http://www.grid.ac/institutes/grid.49470.3e", 
          "name": [
            "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Shi", 
        "givenName": "Liangsheng", 
        "id": "sg:person.013753244525.37", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013753244525.37"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China", 
          "id": "http://www.grid.ac/institutes/grid.49470.3e", 
          "name": [
            "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Yang", 
        "givenName": "Qi", 
        "id": "sg:person.013514733747.03", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013514733747.03"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Guangxi Hydraulic Research Institute, 530023, Nanning, Guangxi, China", 
          "id": "http://www.grid.ac/institutes/None", 
          "name": [
            "Guangxi Hydraulic Research Institute, 530023, Nanning, Guangxi, China"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Huang", 
        "givenName": "Kai", 
        "id": "sg:person.011000754424.91", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011000754424.91"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China", 
          "id": "http://www.grid.ac/institutes/grid.49470.3e", 
          "name": [
            "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Zha", 
        "givenName": "Yuanyuan", 
        "id": "sg:person.01041615256.23", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01041615256.23"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China", 
          "id": "http://www.grid.ac/institutes/grid.49470.3e", 
          "name": [
            "State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Yu", 
        "givenName": "Jin", 
        "id": "sg:person.015047116211.77", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.015047116211.77"
        ], 
        "type": "Person"
      }
    ], 
    "citation": [
      {
        "id": "sg:pub.10.1007/s11119-011-9246-1", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1017493409", 
          "https://doi.org/10.1007/s11119-011-9246-1"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-642-25047-7_1", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1043586081", 
          "https://doi.org/10.1007/978-3-642-25047-7_1"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/s11119-019-09642-0", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1112460844", 
          "https://doi.org/10.1007/s11119-019-09642-0"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1186/s13007-015-0047-9", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1026301719", 
          "https://doi.org/10.1186/s13007-015-0047-9"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1038/nature25785", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1101376601", 
          "https://doi.org/10.1038/nature25785"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/s11119-019-09656-8", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1113603632", 
          "https://doi.org/10.1007/s11119-019-09656-8"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-642-36657-4_1", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1046312800", 
          "https://doi.org/10.1007/978-3-642-36657-4_1"
        ], 
        "type": "CreativeWork"
      }
    ], 
    "datePublished": "2020-06-28", 
    "datePublishedReg": "2020-06-28", 
    "description": "Smallholder farmers play an important role in the global food supply. As smartphones become increasingly pervasive, they enable smallholder farmers to collect images at very low cost. In this study, an efficient deep convolutional neural network (DCNN) architecture was proposed to detect development stages (DVS) of paddy rice using photographs taken by a handheld camera. The DCNN model was trained with different strategies and compared against the traditional time series Green chromatic coordinate (time-series Gcc) method and the manually extracted feature-combining support vector machine (MF-SVM) method. Furthermore, images taken at different view angles, model training strategies, and interpretations of predictions of the DCNN models were investigated. Optimal results were obtained by the DCNN model trained with the proposed two-step fine-tuning strategy, with a high overall accuracy of 0.913 and low mean absolute error of 0.090. The results indicated that images taken at large view angles contained more valuable information and the performance of the model can be further improved by using images taken at multiple angles. The two-step fine-tuning strategy greatly improved the model robustness against the randomness of view angle. The interpretation results demonstrated that it is possible to extract phenology-related features from images. This study provides a phenology detection approach to utilize handheld camera images in real time and some important insights into the use of deep learning in real world scenarios.", 
    "genre": "article", 
    "id": "sg:pub.10.1007/s11119-020-09734-2", 
    "inLanguage": "en", 
    "isAccessibleForFree": false, 
    "isFundedItemOf": [
      {
        "id": "sg:grant.9415548", 
        "type": "MonetaryGrant"
      }, 
      {
        "id": "sg:grant.8271116", 
        "type": "MonetaryGrant"
      }
    ], 
    "isPartOf": [
      {
        "id": "sg:journal.1135929", 
        "issn": [
          "1385-2256", 
          "1573-1618"
        ], 
        "name": "Precision Agriculture", 
        "publisher": "Springer Nature", 
        "type": "Periodical"
      }, 
      {
        "issueNumber": "1", 
        "type": "PublicationIssue"
      }, 
      {
        "type": "PublicationVolume", 
        "volumeNumber": "22"
      }
    ], 
    "keywords": [
      "DCNN model", 
      "deep convolutional neural network architecture", 
      "camera images", 
      "convolutional neural network architecture", 
      "convolutional neural network", 
      "neural network architecture", 
      "view angle", 
      "model training strategies", 
      "real-world scenarios", 
      "support vector machine method", 
      "vector machine method", 
      "different view angles", 
      "deep learning", 
      "network architecture", 
      "real-time detection", 
      "neural network", 
      "handheld camera", 
      "detection approach", 
      "world scenarios", 
      "highest overall accuracy", 
      "machine method", 
      "real time", 
      "training strategy", 
      "interpretation of predictions", 
      "images", 
      "model robustness", 
      "overall accuracy", 
      "multiple angles", 
      "absolute error", 
      "optimal results", 
      "development stages", 
      "smartphones", 
      "low cost", 
      "architecture", 
      "camera", 
      "interpretation results", 
      "network", 
      "valuable information", 
      "learning", 
      "robustness", 
      "different strategies", 
      "scenarios", 
      "model", 
      "accuracy", 
      "information", 
      "performance", 
      "randomness", 
      "method", 
      "cost", 
      "strategies", 
      "detection", 
      "error", 
      "features", 
      "large view angles", 
      "results", 
      "two-step", 
      "prediction", 
      "rice phenology", 
      "coordinate method", 
      "time", 
      "use", 
      "important role", 
      "photographs", 
      "insights", 
      "interpretation", 
      "stage", 
      "angle", 
      "important insights", 
      "farmers", 
      "study", 
      "supply", 
      "paddy rice", 
      "role", 
      "smallholder farmers", 
      "global food supply", 
      "approach", 
      "food supply", 
      "phenology", 
      "rice"
    ], 
    "name": "Real-time detection of rice phenology through convolutional neural network using handheld camera images", 
    "pagination": "154-178", 
    "productId": [
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1128825284"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1007/s11119-020-09734-2"
        ]
      }
    ], 
    "sameAs": [
      "https://doi.org/10.1007/s11119-020-09734-2", 
      "https://app.dimensions.ai/details/publication/pub.1128825284"
    ], 
    "sdDataset": "articles", 
    "sdDatePublished": "2022-05-10T10:25", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-springernature-scigraph/baseset/20220509/entities/gbq_results/article/article_844.jsonl", 
    "type": "ScholarlyArticle", 
    "url": "https://doi.org/10.1007/s11119-020-09734-2"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/s11119-020-09734-2'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/s11119-020-09734-2'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/s11119-020-09734-2'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/s11119-020-09734-2'


 

This table displays all metadata directly associated to this object as RDF triples.

207 TRIPLES      22 PREDICATES      111 URIs      96 LITERALS      6 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1007/s11119-020-09734-2 schema:about anzsrc-for:07
2 anzsrc-for:0703
3 schema:author N0d1329b05cb74f59afae0d5b86a9a1ed
4 schema:citation sg:pub.10.1007/978-3-642-25047-7_1
5 sg:pub.10.1007/978-3-642-36657-4_1
6 sg:pub.10.1007/s11119-011-9246-1
7 sg:pub.10.1007/s11119-019-09642-0
8 sg:pub.10.1007/s11119-019-09656-8
9 sg:pub.10.1038/nature25785
10 sg:pub.10.1186/s13007-015-0047-9
11 schema:datePublished 2020-06-28
12 schema:datePublishedReg 2020-06-28
13 schema:description Smallholder farmers play an important role in the global food supply. As smartphones become increasingly pervasive, they enable smallholder farmers to collect images at very low cost. In this study, an efficient deep convolutional neural network (DCNN) architecture was proposed to detect development stages (DVS) of paddy rice using photographs taken by a handheld camera. The DCNN model was trained with different strategies and compared against the traditional time series Green chromatic coordinate (time-series Gcc) method and the manually extracted feature-combining support vector machine (MF-SVM) method. Furthermore, images taken at different view angles, model training strategies, and interpretations of predictions of the DCNN models were investigated. Optimal results were obtained by the DCNN model trained with the proposed two-step fine-tuning strategy, with a high overall accuracy of 0.913 and low mean absolute error of 0.090. The results indicated that images taken at large view angles contained more valuable information and the performance of the model can be further improved by using images taken at multiple angles. The two-step fine-tuning strategy greatly improved the model robustness against the randomness of view angle. The interpretation results demonstrated that it is possible to extract phenology-related features from images. This study provides a phenology detection approach to utilize handheld camera images in real time and some important insights into the use of deep learning in real world scenarios.
14 schema:genre article
15 schema:inLanguage en
16 schema:isAccessibleForFree false
17 schema:isPartOf Nc5dfe62e23af40299999f9b6e0569c88
18 Ne84e052ca71d405f87719cef6ebc09ad
19 sg:journal.1135929
20 schema:keywords DCNN model
21 absolute error
22 accuracy
23 angle
24 approach
25 architecture
26 camera
27 camera images
28 convolutional neural network
29 convolutional neural network architecture
30 coordinate method
31 cost
32 deep convolutional neural network architecture
33 deep learning
34 detection
35 detection approach
36 development stages
37 different strategies
38 different view angles
39 error
40 farmers
41 features
42 food supply
43 global food supply
44 handheld camera
45 highest overall accuracy
46 images
47 important insights
48 important role
49 information
50 insights
51 interpretation
52 interpretation of predictions
53 interpretation results
54 large view angles
55 learning
56 low cost
57 machine method
58 method
59 model
60 model robustness
61 model training strategies
62 multiple angles
63 network
64 network architecture
65 neural network
66 neural network architecture
67 optimal results
68 overall accuracy
69 paddy rice
70 performance
71 phenology
72 photographs
73 prediction
74 randomness
75 real time
76 real-time detection
77 real-world scenarios
78 results
79 rice
80 rice phenology
81 robustness
82 role
83 scenarios
84 smallholder farmers
85 smartphones
86 stage
87 strategies
88 study
89 supply
90 support vector machine method
91 time
92 training strategy
93 two-step
94 use
95 valuable information
96 vector machine method
97 view angle
98 world scenarios
99 schema:name Real-time detection of rice phenology through convolutional neural network using handheld camera images
100 schema:pagination 154-178
101 schema:productId N3a0ffa001807494aa375d1046b10e8e6
102 Nd8ec6d8183d94a0bab9430cb8d0f7281
103 schema:sameAs https://app.dimensions.ai/details/publication/pub.1128825284
104 https://doi.org/10.1007/s11119-020-09734-2
105 schema:sdDatePublished 2022-05-10T10:25
106 schema:sdLicense https://scigraph.springernature.com/explorer/license/
107 schema:sdPublisher Na887e5e2722d4880899f7d70eacbc0f2
108 schema:url https://doi.org/10.1007/s11119-020-09734-2
109 sgo:license sg:explorer/license/
110 sgo:sdDataset articles
111 rdf:type schema:ScholarlyArticle
112 N0d1329b05cb74f59afae0d5b86a9a1ed rdf:first sg:person.012656574611.32
113 rdf:rest Nc7d8ec4deb9a4174be67928cb3279bd9
114 N16abf23cda7e43198f669f0e9eae3b44 rdf:first sg:person.01041615256.23
115 rdf:rest N33b6dd0686b842c8a2c9fe27ca434511
116 N33b6dd0686b842c8a2c9fe27ca434511 rdf:first sg:person.015047116211.77
117 rdf:rest rdf:nil
118 N3a0ffa001807494aa375d1046b10e8e6 schema:name doi
119 schema:value 10.1007/s11119-020-09734-2
120 rdf:type schema:PropertyValue
121 N5fed4d95ef22449bbf17d35f87e44d62 rdf:first sg:person.011000754424.91
122 rdf:rest N16abf23cda7e43198f669f0e9eae3b44
123 Na887e5e2722d4880899f7d70eacbc0f2 schema:name Springer Nature - SN SciGraph project
124 rdf:type schema:Organization
125 Nc5dfe62e23af40299999f9b6e0569c88 schema:volumeNumber 22
126 rdf:type schema:PublicationVolume
127 Nc7d8ec4deb9a4174be67928cb3279bd9 rdf:first sg:person.013753244525.37
128 rdf:rest Ndc9a9f71bf4746f7abd36ca6218d9170
129 Nd8ec6d8183d94a0bab9430cb8d0f7281 schema:name dimensions_id
130 schema:value pub.1128825284
131 rdf:type schema:PropertyValue
132 Ndc9a9f71bf4746f7abd36ca6218d9170 rdf:first sg:person.013514733747.03
133 rdf:rest N5fed4d95ef22449bbf17d35f87e44d62
134 Ne84e052ca71d405f87719cef6ebc09ad schema:issueNumber 1
135 rdf:type schema:PublicationIssue
136 anzsrc-for:07 schema:inDefinedTermSet anzsrc-for:
137 schema:name Agricultural and Veterinary Sciences
138 rdf:type schema:DefinedTerm
139 anzsrc-for:0703 schema:inDefinedTermSet anzsrc-for:
140 schema:name Crop and Pasture Production
141 rdf:type schema:DefinedTerm
142 sg:grant.8271116 http://pending.schema.org/fundedItem sg:pub.10.1007/s11119-020-09734-2
143 rdf:type schema:MonetaryGrant
144 sg:grant.9415548 http://pending.schema.org/fundedItem sg:pub.10.1007/s11119-020-09734-2
145 rdf:type schema:MonetaryGrant
146 sg:journal.1135929 schema:issn 1385-2256
147 1573-1618
148 schema:name Precision Agriculture
149 schema:publisher Springer Nature
150 rdf:type schema:Periodical
151 sg:person.01041615256.23 schema:affiliation grid-institutes:grid.49470.3e
152 schema:familyName Zha
153 schema:givenName Yuanyuan
154 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01041615256.23
155 rdf:type schema:Person
156 sg:person.011000754424.91 schema:affiliation grid-institutes:None
157 schema:familyName Huang
158 schema:givenName Kai
159 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.011000754424.91
160 rdf:type schema:Person
161 sg:person.012656574611.32 schema:affiliation grid-institutes:grid.49470.3e
162 schema:familyName Han
163 schema:givenName Jingye
164 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012656574611.32
165 rdf:type schema:Person
166 sg:person.013514733747.03 schema:affiliation grid-institutes:grid.49470.3e
167 schema:familyName Yang
168 schema:givenName Qi
169 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013514733747.03
170 rdf:type schema:Person
171 sg:person.013753244525.37 schema:affiliation grid-institutes:grid.49470.3e
172 schema:familyName Shi
173 schema:givenName Liangsheng
174 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013753244525.37
175 rdf:type schema:Person
176 sg:person.015047116211.77 schema:affiliation grid-institutes:grid.49470.3e
177 schema:familyName Yu
178 schema:givenName Jin
179 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.015047116211.77
180 rdf:type schema:Person
181 sg:pub.10.1007/978-3-642-25047-7_1 schema:sameAs https://app.dimensions.ai/details/publication/pub.1043586081
182 https://doi.org/10.1007/978-3-642-25047-7_1
183 rdf:type schema:CreativeWork
184 sg:pub.10.1007/978-3-642-36657-4_1 schema:sameAs https://app.dimensions.ai/details/publication/pub.1046312800
185 https://doi.org/10.1007/978-3-642-36657-4_1
186 rdf:type schema:CreativeWork
187 sg:pub.10.1007/s11119-011-9246-1 schema:sameAs https://app.dimensions.ai/details/publication/pub.1017493409
188 https://doi.org/10.1007/s11119-011-9246-1
189 rdf:type schema:CreativeWork
190 sg:pub.10.1007/s11119-019-09642-0 schema:sameAs https://app.dimensions.ai/details/publication/pub.1112460844
191 https://doi.org/10.1007/s11119-019-09642-0
192 rdf:type schema:CreativeWork
193 sg:pub.10.1007/s11119-019-09656-8 schema:sameAs https://app.dimensions.ai/details/publication/pub.1113603632
194 https://doi.org/10.1007/s11119-019-09656-8
195 rdf:type schema:CreativeWork
196 sg:pub.10.1038/nature25785 schema:sameAs https://app.dimensions.ai/details/publication/pub.1101376601
197 https://doi.org/10.1038/nature25785
198 rdf:type schema:CreativeWork
199 sg:pub.10.1186/s13007-015-0047-9 schema:sameAs https://app.dimensions.ai/details/publication/pub.1026301719
200 https://doi.org/10.1186/s13007-015-0047-9
201 rdf:type schema:CreativeWork
202 grid-institutes:None schema:alternateName Guangxi Hydraulic Research Institute, 530023, Nanning, Guangxi, China
203 schema:name Guangxi Hydraulic Research Institute, 530023, Nanning, Guangxi, China
204 rdf:type schema:Organization
205 grid-institutes:grid.49470.3e schema:alternateName State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China
206 schema:name State Key Laboratory of Water Resources and Hydropower Engineering Sciences, Wuhan University, 430072, Wuhan, Hubei, China
207 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...