A mechanical approach to realize reflexive omnidirectional bending motion for pneumatic continuum robots View Full Text


Ontology type: schema:ScholarlyArticle      Open Access: True


Article Info

DATE

2016-12

AUTHORS

Eri Takane, Kenjiro Tadakuma, Tomonari Yamamoto, Masashi Konyo, Satoshi Tadokoro

ABSTRACT

A mechanism that allows a robotic arm to quickly grip various forms of objects at disaster sites will enhance the mobility of rescue robots by keeping their bodies stable and maintaining manipulability for target objects, such as debris. Such a mechanism requires the ability to quickly and omnidirectionally change arm postures toward the target and hold it in a stable manner. Continuum robots are expected to provide this functionality. Conventional continuum robots realize the function of changing arm postures and grasping objects by controlling pneumatic actuators with multiple air chambers arranged in parallel. However, conventional robots cannot be applied to potential disaster sites filled with flammable gases, gasoline, or high radiation levels because they require electronic components (e.g., solenoid valves, and sensors) to control air pressures. This study proposes a unique approach to realize reflexive omnidirectional bending motion using only mechanical components without any electrical devices. The proposed system realizes a reflexive motion to bend the arm in the target’s direction by detecting a contact location using a mechanical reactive system. The proposed simple mechanism has the advantages of high durability and easy implementation. This paper aims to confirm the proposed concept by prototyping a drive mechanism coupled with contact detection and bending motion using mechanical port valves. We report the design concept and development of this prototype. The fundamental characteristics and feasibility of the proposed mechanism are experimentally confirmed. First, a prototype is developed using a mathematical model. Its performance in the bending and omnidirectional motions is evaluated. The results show that the model has a margin of −4.9% error in the bending angle and −7.4% error in the central curvature compared with the experimental values. We also confirm that using a higher pressure could realize a smaller radius of curvature and reduce an unnecessary twisting motion. We also tested a second prototype to confirm the grasping motion and force by changing the applied pressures. The influence of the bending direction was then evaluated. We confirm that a higher pressure generated a larger grasping force. The prototype can omnidirectionally produce approximately the same forces although the generated forces depend on the number of air chambers excited by the contact pads. Subsequently, we experimentally confirm the influence of gravity. The test shows that the effect of own weight greatly influences the posture after the object is in contact. This effect should not be ignored. Furthermore, the curve became sufficiently large when its contact pad is pressed. This result experimentally proved that self-holding is possible. The experimental results show the potential of the proposed mechanism. More... »

PAGES

28

Identifiers

URI

http://scigraph.springernature.com/pub.10.1186/s40648-016-0067-x

DOI

http://dx.doi.org/10.1186/s40648-016-0067-x

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1030031917


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Artificial Intelligence and Image Processing", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Information and Computing Sciences", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "Tohoku University", 
          "id": "https://www.grid.ac/institutes/grid.69566.3a", 
          "name": [
            "Graduate School of Information Sciences Applied Information Sciences Information and Applied Technology Human-Robot Informatics, Tohoku University, Sendai, Miyagi, Japan"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Takane", 
        "givenName": "Eri", 
        "id": "sg:person.07553201505.62", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.07553201505.62"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Tohoku University", 
          "id": "https://www.grid.ac/institutes/grid.69566.3a", 
          "name": [
            "Graduate School of Information Sciences Applied Information Sciences Information and Applied Technology Human-Robot Informatics, Tohoku University, Sendai, Miyagi, Japan"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Tadakuma", 
        "givenName": "Kenjiro", 
        "id": "sg:person.012225714221.87", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012225714221.87"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Tohoku University", 
          "id": "https://www.grid.ac/institutes/grid.69566.3a", 
          "name": [
            "Graduate School of Information Sciences Applied Information Sciences Information and Applied Technology Human-Robot Informatics, Tohoku University, Sendai, Miyagi, Japan"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Yamamoto", 
        "givenName": "Tomonari", 
        "id": "sg:person.016474676503.20", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.016474676503.20"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Tohoku University", 
          "id": "https://www.grid.ac/institutes/grid.69566.3a", 
          "name": [
            "Graduate School of Information Sciences Applied Information Sciences Information and Applied Technology Human-Robot Informatics, Tohoku University, Sendai, Miyagi, Japan"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Konyo", 
        "givenName": "Masashi", 
        "id": "sg:person.01101070023.41", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01101070023.41"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Tohoku University", 
          "id": "https://www.grid.ac/institutes/grid.69566.3a", 
          "name": [
            "Graduate School of Information Sciences Applied Information Sciences Information and Applied Technology Human-Robot Informatics, Tohoku University, Sendai, Miyagi, Japan"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Tadokoro", 
        "givenName": "Satoshi", 
        "id": "sg:person.013454033251.77", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013454033251.77"
        ], 
        "type": "Person"
      }
    ], 
    "citation": [
      {
        "id": "https://doi.org/10.1177/0278364909360852", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1004923115"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1177/0278364909360852", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1004923115"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1299/kikaic.55.2547", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1032165057"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.5402/2013/726506", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1047568323"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1073/pnas.1003250107", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1050759936"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/mra.2002.1035210", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061419211"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tnnls.2013.2287890", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061718449"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tro.2013.2256313", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061785507"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/iros.2005.1545487", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1093955516"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/robot.2001.932991", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1094628001"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/icra.2012.6225373", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1095765227"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/iros.2001.977200", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1095795398"
        ], 
        "type": "CreativeWork"
      }
    ], 
    "datePublished": "2016-12", 
    "datePublishedReg": "2016-12-01", 
    "description": "A mechanism that allows a robotic arm to quickly grip various forms of objects at disaster sites will enhance the mobility of rescue robots by keeping their bodies stable and maintaining manipulability for target objects, such as debris. Such a mechanism requires the ability to quickly and omnidirectionally change arm postures toward the target and hold it in a stable manner. Continuum robots are expected to provide this functionality. Conventional continuum robots realize the function of changing arm postures and grasping objects by controlling pneumatic actuators with multiple air chambers arranged in parallel. However, conventional robots cannot be applied to potential disaster sites filled with flammable gases, gasoline, or high radiation levels because they require electronic components (e.g., solenoid valves, and sensors) to control air pressures. This study proposes a unique approach to realize reflexive omnidirectional bending motion using only mechanical components without any electrical devices. The proposed system realizes a reflexive motion to bend the arm in the target\u2019s direction by detecting a contact location using a mechanical reactive system. The proposed simple mechanism has the advantages of high durability and easy implementation. This paper aims to confirm the proposed concept by prototyping a drive mechanism coupled with contact detection and bending motion using mechanical port valves. We report the design concept and development of this prototype. The fundamental characteristics and feasibility of the proposed mechanism are experimentally confirmed. First, a prototype is developed using a mathematical model. Its performance in the bending and omnidirectional motions is evaluated. The results show that the model has a margin of \u22124.9% error in the bending angle and \u22127.4% error in the central curvature compared with the experimental values. We also confirm that using a higher pressure could realize a smaller radius of curvature and reduce an unnecessary twisting motion. We also tested a second prototype to confirm the grasping motion and force by changing the applied pressures. The influence of the bending direction was then evaluated. We confirm that a higher pressure generated a larger grasping force. The prototype can omnidirectionally produce approximately the same forces although the generated forces depend on the number of air chambers excited by the contact pads. Subsequently, we experimentally confirm the influence of gravity. The test shows that the effect of own weight greatly influences the posture after the object is in contact. This effect should not be ignored. Furthermore, the curve became sufficiently large when its contact pad is pressed. This result experimentally proved that self-holding is possible. The experimental results show the potential of the proposed mechanism.", 
    "genre": "research_article", 
    "id": "sg:pub.10.1186/s40648-016-0067-x", 
    "inLanguage": [
      "en"
    ], 
    "isAccessibleForFree": true, 
    "isPartOf": [
      {
        "id": "sg:journal.1135873", 
        "issn": [
          "2197-4225"
        ], 
        "name": "ROBOMECH Journal", 
        "type": "Periodical"
      }, 
      {
        "issueNumber": "1", 
        "type": "PublicationIssue"
      }, 
      {
        "type": "PublicationVolume", 
        "volumeNumber": "3"
      }
    ], 
    "name": "A mechanical approach to realize reflexive omnidirectional bending motion for pneumatic continuum robots", 
    "pagination": "28", 
    "productId": [
      {
        "name": "readcube_id", 
        "type": "PropertyValue", 
        "value": [
          "8f12002c04cc3cce037575727dd1c63636d44160b66c7d22ada2a397a832060d"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1186/s40648-016-0067-x"
        ]
      }, 
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1030031917"
        ]
      }
    ], 
    "sameAs": [
      "https://doi.org/10.1186/s40648-016-0067-x", 
      "https://app.dimensions.ai/details/publication/pub.1030031917"
    ], 
    "sdDataset": "articles", 
    "sdDatePublished": "2019-04-11T12:37", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-uberresearch-data-dimensions-target-20181106-alternative/cleanup/v134/2549eaecd7973599484d7c17b260dba0a4ecb94b/merge/v9/a6c9fde33151104705d4d7ff012ea9563521a3ce/jats-lookup/v90/0000000363_0000000363/records_70032_00000001.jsonl", 
    "type": "ScholarlyArticle", 
    "url": "https://link.springer.com/10.1186%2Fs40648-016-0067-x"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1186/s40648-016-0067-x'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1186/s40648-016-0067-x'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1186/s40648-016-0067-x'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1186/s40648-016-0067-x'


 

This table displays all metadata directly associated to this object as RDF triples.

121 TRIPLES      21 PREDICATES      38 URIs      19 LITERALS      7 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1186/s40648-016-0067-x schema:about anzsrc-for:08
2 anzsrc-for:0801
3 schema:author N5221e61f9f3a4e7c8c96e78ce2ab0473
4 schema:citation https://doi.org/10.1073/pnas.1003250107
5 https://doi.org/10.1109/icra.2012.6225373
6 https://doi.org/10.1109/iros.2001.977200
7 https://doi.org/10.1109/iros.2005.1545487
8 https://doi.org/10.1109/mra.2002.1035210
9 https://doi.org/10.1109/robot.2001.932991
10 https://doi.org/10.1109/tnnls.2013.2287890
11 https://doi.org/10.1109/tro.2013.2256313
12 https://doi.org/10.1177/0278364909360852
13 https://doi.org/10.1299/kikaic.55.2547
14 https://doi.org/10.5402/2013/726506
15 schema:datePublished 2016-12
16 schema:datePublishedReg 2016-12-01
17 schema:description A mechanism that allows a robotic arm to quickly grip various forms of objects at disaster sites will enhance the mobility of rescue robots by keeping their bodies stable and maintaining manipulability for target objects, such as debris. Such a mechanism requires the ability to quickly and omnidirectionally change arm postures toward the target and hold it in a stable manner. Continuum robots are expected to provide this functionality. Conventional continuum robots realize the function of changing arm postures and grasping objects by controlling pneumatic actuators with multiple air chambers arranged in parallel. However, conventional robots cannot be applied to potential disaster sites filled with flammable gases, gasoline, or high radiation levels because they require electronic components (e.g., solenoid valves, and sensors) to control air pressures. This study proposes a unique approach to realize reflexive omnidirectional bending motion using only mechanical components without any electrical devices. The proposed system realizes a reflexive motion to bend the arm in the target’s direction by detecting a contact location using a mechanical reactive system. The proposed simple mechanism has the advantages of high durability and easy implementation. This paper aims to confirm the proposed concept by prototyping a drive mechanism coupled with contact detection and bending motion using mechanical port valves. We report the design concept and development of this prototype. The fundamental characteristics and feasibility of the proposed mechanism are experimentally confirmed. First, a prototype is developed using a mathematical model. Its performance in the bending and omnidirectional motions is evaluated. The results show that the model has a margin of −4.9% error in the bending angle and −7.4% error in the central curvature compared with the experimental values. We also confirm that using a higher pressure could realize a smaller radius of curvature and reduce an unnecessary twisting motion. We also tested a second prototype to confirm the grasping motion and force by changing the applied pressures. The influence of the bending direction was then evaluated. We confirm that a higher pressure generated a larger grasping force. The prototype can omnidirectionally produce approximately the same forces although the generated forces depend on the number of air chambers excited by the contact pads. Subsequently, we experimentally confirm the influence of gravity. The test shows that the effect of own weight greatly influences the posture after the object is in contact. This effect should not be ignored. Furthermore, the curve became sufficiently large when its contact pad is pressed. This result experimentally proved that self-holding is possible. The experimental results show the potential of the proposed mechanism.
18 schema:genre research_article
19 schema:inLanguage en
20 schema:isAccessibleForFree true
21 schema:isPartOf N5530c6624ceb4b0c90effd5d23001ed7
22 N7075edee4a404301a2614e59c1881bec
23 sg:journal.1135873
24 schema:name A mechanical approach to realize reflexive omnidirectional bending motion for pneumatic continuum robots
25 schema:pagination 28
26 schema:productId N1d3aa471f8344d79be6aa75cd0ce32fc
27 N90e79a26e0fe4196b019ffc95f8afe18
28 Nc9bf99f0731c43d19d304fcbe8c67955
29 schema:sameAs https://app.dimensions.ai/details/publication/pub.1030031917
30 https://doi.org/10.1186/s40648-016-0067-x
31 schema:sdDatePublished 2019-04-11T12:37
32 schema:sdLicense https://scigraph.springernature.com/explorer/license/
33 schema:sdPublisher N49c18a558b5540019acdddd093152b43
34 schema:url https://link.springer.com/10.1186%2Fs40648-016-0067-x
35 sgo:license sg:explorer/license/
36 sgo:sdDataset articles
37 rdf:type schema:ScholarlyArticle
38 N1d3aa471f8344d79be6aa75cd0ce32fc schema:name dimensions_id
39 schema:value pub.1030031917
40 rdf:type schema:PropertyValue
41 N20909bab292741369b09415cb2112e5f rdf:first sg:person.012225714221.87
42 rdf:rest N8d30f6f78f27441599a631dc9b28dd9b
43 N49c18a558b5540019acdddd093152b43 schema:name Springer Nature - SN SciGraph project
44 rdf:type schema:Organization
45 N5221e61f9f3a4e7c8c96e78ce2ab0473 rdf:first sg:person.07553201505.62
46 rdf:rest N20909bab292741369b09415cb2112e5f
47 N5530c6624ceb4b0c90effd5d23001ed7 schema:volumeNumber 3
48 rdf:type schema:PublicationVolume
49 N6232f4c86e824970aa1e26836ef0859d rdf:first sg:person.013454033251.77
50 rdf:rest rdf:nil
51 N7075edee4a404301a2614e59c1881bec schema:issueNumber 1
52 rdf:type schema:PublicationIssue
53 N8d30f6f78f27441599a631dc9b28dd9b rdf:first sg:person.016474676503.20
54 rdf:rest Nff014b4e155c41f59fbb100db82b7b36
55 N90e79a26e0fe4196b019ffc95f8afe18 schema:name readcube_id
56 schema:value 8f12002c04cc3cce037575727dd1c63636d44160b66c7d22ada2a397a832060d
57 rdf:type schema:PropertyValue
58 Nc9bf99f0731c43d19d304fcbe8c67955 schema:name doi
59 schema:value 10.1186/s40648-016-0067-x
60 rdf:type schema:PropertyValue
61 Nff014b4e155c41f59fbb100db82b7b36 rdf:first sg:person.01101070023.41
62 rdf:rest N6232f4c86e824970aa1e26836ef0859d
63 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
64 schema:name Information and Computing Sciences
65 rdf:type schema:DefinedTerm
66 anzsrc-for:0801 schema:inDefinedTermSet anzsrc-for:
67 schema:name Artificial Intelligence and Image Processing
68 rdf:type schema:DefinedTerm
69 sg:journal.1135873 schema:issn 2197-4225
70 schema:name ROBOMECH Journal
71 rdf:type schema:Periodical
72 sg:person.01101070023.41 schema:affiliation https://www.grid.ac/institutes/grid.69566.3a
73 schema:familyName Konyo
74 schema:givenName Masashi
75 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01101070023.41
76 rdf:type schema:Person
77 sg:person.012225714221.87 schema:affiliation https://www.grid.ac/institutes/grid.69566.3a
78 schema:familyName Tadakuma
79 schema:givenName Kenjiro
80 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012225714221.87
81 rdf:type schema:Person
82 sg:person.013454033251.77 schema:affiliation https://www.grid.ac/institutes/grid.69566.3a
83 schema:familyName Tadokoro
84 schema:givenName Satoshi
85 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013454033251.77
86 rdf:type schema:Person
87 sg:person.016474676503.20 schema:affiliation https://www.grid.ac/institutes/grid.69566.3a
88 schema:familyName Yamamoto
89 schema:givenName Tomonari
90 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.016474676503.20
91 rdf:type schema:Person
92 sg:person.07553201505.62 schema:affiliation https://www.grid.ac/institutes/grid.69566.3a
93 schema:familyName Takane
94 schema:givenName Eri
95 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.07553201505.62
96 rdf:type schema:Person
97 https://doi.org/10.1073/pnas.1003250107 schema:sameAs https://app.dimensions.ai/details/publication/pub.1050759936
98 rdf:type schema:CreativeWork
99 https://doi.org/10.1109/icra.2012.6225373 schema:sameAs https://app.dimensions.ai/details/publication/pub.1095765227
100 rdf:type schema:CreativeWork
101 https://doi.org/10.1109/iros.2001.977200 schema:sameAs https://app.dimensions.ai/details/publication/pub.1095795398
102 rdf:type schema:CreativeWork
103 https://doi.org/10.1109/iros.2005.1545487 schema:sameAs https://app.dimensions.ai/details/publication/pub.1093955516
104 rdf:type schema:CreativeWork
105 https://doi.org/10.1109/mra.2002.1035210 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061419211
106 rdf:type schema:CreativeWork
107 https://doi.org/10.1109/robot.2001.932991 schema:sameAs https://app.dimensions.ai/details/publication/pub.1094628001
108 rdf:type schema:CreativeWork
109 https://doi.org/10.1109/tnnls.2013.2287890 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061718449
110 rdf:type schema:CreativeWork
111 https://doi.org/10.1109/tro.2013.2256313 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061785507
112 rdf:type schema:CreativeWork
113 https://doi.org/10.1177/0278364909360852 schema:sameAs https://app.dimensions.ai/details/publication/pub.1004923115
114 rdf:type schema:CreativeWork
115 https://doi.org/10.1299/kikaic.55.2547 schema:sameAs https://app.dimensions.ai/details/publication/pub.1032165057
116 rdf:type schema:CreativeWork
117 https://doi.org/10.5402/2013/726506 schema:sameAs https://app.dimensions.ai/details/publication/pub.1047568323
118 rdf:type schema:CreativeWork
119 https://www.grid.ac/institutes/grid.69566.3a schema:alternateName Tohoku University
120 schema:name Graduate School of Information Sciences Applied Information Sciences Information and Applied Technology Human-Robot Informatics, Tohoku University, Sendai, Miyagi, Japan
121 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...