Specific components of face perception in the human fusiform gyrus studied by tomographic estimates of magnetoencephalographic signals: a tool for ... View Full Text


Ontology type: schema:ScholarlyArticle      Open Access: True


Article Info

DATE

2007-12

AUTHORS

Yuka Okazaki, Andreas A Ioannides

ABSTRACT

AIMS: The aim of this study was to determine the specific spatiotemporal activation patterns of face perception in the fusiform gyrus (FG). The FG is a key area in the specialized brain system that makes possible the recognition of face with ease and speed in our daily life. Characterization of FG response provides a quantitative method for evaluating the fundamental functions that contribute to non-verbal communication in various psychosomatic paradigms. METHODS: The MEG signal was recorded during passive visual stimulus presentation with three stimulus types - Faces, Hands and Shoes. The stimuli were presented separately to the central and peripheral visual fields. We performed statistical parametric mapping (SPM) analysis of tomographic estimates of activity to compare activity between a pre- and post-stimulus period in the same object (baseline test), and activity between objects (active test). The time course of regional activation curves was analyzed for each stimulus condition. RESULTS: The SPM baseline test revealed a response to each stimulus type, which was very compact at the initial segment of main M(FG)170. For hands and shoes the area of significant change remains compact. For faces the area expanded widely within a few milliseconds and its boundaries engulfed the other object areas. The active test demonstrated that activity for faces was significantly larger than the activity for hands. The same face specific compact area as in the baseline test was identified, and then again expanded widely. For each stimulus type and presentation in each one of the visual fields locations, the analysis of the time course of FG activity identified three components in the FG: M(FG)100, M(FG)170, and M(FG)200 - all showed preference for faces. CONCLUSION: Early compact face-specific activity in the FG expands widely along the occipito-ventral brain within a few milliseconds. The significant difference between faces and the other object stimuli in M(FG)100 shows that processing of faces is already differentiated from processing of other objects within 100 ms. Standardization of the three face-specific MEG components could have diagnostic value for the integrity of the initial process of non-verbal communication in various psychosomatic paradigms. More... »

PAGES

23

Identifiers

URI

http://scigraph.springernature.com/pub.10.1186/1751-0759-1-23

DOI

http://dx.doi.org/10.1186/1751-0759-1-23

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1033886433

PUBMED

https://www.ncbi.nlm.nih.gov/pubmed/18053195


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/1701", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Psychology", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/17", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Psychology and Cognitive Sciences", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "RIKEN Center for Brain Science", 
          "id": "https://www.grid.ac/institutes/grid.474690.8", 
          "name": [
            "Department of Brain Science and Engineering, Graduate School of Life Science and Systems Engineering, Kyushu Institute of Technology, Kitakyushu-shi, Japan", 
            "Laboratory for Human Brain Dynamics, Brain Science Institute (BSI), RIKEN, Wako-shi, Japan"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Okazaki", 
        "givenName": "Yuka", 
        "id": "sg:person.0744073634.04", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.0744073634.04"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "RIKEN Center for Brain Science", 
          "id": "https://www.grid.ac/institutes/grid.474690.8", 
          "name": [
            "Department of Brain Science and Engineering, Graduate School of Life Science and Systems Engineering, Kyushu Institute of Technology, Kitakyushu-shi, Japan", 
            "Laboratory for Human Brain Dynamics, Brain Science Institute (BSI), RIKEN, Wako-shi, Japan"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Ioannides", 
        "givenName": "Andreas A", 
        "id": "sg:person.0600434417.84", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.0600434417.84"
        ], 
        "type": "Person"
      }
    ], 
    "citation": [
      {
        "id": "https://doi.org/10.1016/s0926-6410(99)00013-0", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1005409857"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/j.jesp.2006.10.023", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1011280568"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1523/jneurosci.1091-05.2005", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1015109357"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/j.ijdevneu.2004.12.012", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1021923212"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/j.neuroimage.2006.02.009", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1025529451"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1002/hbm.10043", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1025674676"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1023/a:1022258620435", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1034352216", 
          "https://doi.org/10.1023/a:1022258620435"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1073/pnas.96.16.9379", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1034957667"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1093/cercor/10.1.69", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1039978488"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1088/0266-5611/6/4/005", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1041998952"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/s0926-6410(98)00048-2", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1043728763"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1111/j.1460-9568.2005.04181.x", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1043812789"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1111/j.1460-9568.2005.04181.x", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1043812789"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/j.brainres.2006.07.072", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1045688556"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1093/cercor/9.5.415", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1046286523"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1080/13554790701494964", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1047369750"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1523/jneurosci.2621-05.2005", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1049120256"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1126/science.1063736", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1050161855"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/s0896-6273(00)00168-9", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1050668166"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/42.759120", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061170758"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1212/01.wnl.0000267842.85646.f2", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1064349444"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1212/01.wnl.0000267842.85646.f2", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1064349444"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1212/01.wnl.0000267842.85646.f2", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1064349444"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1523/jneurosci.17-11-04302.1997", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1083089633"
        ], 
        "type": "CreativeWork"
      }
    ], 
    "datePublished": "2007-12", 
    "datePublishedReg": "2007-12-01", 
    "description": "AIMS: The aim of this study was to determine the specific spatiotemporal activation patterns of face perception in the fusiform gyrus (FG). The FG is a key area in the specialized brain system that makes possible the recognition of face with ease and speed in our daily life. Characterization of FG response provides a quantitative method for evaluating the fundamental functions that contribute to non-verbal communication in various psychosomatic paradigms.\nMETHODS: The MEG signal was recorded during passive visual stimulus presentation with three stimulus types - Faces, Hands and Shoes. The stimuli were presented separately to the central and peripheral visual fields. We performed statistical parametric mapping (SPM) analysis of tomographic estimates of activity to compare activity between a pre- and post-stimulus period in the same object (baseline test), and activity between objects (active test). The time course of regional activation curves was analyzed for each stimulus condition.\nRESULTS: The SPM baseline test revealed a response to each stimulus type, which was very compact at the initial segment of main M(FG)170. For hands and shoes the area of significant change remains compact. For faces the area expanded widely within a few milliseconds and its boundaries engulfed the other object areas. The active test demonstrated that activity for faces was significantly larger than the activity for hands. The same face specific compact area as in the baseline test was identified, and then again expanded widely. For each stimulus type and presentation in each one of the visual fields locations, the analysis of the time course of FG activity identified three components in the FG: M(FG)100, M(FG)170, and M(FG)200 - all showed preference for faces.\nCONCLUSION: Early compact face-specific activity in the FG expands widely along the occipito-ventral brain within a few milliseconds. The significant difference between faces and the other object stimuli in M(FG)100 shows that processing of faces is already differentiated from processing of other objects within 100 ms. Standardization of the three face-specific MEG components could have diagnostic value for the integrity of the initial process of non-verbal communication in various psychosomatic paradigms.", 
    "genre": "research_article", 
    "id": "sg:pub.10.1186/1751-0759-1-23", 
    "inLanguage": [
      "en"
    ], 
    "isAccessibleForFree": true, 
    "isPartOf": [
      {
        "id": "sg:journal.1037697", 
        "issn": [
          "1751-0759"
        ], 
        "name": "BioPsychoSocial Medicine", 
        "type": "Periodical"
      }, 
      {
        "issueNumber": "1", 
        "type": "PublicationIssue"
      }, 
      {
        "type": "PublicationVolume", 
        "volumeNumber": "1"
      }
    ], 
    "name": "Specific components of face perception in the human fusiform gyrus studied by tomographic estimates of magnetoencephalographic signals: a tool for the evaluation of non-verbal communication in psychosomatic paradigms", 
    "pagination": "23", 
    "productId": [
      {
        "name": "readcube_id", 
        "type": "PropertyValue", 
        "value": [
          "a16d0c909688a6f1aa84646c3b8f30b626f3590bf1635701a602248bd83a1724"
        ]
      }, 
      {
        "name": "pubmed_id", 
        "type": "PropertyValue", 
        "value": [
          "18053195"
        ]
      }, 
      {
        "name": "nlm_unique_id", 
        "type": "PropertyValue", 
        "value": [
          "101286572"
        ]
      }, 
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1186/1751-0759-1-23"
        ]
      }, 
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1033886433"
        ]
      }
    ], 
    "sameAs": [
      "https://doi.org/10.1186/1751-0759-1-23", 
      "https://app.dimensions.ai/details/publication/pub.1033886433"
    ], 
    "sdDataset": "articles", 
    "sdDatePublished": "2019-04-11T01:07", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-uberresearch-data-dimensions-target-20181106-alternative/cleanup/v134/2549eaecd7973599484d7c17b260dba0a4ecb94b/merge/v9/a6c9fde33151104705d4d7ff012ea9563521a3ce/jats-lookup/v90/0000000001_0000000264/records_8697_00000514.jsonl", 
    "type": "ScholarlyArticle", 
    "url": "http://link.springer.com/10.1186%2F1751-0759-1-23"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1186/1751-0759-1-23'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1186/1751-0759-1-23'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1186/1751-0759-1-23'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1186/1751-0759-1-23'


 

This table displays all metadata directly associated to this object as RDF triples.

140 TRIPLES      21 PREDICATES      50 URIs      21 LITERALS      9 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1186/1751-0759-1-23 schema:about anzsrc-for:17
2 anzsrc-for:1701
3 schema:author N25a6259cc5584582ac5c5b86c968c2f9
4 schema:citation sg:pub.10.1023/a:1022258620435
5 https://doi.org/10.1002/hbm.10043
6 https://doi.org/10.1016/j.brainres.2006.07.072
7 https://doi.org/10.1016/j.ijdevneu.2004.12.012
8 https://doi.org/10.1016/j.jesp.2006.10.023
9 https://doi.org/10.1016/j.neuroimage.2006.02.009
10 https://doi.org/10.1016/s0896-6273(00)00168-9
11 https://doi.org/10.1016/s0926-6410(98)00048-2
12 https://doi.org/10.1016/s0926-6410(99)00013-0
13 https://doi.org/10.1073/pnas.96.16.9379
14 https://doi.org/10.1080/13554790701494964
15 https://doi.org/10.1088/0266-5611/6/4/005
16 https://doi.org/10.1093/cercor/10.1.69
17 https://doi.org/10.1093/cercor/9.5.415
18 https://doi.org/10.1109/42.759120
19 https://doi.org/10.1111/j.1460-9568.2005.04181.x
20 https://doi.org/10.1126/science.1063736
21 https://doi.org/10.1212/01.wnl.0000267842.85646.f2
22 https://doi.org/10.1523/jneurosci.1091-05.2005
23 https://doi.org/10.1523/jneurosci.17-11-04302.1997
24 https://doi.org/10.1523/jneurosci.2621-05.2005
25 schema:datePublished 2007-12
26 schema:datePublishedReg 2007-12-01
27 schema:description AIMS: The aim of this study was to determine the specific spatiotemporal activation patterns of face perception in the fusiform gyrus (FG). The FG is a key area in the specialized brain system that makes possible the recognition of face with ease and speed in our daily life. Characterization of FG response provides a quantitative method for evaluating the fundamental functions that contribute to non-verbal communication in various psychosomatic paradigms. METHODS: The MEG signal was recorded during passive visual stimulus presentation with three stimulus types - Faces, Hands and Shoes. The stimuli were presented separately to the central and peripheral visual fields. We performed statistical parametric mapping (SPM) analysis of tomographic estimates of activity to compare activity between a pre- and post-stimulus period in the same object (baseline test), and activity between objects (active test). The time course of regional activation curves was analyzed for each stimulus condition. RESULTS: The SPM baseline test revealed a response to each stimulus type, which was very compact at the initial segment of main M(FG)170. For hands and shoes the area of significant change remains compact. For faces the area expanded widely within a few milliseconds and its boundaries engulfed the other object areas. The active test demonstrated that activity for faces was significantly larger than the activity for hands. The same face specific compact area as in the baseline test was identified, and then again expanded widely. For each stimulus type and presentation in each one of the visual fields locations, the analysis of the time course of FG activity identified three components in the FG: M(FG)100, M(FG)170, and M(FG)200 - all showed preference for faces. CONCLUSION: Early compact face-specific activity in the FG expands widely along the occipito-ventral brain within a few milliseconds. The significant difference between faces and the other object stimuli in M(FG)100 shows that processing of faces is already differentiated from processing of other objects within 100 ms. Standardization of the three face-specific MEG components could have diagnostic value for the integrity of the initial process of non-verbal communication in various psychosomatic paradigms.
28 schema:genre research_article
29 schema:inLanguage en
30 schema:isAccessibleForFree true
31 schema:isPartOf N2bd06c46e8ae4410ae1e48e760be61c2
32 N3d3bdfc67d264b9ca295a7c62b81af97
33 sg:journal.1037697
34 schema:name Specific components of face perception in the human fusiform gyrus studied by tomographic estimates of magnetoencephalographic signals: a tool for the evaluation of non-verbal communication in psychosomatic paradigms
35 schema:pagination 23
36 schema:productId N2c42a5f9577b47358b621e3d14f2fc01
37 N56ede8df359a41b291794bdfdf23e9e4
38 N66eaaa4bb62248738c63a7def1e9f745
39 N9cd62f4fc934430f9bc294f609af4c29
40 Nf1800fb4b05b4f4f8e26c025baf2d4b0
41 schema:sameAs https://app.dimensions.ai/details/publication/pub.1033886433
42 https://doi.org/10.1186/1751-0759-1-23
43 schema:sdDatePublished 2019-04-11T01:07
44 schema:sdLicense https://scigraph.springernature.com/explorer/license/
45 schema:sdPublisher N089b28aaf1eb421d906d63806fac15e3
46 schema:url http://link.springer.com/10.1186%2F1751-0759-1-23
47 sgo:license sg:explorer/license/
48 sgo:sdDataset articles
49 rdf:type schema:ScholarlyArticle
50 N089b28aaf1eb421d906d63806fac15e3 schema:name Springer Nature - SN SciGraph project
51 rdf:type schema:Organization
52 N134b722b81fb4d1791e591d1b164d6a8 rdf:first sg:person.0600434417.84
53 rdf:rest rdf:nil
54 N25a6259cc5584582ac5c5b86c968c2f9 rdf:first sg:person.0744073634.04
55 rdf:rest N134b722b81fb4d1791e591d1b164d6a8
56 N2bd06c46e8ae4410ae1e48e760be61c2 schema:volumeNumber 1
57 rdf:type schema:PublicationVolume
58 N2c42a5f9577b47358b621e3d14f2fc01 schema:name pubmed_id
59 schema:value 18053195
60 rdf:type schema:PropertyValue
61 N3d3bdfc67d264b9ca295a7c62b81af97 schema:issueNumber 1
62 rdf:type schema:PublicationIssue
63 N56ede8df359a41b291794bdfdf23e9e4 schema:name dimensions_id
64 schema:value pub.1033886433
65 rdf:type schema:PropertyValue
66 N66eaaa4bb62248738c63a7def1e9f745 schema:name readcube_id
67 schema:value a16d0c909688a6f1aa84646c3b8f30b626f3590bf1635701a602248bd83a1724
68 rdf:type schema:PropertyValue
69 N9cd62f4fc934430f9bc294f609af4c29 schema:name doi
70 schema:value 10.1186/1751-0759-1-23
71 rdf:type schema:PropertyValue
72 Nf1800fb4b05b4f4f8e26c025baf2d4b0 schema:name nlm_unique_id
73 schema:value 101286572
74 rdf:type schema:PropertyValue
75 anzsrc-for:17 schema:inDefinedTermSet anzsrc-for:
76 schema:name Psychology and Cognitive Sciences
77 rdf:type schema:DefinedTerm
78 anzsrc-for:1701 schema:inDefinedTermSet anzsrc-for:
79 schema:name Psychology
80 rdf:type schema:DefinedTerm
81 sg:journal.1037697 schema:issn 1751-0759
82 schema:name BioPsychoSocial Medicine
83 rdf:type schema:Periodical
84 sg:person.0600434417.84 schema:affiliation https://www.grid.ac/institutes/grid.474690.8
85 schema:familyName Ioannides
86 schema:givenName Andreas A
87 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.0600434417.84
88 rdf:type schema:Person
89 sg:person.0744073634.04 schema:affiliation https://www.grid.ac/institutes/grid.474690.8
90 schema:familyName Okazaki
91 schema:givenName Yuka
92 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.0744073634.04
93 rdf:type schema:Person
94 sg:pub.10.1023/a:1022258620435 schema:sameAs https://app.dimensions.ai/details/publication/pub.1034352216
95 https://doi.org/10.1023/a:1022258620435
96 rdf:type schema:CreativeWork
97 https://doi.org/10.1002/hbm.10043 schema:sameAs https://app.dimensions.ai/details/publication/pub.1025674676
98 rdf:type schema:CreativeWork
99 https://doi.org/10.1016/j.brainres.2006.07.072 schema:sameAs https://app.dimensions.ai/details/publication/pub.1045688556
100 rdf:type schema:CreativeWork
101 https://doi.org/10.1016/j.ijdevneu.2004.12.012 schema:sameAs https://app.dimensions.ai/details/publication/pub.1021923212
102 rdf:type schema:CreativeWork
103 https://doi.org/10.1016/j.jesp.2006.10.023 schema:sameAs https://app.dimensions.ai/details/publication/pub.1011280568
104 rdf:type schema:CreativeWork
105 https://doi.org/10.1016/j.neuroimage.2006.02.009 schema:sameAs https://app.dimensions.ai/details/publication/pub.1025529451
106 rdf:type schema:CreativeWork
107 https://doi.org/10.1016/s0896-6273(00)00168-9 schema:sameAs https://app.dimensions.ai/details/publication/pub.1050668166
108 rdf:type schema:CreativeWork
109 https://doi.org/10.1016/s0926-6410(98)00048-2 schema:sameAs https://app.dimensions.ai/details/publication/pub.1043728763
110 rdf:type schema:CreativeWork
111 https://doi.org/10.1016/s0926-6410(99)00013-0 schema:sameAs https://app.dimensions.ai/details/publication/pub.1005409857
112 rdf:type schema:CreativeWork
113 https://doi.org/10.1073/pnas.96.16.9379 schema:sameAs https://app.dimensions.ai/details/publication/pub.1034957667
114 rdf:type schema:CreativeWork
115 https://doi.org/10.1080/13554790701494964 schema:sameAs https://app.dimensions.ai/details/publication/pub.1047369750
116 rdf:type schema:CreativeWork
117 https://doi.org/10.1088/0266-5611/6/4/005 schema:sameAs https://app.dimensions.ai/details/publication/pub.1041998952
118 rdf:type schema:CreativeWork
119 https://doi.org/10.1093/cercor/10.1.69 schema:sameAs https://app.dimensions.ai/details/publication/pub.1039978488
120 rdf:type schema:CreativeWork
121 https://doi.org/10.1093/cercor/9.5.415 schema:sameAs https://app.dimensions.ai/details/publication/pub.1046286523
122 rdf:type schema:CreativeWork
123 https://doi.org/10.1109/42.759120 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061170758
124 rdf:type schema:CreativeWork
125 https://doi.org/10.1111/j.1460-9568.2005.04181.x schema:sameAs https://app.dimensions.ai/details/publication/pub.1043812789
126 rdf:type schema:CreativeWork
127 https://doi.org/10.1126/science.1063736 schema:sameAs https://app.dimensions.ai/details/publication/pub.1050161855
128 rdf:type schema:CreativeWork
129 https://doi.org/10.1212/01.wnl.0000267842.85646.f2 schema:sameAs https://app.dimensions.ai/details/publication/pub.1064349444
130 rdf:type schema:CreativeWork
131 https://doi.org/10.1523/jneurosci.1091-05.2005 schema:sameAs https://app.dimensions.ai/details/publication/pub.1015109357
132 rdf:type schema:CreativeWork
133 https://doi.org/10.1523/jneurosci.17-11-04302.1997 schema:sameAs https://app.dimensions.ai/details/publication/pub.1083089633
134 rdf:type schema:CreativeWork
135 https://doi.org/10.1523/jneurosci.2621-05.2005 schema:sameAs https://app.dimensions.ai/details/publication/pub.1049120256
136 rdf:type schema:CreativeWork
137 https://www.grid.ac/institutes/grid.474690.8 schema:alternateName RIKEN Center for Brain Science
138 schema:name Department of Brain Science and Engineering, Graduate School of Life Science and Systems Engineering, Kyushu Institute of Technology, Kitakyushu-shi, Japan
139 Laboratory for Human Brain Dynamics, Brain Science Institute (BSI), RIKEN, Wako-shi, Japan
140 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...