Attention, Perception, & Psychophysics View Homepage


Ontology type: schema:Periodical     


Journal Info

START YEAR

2009

PUBLISHER

Springer US

LANGUAGE

en

HOMEPAGE

http://link.springer.com/journal/13414

Recent publications latest 20 shown

  • 2019-04-12 Feature-based guidance of attention during post-saccadic selection.
  • 2019-04-11 Interactions between speech perception and production during learning of novel phonemic categories.
  • 2019-04-04 Move on up: Fingertip forces and felt heaviness are modulated by the goal of the lift.
  • 2019-04-03 Lexical processing depends on sublexical processing: Evidence from the visual world paradigm and aphasia.
  • 2019-04-01 Long-standing problems in speech perception dissolve within an information-theoretic perspective
  • 2019-04 Modality differences in timing and the filled-duration illusion: Testing the pacemaker rate explanation
  • 2019-04 Searching with and against each other: Spatiotemporal coordination of visual search behavior in collaborative and competitive settings
  • 2019-04 How state anxiety and attentional bias interact with each other: The moderating effect of cognitive appraisal
  • 2019-04 Cognitive control in the cocktail party: Preparing selective attention to dichotically presented voices supports distractor suppression
  • 2019-04 The relations between temporal and social perceptual biases: Evidence from perceptual matching
  • 2019-04 Taking a closer look at visual search: Just how feature-agnostic is singleton detection mode?
  • 2019-04 Categorizing digits and the mental number line
  • 2019-04 Guidance and selection history in hybrid foraging visual search
  • 2019-04 Reading without spaces: The role of precise letter order
  • 2019-04 Separating after-effects of target and distractor processing in the tactile sensory modality
  • 2019-04 Multiple paths to holistic processing: Holistic processing of Gestalt stimuli do not overlap with holistic face processing in the same manner as do objects of expertise
  • 2019-04 Monocular channels have a functional role in phasic alertness and temporal expectancy
  • 2019-04 Dynamic distractor environments reveal classic visual field anisotropies for judgments of temporal order
  • 2019-04 Depth benefits now loading: Visual working memory capacity and benefits in 3-D
  • 2019-04 Implied tactile motion: Localizing dynamic stimulations on the skin
  • JSON-LD is the canonical representation for SciGraph data.

    TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

    [
      {
        "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
        "about": [
          {
            "id": "http://scigraph.springernature.com/ontologies/product-market-codes/Y20060", 
            "inDefinedTermSet": "http://scigraph.springernature.com/ontologies/product-market-codes/", 
            "name": "Cognitive Psychology", 
            "type": "DefinedTerm"
          }
        ], 
        "contentRating": [
          {
            "author": "snip", 
            "ratingValue": "0.79", 
            "type": "Rating"
          }, 
          {
            "author": "sjr", 
            "ratingValue": "0.992", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2017", 
            "ratingValue": "1.678", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2016", 
            "ratingValue": "1.863", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2016", 
            "ratingValue": "1.863", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2015", 
            "ratingValue": "1.782", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2015", 
            "ratingValue": "1.782", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2014", 
            "ratingValue": "2.168", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2014", 
            "ratingValue": "2.168", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2013", 
            "ratingValue": "2.152", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2013", 
            "ratingValue": "2.152", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2012", 
            "ratingValue": "1.969", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2012", 
            "ratingValue": "1.969", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2011", 
            "ratingValue": "2.039", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2011", 
            "ratingValue": "2.039", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2010", 
            "ratingValue": "1.333", 
            "type": "Rating"
          }, 
          {
            "author": "impact_factor_wos", 
            "dateCreated": "2010", 
            "ratingValue": "1.333", 
            "type": "Rating"
          }
        ], 
        "description": "

    SPECIAL ISSUE CALL FOR PAPERS: Time for Action: Reaching for a better understanding of the dynamics of cognition

    A Special Issue of Attention, Perception, & Psychophysics
    Submission deadline:\u00a0 Nov 15, 2018

    Guest Editors:\u00a0
    Joo-Hyun Song, Brown University, and Tim Welsh,\u00a0University of Toronto

    The overarching goal of this special issue is to provide a dedicated space for empirical and review papers that advance the understanding of how cognition and action systems are integrated and operate synergistically. This knowledge of how humans efficiently interact and navigate in complex environments is vital for generating a comprehensive understanding of human behavior and will help shape the design of everyday objects and training and working environments.\u00a0\u00a0

    It is evident that the products of our cognitive processes are expressed through our actions. Historically, the transformation of sensory inputs into action has been treated as a set of relatively unidirectional processing events with the results of low-level sensory and earlier perceptual processes informing higher-order cognitive processes until a decision is made to respond, at which point the action system receives its instructions. Given this compartmentalized approach, it may not be too surprising that there has been relatively little interaction between researchers in cognitive and motor domains. Thus, until recently, a deeper understanding of human behavior has been hindered because little attention has been paid to the broader context of action and how action processes are embedded in the larger canvas of visual attention, memory, learning, decision-making and interpersonal interaction.

    \u00a0

    We seek contributions from researchers across multiple areas, including but not limited to, psychology, neuroscience, kinesiology, and human-computer interactions, to share and critically evaluate their cutting-edge theoretical, empirical, and translational developments. Submissions could be in the format of empirical pieces reporting new results from original research, targeted reviews, or interesting viewpoints. The special issue will have a broad scope encompassing experimental, theoretical, computational, and clinical studies, as well as methodological approaches. Authors who are interested in submitting a review or viewpoint paper are highly recommend to submit a pre-submission inquiry of approximately 1 page to the guest editors by August 1, 2018. If you have any question about a possible submission, please contact one of the guest editors.

    All submissions will undergo normal, full peer review, maintaining the same high editorial standards for regular submissions to Attention, Perception, & Psychophysics. Manuscripts should include a cover letter indicating that the submission is for the special issue, \u201cTime for Action: Reaching for a better understanding of the dynamics of cognition\u201d.

    Because this is a journal special issue, not an edited book, the deadline is firm; our intention is to publish the special issue 6-8 months after the submission deadline. Revisions invited by the guest editors will be expected within two months of receipt of the editorial decision letter and reviews.

    The journal Attention, Perception, & Psychophysics is an official journal of the Psychonomic Society.\u00a0\u00a0 It spans all areas of research in sensory processes, perception, attention, and psychophysics. Most articles published are reports of experimental work; the journal also presents theoretical, integrative, and evaluative reviews.\u00a0 Founded in 1966 as Perception & Psychophysics, the journal assumed its present name in 2009.

    The journal also encourages studies with a neuroscientific perspective that enhance our understanding of attention, perception, and psychophysics.

    • For editorial-related inquiries, please contact Editor-in-Chief\u00a0Mike\u00a0Dodd, Ph.D, at mdodd2@unl.edu
    • For questions concerning ScholarOne's manuscript submission system, please contact journals@psychonomic.org

    Attention Perception & Psychophysics\u00a0, a journal of The Psychonomic Society, is committed to upholding principles of integrity in scientific publishing and practice. As a member of the Committee on Publication Ethics (COPE), the journal will follow COPE guidelines concerning procedures for handling potential acts of professional misconduct.\u00a0

    ", "editor": [ { "familyName": "Dodd", "givenName": "Michael D.", "type": "Person" } ], "id": "sg:journal.1041037", "inLanguage": [ "en" ], "isAccessibleForFree": false, "issn": [ "1943-3921", "1943-393X" ], "license": "Hybrid (Open Choice)", "name": "Attention, Perception, & Psychophysics", "productId": [ { "name": "scopus_id", "type": "PropertyValue", "value": [ "17500155126" ] }, { "name": "wos_id", "type": "PropertyValue", "value": [ "1943-3921/ATTENTION PERCEPTION & PSYCHOPHYSICS" ] }, { "name": "nlm_unique_id", "type": "PropertyValue", "value": [ "101495384", "0200445" ] }, { "name": "nsd_ids_id", "type": "PropertyValue", "value": [ "475798", "341729" ] }, { "name": "springer_id", "type": "PropertyValue", "value": [ "13414" ] }, { "name": "lccn_id", "type": "PropertyValue", "value": [ "2008212295" ] }, { "name": "dimensions_id", "type": "PropertyValue", "value": [ "41037" ] }, { "name": "era_ids_id", "type": "PropertyValue", "value": [ "6549" ] } ], "publisher": { "name": "Springer US", "type": "Organization" }, "publisherImprint": "Springer", "sameAs": [ "https://app.dimensions.ai/discover/publication?and_facet_source_title=jour.1041037" ], "sdDataset": "journals", "sdDatePublished": "2019-03-18T11:05", "sdLicense": "https://scigraph.springernature.com/explorer/license/", "sdPublisher": { "name": "Springer Nature - SN SciGraph project", "type": "Organization" }, "sdSource": "file:///home/ubuntu/piotr/scigraph_export/journals_20190313_sn_only.jsonl", "startYear": "2009", "type": "Periodical", "url": "http://link.springer.com/journal/13414" } ]

     

    Download the RDF metadata as:  json-ld nt turtle xml License info

    HOW TO GET THIS DATA PROGRAMMATICALLY:

    JSON-LD is a popular format for linked data which is fully compatible with JSON.

    curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/journal.1041037'

    N-Triples is a line-based linked data format ideal for batch operations.

    curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/journal.1041037'

    Turtle is a human-readable linked data format.

    curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/journal.1041037'

    RDF/XML is a standard XML format for linked data.

    curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/journal.1041037'


     

    This table displays all metadata directly associated to this object as RDF triples.

    183 TRIPLES      21 PREDICATES      46 URIs      41 LITERALS      28 BLANK NODES

    Subject Predicate Object
    1 sg:journal.1041037 schema:about sg:ontologies/product-market-codes/Y20060
    2 schema:contentRating N0ad57a4fd90c4db6947ec0f5f1af0aa8
    3 N21df026538674bb19c57ad1fc6a8c972
    4 N2f9e11b604da4a53ab2e3340efe7631b
    5 N30e6e5244bcc4ebd851d5745f63a3461
    6 N314299ae5c8f40b9aa98f3b4043905f2
    7 N4138a63ab778408a9917ac510f43dc7f
    8 N4777f5317bee49d4b2beae7c452ba871
    9 N51f6ed7c9c9a4b86a473f5e9e4c8c907
    10 N56db2497da2842bb968d63604fd55b81
    11 N68bb42008dc54de7ad1a459b49803df8
    12 Na1a7eebda8ca479cab074b92a8a92e4c
    13 Naac7f4e2d9194f2e99045ee3c7da75a1
    14 Nd95b89f5b1b24c32b013cc8665ea41fa
    15 Ne257c7f21c3d42c3850d5f524dd4cff1
    16 Ne592377ddbad4658af59d5aebab4eccd
    17 Nec206ccdff0c4717b1567afda38ef859
    18 Neffc68012ec14b4ea0b24b97834b55d1
    19 schema:description <p><b></b></p><p><b>SPECIAL ISSUE CALL FOR PAPERS:</b> <b>Time for Action: Reaching for a better understanding of the dynamics of cognition</b></p><p>A Special Issue of<b><i> Attention, Perception, &amp; Psychophysics<br/>Submission deadline:  Nov 15, 2018<br/><br/>Guest Editors:  </i></b>Joo-Hyun Song, Brown University, and Tim Welsh, University of Toronto<br/><br/>The overarching goal of this special issue is to provide a dedicated space for empirical and review papers that advance the understanding of how cognition and action systems are integrated and operate synergistically. This knowledge of how humans efficiently interact and navigate in complex environments is vital for generating a comprehensive understanding of human behavior and will help shape the design of everyday objects and training and working environments.  </p><p>It is evident that the products of our cognitive processes are expressed through our actions. Historically, the transformation of sensory inputs into action has been treated as a set of relatively unidirectional processing events with the results of low-level sensory and earlier perceptual processes informing higher-order cognitive processes until a decision is made to respond, at which point the action system receives its instructions. Given this compartmentalized approach, it may not be too surprising that there has been relatively little interaction between researchers in cognitive and motor domains. Thus, until recently, a deeper understanding of human behavior has been hindered because little attention has been paid to the broader context of action and how action processes are embedded in the larger canvas of visual attention, memory, learning, decision-making and interpersonal interaction. </p><p> </p><p>We seek contributions from researchers across multiple areas, including but not limited to, psychology, neuroscience, kinesiology, and human-computer interactions, to share and critically evaluate their cutting-edge theoretical, empirical, and translational developments. Submissions could be in the format of empirical pieces reporting new results from original research, targeted reviews, or interesting viewpoints. The special issue will have a broad scope encompassing experimental, theoretical, computational, and clinical studies, as well as methodological approaches. Authors who are interested in submitting a review or viewpoint paper are highly recommend to submit a pre-submission inquiry of approximately 1 page to the guest editors by August 1, 2018. If you have any question about a possible submission, please contact one of the guest editors. </p><p>All submissions will undergo normal, full peer review, maintaining the same high editorial standards for regular submissions to <i>Attention, Perception, &amp; Psychophysics</i>. <i>Manuscripts should include a cover letter indicating that the submission is for the special issue, “</i><i>Time for Action: </i><i>Reaching for a better understanding of the dynamics of cognition”. <br/><br/></i></p><p>Because this is a journal special issue, not an edited book, the deadline is firm; our intention is to publish the special issue 6-8 months after the submission deadline. Revisions invited by the guest editors will be expected within two months of receipt of the editorial decision letter and reviews.</p><p/><p/><p>The journal <i>Attention, Perception, &amp; Psychophysics</i> is an official journal of the Psychonomic Society.   It spans all areas of research in sensory processes, perception, attention, and psychophysics. Most articles published are reports of experimental work; the journal also presents theoretical, integrative, and evaluative reviews.  Founded in 1966 as <i>Perception &amp; Psychophysics</i>, the journal assumed its present name in 2009.</p><p>The journal also encourages studies with a neuroscientific perspective that enhance our understanding of attention, perception, and psychophysics. </p><p/><ul><li>For editorial-related inquiries, please contact Editor-in-Chief Mike Dodd, Ph.D, at mdodd2@unl.edu</li><li>For questions concerning ScholarOne's manuscript submission system, please contact journals@psychonomic.org </li></ul><p><i>Attention Perception &amp; Psychophysics</i> , a journal of The Psychonomic Society, is committed to upholding principles of integrity in scientific publishing and practice. As a member of the Committee on Publication Ethics (COPE), the journal will follow COPE guidelines concerning procedures for handling potential acts of professional misconduct. </p><p/><p/>
    20 schema:editor N3a284089eac54e94906266359336074d
    21 schema:inLanguage en
    22 schema:isAccessibleForFree false
    23 schema:issn 1943-3921
    24 1943-393X
    25 schema:license Hybrid (Open Choice)
    26 schema:name Attention, Perception, & Psychophysics
    27 schema:productId N2f5752c3ba3f4ccea38694d199744a5c
    28 N6ad2d320eff34b6788529183833b09d7
    29 N859b6b76f72b466087ce06c93c46275a
    30 N8f6b6f50dffc490b9f93efb01053d4bd
    31 Naf223002594a472a9dcdc9983a339b52
    32 Nc075278834724d059d032dd81a62fb44
    33 Nc0c08f9968ff4fdbabee6e97b1645e3e
    34 Nd5ffadbd8df84ce0b9f0bfc37c8df55f
    35 schema:publisher N0c8dfa81435f4230977bc08251cf2468
    36 schema:publisherImprint Springer
    37 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_source_title=jour.1041037
    38 schema:sdDatePublished 2019-03-18T11:05
    39 schema:sdLicense https://scigraph.springernature.com/explorer/license/
    40 schema:sdPublisher Nb650edeff1a440aaa76e35b77a2b083c
    41 schema:startYear 2009
    42 schema:url http://link.springer.com/journal/13414
    43 sgo:license sg:explorer/license/
    44 sgo:sdDataset journals
    45 rdf:type schema:Periodical
    46 N06e45e489b3f40b78ff020a3aabb8030 rdf:first snip
    47 rdf:rest rdf:nil
    48 N0ad57a4fd90c4db6947ec0f5f1af0aa8 schema:author N2ac608177f04416eb11395c20ba9fe9a
    49 schema:dateCreated 2015
    50 schema:ratingValue 1.782
    51 rdf:type schema:Rating
    52 N0c8dfa81435f4230977bc08251cf2468 schema:name Springer US
    53 rdf:type schema:Organization
    54 N21df026538674bb19c57ad1fc6a8c972 schema:author N5072de0521854e8a8091d0ddf87f2a6a
    55 schema:dateCreated 2010
    56 schema:ratingValue 1.333
    57 rdf:type schema:Rating
    58 N2ac608177f04416eb11395c20ba9fe9a rdf:first impact_factor_wos
    59 rdf:rest rdf:nil
    60 N2efcdbbacf6c4d408d3bbf223b39b947 rdf:first impact_factor_wos
    61 rdf:rest rdf:nil
    62 N2f5752c3ba3f4ccea38694d199744a5c schema:name springer_id
    63 schema:value 13414
    64 rdf:type schema:PropertyValue
    65 N2f9e11b604da4a53ab2e3340efe7631b schema:author Nca152802f03a4770ac252f9882b30634
    66 schema:dateCreated 2014
    67 schema:ratingValue 2.168
    68 rdf:type schema:Rating
    69 N30e6e5244bcc4ebd851d5745f63a3461 schema:author N2efcdbbacf6c4d408d3bbf223b39b947
    70 schema:dateCreated 2012
    71 schema:ratingValue 1.969
    72 rdf:type schema:Rating
    73 N314299ae5c8f40b9aa98f3b4043905f2 schema:author N3b020dbd96294466a38f62b79f41b191
    74 schema:dateCreated 2014
    75 schema:ratingValue 2.168
    76 rdf:type schema:Rating
    77 N38f92e45ad5b47758c98fd8ec0406566 rdf:first impact_factor_wos
    78 rdf:rest rdf:nil
    79 N3a284089eac54e94906266359336074d rdf:first Nfdc6545a405a4f8c84cf652eeb077c57
    80 rdf:rest rdf:nil
    81 N3b020dbd96294466a38f62b79f41b191 rdf:first impact_factor_wos
    82 rdf:rest rdf:nil
    83 N4138a63ab778408a9917ac510f43dc7f schema:author N8c5e885d29f84ee59daec51822aa4be7
    84 schema:dateCreated 2015
    85 schema:ratingValue 1.782
    86 rdf:type schema:Rating
    87 N4777f5317bee49d4b2beae7c452ba871 schema:author N96fa640bfefb4552826b3d07484f6591
    88 schema:dateCreated 2013
    89 schema:ratingValue 2.152
    90 rdf:type schema:Rating
    91 N5072de0521854e8a8091d0ddf87f2a6a rdf:first impact_factor_wos
    92 rdf:rest rdf:nil
    93 N51f6ed7c9c9a4b86a473f5e9e4c8c907 schema:author N800591f897684ffab02cb3ee9d2d9a4b
    94 schema:dateCreated 2016
    95 schema:ratingValue 1.863
    96 rdf:type schema:Rating
    97 N56db2497da2842bb968d63604fd55b81 schema:author N7087b893dbbe4422be0098e47fbcdc35
    98 schema:dateCreated 2013
    99 schema:ratingValue 2.152
    100 rdf:type schema:Rating
    101 N68bb42008dc54de7ad1a459b49803df8 schema:author N8b27f944079c4f5f93ee38e40e51cc2b
    102 schema:dateCreated 2010
    103 schema:ratingValue 1.333
    104 rdf:type schema:Rating
    105 N6a927b20cfb24e78b3b60f336bbda33b rdf:first impact_factor_wos
    106 rdf:rest rdf:nil
    107 N6ad2d320eff34b6788529183833b09d7 schema:name lccn_id
    108 schema:value 2008212295
    109 rdf:type schema:PropertyValue
    110 N7087b893dbbe4422be0098e47fbcdc35 rdf:first impact_factor_wos
    111 rdf:rest rdf:nil
    112 N71faa4729e704580823b230f5b342de7 rdf:first impact_factor_wos
    113 rdf:rest rdf:nil
    114 N800591f897684ffab02cb3ee9d2d9a4b rdf:first impact_factor_wos
    115 rdf:rest rdf:nil
    116 N859b6b76f72b466087ce06c93c46275a schema:name scopus_id
    117 schema:value 17500155126
    118 rdf:type schema:PropertyValue
    119 N8b27f944079c4f5f93ee38e40e51cc2b rdf:first impact_factor_wos
    120 rdf:rest rdf:nil
    121 N8c5e885d29f84ee59daec51822aa4be7 rdf:first impact_factor_wos
    122 rdf:rest rdf:nil
    123 N8f6b6f50dffc490b9f93efb01053d4bd schema:name nlm_unique_id
    124 schema:value 0200445
    125 101495384
    126 rdf:type schema:PropertyValue
    127 N96fa640bfefb4552826b3d07484f6591 rdf:first impact_factor_wos
    128 rdf:rest rdf:nil
    129 Na1a7eebda8ca479cab074b92a8a92e4c schema:author Nedf3f8e54e4643028a821f5014b0a87f
    130 schema:dateCreated 2017
    131 schema:ratingValue 1.678
    132 rdf:type schema:Rating
    133 Naac7f4e2d9194f2e99045ee3c7da75a1 schema:author N06e45e489b3f40b78ff020a3aabb8030
    134 schema:ratingValue 0.79
    135 rdf:type schema:Rating
    136 Naf223002594a472a9dcdc9983a339b52 schema:name nsd_ids_id
    137 schema:value 341729
    138 475798
    139 rdf:type schema:PropertyValue
    140 Nb650edeff1a440aaa76e35b77a2b083c schema:name Springer Nature - SN SciGraph project
    141 rdf:type schema:Organization
    142 Nc075278834724d059d032dd81a62fb44 schema:name wos_id
    143 schema:value 1943-3921/ATTENTION PERCEPTION & PSYCHOPHYSICS
    144 rdf:type schema:PropertyValue
    145 Nc0c08f9968ff4fdbabee6e97b1645e3e schema:name era_ids_id
    146 schema:value 6549
    147 rdf:type schema:PropertyValue
    148 Nca152802f03a4770ac252f9882b30634 rdf:first impact_factor_wos
    149 rdf:rest rdf:nil
    150 Nd5ffadbd8df84ce0b9f0bfc37c8df55f schema:name dimensions_id
    151 schema:value 41037
    152 rdf:type schema:PropertyValue
    153 Nd95b89f5b1b24c32b013cc8665ea41fa schema:author N71faa4729e704580823b230f5b342de7
    154 schema:dateCreated 2011
    155 schema:ratingValue 2.039
    156 rdf:type schema:Rating
    157 Ne257c7f21c3d42c3850d5f524dd4cff1 schema:author N6a927b20cfb24e78b3b60f336bbda33b
    158 schema:dateCreated 2012
    159 schema:ratingValue 1.969
    160 rdf:type schema:Rating
    161 Ne592377ddbad4658af59d5aebab4eccd schema:author Nfc1827af7eab4eb793307959cc5f7d31
    162 schema:dateCreated 2016
    163 schema:ratingValue 1.863
    164 rdf:type schema:Rating
    165 Ne9c21c64d61941c0851f478a94380c0f rdf:first sjr
    166 rdf:rest rdf:nil
    167 Nec206ccdff0c4717b1567afda38ef859 schema:author Ne9c21c64d61941c0851f478a94380c0f
    168 schema:ratingValue 0.992
    169 rdf:type schema:Rating
    170 Nedf3f8e54e4643028a821f5014b0a87f rdf:first impact_factor_wos
    171 rdf:rest rdf:nil
    172 Neffc68012ec14b4ea0b24b97834b55d1 schema:author N38f92e45ad5b47758c98fd8ec0406566
    173 schema:dateCreated 2011
    174 schema:ratingValue 2.039
    175 rdf:type schema:Rating
    176 Nfc1827af7eab4eb793307959cc5f7d31 rdf:first impact_factor_wos
    177 rdf:rest rdf:nil
    178 Nfdc6545a405a4f8c84cf652eeb077c57 schema:familyName Dodd
    179 schema:givenName Michael D.
    180 rdf:type schema:Person
    181 sg:ontologies/product-market-codes/Y20060 schema:inDefinedTermSet sg:ontologies/product-market-codes/
    182 schema:name Cognitive Psychology
    183 rdf:type schema:DefinedTerm
     




    Preview window. Press ESC to close (or click here)


    ...