Terapia de Lenguaje, Sensorial y Motora, Basada en Realidad Virtual View Homepage


Ontology type: schema:MedicalStudy     


Clinical Trial Info

YEARS

2016-2018

ABSTRACT

The purpose of this study is to determine whether VR based language rehabilitation scenario based on the core premises of ILAT has a beneficial effect on the linguistic performance (faster retrieval of the target lexicon and general fluency) of Broca's aphasia patients. Furthermore, it aims at testing the effects of cueing (visual and auditory) on word retrieval. Detailed Description Acquired brain lesions such as stroke often result the most common disabling neurological damages (Carter et al, 2012). 35-40% of stroke patients suffer serious language deficits and patients are frequently left with chronic disabilities which adversely impact their quality of life. Thus, the need for efficient rehabilitation methods increases. Recent studies show that Broca's area and the premotor cortex are anatomically coupled (Pulvermuller 2005) suggesting that for a therapy to be effective, in the brain there must be an interaction between linguistic neural system, motor and sensory circuits, memory, planning and monitoring (Kurland et al, 2012). These hypotheses led to the establishment of the so-called Intensive Language-Action Therapy (ILAT) (Pulvermuller 2012) which promotes motor movement during language practice. Thus, ILAT is an action-embedded language therapy grounded in three main principles: intense practice, overcoming learned non-use, and promoting motor actions (no compensations). Recently, a number of studies examined the functionality of virtual reality based rehabilitation systems that aim at post stroke motor recovery of upper extremities (Boian et al., 2002; Cameirão, Badia, Oller, & Verschure, 2010; Jack et al., 2001; Saposnik et al., 2010). In the present study, the goal is to further validate VR based language rehabilitation system based on the core principles of ILAT implemented within the environment of the rehabilitation Gaming System (RGS). Additionally, the goal is to investigate the effects of cueing on word retrieval. It was shown that conduction and Broca's aphasics exhibit the highest responsiveness to cueing (Li & Williams 1989). In order to overcome subsequent disturbances in word retrieval mechanisms, a number of cueing methods have been established to improve both the immediate and long term lexical access (Howard 2000). Both semantic and phonemic cues act as primes and are usually administered by the therapist in a written or oral manner containing phonological, semantic or syntactic information about the target word (Howard et al. 1985, Howard2000). Here, the investigators will implement the system with videos representing the lip motion representative for a correct pronunciation of the target words, as well as a representative sound (i.e. barking sound in case of dog). The investigators expect that the proposed system will be efficient in treating post stroke chronic Broca's aphasia patients according to the standard scales such as Boston Naming Test and Communicative Activity Log. More... »

URL

https://clinicaltrials.gov/show/NCT02928822

Related SciGraph Publications

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/3120", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "type": "DefinedTerm"
      }
    ], 
    "description": "The purpose of this study is to determine whether VR based language rehabilitation scenario based on the core premises of ILAT has a beneficial effect on the linguistic performance (faster retrieval of the target lexicon and general fluency) of Broca's aphasia patients. Furthermore, it aims at testing the effects of cueing (visual and auditory) on word retrieval.\n\nDetailed Description\nAcquired brain lesions such as stroke often result the most common disabling neurological damages (Carter et al, 2012). 35-40% of stroke patients suffer serious language deficits and patients are frequently left with chronic disabilities which adversely impact their quality of life. Thus, the need for efficient rehabilitation methods increases. Recent studies show that Broca's area and the premotor cortex are anatomically coupled (Pulvermuller 2005) suggesting that for a therapy to be effective, in the brain there must be an interaction between linguistic neural system, motor and sensory circuits, memory, planning and monitoring (Kurland et al, 2012). These hypotheses led to the establishment of the so-called Intensive Language-Action Therapy (ILAT) (Pulvermuller 2012) which promotes motor movement during language practice. Thus, ILAT is an action-embedded language therapy grounded in three main principles: intense practice, overcoming learned non-use, and promoting motor actions (no compensations). Recently, a number of studies examined the functionality of virtual reality based rehabilitation systems that aim at post stroke motor recovery of upper extremities (Boian et al., 2002; Cameir\u00e3o, Badia, Oller, & Verschure, 2010; Jack et al., 2001; Saposnik et al., 2010). In the present study, the goal is to further validate VR based language rehabilitation system based on the core principles of ILAT implemented within the environment of the rehabilitation Gaming System (RGS). Additionally, the goal is to investigate the effects of cueing on word retrieval. It was shown that conduction and Broca's aphasics exhibit the highest responsiveness to cueing (Li & Williams 1989). In order to overcome subsequent disturbances in word retrieval mechanisms, a number of cueing methods have been established to improve both the immediate and long term lexical access (Howard 2000). Both semantic and phonemic cues act as primes and are usually administered by the therapist in a written or oral manner containing phonological, semantic or syntactic information about the target word (Howard et al. 1985, Howard2000). Here, the investigators will implement the system with videos representing the lip motion representative for a correct pronunciation of the target words, as well as a representative sound (i.e. barking sound in case of dog). The investigators expect that the proposed system will be efficient in treating post stroke chronic Broca's aphasia patients according to the standard scales such as Boston Naming Test and Communicative Activity Log.", 
    "endDate": "2018-04-01T00:00:00Z", 
    "id": "sg:clinicaltrial.NCT02928822", 
    "keywords": [
      "de", 
      "en", 
      "rehabilitation", 
      "premise", 
      "beneficial effect", 
      "linguistics", 
      "retrieval", 
      "lexicon", 
      "fluency", 
      "Broca's", 
      "aphasia", 
      "auditory", 
      "word retrieval", 
      "brain lesion", 
      "stroke", 
      "neurological damage", 
      "Carter", 
      "stroke patient", 
      "language deficit", 
      "patient", 
      "chronic disability", 
      "life", 
      "Recent study", 
      "premotor cortex", 
      "therapy", 
      "brain", 
      "neural system", 
      "motor", 
      "sensory circuit", 
      "memory", 
      "hypothesise", 
      "establishment", 
      "language", 
      "motor movement", 
      "Language Therapy", 
      "main principle", 
      "practice", 
      "non-use", 
      "motor action", 
      "Compensation and Redress", 
      "functionality", 
      "virtual reality", 
      "motor recovery", 
      "Upper Extremity", 
      "Jack", 
      "present study", 
      "core principle", 
      "environment", 
      "conduction", 
      "aphasic", 
      "responsiveness", 
      "Li", 
      "disturbance", 
      "method", 
      "lexical access", 
      "semantics", 
      "therapist", 
      "syntactic information", 
      "target word", 
      "Howard", 
      "video", 
      "motion", 
      "pronunciation", 
      "dog", 
      "scale", 
      "Boston", 
      "log"
    ], 
    "name": "Terapia de Lenguaje, Sensorial y Motora, Basada en Realidad Virtual", 
    "sameAs": [
      "https://app.dimensions.ai/details/clinical_trial/NCT02928822"
    ], 
    "sdDataset": "clinical_trials", 
    "sdDatePublished": "2019-03-07T15:26", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "file:///pack/app/us_ct_data_00022.json", 
    "sponsor": [
      {
        "id": "https://www.grid.ac/institutes/grid.411435.6", 
        "type": "Organization"
      }, 
      {
        "id": "https://www.grid.ac/institutes/grid.5612.0", 
        "type": "Organization"
      }
    ], 
    "startDate": "2016-07-01T00:00:00Z", 
    "subjectOf": [
      {
        "id": "https://doi.org/10.1016/j.neuroimage.2012.02.070", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1014687349"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/s10548-014-0398-y", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1018107685", 
          "https://doi.org/10.1007/s10548-014-0398-y"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1038/nrn1706", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1020661201", 
          "https://doi.org/10.1038/nrn1706"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1186/1743-0003-7-48", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1022258938", 
          "https://doi.org/10.1186/1743-0003-7-48"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1016/j.jcomdis.2015.01.003", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1023400920"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1093/brain/awv022", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1030242522"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1161/strokeaha.110.584979", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1030389598"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1044/1058-0360(2012/11-0113)", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1044039677"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/7333.948460", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061220000"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://app.dimensions.ai/details/publication/pub.1076944954", 
        "type": "CreativeWork"
      }
    ], 
    "type": "MedicalStudy", 
    "url": "https://clinicaltrials.gov/show/NCT02928822"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/clinicaltrial.NCT02928822'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/clinicaltrial.NCT02928822'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/clinicaltrial.NCT02928822'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/clinicaltrial.NCT02928822'


 

This table displays all metadata directly associated to this object as RDF triples.

120 TRIPLES      16 PREDICATES      93 URIs      76 LITERALS      1 BLANK NODES

Subject Predicate Object
1 sg:clinicaltrial.NCT02928822 schema:about anzsrc-for:3120
2 schema:description The purpose of this study is to determine whether VR based language rehabilitation scenario based on the core premises of ILAT has a beneficial effect on the linguistic performance (faster retrieval of the target lexicon and general fluency) of Broca's aphasia patients. Furthermore, it aims at testing the effects of cueing (visual and auditory) on word retrieval. Detailed Description Acquired brain lesions such as stroke often result the most common disabling neurological damages (Carter et al, 2012). 35-40% of stroke patients suffer serious language deficits and patients are frequently left with chronic disabilities which adversely impact their quality of life. Thus, the need for efficient rehabilitation methods increases. Recent studies show that Broca's area and the premotor cortex are anatomically coupled (Pulvermuller 2005) suggesting that for a therapy to be effective, in the brain there must be an interaction between linguistic neural system, motor and sensory circuits, memory, planning and monitoring (Kurland et al, 2012). These hypotheses led to the establishment of the so-called Intensive Language-Action Therapy (ILAT) (Pulvermuller 2012) which promotes motor movement during language practice. Thus, ILAT is an action-embedded language therapy grounded in three main principles: intense practice, overcoming learned non-use, and promoting motor actions (no compensations). Recently, a number of studies examined the functionality of virtual reality based rehabilitation systems that aim at post stroke motor recovery of upper extremities (Boian et al., 2002; Cameirão, Badia, Oller, & Verschure, 2010; Jack et al., 2001; Saposnik et al., 2010). In the present study, the goal is to further validate VR based language rehabilitation system based on the core principles of ILAT implemented within the environment of the rehabilitation Gaming System (RGS). Additionally, the goal is to investigate the effects of cueing on word retrieval. It was shown that conduction and Broca's aphasics exhibit the highest responsiveness to cueing (Li & Williams 1989). In order to overcome subsequent disturbances in word retrieval mechanisms, a number of cueing methods have been established to improve both the immediate and long term lexical access (Howard 2000). Both semantic and phonemic cues act as primes and are usually administered by the therapist in a written or oral manner containing phonological, semantic or syntactic information about the target word (Howard et al. 1985, Howard2000). Here, the investigators will implement the system with videos representing the lip motion representative for a correct pronunciation of the target words, as well as a representative sound (i.e. barking sound in case of dog). The investigators expect that the proposed system will be efficient in treating post stroke chronic Broca's aphasia patients according to the standard scales such as Boston Naming Test and Communicative Activity Log.
3 schema:endDate 2018-04-01T00:00:00Z
4 schema:keywords Boston
5 Broca's
6 Carter
7 Compensation and Redress
8 Howard
9 Jack
10 Language Therapy
11 Li
12 Recent study
13 Upper Extremity
14 aphasia
15 aphasic
16 auditory
17 beneficial effect
18 brain
19 brain lesion
20 chronic disability
21 conduction
22 core principle
23 de
24 disturbance
25 dog
26 en
27 environment
28 establishment
29 fluency
30 functionality
31 hypothesise
32 language
33 language deficit
34 lexical access
35 lexicon
36 life
37 linguistics
38 log
39 main principle
40 memory
41 method
42 motion
43 motor
44 motor action
45 motor movement
46 motor recovery
47 neural system
48 neurological damage
49 non-use
50 patient
51 practice
52 premise
53 premotor cortex
54 present study
55 pronunciation
56 rehabilitation
57 responsiveness
58 retrieval
59 scale
60 semantics
61 sensory circuit
62 stroke
63 stroke patient
64 syntactic information
65 target word
66 therapist
67 therapy
68 video
69 virtual reality
70 word retrieval
71 schema:name Terapia de Lenguaje, Sensorial y Motora, Basada en Realidad Virtual
72 schema:sameAs https://app.dimensions.ai/details/clinical_trial/NCT02928822
73 schema:sdDatePublished 2019-03-07T15:26
74 schema:sdLicense https://scigraph.springernature.com/explorer/license/
75 schema:sdPublisher N8aaba7124777445a9a663b9432b4d435
76 schema:sponsor https://www.grid.ac/institutes/grid.411435.6
77 https://www.grid.ac/institutes/grid.5612.0
78 schema:startDate 2016-07-01T00:00:00Z
79 schema:subjectOf sg:pub.10.1007/s10548-014-0398-y
80 sg:pub.10.1038/nrn1706
81 sg:pub.10.1186/1743-0003-7-48
82 https://app.dimensions.ai/details/publication/pub.1076944954
83 https://doi.org/10.1016/j.jcomdis.2015.01.003
84 https://doi.org/10.1016/j.neuroimage.2012.02.070
85 https://doi.org/10.1044/1058-0360(2012/11-0113)
86 https://doi.org/10.1093/brain/awv022
87 https://doi.org/10.1109/7333.948460
88 https://doi.org/10.1161/strokeaha.110.584979
89 schema:url https://clinicaltrials.gov/show/NCT02928822
90 sgo:license sg:explorer/license/
91 sgo:sdDataset clinical_trials
92 rdf:type schema:MedicalStudy
93 N8aaba7124777445a9a663b9432b4d435 schema:name Springer Nature - SN SciGraph project
94 rdf:type schema:Organization
95 anzsrc-for:3120 schema:inDefinedTermSet anzsrc-for:
96 rdf:type schema:DefinedTerm
97 sg:pub.10.1007/s10548-014-0398-y schema:sameAs https://app.dimensions.ai/details/publication/pub.1018107685
98 https://doi.org/10.1007/s10548-014-0398-y
99 rdf:type schema:CreativeWork
100 sg:pub.10.1038/nrn1706 schema:sameAs https://app.dimensions.ai/details/publication/pub.1020661201
101 https://doi.org/10.1038/nrn1706
102 rdf:type schema:CreativeWork
103 sg:pub.10.1186/1743-0003-7-48 schema:sameAs https://app.dimensions.ai/details/publication/pub.1022258938
104 https://doi.org/10.1186/1743-0003-7-48
105 rdf:type schema:CreativeWork
106 https://app.dimensions.ai/details/publication/pub.1076944954 schema:CreativeWork
107 https://doi.org/10.1016/j.jcomdis.2015.01.003 schema:sameAs https://app.dimensions.ai/details/publication/pub.1023400920
108 rdf:type schema:CreativeWork
109 https://doi.org/10.1016/j.neuroimage.2012.02.070 schema:sameAs https://app.dimensions.ai/details/publication/pub.1014687349
110 rdf:type schema:CreativeWork
111 https://doi.org/10.1044/1058-0360(2012/11-0113) schema:sameAs https://app.dimensions.ai/details/publication/pub.1044039677
112 rdf:type schema:CreativeWork
113 https://doi.org/10.1093/brain/awv022 schema:sameAs https://app.dimensions.ai/details/publication/pub.1030242522
114 rdf:type schema:CreativeWork
115 https://doi.org/10.1109/7333.948460 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061220000
116 rdf:type schema:CreativeWork
117 https://doi.org/10.1161/strokeaha.110.584979 schema:sameAs https://app.dimensions.ai/details/publication/pub.1030389598
118 rdf:type schema:CreativeWork
119 https://www.grid.ac/institutes/grid.411435.6 schema:Organization
120 https://www.grid.ac/institutes/grid.5612.0 schema:Organization
 




Preview window. Press ESC to close (or click here)


...