Emotion Recognition System by Gesture Analysis Using Fuzzy Sets View Full Text


Ontology type: schema:Chapter     


Chapter Info

DATE

2013

AUTHORS

Reshma Kar , Aruna Chakraborty , Amit Konar , Ramadoss Janarthanan

ABSTRACT

Gestures have been called the leaky source of emotional information. Also gestures are easy to retrieve from a distance by ordinary cameras. Thus as many would agree gestures become an important clue to the emotional state of a person. In this paper we have worked on recognizing emotions of a person by analyzing only gestural information. Subjects are initially trained to perform emotionally expressive gestures by a professional actor. The same actor trained the system to recognize the emotional context of gestures. Finally the gestural performances of the subjects are evaluated by the system to identify the class of emotion indicated. Our system yields an accuracy of 94.4% with a training set of only one gesture per emotion. Apart from this our system is also computationally efficient. Our work analyses emotions from only gestures, which is a significant step towards reducing the cost efficiency of emotion recognition. It may be noted here that this system may also be used for the purpose of general gesture recognition. We have proposed new features and a new classifying approach using fuzzy sets. We have achieved state of art accuracy with minimal complexity as each motion trajectory along each axis generates only 4 displacement features. Each axis generates a trajectory and only 6 joint trajectories among all joint trajectories are compared. The 6 motion trajectories are selected based on maximum motion, as maximum moving regions give more information on gestures. The experiments have been performed on data obtained from Microsoft Kinect sensors. Training and Testing were subject gender independent. More... »

PAGES

354-363

Book

TITLE

Swarm, Evolutionary, and Memetic Computing

ISBN

978-3-319-03755-4
978-3-319-03756-1

Author Affiliations

Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/978-3-319-03756-1_32

DOI

http://dx.doi.org/10.1007/978-3-319-03756-1_32

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1038525440


Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT

[
  {
    "@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json", 
    "about": [
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/1701", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Psychology", 
        "type": "DefinedTerm"
      }, 
      {
        "id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/17", 
        "inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/", 
        "name": "Psychology and Cognitive Sciences", 
        "type": "DefinedTerm"
      }
    ], 
    "author": [
      {
        "affiliation": {
          "alternateName": "Jadavpur University", 
          "id": "https://www.grid.ac/institutes/grid.216499.1", 
          "name": [
            "Department of Electronics and Tele-Communication Engineering, Jadavpur University, Kolkata, India"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Kar", 
        "givenName": "Reshma", 
        "id": "sg:person.014454247455.14", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.014454247455.14"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "name": [
            "Department of Computer Science & Engineering, St. Thomas\u2019 College of Engineering & Technology, Kolkata, India"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Chakraborty", 
        "givenName": "Aruna", 
        "id": "sg:person.015735623111.66", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.015735623111.66"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "alternateName": "Jadavpur University", 
          "id": "https://www.grid.ac/institutes/grid.216499.1", 
          "name": [
            "Department of Electronics and Tele-Communication Engineering, Jadavpur University, Kolkata, India"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Konar", 
        "givenName": "Amit", 
        "id": "sg:person.01337053064.29", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01337053064.29"
        ], 
        "type": "Person"
      }, 
      {
        "affiliation": {
          "name": [
            "Department of Computer Science & Engineering, TJS Engineering College, Chennai, India"
          ], 
          "type": "Organization"
        }, 
        "familyName": "Janarthanan", 
        "givenName": "Ramadoss", 
        "id": "sg:person.013316247664.76", 
        "sameAs": [
          "https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013316247664.76"
        ], 
        "type": "Person"
      }
    ], 
    "citation": [
      {
        "id": "https://doi.org/10.1525/mp.2008.26.2.103", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1001078273"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-540-74889-2_7", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1041441854", 
          "https://doi.org/10.1007/978-3-540-74889-2_7"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "sg:pub.10.1007/978-3-540-74889-2_7", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1041441854", 
          "https://doi.org/10.1007/978-3-540-74889-2_7"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/91.493904", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061247769"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/lgrs.2010.2046312", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061359021"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tbme.2011.2176940", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061528644"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tfuzz.2005.861604", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061605879"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tfuzz.2011.2150756", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061606473"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tim.2011.2108075", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061638681"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tim.2011.2161025", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061638893"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/titb.2010.2091684", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061656974"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tsmca.2004.824852", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061794978"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tsmca.2010.2046734", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061795628"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tsmca.2011.2116004", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061795741"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tsmcb.2009.2020436", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061797081"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/tsmcb.2010.2076325", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1061797276"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.2528/pierb08121302", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1070907326"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/cisp.2009.5304150", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1093572802"
        ], 
        "type": "CreativeWork"
      }, 
      {
        "id": "https://doi.org/10.1109/fuzzy.2007.4295635", 
        "sameAs": [
          "https://app.dimensions.ai/details/publication/pub.1094441486"
        ], 
        "type": "CreativeWork"
      }
    ], 
    "datePublished": "2013", 
    "datePublishedReg": "2013-01-01", 
    "description": "Gestures have been called the leaky source of emotional information. Also gestures are easy to retrieve from a distance by ordinary cameras. Thus as many would agree gestures become an important clue to the emotional state of a person. In this paper we have worked on recognizing emotions of a person by analyzing only gestural information. Subjects are initially trained to perform emotionally expressive gestures by a professional actor. The same actor trained the system to recognize the emotional context of gestures. Finally the gestural performances of the subjects are evaluated by the system to identify the class of emotion indicated. Our system yields an accuracy of 94.4% with a training set of only one gesture per emotion. Apart from this our system is also computationally efficient. Our work analyses emotions from only gestures, which is a significant step towards reducing the cost efficiency of emotion recognition. It may be noted here that this system may also be used for the purpose of general gesture recognition. We have proposed new features and a new classifying approach using fuzzy sets. We have achieved state of art accuracy with minimal complexity as each motion trajectory along each axis generates only 4 displacement features. Each axis generates a trajectory and only 6 joint trajectories among all joint trajectories are compared. The 6 motion trajectories are selected based on maximum motion, as maximum moving regions give more information on gestures. The experiments have been performed on data obtained from Microsoft Kinect sensors. Training and Testing were subject gender independent.", 
    "editor": [
      {
        "familyName": "Panigrahi", 
        "givenName": "Bijaya Ketan", 
        "type": "Person"
      }, 
      {
        "familyName": "Suganthan", 
        "givenName": "Ponnuthurai Nagaratnam", 
        "type": "Person"
      }, 
      {
        "familyName": "Das", 
        "givenName": "Swagatam", 
        "type": "Person"
      }, 
      {
        "familyName": "Dash", 
        "givenName": "Shubhransu Sekhar", 
        "type": "Person"
      }
    ], 
    "genre": "chapter", 
    "id": "sg:pub.10.1007/978-3-319-03756-1_32", 
    "inLanguage": [
      "en"
    ], 
    "isAccessibleForFree": false, 
    "isPartOf": {
      "isbn": [
        "978-3-319-03755-4", 
        "978-3-319-03756-1"
      ], 
      "name": "Swarm, Evolutionary, and Memetic Computing", 
      "type": "Book"
    }, 
    "name": "Emotion Recognition System by Gesture Analysis Using Fuzzy Sets", 
    "pagination": "354-363", 
    "productId": [
      {
        "name": "doi", 
        "type": "PropertyValue", 
        "value": [
          "10.1007/978-3-319-03756-1_32"
        ]
      }, 
      {
        "name": "readcube_id", 
        "type": "PropertyValue", 
        "value": [
          "b784eeb96b7f446c3c2029f3fd0bccb5072a0c0194134f6df1553428b01b9ff3"
        ]
      }, 
      {
        "name": "dimensions_id", 
        "type": "PropertyValue", 
        "value": [
          "pub.1038525440"
        ]
      }
    ], 
    "publisher": {
      "location": "Cham", 
      "name": "Springer International Publishing", 
      "type": "Organisation"
    }, 
    "sameAs": [
      "https://doi.org/10.1007/978-3-319-03756-1_32", 
      "https://app.dimensions.ai/details/publication/pub.1038525440"
    ], 
    "sdDataset": "chapters", 
    "sdDatePublished": "2019-04-15T21:03", 
    "sdLicense": "https://scigraph.springernature.com/explorer/license/", 
    "sdPublisher": {
      "name": "Springer Nature - SN SciGraph project", 
      "type": "Organization"
    }, 
    "sdSource": "s3://com-uberresearch-data-dimensions-target-20181106-alternative/cleanup/v134/2549eaecd7973599484d7c17b260dba0a4ecb94b/merge/v9/a6c9fde33151104705d4d7ff012ea9563521a3ce/jats-lookup/v90/0000000001_0000000264/records_8690_00000267.jsonl", 
    "type": "Chapter", 
    "url": "http://link.springer.com/10.1007/978-3-319-03756-1_32"
  }
]
 

Download the RDF metadata as:  json-ld nt turtle xml License info

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/978-3-319-03756-1_32'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/978-3-319-03756-1_32'

Turtle is a human-readable linked data format.

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/978-3-319-03756-1_32'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/978-3-319-03756-1_32'


 

This table displays all metadata directly associated to this object as RDF triples.

160 TRIPLES      23 PREDICATES      45 URIs      20 LITERALS      8 BLANK NODES

Subject Predicate Object
1 sg:pub.10.1007/978-3-319-03756-1_32 schema:about anzsrc-for:17
2 anzsrc-for:1701
3 schema:author Nd2aade55f9ac452496bb293ced908b6b
4 schema:citation sg:pub.10.1007/978-3-540-74889-2_7
5 https://doi.org/10.1109/91.493904
6 https://doi.org/10.1109/cisp.2009.5304150
7 https://doi.org/10.1109/fuzzy.2007.4295635
8 https://doi.org/10.1109/lgrs.2010.2046312
9 https://doi.org/10.1109/tbme.2011.2176940
10 https://doi.org/10.1109/tfuzz.2005.861604
11 https://doi.org/10.1109/tfuzz.2011.2150756
12 https://doi.org/10.1109/tim.2011.2108075
13 https://doi.org/10.1109/tim.2011.2161025
14 https://doi.org/10.1109/titb.2010.2091684
15 https://doi.org/10.1109/tsmca.2004.824852
16 https://doi.org/10.1109/tsmca.2010.2046734
17 https://doi.org/10.1109/tsmca.2011.2116004
18 https://doi.org/10.1109/tsmcb.2009.2020436
19 https://doi.org/10.1109/tsmcb.2010.2076325
20 https://doi.org/10.1525/mp.2008.26.2.103
21 https://doi.org/10.2528/pierb08121302
22 schema:datePublished 2013
23 schema:datePublishedReg 2013-01-01
24 schema:description Gestures have been called the leaky source of emotional information. Also gestures are easy to retrieve from a distance by ordinary cameras. Thus as many would agree gestures become an important clue to the emotional state of a person. In this paper we have worked on recognizing emotions of a person by analyzing only gestural information. Subjects are initially trained to perform emotionally expressive gestures by a professional actor. The same actor trained the system to recognize the emotional context of gestures. Finally the gestural performances of the subjects are evaluated by the system to identify the class of emotion indicated. Our system yields an accuracy of 94.4% with a training set of only one gesture per emotion. Apart from this our system is also computationally efficient. Our work analyses emotions from only gestures, which is a significant step towards reducing the cost efficiency of emotion recognition. It may be noted here that this system may also be used for the purpose of general gesture recognition. We have proposed new features and a new classifying approach using fuzzy sets. We have achieved state of art accuracy with minimal complexity as each motion trajectory along each axis generates only 4 displacement features. Each axis generates a trajectory and only 6 joint trajectories among all joint trajectories are compared. The 6 motion trajectories are selected based on maximum motion, as maximum moving regions give more information on gestures. The experiments have been performed on data obtained from Microsoft Kinect sensors. Training and Testing were subject gender independent.
25 schema:editor N7bc7483da00947e3b25f48787be54d46
26 schema:genre chapter
27 schema:inLanguage en
28 schema:isAccessibleForFree false
29 schema:isPartOf N123981bfa7944bc688623a222e1e74b5
30 schema:name Emotion Recognition System by Gesture Analysis Using Fuzzy Sets
31 schema:pagination 354-363
32 schema:productId N0ea78a25149d4553a048cfb22642537e
33 N75102c2d24bd47b996a2d5078a1a69ad
34 Ncaf6c7128ad648cc8ef7b27ecf646484
35 schema:publisher Na2828695724a4952bea2a1a6a6a27d29
36 schema:sameAs https://app.dimensions.ai/details/publication/pub.1038525440
37 https://doi.org/10.1007/978-3-319-03756-1_32
38 schema:sdDatePublished 2019-04-15T21:03
39 schema:sdLicense https://scigraph.springernature.com/explorer/license/
40 schema:sdPublisher N5a0b974fc83b48bcbe8d6ce94a39feec
41 schema:url http://link.springer.com/10.1007/978-3-319-03756-1_32
42 sgo:license sg:explorer/license/
43 sgo:sdDataset chapters
44 rdf:type schema:Chapter
45 N0ea78a25149d4553a048cfb22642537e schema:name dimensions_id
46 schema:value pub.1038525440
47 rdf:type schema:PropertyValue
48 N0f4dff16c0054a1f90dbf5cad13d6e22 schema:familyName Panigrahi
49 schema:givenName Bijaya Ketan
50 rdf:type schema:Person
51 N0f85e0b99cf74754a70a3c56062cb0c8 rdf:first Nf50ae86d624b4fa482ee42d1ef659154
52 rdf:rest N6f75797ac71e48f08f663b621a6fc1e4
53 N123981bfa7944bc688623a222e1e74b5 schema:isbn 978-3-319-03755-4
54 978-3-319-03756-1
55 schema:name Swarm, Evolutionary, and Memetic Computing
56 rdf:type schema:Book
57 N48478ed5244f4570aa5b5ad5845c680a rdf:first sg:person.01337053064.29
58 rdf:rest N8c1bdfd76dc9407e8218ef5d89b82db2
59 N4e5256c952a845b69f81f411c99c0106 rdf:first Na3e10323b7f2473e8fa1aca0b93c6323
60 rdf:rest N0f85e0b99cf74754a70a3c56062cb0c8
61 N5a0b974fc83b48bcbe8d6ce94a39feec schema:name Springer Nature - SN SciGraph project
62 rdf:type schema:Organization
63 N6e4824b800484a14bb05fa39192ab9b6 rdf:first sg:person.015735623111.66
64 rdf:rest N48478ed5244f4570aa5b5ad5845c680a
65 N6f75797ac71e48f08f663b621a6fc1e4 rdf:first N866543a95e9b45739bfa635a093cacfc
66 rdf:rest rdf:nil
67 N75102c2d24bd47b996a2d5078a1a69ad schema:name readcube_id
68 schema:value b784eeb96b7f446c3c2029f3fd0bccb5072a0c0194134f6df1553428b01b9ff3
69 rdf:type schema:PropertyValue
70 N7bc7483da00947e3b25f48787be54d46 rdf:first N0f4dff16c0054a1f90dbf5cad13d6e22
71 rdf:rest N4e5256c952a845b69f81f411c99c0106
72 N866543a95e9b45739bfa635a093cacfc schema:familyName Dash
73 schema:givenName Shubhransu Sekhar
74 rdf:type schema:Person
75 N8c1bdfd76dc9407e8218ef5d89b82db2 rdf:first sg:person.013316247664.76
76 rdf:rest rdf:nil
77 N9003e60f913a494291b1bbb1a0d43675 schema:name Department of Computer Science & Engineering, St. Thomas’ College of Engineering & Technology, Kolkata, India
78 rdf:type schema:Organization
79 Na2828695724a4952bea2a1a6a6a27d29 schema:location Cham
80 schema:name Springer International Publishing
81 rdf:type schema:Organisation
82 Na3e10323b7f2473e8fa1aca0b93c6323 schema:familyName Suganthan
83 schema:givenName Ponnuthurai Nagaratnam
84 rdf:type schema:Person
85 Ncaf6c7128ad648cc8ef7b27ecf646484 schema:name doi
86 schema:value 10.1007/978-3-319-03756-1_32
87 rdf:type schema:PropertyValue
88 Nd2aade55f9ac452496bb293ced908b6b rdf:first sg:person.014454247455.14
89 rdf:rest N6e4824b800484a14bb05fa39192ab9b6
90 Nefc6a9d6a2f34cfd9705249ce8ce9a09 schema:name Department of Computer Science & Engineering, TJS Engineering College, Chennai, India
91 rdf:type schema:Organization
92 Nf50ae86d624b4fa482ee42d1ef659154 schema:familyName Das
93 schema:givenName Swagatam
94 rdf:type schema:Person
95 anzsrc-for:17 schema:inDefinedTermSet anzsrc-for:
96 schema:name Psychology and Cognitive Sciences
97 rdf:type schema:DefinedTerm
98 anzsrc-for:1701 schema:inDefinedTermSet anzsrc-for:
99 schema:name Psychology
100 rdf:type schema:DefinedTerm
101 sg:person.013316247664.76 schema:affiliation Nefc6a9d6a2f34cfd9705249ce8ce9a09
102 schema:familyName Janarthanan
103 schema:givenName Ramadoss
104 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.013316247664.76
105 rdf:type schema:Person
106 sg:person.01337053064.29 schema:affiliation https://www.grid.ac/institutes/grid.216499.1
107 schema:familyName Konar
108 schema:givenName Amit
109 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01337053064.29
110 rdf:type schema:Person
111 sg:person.014454247455.14 schema:affiliation https://www.grid.ac/institutes/grid.216499.1
112 schema:familyName Kar
113 schema:givenName Reshma
114 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.014454247455.14
115 rdf:type schema:Person
116 sg:person.015735623111.66 schema:affiliation N9003e60f913a494291b1bbb1a0d43675
117 schema:familyName Chakraborty
118 schema:givenName Aruna
119 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.015735623111.66
120 rdf:type schema:Person
121 sg:pub.10.1007/978-3-540-74889-2_7 schema:sameAs https://app.dimensions.ai/details/publication/pub.1041441854
122 https://doi.org/10.1007/978-3-540-74889-2_7
123 rdf:type schema:CreativeWork
124 https://doi.org/10.1109/91.493904 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061247769
125 rdf:type schema:CreativeWork
126 https://doi.org/10.1109/cisp.2009.5304150 schema:sameAs https://app.dimensions.ai/details/publication/pub.1093572802
127 rdf:type schema:CreativeWork
128 https://doi.org/10.1109/fuzzy.2007.4295635 schema:sameAs https://app.dimensions.ai/details/publication/pub.1094441486
129 rdf:type schema:CreativeWork
130 https://doi.org/10.1109/lgrs.2010.2046312 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061359021
131 rdf:type schema:CreativeWork
132 https://doi.org/10.1109/tbme.2011.2176940 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061528644
133 rdf:type schema:CreativeWork
134 https://doi.org/10.1109/tfuzz.2005.861604 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061605879
135 rdf:type schema:CreativeWork
136 https://doi.org/10.1109/tfuzz.2011.2150756 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061606473
137 rdf:type schema:CreativeWork
138 https://doi.org/10.1109/tim.2011.2108075 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061638681
139 rdf:type schema:CreativeWork
140 https://doi.org/10.1109/tim.2011.2161025 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061638893
141 rdf:type schema:CreativeWork
142 https://doi.org/10.1109/titb.2010.2091684 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061656974
143 rdf:type schema:CreativeWork
144 https://doi.org/10.1109/tsmca.2004.824852 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061794978
145 rdf:type schema:CreativeWork
146 https://doi.org/10.1109/tsmca.2010.2046734 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061795628
147 rdf:type schema:CreativeWork
148 https://doi.org/10.1109/tsmca.2011.2116004 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061795741
149 rdf:type schema:CreativeWork
150 https://doi.org/10.1109/tsmcb.2009.2020436 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061797081
151 rdf:type schema:CreativeWork
152 https://doi.org/10.1109/tsmcb.2010.2076325 schema:sameAs https://app.dimensions.ai/details/publication/pub.1061797276
153 rdf:type schema:CreativeWork
154 https://doi.org/10.1525/mp.2008.26.2.103 schema:sameAs https://app.dimensions.ai/details/publication/pub.1001078273
155 rdf:type schema:CreativeWork
156 https://doi.org/10.2528/pierb08121302 schema:sameAs https://app.dimensions.ai/details/publication/pub.1070907326
157 rdf:type schema:CreativeWork
158 https://www.grid.ac/institutes/grid.216499.1 schema:alternateName Jadavpur University
159 schema:name Department of Electronics and Tele-Communication Engineering, Jadavpur University, Kolkata, India
160 rdf:type schema:Organization
 




Preview window. Press ESC to close (or click here)


...