# Communication with Contextual Uncertainty

Ontology type: schema:ScholarlyArticle      Open Access: True

### Article Info

DATE

2017-08-29

AUTHORS ABSTRACT

We introduce a simple model illustrating the utility of context in compressing communication and the challenge posed by uncertainty of knowledge of context. We consider a variant of distributional communication complexity where Alice gets some information X∈{0,1}n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${X \in \{0,1\}^n}$$\end{document} and Bob gets Y∈{0,1}n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${Y \in \{0,1\}^n}$$\end{document}, where (X, Y) is drawn from a known distribution, and Bob wishes to compute some function g(X, Y) or some close approximation to it (i.e., the output is g(X, Y) with high probability over (X, Y)). In our variant, Alice does not know g, but only knows some function f which is a very close approximation to g. Thus, the function being computed forms the context for the communication. It is an enormous implicit input, potentially described by a truth table of size 2n. Imprecise knowledge of this function models the (mild) uncertainty in this context.We show that uncertainty can lead to a huge cost in communication. Specifically, we construct a distribution μ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\mu}$$\end{document} over (X,Y)∈{0,1}n×{0,1}n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${(X,Y)\in \{0,1\}^n \times \{0,1\}^n}$$\end{document} and a class of function pairs (f, g) which are very close (i.e., disagree with o(1) probability when (X, Y) are sampled according to μ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\mu}$$\end{document}), for which the communication complexity of f or g in the standard setting is one bit, whereas the (two-way) communication complexity in the uncertain setting is at least Ω(n)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\Omega(\sqrt{n})}$$\end{document} bits even when allowing a constant probability of error.It turns out that this blow-up in communication complexity can be attributed in part to the mutual information between X and Y. In particular, we give an efficient protocol for communication under contextual uncertainty that incurs only a small blow-up in communication if this mutual information is small. Namely, we show that if g has a communication protocol with complexity k in the standard setting and the mutual information between X and Y is I, then g has a one-way communication protocol with complexity O((1+I)·2k)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${O((1+I)\cdot 2^k)}$$\end{document} in the uncertain setting. This result is an immediate corollary of an even stronger result which shows that if g has one-way communication complexity k, then it has one-way uncertain-communication complexity at most O((1+I)·k)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${O((1+I)\cdot k)}$$\end{document}. In the particular case where the input distribution is a product distribution (and so I = 0), the protocol in the uncertain setting only incurs a constant factor blow-up in one-way communication and error. More... »

PAGES

463-509

### References to SciGraph publications

• 2014. On the Role of Shared Randomness in Simultaneous Communication in AUTOMATA, LANGUAGES, AND PROGRAMMING
• 1999-06. On Randomized One-round Communication Complexity in COMPUTATIONAL COMPLEXITY
• ### Journal

TITLE

computational complexity

ISSUE

3

VOLUME

27

### Identifiers

URI

http://scigraph.springernature.com/pub.10.1007/s00037-017-0161-3

DOI

http://dx.doi.org/10.1007/s00037-017-0161-3

DIMENSIONS

https://app.dimensions.ai/details/publication/pub.1091379182

Indexing Status Check whether this publication has been indexed by Scopus and Web Of Science using the SN Indexing Status Tool
Incoming Citations Browse incoming citations for this publication using opencitations.net

JSON-LD is the canonical representation for SciGraph data.

TIP: You can open this SciGraph record using an external JSON-LD service:

[
{
"@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json",
{
"id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08",
"inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/",
"name": "Information and Computing Sciences",
"type": "DefinedTerm"
},
{
"id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0804",
"inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/",
"name": "Data Format",
"type": "DefinedTerm"
}
],
"author": [
{
"affiliation": {
"alternateName": "Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 02139, Cambridge, MA, USA",
"id": "http://www.grid.ac/institutes/grid.116068.8",
"name": [
"Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 02139, Cambridge, MA, USA"
],
"type": "Organization"
},
"familyName": "Ghazi",
"id": "sg:person.012547725413.79",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012547725413.79"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Department of Computer Science and Applied Mathematics, Weizmann Institute of Science, 76100, Rehovot, Israel",
"id": "http://www.grid.ac/institutes/grid.13992.30",
"name": [
"Department of Computer Science and Applied Mathematics, Weizmann Institute of Science, 76100, Rehovot, Israel"
],
"type": "Organization"
},
"familyName": "Komargodski",
"givenName": "Ilan",
"id": "sg:person.012204235441.12",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012204235441.12"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Princeton University and IAS, Princeton, NJ, USA",
"id": "http://www.grid.ac/institutes/grid.16750.35",
"name": [
"Princeton University and IAS, Princeton, NJ, USA"
],
"type": "Organization"
},
"familyName": "Kothari",
"givenName": "Pravesh K.",
"id": "sg:person.010472330167.05",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010472330167.05"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Harvard John A. Paulson School of Engineering and Applied Sciences, 33 Oxford Street, 02138, Cambridge, MA, USA",
"id": "http://www.grid.ac/institutes/grid.38142.3c",
"name": [
"Harvard John A. Paulson School of Engineering and Applied Sciences, 33 Oxford Street, 02138, Cambridge, MA, USA"
],
"type": "Organization"
},
"familyName": "Sudan",
"id": "sg:person.014663420265.17",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.014663420265.17"
],
"type": "Person"
}
],
"citation": [
{
"id": "sg:pub.10.1007/978-3-662-43948-7_13",
"sameAs": [
"https://app.dimensions.ai/details/publication/pub.1039629712",
"https://doi.org/10.1007/978-3-662-43948-7_13"
],
"type": "CreativeWork"
},
{
"id": "sg:pub.10.1007/s000370050018",
"sameAs": [
"https://app.dimensions.ai/details/publication/pub.1002114721",
"https://doi.org/10.1007/s000370050018"
],
"type": "CreativeWork"
}
],
"datePublished": "2017-08-29",
"datePublishedReg": "2017-08-29",
"description": "We introduce a simple model illustrating the utility of context in compressing communication and the challenge posed by uncertainty of knowledge of context. We consider a variant of distributional communication complexity where Alice gets some information X\u2208{0,1}n\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${X \\in \\{0,1\\}^n}$$\\end{document} and Bob gets Y\u2208{0,1}n\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${Y \\in \\{0,1\\}^n}$$\\end{document}, where (X, Y) is drawn from a known distribution, and Bob wishes to compute some function g(X, Y) or some close approximation to it (i.e., the output is g(X, Y) with high probability over (X, Y)). In our variant, Alice does not know g, but only knows some function f which is a very close approximation to g. Thus, the function being computed forms the context for the communication. It is an enormous implicit input, potentially described by a truth table of size 2n. Imprecise knowledge of this function models the (mild) uncertainty in this context.We show that uncertainty can lead to a huge cost in communication. Specifically, we construct a distribution \u03bc\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${\\mu}$$\\end{document} over (X,Y)\u2208{0,1}n\u00d7{0,1}n\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${(X,Y)\\in \\{0,1\\}^n \\times \\{0,1\\}^n}$$\\end{document} and a class of function pairs (f, g) which are very close (i.e., disagree with o(1) probability when (X, Y) are sampled according to \u03bc\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${\\mu}$$\\end{document}), for which the communication complexity of f or g in the standard setting is one bit, whereas the (two-way) communication complexity in the uncertain setting is at least \u03a9(n)\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${\\Omega(\\sqrt{n})}$$\\end{document} bits even when allowing a constant probability of error.It turns out that this blow-up in communication complexity can be attributed in part to the mutual information between X and Y. In particular, we give an efficient protocol for communication under contextual uncertainty that incurs only a small blow-up in communication if this mutual information is small. Namely, we show that if g has a communication protocol with complexity k in the standard setting and the mutual information between X and Y is I, then g has a one-way communication protocol with complexity O((1+I)\u00b72k)\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${O((1+I)\\cdot 2^k)}$$\\end{document} in the uncertain setting. This result is an immediate corollary of an even stronger result which shows that if g has one-way communication complexity k, then it has one-way uncertain-communication complexity at most O((1+I)\u00b7k)\\documentclass[12pt]{minimal}\n\t\t\t\t\\usepackage{amsmath}\n\t\t\t\t\\usepackage{wasysym}\n\t\t\t\t\\usepackage{amsfonts}\n\t\t\t\t\\usepackage{amssymb}\n\t\t\t\t\\usepackage{amsbsy}\n\t\t\t\t\\usepackage{mathrsfs}\n\t\t\t\t\\usepackage{upgreek}\n\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\n\t\t\t\t\\begin{document}$${O((1+I)\\cdot k)}$$\\end{document}. In the particular case where the input distribution is a product distribution (and so I\u00a0=\u00a00), the protocol in the uncertain setting only incurs a constant factor blow-up in one-way communication and error.",
"genre": "article",
"id": "sg:pub.10.1007/s00037-017-0161-3",
"inLanguage": "en",
"isAccessibleForFree": true,
"isPartOf": [
{
"id": "sg:journal.1136224",
"issn": [
"1016-3328",
"1420-8954"
],
"name": "computational complexity",
"publisher": "Springer Nature",
"type": "Periodical"
},
{
"issueNumber": "3",
"type": "PublicationIssue"
},
{
"type": "PublicationVolume",
}
],
"keywords": [
"setting",
"protocol",
"variants",
"function",
"standard setting",
"factors",
"cases",
"knowledge",
"information",
"utility",
"results",
"close approximation",
"challenges",
"communication",
"distribution",
"huge cost",
"part",
"context",
"form",
"probability",
"model",
"cost",
"uncertainty of knowledge",
"class",
"complexity",
"input",
"one-way communication",
"table",
"pairs",
"error",
"stronger result",
"uncertainty",
"mutual information",
"efficient protocol",
"constant probability",
"corollary",
"contextual uncertainty",
"particular case",
"Bob",
"function f",
"constant factor",
"simple model",
"Alice",
"input distribution",
"uncertain settings",
"function pairs",
"bits",
"complexity k",
"communication protocols",
"communication complexity",
"approximation",
"immediate corollary",
"distributional communication complexity",
"truth table",
"one-way communication protocol",
"implicit input",
"product distribution",
"size 2n",
"utility of context",
"enormous implicit input",
"one-way communication complexity k",
"communication complexity k",
"one-way uncertain-communication complexity",
"uncertain-communication complexity"
],
"name": "Communication with Contextual Uncertainty",
"pagination": "463-509",
"productId": [
{
"name": "dimensions_id",
"type": "PropertyValue",
"value": [
"pub.1091379182"
]
},
{
"name": "doi",
"type": "PropertyValue",
"value": [
"10.1007/s00037-017-0161-3"
]
}
],
"sameAs": [
"https://doi.org/10.1007/s00037-017-0161-3",
"https://app.dimensions.ai/details/publication/pub.1091379182"
],
"sdDataset": "articles",
"sdDatePublished": "2022-01-01T18:46",
"sdPublisher": {
"name": "Springer Nature - SN SciGraph project",
"type": "Organization"
},
"sdSource": "s3://com-springernature-scigraph/baseset/20220101/entities/gbq_results/article/article_752.jsonl",
"type": "ScholarlyArticle",
"url": "https://doi.org/10.1007/s00037-017-0161-3"
}
]

HOW TO GET THIS DATA PROGRAMMATICALLY:

JSON-LD is a popular format for linked data which is fully compatible with JSON.

curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/s00037-017-0161-3'

N-Triples is a line-based linked data format ideal for batch operations.

curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/s00037-017-0161-3'

curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/s00037-017-0161-3'

RDF/XML is a standard XML format for linked data.

curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/s00037-017-0161-3'

This table displays all metadata directly associated to this object as RDF triples.

160 TRIPLES      22 PREDICATES      91 URIs      81 LITERALS      6 BLANK NODES

Subject Predicate Object
2 anzsrc-for:0804
3 schema:author N130f697fd2dc46108afd574244420364
4 schema:citation sg:pub.10.1007/978-3-662-43948-7_13
5 sg:pub.10.1007/s000370050018
6 schema:datePublished 2017-08-29
7 schema:datePublishedReg 2017-08-29
8 schema:description We introduce a simple model illustrating the utility of context in compressing communication and the challenge posed by uncertainty of knowledge of context. We consider a variant of distributional communication complexity where Alice gets some information X∈{0,1}n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${X \in \{0,1\}^n}$$\end{document} and Bob gets Y∈{0,1}n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${Y \in \{0,1\}^n}$$\end{document}, where (X, Y) is drawn from a known distribution, and Bob wishes to compute some function g(X, Y) or some close approximation to it (i.e., the output is g(X, Y) with high probability over (X, Y)). In our variant, Alice does not know g, but only knows some function f which is a very close approximation to g. Thus, the function being computed forms the context for the communication. It is an enormous implicit input, potentially described by a truth table of size 2n. Imprecise knowledge of this function models the (mild) uncertainty in this context.We show that uncertainty can lead to a huge cost in communication. Specifically, we construct a distribution μ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\mu}$$\end{document} over (X,Y)∈{0,1}n×{0,1}n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${(X,Y)\in \{0,1\}^n \times \{0,1\}^n}$$\end{document} and a class of function pairs (f, g) which are very close (i.e., disagree with o(1) probability when (X, Y) are sampled according to μ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\mu}$$\end{document}), for which the communication complexity of f or g in the standard setting is one bit, whereas the (two-way) communication complexity in the uncertain setting is at least Ω(n)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\Omega(\sqrt{n})}$$\end{document} bits even when allowing a constant probability of error.It turns out that this blow-up in communication complexity can be attributed in part to the mutual information between X and Y. In particular, we give an efficient protocol for communication under contextual uncertainty that incurs only a small blow-up in communication if this mutual information is small. Namely, we show that if g has a communication protocol with complexity k in the standard setting and the mutual information between X and Y is I, then g has a one-way communication protocol with complexity O((1+I)·2k)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${O((1+I)\cdot 2^k)}$$\end{document} in the uncertain setting. This result is an immediate corollary of an even stronger result which shows that if g has one-way communication complexity k, then it has one-way uncertain-communication complexity at most O((1+I)·k)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${O((1+I)\cdot k)}$$\end{document}. In the particular case where the input distribution is a product distribution (and so I = 0), the protocol in the uncertain setting only incurs a constant factor blow-up in one-way communication and error.
9 schema:genre article
10 schema:inLanguage en
11 schema:isAccessibleForFree true
12 schema:isPartOf N704fa2f942704a5bab60f6064f428f9c
13 Nb3ef3405c8db4dd390dd815596ee0fa4
14 sg:journal.1136224
15 schema:keywords Alice
16 Bob
17 approximation
18 bits
19 cases
20 challenges
21 class
22 close approximation
23 communication
24 communication complexity
25 communication complexity k
26 communication protocols
27 complexity
28 complexity k
29 constant factor
30 constant probability
31 context
32 contextual uncertainty
33 corollary
34 cost
35 distribution
36 distributional communication complexity
37 efficient protocol
38 enormous implicit input
39 error
40 factors
41 form
42 function
43 function f
44 function pairs
45 huge cost
46 immediate corollary
47 implicit input
48 information
49 input
50 input distribution
51 knowledge
52 model
53 mutual information
54 one-way communication
55 one-way communication complexity k
56 one-way communication protocol
57 one-way uncertain-communication complexity
58 pairs
59 part
60 particular case
61 probability
62 product distribution
63 protocol
64 results
65 setting
66 simple model
67 size 2n
68 standard setting
69 stronger result
70 table
71 truth table
72 uncertain settings
73 uncertain-communication complexity
74 uncertainty
75 uncertainty of knowledge
76 utility
77 utility of context
78 variants
79 schema:name Communication with Contextual Uncertainty
80 schema:pagination 463-509
81 schema:productId N98965496d84e497591b4731d04e9e4d3
82 Ncda2906ed1814873aa2c390c80bf3726
83 schema:sameAs https://app.dimensions.ai/details/publication/pub.1091379182
84 https://doi.org/10.1007/s00037-017-0161-3
85 schema:sdDatePublished 2022-01-01T18:46
87 schema:sdPublisher Naaa99fb1f9bd4466a38085cbf81bd03a
88 schema:url https://doi.org/10.1007/s00037-017-0161-3
90 sgo:sdDataset articles
91 rdf:type schema:ScholarlyArticle
92 N130f697fd2dc46108afd574244420364 rdf:first sg:person.012547725413.79
93 rdf:rest N272289bc2ee74067911b7de6133bb325
94 N272289bc2ee74067911b7de6133bb325 rdf:first sg:person.012204235441.12
95 rdf:rest N2f029edcdcc149ccbd0d6fdf84dc9fbd
96 N2f029edcdcc149ccbd0d6fdf84dc9fbd rdf:first sg:person.010472330167.05
97 rdf:rest N8967d79bcbbd4dcc8fdb5b2149edfb30
98 N704fa2f942704a5bab60f6064f428f9c schema:issueNumber 3
99 rdf:type schema:PublicationIssue
100 N8967d79bcbbd4dcc8fdb5b2149edfb30 rdf:first sg:person.014663420265.17
101 rdf:rest rdf:nil
102 N98965496d84e497591b4731d04e9e4d3 schema:name doi
103 schema:value 10.1007/s00037-017-0161-3
104 rdf:type schema:PropertyValue
105 Naaa99fb1f9bd4466a38085cbf81bd03a schema:name Springer Nature - SN SciGraph project
106 rdf:type schema:Organization
108 rdf:type schema:PublicationVolume
109 Ncda2906ed1814873aa2c390c80bf3726 schema:name dimensions_id
110 schema:value pub.1091379182
111 rdf:type schema:PropertyValue
112 anzsrc-for:08 schema:inDefinedTermSet anzsrc-for:
113 schema:name Information and Computing Sciences
114 rdf:type schema:DefinedTerm
115 anzsrc-for:0804 schema:inDefinedTermSet anzsrc-for:
116 schema:name Data Format
117 rdf:type schema:DefinedTerm
118 sg:journal.1136224 schema:issn 1016-3328
119 1420-8954
120 schema:name computational complexity
121 schema:publisher Springer Nature
122 rdf:type schema:Periodical
123 sg:person.010472330167.05 schema:affiliation grid-institutes:grid.16750.35
124 schema:familyName Kothari
125 schema:givenName Pravesh K.
126 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010472330167.05
127 rdf:type schema:Person
128 sg:person.012204235441.12 schema:affiliation grid-institutes:grid.13992.30
129 schema:familyName Komargodski
130 schema:givenName Ilan
131 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012204235441.12
132 rdf:type schema:Person
133 sg:person.012547725413.79 schema:affiliation grid-institutes:grid.116068.8
134 schema:familyName Ghazi
136 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.012547725413.79
137 rdf:type schema:Person
138 sg:person.014663420265.17 schema:affiliation grid-institutes:grid.38142.3c
139 schema:familyName Sudan
141 schema:sameAs https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.014663420265.17
142 rdf:type schema:Person
143 sg:pub.10.1007/978-3-662-43948-7_13 schema:sameAs https://app.dimensions.ai/details/publication/pub.1039629712
144 https://doi.org/10.1007/978-3-662-43948-7_13
145 rdf:type schema:CreativeWork
146 sg:pub.10.1007/s000370050018 schema:sameAs https://app.dimensions.ai/details/publication/pub.1002114721
147 https://doi.org/10.1007/s000370050018
148 rdf:type schema:CreativeWork
149 grid-institutes:grid.116068.8 schema:alternateName Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 02139, Cambridge, MA, USA
150 schema:name Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 02139, Cambridge, MA, USA
151 rdf:type schema:Organization
152 grid-institutes:grid.13992.30 schema:alternateName Department of Computer Science and Applied Mathematics, Weizmann Institute of Science, 76100, Rehovot, Israel
153 schema:name Department of Computer Science and Applied Mathematics, Weizmann Institute of Science, 76100, Rehovot, Israel
154 rdf:type schema:Organization
155 grid-institutes:grid.16750.35 schema:alternateName Princeton University and IAS, Princeton, NJ, USA
156 schema:name Princeton University and IAS, Princeton, NJ, USA
157 rdf:type schema:Organization
158 grid-institutes:grid.38142.3c schema:alternateName Harvard John A. Paulson School of Engineering and Applied Sciences, 33 Oxford Street, 02138, Cambridge, MA, USA
159 schema:name Harvard John A. Paulson School of Engineering and Applied Sciences, 33 Oxford Street, 02138, Cambridge, MA, USA
160 rdf:type schema:Organization