Ontology type: schema:Chapter
2013
AUTHORSNoha El-Zehiry , Michelle Yan , Sara Good , Tong Fang , S. Kevin Zhou , Leo Grady
ABSTRACTUltrasound acquisition is a challenging task that requires simultaneous adjustment of several acquisition parameters (the depth, the focus, the frequency and its operation mode). If the acquisition parameters are not properly chosen, the resulting image will have a poor quality and will degrade the patient diagnosis and treatment workflow. Several hardware-based systems for autotuning the acquisition parameters have been previously proposed, but these solutions were largely abandoned because they failed to properly account for tissue inhomogeneity and other patient-specific characteristics. Consequently, in routine practice the clinician either uses population-based parameter presets or manually adjusts the acquisition parameters for each patient during the scan. In this paper, we revisit the problem of autotuning the acquisition parameters by taking a completely novel approach and producing a solution based on image analytics. Our solution is inspired by the autofocus capability of conventional digital cameras, but is significantly more challenging because the number of acquisition parameters is large and the determination of “good quality” images is more difficult to assess. Surprisingly, we show that the set of acquisition parameters which produce images that are favored by clinicians comprise a 1D manifold, allowing for a real-time optimization to maximize image quality. We demonstrate our method for acquisition parameter autotuning on several live patients, showing that our system can start with a poor initial set of parameters and automatically optimize the parameters to produce high quality images. More... »
PAGES122-130
Advanced Information Systems Engineering
ISBN
978-3-642-40810-6
978-3-642-40811-3
http://scigraph.springernature.com/pub.10.1007/978-3-642-40811-3_16
DOIhttp://dx.doi.org/10.1007/978-3-642-40811-3_16
DIMENSIONShttps://app.dimensions.ai/details/publication/pub.1049687045
PUBMEDhttps://www.ncbi.nlm.nih.gov/pubmed/24505657
JSON-LD is the canonical representation for SciGraph data.
TIP: You can open this SciGraph record using an external JSON-LD service: JSON-LD Playground Google SDTT
[
{
"@context": "https://springernature.github.io/scigraph/jsonld/sgcontext.json",
"about": [
{
"id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/08",
"inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/",
"name": "Information and Computing Sciences",
"type": "DefinedTerm"
},
{
"id": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/0801",
"inDefinedTermSet": "http://purl.org/au-research/vocabulary/anzsrc-for/2008/",
"name": "Artificial Intelligence and Image Processing",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Algorithms",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Artificial Intelligence",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Humans",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Image Enhancement",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Image Interpretation, Computer-Assisted",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Imaging, Three-Dimensional",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Pattern Recognition, Automated",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Reproducibility of Results",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Sensitivity and Specificity",
"type": "DefinedTerm"
},
{
"inDefinedTermSet": "https://www.nlm.nih.gov/mesh/",
"name": "Ultrasonography",
"type": "DefinedTerm"
}
],
"author": [
{
"affiliation": {
"alternateName": "Corporate Technology, Siemens Corporation, USA",
"id": "http://www.grid.ac/institutes/grid.419233.e",
"name": [
"Corporate Technology, Siemens Corporation, USA"
],
"type": "Organization"
},
"familyName": "El-Zehiry",
"givenName": "Noha",
"id": "sg:person.07657676251.51",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.07657676251.51"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Corporate Technology, Siemens Corporation, USA",
"id": "http://www.grid.ac/institutes/grid.419233.e",
"name": [
"Corporate Technology, Siemens Corporation, USA"
],
"type": "Organization"
},
"familyName": "Yan",
"givenName": "Michelle",
"id": "sg:person.010661631022.08",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.010661631022.08"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Siemens Healthcare, USA",
"id": "http://www.grid.ac/institutes/grid.415886.6",
"name": [
"Siemens Healthcare, USA"
],
"type": "Organization"
},
"familyName": "Good",
"givenName": "Sara",
"id": "sg:person.0614045271.61",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.0614045271.61"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Corporate Technology, Siemens Corporation, USA",
"id": "http://www.grid.ac/institutes/grid.419233.e",
"name": [
"Corporate Technology, Siemens Corporation, USA"
],
"type": "Organization"
},
"familyName": "Fang",
"givenName": "Tong",
"id": "sg:person.01065677302.14",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01065677302.14"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Corporate Technology, Siemens Corporation, USA",
"id": "http://www.grid.ac/institutes/grid.419233.e",
"name": [
"Corporate Technology, Siemens Corporation, USA"
],
"type": "Organization"
},
"familyName": "Zhou",
"givenName": "S. Kevin",
"id": "sg:person.01372425362.30",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.01372425362.30"
],
"type": "Person"
},
{
"affiliation": {
"alternateName": "Corporate Technology, Siemens Corporation, USA",
"id": "http://www.grid.ac/institutes/grid.419233.e",
"name": [
"Corporate Technology, Siemens Corporation, USA"
],
"type": "Organization"
},
"familyName": "Grady",
"givenName": "Leo",
"id": "sg:person.0617232252.77",
"sameAs": [
"https://app.dimensions.ai/discover/publication?and_facet_researcher=ur.0617232252.77"
],
"type": "Person"
}
],
"datePublished": "2013",
"datePublishedReg": "2013-01-01",
"description": "Ultrasound acquisition is a challenging task that requires simultaneous adjustment of several acquisition parameters (the depth, the focus, the frequency and its operation mode). If the acquisition parameters are not properly chosen, the resulting image will have a poor quality and will degrade the patient diagnosis and treatment workflow. Several hardware-based systems for autotuning the acquisition parameters have been previously proposed, but these solutions were largely abandoned because they failed to properly account for tissue inhomogeneity and other patient-specific characteristics. Consequently, in routine practice the clinician either uses population-based parameter presets or manually adjusts the acquisition parameters for each patient during the scan. In this paper, we revisit the problem of autotuning the acquisition parameters by taking a completely novel approach and producing a solution based on image analytics. Our solution is inspired by the autofocus capability of conventional digital cameras, but is significantly more challenging because the number of acquisition parameters is large and the determination of \u201cgood quality\u201d images is more difficult to assess. Surprisingly, we show that the set of acquisition parameters which produce images that are favored by clinicians comprise a 1D manifold, allowing for a real-time optimization to maximize image quality. We demonstrate our method for acquisition parameter autotuning on several live patients, showing that our system can start with a poor initial set of parameters and automatically optimize the parameters to produce high quality images.",
"editor": [
{
"familyName": "Salinesi",
"givenName": "Camille",
"type": "Person"
},
{
"familyName": "Norrie",
"givenName": "Moira C.",
"type": "Person"
},
{
"familyName": "Pastor",
"givenName": "\u00d3scar",
"type": "Person"
}
],
"genre": "chapter",
"id": "sg:pub.10.1007/978-3-642-40811-3_16",
"inLanguage": "en",
"isAccessibleForFree": false,
"isPartOf": {
"isbn": [
"978-3-642-40810-6",
"978-3-642-40811-3"
],
"name": "Advanced Information Systems Engineering",
"type": "Book"
},
"keywords": [
"hardware-based systems",
"ultrasound acquisition",
"conventional digital camera",
"image analytics",
"high-quality images",
"real-time optimization",
"challenging task",
"acquisition parameters",
"digital camera",
"quality images",
"initial set",
"image quality",
"novel approach",
"images",
"treatment workflow",
"analytics",
"set",
"good quality",
"workflow",
"camera",
"task",
"patient-specific characteristics",
"system",
"solution",
"acquisition",
"quality",
"capability",
"tissue inhomogeneities",
"patient diagnosis",
"optimization",
"parameters",
"preset",
"live patients",
"poor quality",
"inhomogeneity",
"method",
"number",
"simultaneous adjustment",
"manifold",
"characteristics",
"practice",
"problem",
"determination",
"approach",
"scans",
"clinicians",
"adjustment",
"diagnosis",
"routine practice",
"patients",
"paper"
],
"name": "Learning the Manifold of Quality Ultrasound Acquisition",
"pagination": "122-130",
"productId": [
{
"name": "dimensions_id",
"type": "PropertyValue",
"value": [
"pub.1049687045"
]
},
{
"name": "doi",
"type": "PropertyValue",
"value": [
"10.1007/978-3-642-40811-3_16"
]
},
{
"name": "pubmed_id",
"type": "PropertyValue",
"value": [
"24505657"
]
}
],
"publisher": {
"name": "Springer Nature",
"type": "Organisation"
},
"sameAs": [
"https://doi.org/10.1007/978-3-642-40811-3_16",
"https://app.dimensions.ai/details/publication/pub.1049687045"
],
"sdDataset": "chapters",
"sdDatePublished": "2022-05-20T07:46",
"sdLicense": "https://scigraph.springernature.com/explorer/license/",
"sdPublisher": {
"name": "Springer Nature - SN SciGraph project",
"type": "Organization"
},
"sdSource": "s3://com-springernature-scigraph/baseset/20220519/entities/gbq_results/chapter/chapter_329.jsonl",
"type": "Chapter",
"url": "https://doi.org/10.1007/978-3-642-40811-3_16"
}
]
Download the RDF metadata as: json-ld nt turtle xml License info
JSON-LD is a popular format for linked data which is fully compatible with JSON.
curl -H 'Accept: application/ld+json' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-40811-3_16'
N-Triples is a line-based linked data format ideal for batch operations.
curl -H 'Accept: application/n-triples' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-40811-3_16'
Turtle is a human-readable linked data format.
curl -H 'Accept: text/turtle' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-40811-3_16'
RDF/XML is a standard XML format for linked data.
curl -H 'Accept: application/rdf+xml' 'https://scigraph.springernature.com/pub.10.1007/978-3-642-40811-3_16'
This table displays all metadata directly associated to this object as RDF triples.
203 TRIPLES
23 PREDICATES
88 URIs
81 LITERALS
18 BLANK NODES