Welcome to P K Kelkar Library, Online Public Access Catalogue (OPAC)

Normal view MARC view ISBD view

Big data integration /

By: Dong, Xin Luna [author.].
Contributor(s): Srivastava, Divesh [author.].
Material type: materialTypeLabelBookSeries: Synthesis digital library of engineering and computer science: ; Synthesis lectures on data management: # 40.Publisher: San Rafael, California (1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool, 2015.Description: 1 PDF (xx, 178 pages) : illustrations.Content type: text Media type: electronic Carrier type: online resourceISBN: 9781627052245.Subject(s): Big data | Data integration (Computer science) | big data integration | data fusion | record linkage | schema alignment | variety | velocity | veracity | volumeDDC classification: 006.312 Online resources: Abstract with links to resource Also available in print.
Contents:
1. Motivation: challenges and opportunities for BDI -- 1.1 Traditional data integration -- 1.1.1 The flights example: data sources -- 1.1.2 The flights example: data integration -- 1.1.3 Data integration: architecture & three major steps -- 1.2 BDI: challenges -- 1.2.1 The "V" dimensions -- 1.2.2 Case study: quantity of deep web data -- 1.2.3 Case study: extracted domain-specific data -- 1.2.4 Case study: quality of deep web data -- 1.2.5 Case study: surface web structured data -- 1.2.6 Case study: extracted knowledge triples -- 1.3 BDI: opportunities -- 1.3.1 Data redundancy -- 1.3.2 Long data -- 1.3.3 Big data platforms -- 1.4 Outline of book --
2. Schema alignment -- 2.1 Traditional schema alignment: a quick tour -- 2.1.1 Mediated schema -- 2.1.2 Attribute matching -- 2.1.3 Schema mapping -- 2.1.4 Query answering -- 2.2 Addressing the variety and velocity challenges -- 2.2.1 Probabilistic schema alignment -- 2.2.2 Pay-as-you-go user feedback -- 2.3 Addressing the variety and volume challenges -- 2.3.1 Integrating deep web data -- 2.3.2 Integrating web tables --
3. Record linkage -- 3.1 Traditional record linkage: a quick tour -- 3.1.1 Pairwise matching -- 3.1.2 Clustering -- 3.1.3 Blocking -- 3.2 Addressing the volume challenge -- 3.2.1 Using MapReduce to parallelize blocking -- 3.2.2 Meta-blocking: pruning pairwise matchings -- 3.3 Addressing the velocity challenge -- 3.3.1 Incremental record linkage -- 3.4 Addressing the variety challenge -- 3.4.1 Linking text snippets to structured data -- 3.5 Addressing the veracity challenge -- 3.5.1 Temporal record linkage -- 3.5.2 Record linkage with uniqueness constraints --
4. BDI: data fusion -- 4.1 Traditional data fusion: a quick tour -- 4.2 Addressing the veracity challenge -- 4.2.1 Accuracy of a source -- 4.2.2 Probability of a value being true -- 4.2.3 Copying between sources -- 4.2.4 The end-to-end solution -- 4.2.5 Extensions and alternatives -- 4.3 Addressing the volume challenge -- 4.3.1 A MapReduce-based framework for offline fusion -- 4.3.2 Online data fusion -- 4.4 Addressing the velocity challenge -- 4.5 Addressing the variety challenge --
5. BDI: emerging topics -- 5.1 Role of crowdsourcing -- 5.1.1 Leveraging transitive relations -- 5.1.2 Crowdsourcing the end-to-end workflow -- 5.1.3 Future work -- 5.2 Source selection -- 5.2.1 Static sources -- 5.2.2 Dynamic sources -- 5.2.3 Future work -- 5.3 Source profiling -- 5.3.1 The Bellman system -- 5.3.2 Summarizing sources -- 5.3.3 Future work --
6. Conclusions -- Bibliography -- Authors' biographies -- Index.
Abstract: The big data era is upon us: data are being generated, analyzed, and used at an unprecedented scale, and data-driven decision making is sweeping through all aspects of society. Since the value of data explodes when it can be linked and fused with other data, addressing the big data integration (BDI) challenge is critical to realizing the promise of big data. BDI differs from traditional data integration along the dimensions of volume, velocity, variety, and veracity. First, not only can data sources contain a huge volume of data, but also the number of data sources is now in the millions. Second, because of the rate at which newly collected data are made available, many of the data sources are very dynamic, and the number of data sources is also rapidly exploding. Third, data sources are extremely heterogeneous in their structure and content, exhibiting considerable variety even for substantially similar entities. Fourth, the data sources are of widely differing qualities, with significant differences in the coverage, accuracy and timeliness of data provided. This book explores the progress that has been made by the data integration community on the topics of schema alignment, record linkage and data fusion in addressing these novel challenges faced by big data integration. Each of these topics is covered in a systematic way: first starting with a quick tour of the topic in the context of traditional data integration, followed by a detailed, example-driven exposition of recent innovative techniques that have been proposed to address the BDI challenges of volume, velocity, variety, and veracity. Finally, it presents emerging topics and opportunities that are specific to BDI, identifying promising directions for the data integration community.
    average rating: 0.0 (0 votes)
Item type Current location Call number Status Date due Barcode Item holds
E books E books PK Kelkar Library, IIT Kanpur
Available EBKE623
Total holds: 0

Mode of access: World Wide Web.

System requirements: Adobe Acrobat Reader.

Part of: Synthesis digital library of engineering and computer science.

Includes bibliographical references (pages 165-173) and index.

1. Motivation: challenges and opportunities for BDI -- 1.1 Traditional data integration -- 1.1.1 The flights example: data sources -- 1.1.2 The flights example: data integration -- 1.1.3 Data integration: architecture & three major steps -- 1.2 BDI: challenges -- 1.2.1 The "V" dimensions -- 1.2.2 Case study: quantity of deep web data -- 1.2.3 Case study: extracted domain-specific data -- 1.2.4 Case study: quality of deep web data -- 1.2.5 Case study: surface web structured data -- 1.2.6 Case study: extracted knowledge triples -- 1.3 BDI: opportunities -- 1.3.1 Data redundancy -- 1.3.2 Long data -- 1.3.3 Big data platforms -- 1.4 Outline of book --

2. Schema alignment -- 2.1 Traditional schema alignment: a quick tour -- 2.1.1 Mediated schema -- 2.1.2 Attribute matching -- 2.1.3 Schema mapping -- 2.1.4 Query answering -- 2.2 Addressing the variety and velocity challenges -- 2.2.1 Probabilistic schema alignment -- 2.2.2 Pay-as-you-go user feedback -- 2.3 Addressing the variety and volume challenges -- 2.3.1 Integrating deep web data -- 2.3.2 Integrating web tables --

3. Record linkage -- 3.1 Traditional record linkage: a quick tour -- 3.1.1 Pairwise matching -- 3.1.2 Clustering -- 3.1.3 Blocking -- 3.2 Addressing the volume challenge -- 3.2.1 Using MapReduce to parallelize blocking -- 3.2.2 Meta-blocking: pruning pairwise matchings -- 3.3 Addressing the velocity challenge -- 3.3.1 Incremental record linkage -- 3.4 Addressing the variety challenge -- 3.4.1 Linking text snippets to structured data -- 3.5 Addressing the veracity challenge -- 3.5.1 Temporal record linkage -- 3.5.2 Record linkage with uniqueness constraints --

4. BDI: data fusion -- 4.1 Traditional data fusion: a quick tour -- 4.2 Addressing the veracity challenge -- 4.2.1 Accuracy of a source -- 4.2.2 Probability of a value being true -- 4.2.3 Copying between sources -- 4.2.4 The end-to-end solution -- 4.2.5 Extensions and alternatives -- 4.3 Addressing the volume challenge -- 4.3.1 A MapReduce-based framework for offline fusion -- 4.3.2 Online data fusion -- 4.4 Addressing the velocity challenge -- 4.5 Addressing the variety challenge --

5. BDI: emerging topics -- 5.1 Role of crowdsourcing -- 5.1.1 Leveraging transitive relations -- 5.1.2 Crowdsourcing the end-to-end workflow -- 5.1.3 Future work -- 5.2 Source selection -- 5.2.1 Static sources -- 5.2.2 Dynamic sources -- 5.2.3 Future work -- 5.3 Source profiling -- 5.3.1 The Bellman system -- 5.3.2 Summarizing sources -- 5.3.3 Future work --

6. Conclusions -- Bibliography -- Authors' biographies -- Index.

Abstract freely available; full-text restricted to subscribers or individual document purchasers.

Compendex

INSPEC

Google scholar

Google book search

The big data era is upon us: data are being generated, analyzed, and used at an unprecedented scale, and data-driven decision making is sweeping through all aspects of society. Since the value of data explodes when it can be linked and fused with other data, addressing the big data integration (BDI) challenge is critical to realizing the promise of big data. BDI differs from traditional data integration along the dimensions of volume, velocity, variety, and veracity. First, not only can data sources contain a huge volume of data, but also the number of data sources is now in the millions. Second, because of the rate at which newly collected data are made available, many of the data sources are very dynamic, and the number of data sources is also rapidly exploding. Third, data sources are extremely heterogeneous in their structure and content, exhibiting considerable variety even for substantially similar entities. Fourth, the data sources are of widely differing qualities, with significant differences in the coverage, accuracy and timeliness of data provided. This book explores the progress that has been made by the data integration community on the topics of schema alignment, record linkage and data fusion in addressing these novel challenges faced by big data integration. Each of these topics is covered in a systematic way: first starting with a quick tour of the topic in the context of traditional data integration, followed by a detailed, example-driven exposition of recent innovative techniques that have been proposed to address the BDI challenges of volume, velocity, variety, and veracity. Finally, it presents emerging topics and opportunities that are specific to BDI, identifying promising directions for the data integration community.

Also available in print.

Title from PDF title page (viewed on March 20, 2015).

There are no comments for this item.

Log in to your account to post a comment.

Powered by Koha