0% found this document useful (0 votes)
7 views

Lecture 07

Uploaded by

browninasia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Lecture 07

Uploaded by

browninasia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 64

Cloud Computing

Professor Mangal Sain


Lecture 7– Part 1

A Brief History of Big Data


C 18,000 BCE
 Humans use tally sticks to record data
for the first time. These are used to track
trading activity and record inventory.
C 2400 BCE
 The abacus is developed, and the first
libraries are built in Babylonia
300 BCE – 48 AD
 TheLibrary of Alexandria is the world’s
largest data storage center – until it is
destroyed by the Romans.
100 AD – 200 AD
 TheAntikythera Mechanism – the first
mechanical computer – is developed in
Greece
1663
 John Graunt conducts the first recorded
statistical-analysis experiments in an
attempt to curb the spread of the bubonic
plague in Europe
1865
 Theterm “business intelligence” is used
by Richard Millar Devens in his
Encyclopaedia of Commercial and
Business Anecdotes
1881
 Herman Hollerith creates the Hollerith
Tabulating Machine which uses punch
cards to vastly reduce the workload of the
US Census.
1926
 NikolaTesla predicts that in the future,
a man will be able to access and analyze
vast amounts of data using adevice small
enough to fit in his pocket.
1928
 FritzPfleumer creates a method of
storing data magnetically, which forms
basis of modern digital data storage
technology.
1944
 Fremont Rider speculates that Yale
Library will contain 200 million books
stored on 6,000 miles of shelves, by 2040.
1958
 Hans Peter Luhn defines Business
Intelligence as “the ability to apprehend
the interrelationships of presented facts
in such a way as to guide action towards
a desired goal.”
1962
 The first steps are taken towards speech
recognition, when IBM engineer William C Dersch
presents the Shoebox Machine at the 1962 World
Fair. It can interpret numbers and sixteen words
spoken in the English language into digital
information.
1964
 An article in the New Statesman refers to the
difficulty in managing the increasing amount of
information becoming available.
1965
 The US Government plans the world’s
first data center to store 742 million tax
returns and 175 million sets of
fingerprints on magnetic tape.
THE INTERNET EFFECT AND PERSONAL COMPUTERS
1970
 RelationalDatabase model developed by
IBM mathematician Edgar F Codd. The
Hierarchal file system allows records to
be accessed using a simple index system.
This means anyone can use databases,
not just computer scientists.
1976
 MaterialRequirements Planning (MRP)
systems are commonly used in business.
Computer and data storage is used for
everyday routine tasks.
PERSONAL COMPUTERS
1989
 Early use of term Big Data in magazine
article by fiction author Erik Larson –
commenting on advertisers’ use of data to
target customers.
1991
 Thebirth of the internet. Anyone can
now go online and upload their own data,
or analyze data uploaded by other people.

 HTML:
 URL:
 HTTP
1996
 Theprice of digital storage falls to the
point where it is more cost-effective than
paper.
1997
 Google launch their search engine which
will quickly become the most popular in
the world.
 Michael Lesk estimates the digital
universe is increasing tenfold in size
every year.
Lecture 7– Part 2

A Brief History of Big Data


1999
 First
use of the term Big Data in an
academic paper – Visually Exploring
Gigabyte Datasets in Realtime (ACM)

 First
use of term Internet of Things, in a
business presentation by Kevin Ashton to
Procter and Gamble.
2000
 InHow Much Information? Peter Lyman and Hal
Varian (now chief economist at Google) attempted
to quantify the amount of digital information in
the world, and its rate of growth, for the first
time.
2001
 Three “Vs” of Big Data

 Volume,
 Velocity,
 Variety

 defined by Doug Laney


2005
 Hadoop – an open source Big Data
framework now developed by Apache – is
developed.
 The birth of “Web 2.0 – the user
generated web”.
2008
 Globally9.57 zettabytes (9.57 trillion
gigabytes) of information is processed by
the world’s CPUs.

 An estimated 14.7 exabytes of new


information is produced this year.
2009
 Theaverage US company with over 1,000
employees is storing more than 200
terabytes of data according to the report
Big Data:

 The
Next Frontier for Innovation,
Competition and Productivity by
McKinsey Global Institute.
2010
 Eric
Schmidt, executive chairman of
Google, tells a conference that as much
data is now being created every two days,
as was created from the beginning of
human civilization to the year 2003.
2011
 The McKinsey report states that by 2018
the US will face a shortfall of between
140,000 and 190,000 professional data
scientists, and warns that issues including
privacy, security and intellectual property
will have to be resolved before the full value
of Big Data will be realised.
2014
 Mobile internet use overtakes desktop for
the first time

 88% of executives responding to an


international survey by GE say that big
data analysis is a top priority
2015
 The data volumes are exploding, more
data has been created in the past two
years than in the entire previous history
of the human race.
SPARK
IMPROVING EXPERIENCE (MID 2010S)
REDUCING RESPONSIBILITY WITH THE SAME FLEXIBILITY (2020)
BIG DATA USE
 BigData is revolutionizing entire
industries and changing human culture
and behavior. It is a result of the
information age and is changing how
people exercise, create music, and work.
BIG DATA ANALYTICS
 Analyticshas, in a sense, been around
since 1663, when John Graunt dealt with
“overwhelming amounts of information,”
using statistics to study the bubonic
plague
BIG DATA FUTURE
 It
may be fair to assume that in the future, the
success of businesses will not only lie in those who
analyze and implement big data the best, but also
those who use big data to their greatest
advantage and make strategic decisions for the
future.
REFERENCE
Lecture 7– Part 3

Introduction to Big Data


TOPICS
• Scope: Big Data & Analytics
• Topics:
– Foundation of Data Analytics and Data Mining
– Hadoop/Map-Reduce Programming and Data Processing &
BigTable/Hbase/Cassandra
– Graph Database and Graph Analytics

44
WHAT’S BIG DATA?

No single definition; here is from Wikipedia:

 Big data is the term for a collection of data sets so


large and complex that it becomes difficult to
process using on-hand database management tools
or traditional data processing applications.
 The challenges include capture, curation, storage,
search, sharing, transfer, analysis, and
visualization.

45
BIG DATA: 3V’S
VOLUME (SCALE)
 Data Volume
 44x increase from 2009 2020
 From 0.8 zettabytes to 35zb

 Data volume is increasing exponentially

Exponential increase in collected/gen


erated data
4.6 billion ca
30 billion RFID tags to mera phones w
day orld wide
12+ TBs (1.3B in 2005)
of tweet data
every day

100s of milli
ons of GPS e
data every day

nabled devices
? TBs of

sold annually

25+ TBs of
log data every day 2+ billion p
eople on the W
eb by end 2011

76 million smart meters in 20


09…
200M by 2014
Maximilien Brice, © CERN
CERN’s Large Hydron Collider (LHC) generates 15 PB a year
THE EARTHSCOPE
• The Earthscope is the world's largest science
project. Designed to track North America's
geological evolution, this observatory records
data over 3.8 million square miles, amassing 67
terabytes of data. It analyzes seismic slips in
the San Andreas fault, sure, but also the plume
of magma underneath Yellowstone and much,
much more.

• (http://www.msnbc.msn.com/id/44363598/ns/technology_and_science-
future_of_technology/#.TmetOdQ--uI)
VARIETY (COMPLEXITY)
 Relational Data (Tables/Transaction/Legacy Data)
 Text Data (Web)
 Semi-structured Data (XML)
 Graph Data
 Social Network, Semantic Web (RDF), …

 Streaming Data
 You can only scan the data once

 A single application can be generating/collecting many


types of data

 Big Public Data (online, weather, finance, etc)

To extract knowledge➔ all these types of data need


to linked together
A Single View to the Customer

Social Me Banking
dia Finance

Our
Gaming
Customer Known
History

Entertain Purchase
VELOCITY (SPEED)

 Data is begin generated fast and need to be processed fast


 Online Data Analytics

 Late decisions ➔ missing opportunities

 Examples
 E-Promotions: Based on your current location, your purchase
history, what you like ➔ send promotions right now for store next to
you

 Healthcare monitoring: sensors monitoring your activities and


body ➔ any abnormal measurements require immediate reaction
REAL-TIME/FAST DATA

Mobile devices
(tracking all objects all the time)

Scientific instruments
Social media and networks
(collecting all sorts of data) Sensor technology and networks
(all of us are generating data)
(measuring all kinds of data)

 The progress and innovation is no longer hindered by the ability to


collect data
 But, by the ability to manage, analyze, summarize, visualize, and
discover knowledge from the collected data in a timely manner and in a
scalable fashion
Real-Time Analytics/Decision Requirement

Product
Recommendations Learning why Customers
Influence
that are Relevant Behavior Switch to competitors
& Compelling and their offers; in
time to Counter

Friend Invitations
Improving the Customer to join a
Marketing Game or Activity
Effectiveness of a that expands
Promotion while it business
is still in Play
Preventing Fraud
as it is Occurring
& preventing more
proactively
SOME MAKE IT 4V’S
HARNESSING BIG DATA

 OLTP: Online Transaction Processing (DBMSs)


 OLAP: Online Analytical Processing (Data Warehousing)
 RTAP: Real-Time Analytics Processing (Big Data Architecture & technology)
THE MODEL HAS CHANGED…

 The Model of Generating/Consuming Data has Changed

Old Model: Few companies are generating data, all others are consuming data

New Model: all of us are generating data, and all of us are consuming data
WHAT’S DRIVING BIG DATA
- Optimizations and predictive analytics
- Complex statistical analysis
- All types of data, and many sources
- Very large datasets
- More of a real-time

- Ad-hoc querying and reporting


- Data mining techniques
- Structured data, typical sources
- Small to mid-size datasets
THE EVOLUTION OF BUSINESS INTELLIGENCE
Interactive Business
Speed
Intelligence & Big Data:
In-memory RDBMS Scale
Real Time &
Single View
QliqView, Tableau, HANA
BI Reporting
OLAP & Graph Databases
Dataware house
Big Data: Speed
Business Objects, SAS, Informatica, Scale
Cognos other SQL Reporting Tools Batch Processing &
Distributed Data Store
Hadoop/Spark; HBase/Cassandra

1990’s 2000’s 2010’s


BIG DATA ANALYTICS

 Big data is more real-time in nature than


traditional DW applications
 Traditional DW architectures (e.g. Exadata,
Teradata) are not well-suited for big data
apps
 Shared nothing, massively parallel
processing, scale out architectures are well-
suited for big data apps
BIG DATA TECHNOLOGY

You might also like