0% found this document useful (0 votes)
8 views12 pages

Intro To Data Science Prelims Reviewer

The document discusses the vast amounts of data generated and stored across various sectors, highlighting the significance of Big Data and Data Science in managing and extracting value from this data. It outlines the roles of Data Scientists and the importance of Database Management Systems (DBMS) in organizing and accessing data efficiently. Additionally, it covers the evolution of database systems and the methodologies for database design and management.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views12 pages

Intro To Data Science Prelims Reviewer

The document discusses the vast amounts of data generated and stored across various sectors, highlighting the significance of Big Data and Data Science in managing and extracting value from this data. It outlines the roles of Data Scientists and the importance of Database Management Systems (DBMS) in organizing and accessing data efficiently. Additionally, it covers the evolution of database systems and the methodologies for database design and management.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

LESSON 1

Data All Around


●​ Lots of data is being collected and warehoused
○​ Web data, e-commerce
○​ Financial transactions, bank/credit transactions
○​ Online trading and purchasing
○​ Social Network

How much data do we have?


●​ Google processes 20 PB a day (2008)
●​ Facebook has 60 TB of daily logs
●​ eBay has 6.5 PB of user data + 50 TB/day (5/2009)
●​ 1000 genomes project: 200 TB
○​ Cost of 1 TB of HDD: $35
○​ Time to read 1TB disk: 3hrs(100MB/s)
Big Data
Big Data is any data that is expensive to manage and hard to extract value from
●​ Volume
○​ The size of the data
●​ Velocity
○​ The latency of data processing relative to the growing demand for interactivity
●​ Variety and Complexity
○​ the diversity of sources, formats, quality, structures
Types of Data we have
●​ Relational Data (Tables/Transaction/Legacy Data)
●​ Text Data (Web)
●​ Semi-structured Data (XML)
●​ Graph Data
●​ Social Network, Semantic Web (RDF), …
●​ Streaming Data

What to do with this data?


●​ Aggregation and Statistics
○​ Data warehousing and OLAP
●​ Indexing, Searching, and Querying
○​ Keyword based search
○​ Pattern matching (XML/RDF)
●​ Knowledge discovery
○​ Data Mining
○​ Statistical Modeling

Big Data and Data Science


●​ “… the sexy job in the next 10 years will be statisticians,” Hal Varian, Google Chief
Economist
●​ The U.S. will need 140,000-190,000 predictive analysts and 1.5 million
managers/analysts by 2018. McKinsey Global Institute’s June 2011
●​ New Data Science institutes being created or repurposed – NYU, Columbia,
Washington, UCB,...
●​ New degree programs, courses, boot-camps:
○​ e.g., at Berkeley: Stats, I-School, CS, Astronomy…
○​ One proposal (elsewhere) for an MS in “Big Data Science”

What is Data Science?


●​ An area that manages, manipulates, extracts, and interprets knowledge from
tremendous amount of data
●​ Data science (DS) is a multidisciplinary field of study with goal to address the challenges
in big data
●​ Data science principles apply to all data – big and small
●​ Theories and techniques from many fields and disciplines are used to investigate and
analyze a large amount of data to help decision makers in many industries such as
science, engineering, economics, politics, finance, and education
○​ Computer Science
■​ Pattern recognition, visualization, data warehousing, High performance
computing, Databases, AI
○​ Mathematics
■​ Mathematical Modeling
○​ Statistics
■​ Statistical and Stochastic modeling, Probability.

Why is it sexy?
●​ Gartner’s 2014 Hype Cycle

Data Science is Multidisciplinary:

Real Life Examples:


●​ Companies learn your secrets, shopping patterns, and preferences
○​ For example, can we know if a woman is pregnant, even if she doesn’t want us to
know? Target case study
●​ Data Science and election (2008, 2012)
○​ 1 million people installed the Obama Facebook app that gave access to info on
“friends”

Data Scientists
●​ Data Scientist
○​ The Sexiest Job of the 21st Century
●​ They find stories, extract knowledge. They are not reporters
●​ Data scientists are the key to realizing the opportunities presented by big data. They
bring structure to it, find compelling patterns in it, and advise executives on the
implications for products, processes, and decisions

What do data scientists do?


●​ National Security
●​ Cyber Security
●​ Business Analytics
●​ Engineering
●​ Healthcare
●​ And more ….

Concentrations in Data Science


●​ Mathematics and Applied Mathematics
●​ Applied Statistics/Data Analysis
●​ Solid Programming Skills (R, Python, Julia, SQL)
●​ Data Mining
●​ Database Storage and Management
●​ Machine Learning and discovery

LESSON 2
Database Management System (DBMS)
●​ DBMS contains information about a particular enterprise
○​ Collection of interrelated data
○​ Set of programs to access the data
○​ An environment that is both convenient and efficient to use
●​ Database Applications:
○​ Banking: transactions
○​ Airlines: reservations, schedules
○​ Universities: registration, grades
○​ Sales: customers, products, purchases
○​ Online retailers: order tracking, customized recommendations
○​ Manufacturing: production, inventory, orders, supply chain
○​ Human resources: employee records, salaries, tax deductions
●​ Databases can be very large.
●​ Databases touch all aspects of our lives

University Database Example


●​ Application program examples
○​ Add new students, instructors, and courses
○​ Register students for courses, and generate class rosters
○​ Assign grades to students, compute grade point averages (GPA) and generate
transcripts
●​ In the early days, database applications were built directly on top of file systems

Drawbacks of using file systems to store data


●​ Data redundancy and inconsistency
○​ Multiple file formats, duplication of information in different files
●​ Difficulty in accessing data
○​ Need to write a new program to carry out each new task
●​ Data isolation
○​ Multiple files and formats
●​ Integrity problems
○​ Integrity constraints (e.g., account balance > 0) become “buried” in program code
rather than being stated explicitly
○​ Hard to add new constraints or change existing ones
●​ Atomicity of updates
○​ Failures may leave database in an inconsistent state with partial updates carried
out
○​ Example: Transfer of funds from one account to another should either complete
or not happen at all
●​ Concurrent access by multiple users
○​ Concurrent access needed for performance
○​ Uncontrolled concurrent accesses can lead to inconsistencies
○​ Example: Two people reading a balance (say 100) and updating it by withdrawing
money (say 50 each) at the same time
●​ Security problems
○​ Hard to provide user access to some, but not all, data
Database systems offer solutions to all the above problems

Levels of Abstraction
●​ Physical level: describes how a record (e.g., instructor) is stored.
●​ Logical level: describes data stored in a database, and the relationships among the
data.
type instructor = record
ID : string;
name : string;
dept_name : string;
salary : integer;
end;
●​ View level: application programs hide details of data types. Views can also hide
information (such as an employee’s salary) for security purposes.

View of Data

Instance and Schemas


●​ Similar to types and variables in programming languages
●​ Logical Schema – the overall logical structure of the database
○​ Example: The database consists of information about a set of customers and
accounts in a bank and the relationship between them
■​ Analogous to type information of a variable in a program
●​ Physical schema – the overall physical structure of the database
●​ Instance – the actual content of the database at a particular point in time
○​ Analogous to the value of a variable
●​ Physical Data Independence – the ability to modify the physical schema without
changing the logical schema
○​ Applications depend on the logical schema
○​ In general, the interfaces between the various levels and components should be
well defined so that changes in some parts do not seriously influence others.

Data Models
●​ A collection of tools for describing
○​ Data
○​ Data relationships
○​ Data semantics
○​ Data constraints
●​ Relational model
●​ Entity-Relationship data model (mainly for database design)
●​ Object-based data models (Object-oriented and Object-relational)
●​ Semistructured data model (XML)
●​ Other older models:
○​ Network model
○​ Hierarchical model

Relational Model
●​ All the data is stored in various tables.
●​ Example of tabular data in the relational model

Data Definition Language(DDL)


●​ Specification notation for defining the database schema

Example:
create table instructor (
ID char(5),
​ name varchar(20),
​ dept_name varchar(20),
​ salary numeric(8,2)
)

●​ DDL compiler generates a set of table templates stored in a data dictionary


●​ Data dictionary contains metadata (i.e., data about data)
○​ Database schema
○​ Integrity constraints
■​ Primary key (ID uniquely identifies instructors)
○​ Authorization
■​ Who can access what
Data Manipulation Language(DML)
●​ Language for accessing and manipulating the data organized by the appropriate data
model
○​ DML also known as query language
●​ Two classes of languages
○​ Pure – used for proving properties about computational power and for
optimization
■​ Relational Algebra
■​ Tuple relational calculus
■​ Domain relational calculus
○​ Commercial – used in commercial systems
■​ SQL is the most widely used commercial language

SQL
●​ The most widely used commercial language
●​ SQL is NOT a Turing machine equivalent language
●​ SQL is NOT a Turing machine equivalent language
●​ To be able to compute complex functions SQL is usually embedded in some higher-level
language
●​ Application programs generally access databases through one of
○​ Language extensions to allow embedded SQL
○​ Application program interface (e.g., ODBC/JDBC) which allow SQL queries to be
sent to a database

Database Design
The process of designing the general structure of the database:
●​ Logical Design – Deciding on the database schema. Database design requires that we
find a “good” collection of relation schemas.
○​ Business decision – What attributes should we record in the database?
○​ Computer Science decision – What relation schemas should we have and how
should the attributes be distributed among the various relation schemas?
●​ Physical Design – Deciding on the physical layout of the database
Design Approaches
●​ Need to come up with a methodology to ensure that each of the relations in the
database is “good”
●​ Two ways of doing so:
○​ Entity Relationship Model (Chapter 7)
■​ Models an enterprise as a collection of entities and relationships
■​ Represented diagrammatically by an entity-relationship diagram:
○​ Normalization Theory (Chapter 8)
■​ Formalize what designs are bad, and test for them

Object-Relational Data Models


●​ Relational model: flat, “atomic” values
●​ Object Relational Data Models
○​ Extend the relational data model by including object orientation and constructs to
deal with added data types.
○​ Allow attributes of tuples to have complex types, including nonatomic values such
as nested relations.
○​ Preserve relational foundations, in particular the declarative access to data, while
extending modeling power.
○​ Provide upward compatibility with existing relational languages.

XML: Extensible Markup Language


●​ Defined by the WWW Consortium (W3C)
●​ Originally intended as a document markup language not a database language
●​ The ability to specify new tags, and to create nested tag structures made XML a great
way to exchange data, not just documents
●​ XML has become the basis for all new generation data interchange formats.
●​ A wide variety of tools is available for parsing, browsing and querying XML
documents/data

Database Engine
●​ Storage manager
●​ Query processing
●​ Transaction manager

Storage Management
●​ Storage manager is a program module that provides the interface between the low-level
data stored in the database and the application programs and queries submitted to the
system.
●​ The storage manager is responsible to the following tasks:
○​ Interaction with the OS file manager
○​ Efficient storing, retrieving and updating of data
●​ Issues:
○​ Storage access
○​ File organization
○​ Indexing and hashing

Query Processing
1.​ Parsing and translation
2.​ Optimization
3.​ Evaluation

●​ Alternative ways of evaluating a given query


○​ Equivalent expressions
○​ Different algorithms for each operation
●​ Cost difference between a good and a bad way of evaluating a query can be enormous
●​ Need to estimate the cost of operations
○​ Depends critically on statistical information about relations which the database
must maintain
○​ Need to estimate statistics for intermediate results to compute cost of complex
expressions

Transaction Management
●​ What if the system fails?
●​ What if more than one user is concurrently updating the same data?
●​ A transaction is a collection of operations that performs a single logical function in a
database application
●​ Transaction-management component ensures that the database remains in a
consistent (correct) state despite system failures (e.g., power failures and operating
system crashes) and transaction failures.
●​ Concurrency-control manager controls the interaction among the concurrent
transactions, to ensure the consistency of the database.
Database Users and Administrators
Database Architecture
The architecture of a database systems is greatly influenced by the underlying computer system
on which the database is running:
●​ Centralized
●​ Client-server
●​ Parallel (multi-processor)
●​ Distributed

History of Database Systems


●​ 1950s and early 1960s:
○​ Data processing using magnetic tapes for storage
■​ Tapes provided only sequential access
○​ Punched cards for input
●​ Late 1960s and 1970s:
○​ Hard disks allowed direct access to data
○​ Network and hierarchical data models in widespread use
○​ Ted Codd defines the relational data model
■​ Would win the ACM Turing Award for this work
■​ IBM Research begins System R prototype
■​ UC Berkeley begins Ingres prototype
○​ High-performance (for the era) transaction processing
●​ 1980s:
○​ Research relational prototypes evolve into commercial systems
■​ SQL becomes industrial standard
○​ Parallel and distributed database systems
○​ Object-oriented database systems
●​ 1990s:
○​ Large decision support and data-mining applications
○​ Large multi-terabyte data warehouses
○​ Emergence of Web commerce
●​ Early 2000s:
○​ XML and XQuery standards
○​ Automated database administration
●​ Later 2000s:
○​ Giant data storage systems
■​ Google BigTable, Yahoo PNuts, Amazon, .

You might also like