0% found this document useful (0 votes)
13 views35 pages

ITP102 Reviewer

Hughjjjcffyhvh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views35 pages

ITP102 Reviewer

Hughjjjcffyhvh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Week 7 & 8 - Introduction to Cascading Style Sheets (CSS) - Lab

▪ CSS stands for Cascading Style Sheets

▪ CSS describes how HTML elements are to be displayed on screen, paper, or in other media.

▪ CSS saves a lot of work. It can control the layout of multiple web pages all at once.

▪ External stylesheets are stored in CSS files.

- CSS is used to define styles for your web pages, including the design, layout and variations in display for
different devices and screen sizes

- HTML was NEVER intended to contain tags for formatting a web page

- CSS removed the style formatting from the HTML page

Benefits of CSS

▪ Solves a big problem

▪ Saves a lot of time

▪ Provide more attributes

CCS Syntax

- A CSS rule consists of a selector and a declaration block.

Selector: Selector indicates the HTML element you want to style. It could be any tag like <h1>, <title>
etc.

Declaration Block: The declaration block can contain one or more declarations separated by a semicolon.
Each declaration contains a property name and value, separated by a colon.

Property: A Property is a type of attribute of HTML element. It could be color, border etc.

Value: Values are assigned to CSS properties


▪ p is a selector in CSS (it points to the HTML element you want to style: <p>).

▪ color is a property, and red is the property value

▪ text-align is a property, and center is the property value

CSS Selector

- used to select the content you want to style

- Selectors are the part of CSS rule set

- select HTML elements according to its id, class, type, attribute etc.

Different types of CSS Selectors:

CSS Element Selector

- selects the HTML element based on the element name. (select elements based on name, id, class)

CSS Id Selector

- selects the id attribute of an HTML element to select a specific element

- id is always unique within the page so it is chosen to select a single, unique element

- written with the hash character (#), followed by the id of the element
CSS Class Selector

- selects HTML elements with a specific class attribute

- To select elements with a specific class, write a period (.) character, followed by the class name

CSS Universal Selector

- used as a wildcard character (*)

- It selects all the elements on the pages

CSS Group Selector

- used to select all the elements with the same style definitions

- used to minimize the code

- Commas are used to separate each selector in grouping


All CSS Selectors

How to add CSS

- CSS is added to HTML pages to format the document according to information in the style sheet

Three ways to insert CSS in HTML documents:

1. Inline CSS

- used to apply CSS on a single line or element

2. Internal CSS

- used to apply CSS on a single document or page

- can affect all the elements of the page

- written inside the style tag within head section of html


3. External CSS

- used to apply CSS on multiple pages or all pages

- Its extension must be .css for example style.css

Week 7 and 8 INTEGRATING DISPARATE DATA & APPLICABLE TECHNOLOGIES - Lecture

- Information systems exist to manage data and databases.

- disparities in the data and databases happen at different levels of the software production cycle.

- disparities happen because of the varying design and development environments.

- differing environments provide alongside issues on data and database propriety, collaboration, access &
constraints, technology compatibilities, and security features

What is Disparities in Data?

- Disparate data is heterogeneous data with variety in data formats, diverse data dimensionality, and low
quality.

- Missing values, inconsistencies, ambiguous records, noise, and high data redundancy contribute to the
'low quality' of disparate data.

- Disparities in data happen because different people are coming from different environments

- Data platforms and databases, as well as database servers worldwide, are designed, manufactured, and
developed by organizations in the cutthroat competitive environment

- part of organizational struggles to emerge on top of the others to ensure viability.

- One needs to outperform the others often at the expense of the others.

- In data and database development fields, the same stiff competition would mean each will outdo the
others.

- to emerge above the competition, often what is the best there to offer could affect data and data
unification.

- Disparities in data can also be viewed through the lenses of a turf war.
- Data is represented in a way that showcases the territories or domains of organizational allies or
affiliates.

- The most prominent of this turf war is the tussle between the open-source community versus the
proprietary community.

Key Points of Disparities in Data:

Data Leveraging

- Data use must be provided for at the utmost.

- Organizations would try to make the most of their data resourcing.

- Other organizations view the leveraging of rival organizations' data as a way of spearheading their
interests.

Disparate File Systems

- Platform technologies (a.k.a., operating system platforms) are coming from different stocks and hides.
For instance,

- the Microsoft Windows environment uses the FAT (File Allocation Table) and NTFS (New Technology File
System) file systems.

Data Stores

- Hard drives and other data repositories were built by manufacturers usually for generic data storage.
However, some proprietary technology platforms are already built into the computing package.
- This is notable among laptops and mobile computers whose components are bundled

Data Presentation
- There is no doubt that the screen display for data is constrained by disparate presentation formats.
- data is either minified or maxified in congruence with the computing capabilities of users.

Data Type Constructs


- Database servers and programming languages are at the heart of the data type definitions provided in
the codes.
- The data type support for certain constructs defines the size (or the field lengths), scope, and format of
data.
- Universally recognized types are integers, decimals, strings, characters, and dates.
- programming languages have their differences or variations for each of these major data types.
Advanced Data Types
- binary large object (BLOB), multimedia data, grid data, objects, and geographic and spatial data.
- more of a concern of database server systems.

Integrating Disparities in Data Disparate


- One of the key frameworks for integrating disparate data is subsumed as data integration.
- The process of combining data from various sources into a single dataset is known as data integration.
- Its main objective is to give users consistent access to and delivery of data across a wide range of
topics and structure types, as well as to satisfy the information requirements of all applications and
business processes.

Activities for Data Integration

Data Conversion

- In some programming languages (e.g., C#, Java), all data created and called are treated as strings.
- data are coded to specify data conversion (e.g., from string to integer, etc.).
- Methods like to String() or to Decimal() are occurrences in these programming language types.
- Other programming languages are typeless – coders need not specify a data type for a variable.
- When the variable is first referenced, the value that it was assumed to hold is the consequent data type
for the variable.

Data processing

- Some data may need to be subject to further processing to fit into certain data constructs.
- Data trimmings is to remove or extract unqualified areas of data.
- trimmed are spaces, leading or trailing zeroes, salt values and other incongruent add-ons to the data.
- From data collection, data cleaning, data transformation, data analysis, data visualization, to data
storage and management.

Data Codification
- categorizing data with some hashing functions.
- Codification is more of a convention independently undertaken by entities.
- Since it is a convention, there must be a way to know how the codification is formulated.
- a series of numbers are actually concatenated strings with each segment (explicitly or implicitly
displayed). representing subcategories. Samples of coded data are product codes, digital signatures,
URLs, IP addresses, and so on.
- Salt values like leading and trailing zeroes, hyphens, asterisks, and other special characters provide
uniformity in the coded data.
Data Drive Formatting
- are empty contiguous memory locations where data can be written.
- Most drives are readily available for read-write operations upon purchase.
- platform technology-oriented, they need to be formatted to allow for the free rein of the intended
platform.
- Part of the formatting is the creation of partitions and the installation of the file system to match the
intended operating system

Data Unification

- A unified view of data helps promote clarity and interoperability among information systems.
- foremost step in a unified view of data is the unification of the data itself.
- Through conducting data transformations, schema integrations, removal of duplications, and general
record cleaning, data unification may be viewed as the process of consuming data from many
operational systems and merging such disparate data sources into a single source.
- Consider all the various systems and programs utilized at your firm to have an understanding of the
main difficulties in data unification.

Unifying Data Terminology


- Definitions of data elements and methods for computing them may differ between business units.
- ex. accounting people have their terminologies which could be quite different from engineering
people. data coming from external sources would exhibit greater disparity in terms of terminology.
- Prior planning and the creation of a uniform data language are essential.
- The key to successfully adopting a supply chain data unification system is to establish a standard
definition of what a data set could refer to as to consistently adhere to utilizing a common data
language.

Standardize Data Sets


- process of putting data into a format that is widely understood and used by computers.
- Data standardization is primarily done to increase data quality, lower transactional costs, and facilitate
improved decision-making.
- Data standardization can also improve teamwork and coordination amongst various departments and
teams.
- Standardizing data, for instance, can involve changing all measurements to the metric system or all
dates to a single format (such as YWY-MM-DD).

Data Character Encoding


- Owing to their geographic origins, data character representations vastly vary.
- Our standard A to Z alphabet may be gaining universal acceptance.
- large portion of character representations is still predominant in certain regions.
- Japanese, Chinese, Arabic, Russian, and Germanic characters (just a few among so many character
representations worldwide) have their own peculiar character representations.
- To enable this character disparity, encoding is being applied to bridge the character gaps.
- no perfect character encoding may be undertaken.

Data Generation Technologies

- Environmental sensing and data generation technologies are at the heart of technology integration
among software systems.
- Sensors are devices embedded onto particular machines, equipment, environmental focus or objects to
enable the determination or measure of their physical properties.
- Environmental sensing refers to different modes and means by which environmental elements are
measured.

- data generation technologies are those which provide specific data onto the focus of ambient measure
as prompted by system users.
- data logger technologies are those devices which measure at mostly regular intervals the several
parameters in the immediate environment in which they are placed into.

Sensors
- are the prime components for environmental sensing. These sensors may be divided into:

a. location sensors,
b. movement (motion) sensors and,
c. environment sensors.

Motion sensors

- These are the sensors which measure force of acceleration and rotation forces along three axes (x, y
and z axes).
- Belonging in this category are the gyroscopes, accelerometers, the gravity sensors, and the sensors for
rotational vector measurements.

Environmental sensors

- These are the sensors which measure various elements in the environment.
- Examples of these elements are air pressure, ambient air or room temperature, degree of darkness or
brightness (illumination), and - relative humidity.
- Included in this category are photometers, barometers, and thermal scanners (or thermometers).

Position sensors
- These are the sensors which could measure the current physical location, direction or position of an
IoT-enabled device.
- Included in this category are sensors for measuring orientation and for fields measuring - magnetic
sensors (magnetometers).

- Other data generation sensors may be used for explicit purposes such as in measuring water quality
parameters like acidity, turbidity, dissolve solids, BOD, pressure, etc.
- Human wearable sensors may be used for measuring body characteristics like in step counters,
biometrics (finger prints, eye, voice, clap, etc.), heart rate, bio- medical gadgets and so forth.
- Geological sensing may also be measured through devices that measure wind qualities, tremors,
volcanic activities, water levels, humidity, hydrology, illumination, precipitation, oxidation, etc.
- Computer based sensors are also embedded onto machines (mostly transport machines) to determine
range, direction, odometry, distances, counts, etc.

- Even institutions, offices and schools embed technologies to determine burglary (CCTV and
identification sensors), attendance monitoring (QR codes, bar codes, RFID, etc.) to facilitate record
access, authentication (ingress-egress systems), property management loggers, etc. The technologies
just cannot be listed all at once here.

Data Processing Technologies

- come in several forms like in data analytics, data cleansing, data classification, data presentation, data
warehousing, data mining

- Artificial intelligence even further enhances the processing to provide for predictions, forecasts,
associations, differentiations, distinction, relationships, clustering, mapping, distributions, etc.
- conversion of somehow raw data into more usable and desired forms.
- conversion or “processing” is carried out using a curtained predefined sequence of operations which
are either manually or automatically undertaken.
- Most of the processing is done by using computers and other data processing devices, and thus done
automatically
- output or “processed” data can be obtained in various forms and formats.
- ex. images, graphs, tables or matrices, vector files, audio files, charts or any other user-desired
formats.
- form generated depends on the software or method used
- When done itself, it is referred to as automatic data processing
- Data centers are the key components of these automated data generation as they enable processing,
storage, access, sharing and analysis of tremendous amounts of data.
- Data processing can lead to better productivity and more profits for various business fields.
- safeguarding data privacy, data security, machine learning, data science, network security etc.
- requires a focused approach for reliable, accurate & cost-effective processing; requires a focused
approach for reliable, accurate & cost-effective processing

Data Logging Technologies


- work by automatically providing monitoring or measurement of specific environmental parameters or
equipment
- undertaken at certain configured intervals via their sensor components
- Most data loggers have built-in recording and storage capacities
- continuous recording of data, however, data loggers need to be transferred onto bigger repositories
- Some data loggers automatically erase previous recordings to make room for incoming data flow
- Data loggers are configured as relatively simple and acting as ‘standalone’ devices
- they may be configured as more complex multi-channel versions
- can be assigned with several sensors and accessories
- Data loggers provide monitoring and/or measurements at configured time intervals
- Passive devices are activated only when there are triggers, for instance when highly irregular readings
are monitored, recording starts.
- some settings are configured to enable it to adjust frequency of data reading
- configured to measure at specific times, length or other environmental parameters
- to take a single measurement some units of time such as per hour, per day, or even several readings per
second
- graphically presenting the results in graphs

Modality Platform Technologies


- term platform refers to in the IT lingo as any hardware or software used to host an application or
service
- application platform, for example, consists of hardware, an operating system and coordinating
programs that use the instruction set for a particular processor or microprocessor
- Modality platform refers to the combination of hardware and software to which IT services are
rendered via provided modes: a. desktop, b. web, c. mobile and d. cloud facilities

Week 9 & 10 - INTEGRATING DISPARATE DATABASE

- Information systems exist to manage data and databases

- disparities in the data and databases happen at different levels of the software production cycle

- disparities happen because of the varying design and development environments

Disparities in Database and their Serves

- IT industry is quite lucrative

- survey would perhaps yield that at the top ten list of the wealthiest individuals worldwide

- owners and/or founders of Amazon, Google, Microsoft, Apple Corporation, Intel, Samsung, Oracle, etc.

- Disparities in data and databases happen because different people are coming from different
environments

- above-cited ultra-rich people know the data and information are the keys to their immense wealth

- Data platforms and databases, as well as database servers worldwide, are developed by organizations
- Data analytics, data warehousing, data science, big data and a whole lot more IT terms have emerged
as foundational to organizational successes

- Client information, market analysis, future trends, business intelligence, innovative product
development, targeted advertisements, and other data-driven endeavors (spearheading growths and
developments that spur tremendous wealth creation)

- database server systems are being continually redesigned to meet such gregarious expectations

- Database servers sprawl across the continents crisscrossing boundaries at so much expense

- the returns on investments dwarf such huge investments

- Database administrators have their heyday finding the means to create the optimum leverages for data

- database servers must be kept in constant check to shoo away deep penetrations and sabotages

- some players may employ the means to disparate database management processes that eventually
lead to data propriety issues and challenges

Key Points of Disparities in Database and their Serves

Database Configuration

- database server configuration acts as the data repository for several


applications in this system design, integrating data from various
applications

- integration database must consider all of its client applications

- resulting schema has to unify what should be distinct bounded


contexts of many systems

- making it either more generic, more complex, or both

- Database changes are more complicated because they must be negotiated between the database
group and the various applications

- they are typically managed by a separate organization from those responsible for application
development

- benefit is that data sharing between applications does not necessitate the addition of an extra layer of
integration services as well as disparate connectors

- When the database is committed, any data changes made in a single application are made available to
all applications, keeping the applications' data use more synchronized

- Database integration, causes serious issues because the database becomes a common area of
attachment among the applications that access it
- typically a deep attachment coupling that greatly increases the risk of changing those applications and
makes them more difficult to evolve

- As a result, the majority of software architects believe that integration databases should be as much as
possible not be the way of configuration

Centralized Databases

- all data is stored and maintained in one place

- central location is a computer or database

- All the data is stored in a single location, for example, a mainframe


computer

- typically maintained and modified using an internet connection


like LAN and WAN

- mostly used in colleges, banks, hospitals, and in small companies

- one computer acts as a server for storing whole data

- simplest type of database system, client/server, is utilized in centralized database systems, where one
client sends a request to the server

- when a request is made, the server will receive it and respond

- Many small organizations use

Database Integration

- process of integrating information from numerous sources

- sources include those from Internet of Things (IoT), social media data, data generated by sensors, data
warehouse systems, client transactions, etc.

- applications share a current, washed version of the database across the departments of the
organization

- integration of databases serves as the hub via which all shared data is transmitted

- process is important in different situations, including commercial (for example, when two distinct
companies must collaborate and merge their databases)

- scientific (for example, combining these results from diverse bioinformatics data stores) domains

- As volume (i.e., big data) and requirement to communicate existing data increases, data integration
becomes more widespread

- subject of in-depth theoretical research and still has a great deal of issues
- internal and external users is encouraged by data integration

- data that is being integrated must come from a diverse database system

- it must be turned into one intelligible data store

- data coherence provides synchronous data transmission across the network

- Another prominent usage of data integration is in various data mining systems

- undertaken when evaluating and extracting information from the databases

- data integration can be critically valuable for business information

- Business mergers may have separate databases which both contain critical data

- combined data for operations may be made more least duplicated and stored accordingly

- Cleansing and security can also be shared with stakeholders

Benefits of Database Integration

- Properly and fully managed database processes provided by database integration are converted to
measurable outcomes like:

Universal reliability of business data

- being able to maintain just a one source of critical data truth across the global scope

Holistic operations oversight

- Managing business intelligence from a centralized, visualized operations screen is a useful tool for
identifying bottlenecks

- a tool for enhancing user experience

- significantly shorten delivery cycles and other tasks

Simplified security

- As high-profile hacks dominate the news, businesses are aware that they are facing more ways of being
accessed

- poses even greater security loopholes and threats than ever before in isolated and on-site
environmental networks
- ultimate versions of data pass through (enter and exit) distinct sources with a deployment, which
greatly simplifies the security of critical information

Easier compliance

- Compliance with international and national operating standards, such as HIPAA, PCI, and GDPR, is
important in modern, digital business

- Database integration provides centralized management for ensuring enterprise compliance

Database Integration Frameworks

- primary data integration platform used by organizations

- greater means of transforming raw data into data analytics and business intelligence

On-Premises Database Integration

- Traditional on-premises network infrastructures are supported by on-site database integration

- solutions, which are frequently sold as standalone products

- they are deployed locally and communicate with the hardware and databases that are already in place
(To clean, monitor, and transform data)

Cloud Database Integration

- solutions are termed as cloud-native

- run as part of a database infrastructure through which they interact in the background with all
enterprise data transactions

Hybrid Database Integration

- framework that highlights the use of software as a service (SaaS) cloud services to synchronize and
manage data both locally and remotely

Open-Source Software for Integration

- three Apache software tools that significantly help in database integration:


1. Apache Hadoop

- Based upon the Java programming language, Hadoop is a database software framework used for
distributed processing

- petabytes of information bulk processed on both physical or remote servers

- they return the clean and processed data with much reliability

2. Apache Spark

- a companion tool for Hadoop that improves the distributed processing by more than 100 times more
speedy via its MapReduce framework

- does not have its own file management system, unlike Hadoop

- it achieves greater processing by putting most data processing in memory

- Other frameworks rely on their data processing via transfer to an immediate physical or remote
location

3. Apache Cassandra

- NoSQL database is a development in information processing that made possible the database
integration

- integration here happens among dissimilar file formats, texts, images, multimedia, and etc

- primary source of big data

- NoSQL databases removed the limitations of relational and columnar databases

- due to for Cassandra allowing for heterogeneous data stores types

Means for Integrating Disparate Databases

- first step in improving the performance of integrated databases is to evaluate the organization's
present database in order to choose the optimal integration platform

Manual data integration

- doing the integration without the aid of automation

- Simple custom codes may be employed by data managers as a means to connect the different data
sources

Advantages:
Reduced cost

- Without costly tools

- this technique is typically only able to integrate a small size of data sources

Greater freedom

- Total control on the integration is achieved without being picked by requirements from some tools

Disadvantages:

Less access

- Manual integration provides for less since everything is done manually

Difficulty scaling

- takes so much time manually changing codes and it would be very harmful for a large projects requiring
fast integration

Greater room for error

- Manual integration is much prone to errors

Middleware Data Integration

- works as a go-between among applications

- transfers data among themselves and among their databases

Advantages:

Better data streaming

- integration is performed consistently and automatically by the software


Easier access between systems
- program is designed to make it easier for systems in a network to communicate with one another

Disadvantages:

Less access

- Technical knowledge is needed in deploying and maintaining the middleware

Limited functionality

- Only a handful of systems can only be used with middleware

- useful for businesses merging ancient systems with more contemporary systems

- mostly serves as a communications tool and has much limitations on data analytics

Application-based integration

- software that handles all the tasks

- find, retrieve, and purify from several sources

- combine data coming from many sources to simplify the data transfer

Advantages:

Simplified processes

- application that does all the integration automatically

Easier information exchange

- With the careful provision of application-based processing

- provides easier information exchange

Fewer resources are used

- Data managers may focus on more pressing concerns rather than being bogged down by too much
integration concerns
Disadvantages

Limited access

- involves particular, technical knowledge

- involves a data manager and/or analyst to monitor application implementation and maintenance

Inconsistent results

- inconsistent with what is desired by the organization

Complicated setup

- requires technical knowledge for configuration and setup which makes it complicated overall

Difficult data management

- Data integrity may compromised since it is easily able to access a lot of applications and data altogether

Uniform access integration

- data is retrieved from even more dissimilar collections and consistently presented

- letting the data remain in its original spot

Advantages:

Lower storage requirements

- Another separate area to store data is no longer needed

Easier data access

- Simultaneous multiple systems as well as data sources are able to access the integrated databases

A simplified view of data

- With the uniform appearance of data, the data presented is quite simplified

Disadvantages:
Data integrity challenges

- possibly be compromise data integrity by using so many sources

Strained systems

- data host systems are not built to handle the volume and frequency of data requests

Common storage integration (data warehousing)

- comparable to uniform access, with the exception that copies of the same data is made and kept in a
data warehouse

- one of the most often used types of data integration

- provides businesses a lot more innovative means in how they can handle data

Advantages:

Reduced burden

- Data queries are no longer the main concern of the integration thus, improving process overlay

Improved control for data version management

- Data integrity is improved when it is accessed from a single source as opposed to several dissimilar
ones

Cleaner data appearance

- Managers and/or analysts can conduct several queries on the stored copy of the data while still keeping
consistency in the way the data looks

Enhancement in data analytics capabilities

- With better quality of data and well-maintained copies of datasets, data analytics is much enhanced

Disadvantages:

Increased storage costs

- Finding a storage location and paying for it are required when making a copy of the data
Higher maintenance costs

- The integration must be set up, managed, and maintained by technical professionals in order to
orchestrate this method

Database Integration Tools

- future cannot be glimpsed, it can be predicted in order to understand how in-house servers, cloud
computing and mobile technologies will interact

- Managers, analysts, and executives will be less reliant on their workplaces as cloud computing grows
while on-premises computing resources shrink

- handheld phone device, users will be able to receive and transmit data, carry out transactions, access
databases, carry out complex queries across several systems, and get the answers instantly and
wherever they are

- capability calls on data integration technologies to function flawlessly across networks and devices

- businesses will begin sharing their data

- this necessitates data integration strategies that function both within and across businesses

- Data integration architects will be compelled to create even more comprehensive capabilities as a
result of the demand for this more access

- cloud-based systems will make it possible to share information across enterprises, at ever-increasing
speeds, and on even greater scales

A good integration tool has the following characteristics:

Portability

- crucial to move data between the cloud and on-premises

- Organizations are able to create data connections once and are able to run them anywhere because of
portability

Ease of use

- ease of understanding and use of the tools must be a prime consideration in choosing integration tools

Cloud compatibility

- single cloud, hybrid cloud, or multi-cloud environment, tools should function without any issues
- greatest tools combine the aforementioned abilities and are compressive

- urgent need for an app suite that gathers, controls, transforms, and shares data by providing a variety
of capabilities

- having self-service apps and persistent data quality, and intelligent governance

- cloud services should cover every data source from beginning to end, enabling businesses to carry out
their data integration swiftly and thoroughly

Efficiency in Data and Database Integration

- Organizations need a great deal of efficiency and reliability in recording, updating, and tracking their
data

- popular systems for storing customer, inventory, or any organizational information is a database

- Employees or decision-makers can use the stored data for a variety of functions, including reporting or
data analytics, by integrating it further with other file organizations, file systems, or applications

- efficient database integration software is necessary for finer and simple integration for greater accuracy
in their analytics

- Database integration is the act of merging data from several sources

- such as databases, the cloud, data warehouses, virtual databases, remote databases, files, and a lot
more

- to disseminate a clear and unified version throughout the whole company

- Data is made available to numerous stakeholders

- made available to client applications owing to database integration systems, which eliminates the need
for duplicating or moving data

- business might keep its customer information in Salesforce and its accounting information in Oracle

- Stakeholders may access the collective dataset from both systems in one location, such as a data
warehouse or database, by using the database integration system's process.

- can use the data to generate actionable insights more quickly

- quite similar to how some companies access website databases to manage and combine data from
different web pages

- it sees the web as a collection of scattered databases

- Database integration is crucial to ensuring the efficient use of organizational data

- big data is the driving force for business intelligence and analytics
- employing appropriate database integration software is crucial for proper administration of database
activities that may convert the obstacles connected with electronic transactions into effectiveness in
operations

Following are some advantages of database integration:

Gaining Better Control of Information

- database connection, businesses can control all of their data or information from a single location

- making it simpler to spot bottlenecks, enhance user experience, and speed up delivery

Ensuring Satisfactory Compliance with Regulations

- firms that deal with digital information, compliance with regional, national, and global operational
standards

- such as HIPAA, For firms that deal with digital information, compliance with regional, national, and
global operational standards

- such as PCI, HIPAA, and GDPR, is quickly becoming essential

- Centralized management made possible by database integration makes it simpler to guarantee


organizational-wide compliance

Creating a Single Source of Truth

- Companies must integrate their data from disparate information management systems and databases
during mergers and acquisitions to produce a uniform view of trustworthy business data

- Data from many sources is combined with the use of database integration tools

- then purified, converted, and fed into the necessary target systems

Integrating Data from Disparate Sources

- Without combining data from various sources, including on-premise systems, cloud-based databases,
and legacy systems

- Database integration is not always possible

- Every business makes use of several types of software

- collect is stored in separate systems

- crucial to connect many data sources and compile all data in a data warehouse for an enterprise's
Business Intelligence (BI), road mapping, and forecasting needs
Speeding Up Data Integration with Built-in Database Connectors

- Business users and developers may connect data from various sources, such as data warehouses, cloud
apps, and more, using integrated system solutions that are comprehensive and have a drag and drop
Interface (Graphical User Interface)

- A database connection, for instance, is a feature in computer science that enables a client software to
communicate with the database server, whether or not they are running on the same machine

- Sending commands and receiving responses, which are typically in the form of result sets, both require
connections

- fundamental idea in data-centric programming is connections

- Connection pooling was developed to enhance performance because some DBMS engines take a long
time to connect

- Without an open and enabled connection to a database, no command may be executed against it

- possible to address a specified database or database server

- user authentication credentials may also be specified

- done by providing a connection string to an underlying driver or provider (for instance, Server=sql box;
Database=Common; User ID=uid; Pwd=password);

- connection can be opened, or closed, and its properties can be set once it has been established

- data provider and data access interface are being utilized to determine the key/value pairs that make
up the connection string

- each connection, many databases (including PostgreSQL) only permit one action to be carried out at a
time

- database obtains a request for data (which is an SQL Select statement) and returns a result set, the
connection is open but not yet ready for subsequent actions since the client is still processing the result
set

- databases do not place this restriction, such as SQL Server 2005 (or later versions)

- compared to databases that allow only one operation job at a time, those that allow several operations
per connection typically have much higher overhead

- database backend platform has extensive features, such as in-built transformations, pre-built
connectors for modern and traditional data sources, job scheduling, and workflow automation to
support bi-directional integration between various databases, each with its own advantages and
drawbacks

- ex. advantages of SQL server over Access are better performance, enhanced scalability and increased
reliability
- list of popular database servers pooled into the database integration software, includes:

- Redshift, SQL Server, IBM DB2, PostgreSQL, MySQL, MS Access, Sybase, Teradata, Netezza, Oracle, etc.

Code-Free Database Integration

- straightforward dataflow in which SQL Server is used to integrate Order data from PostgreSQL after
applying a condition based on the Order Date through utilizing database transaction management
features such as the Filter transformation

Pushdown Optimization

- Data transformation jobs must be pushed down into the database in order to achieve high performance
and make the best use of database resources

- Better outcomes are produced as a result, including time savings, improved processing resource usage,
and amplified developer productivity

Types of pushdown optimization modes:

Partial pushdown

- Depending on the database provider or the transformation logic

- server pushes the logic for transformation to the source or target database in this mode

- According to the applied conditions, the Sales data from SQL Server is combined to the Company
Name. It is then routed into three different destination files in the image below, which illustrates task
execution in a partial pushdown mode

Full pushdown

- jobs are executed from start to end in pushdown mode

- Data from SQL Server is filtered according to Country in the dataflow below, which then displays the
execution of jobs in full pushdown optimization mode before data is transmitted to the target database,
which is also the SQL Server
Ensuring Database Integrity during Migration

- first phase of database integration is data migration

- development and implementation of a data migration project are made simpler by database integration
software that includes builtin data cleansing, quality, and profiling tools in a single environment.

- the accuracy and consistency of the data sources at each of the steps of integration by implementing
the data quality criteria

- data profiling allows for the breakdown of the source data in terms of its error count, structure, percent
of duplication, etc.

- These characteristics guarantee the database migration's accuracy

- migration of databases on Order data coming from an SQL Server to a PostgreSQL Server is
demonstrated in the database integration example below

- Prior to being transferred to the target database, the data is profiled, sorted by Product ID, and has its
Quantity field error-checked using data quality criteria

- Database integration software is a crucial tool in the corporate sector due to the exponential growth in
data volume and variety

- they lack the proper integration tools or are simply not provided with the proper data integration
solutions, the majority of businesses struggle to complete database integration initiatives

- Businesses have discovered that integrating data at the database level may save them a ton of time by
speeding up the process of generating useful, data-driven insights, therefore they are looking for
database integration technologies that can do the job

XML-Based Data and Database Integration

- common format for data transmission via the Internet is XML


- a markup language, similar to HTML, but it offers a wider range of capabilities, including user-defined
tags, which enable the representation of both data and metadata in a single document

- the display are still separate from data representation at the same time

- because of versatility, XML may be used to define new markup languages that are tailored for particular
applications

- tags that documents may employ are described in a document type definition (DTD)

- tailored to the particular requirements of the application context in terms of semantics

- outlines the rules that link tags to their contents

- XML is a widely used data format for the transfer of data between computing systems and applications

- challenge of how data sent by XML documents may be read, stored, and searched is brought up by the
widespread use of XML

Week 11 - INTEGRATING CLOUD-BASED DATABASES

- cloud computing has gained prominence with the advent of the provision of remote data centers

- mostly in cost-effective manners

- Cloud computing has greatly progressed with the integration of software services on a payper-use basis
to counter costs associated with software purchases or licensing

- Cloud services a remote databases shared virtually via data centers

- cloud databases take on the traditional roles of data and database administrators

- with new paradigm of database management

- cloud services have their limitations

- risk of having cloud services for their critical data like having less control over the dynamics of remote
database management systems

- with it in mind, full cloud-based database integration for organizations may be derailed

The Cloud and the Future of Data Integration

- cloud computing capabilities will keep transforming enterprises in fascinating new ways

- initial expectations of having simpler database management activities might get into reverse as the
advancements in cloud computing happen

- Database and data managers find themselves even getting into more complexities in their data
integration despite the promises of ease being proposed by cloud services
- data integration is a clever technique to enable successful data analysis for firms

- knowledge of the complex array of challenges brought about by the systems that must integrate

- necessary to choose the best strategy for every firm

- manual approach might be adequate if all that is required is to manually integrate a small number of
systems

- Businesses and organizations faced with integrating too many disparate systems

- require a multi-pronged approach to integration, especially for their data and databases

- the greatest choice for your company by taking into account client needs, goals, and the type of
strategy that best meets both

Trends in Cloud Data Integration

- cloud is having a profound impact on the future of data integration

- solution offer a number of advantages over traditional on-premises solutions, including scalability, cost-
effectiveness, ease of use, and security

- cloud-based data integration solutions are becoming increasingly popular

- solutions are also becoming more sophisticated (offer features such as real-time data integration, data
lineage, and data governance)

- help businesses to make better decisions by providing them with a more complete view of their data

- data integration is likely to be increasingly cloud-based

- businesses continue to generate more data, they will need to find ways to integrate this data from a
variety of sources

- Cloud-based data integration solutions offer a number of returns that enable them to be much suited
for this task.

technological trends expect to see in the future of cloud-based data integration:

Increased use of real-time data integration

- become increasingly important as businesses need to make decisions based on the latest data

- Cloud-based solutions make it easier to implement real-time data integration, as they offer the
scalability and flexibility needed to handle large volumes of data

Increased use of open-source data integration tools


- becoming increasingly popular, as they offer a number of advantages over proprietary solutions

- advantages include lower cost, greater flexibility, and a larger community of users

Increased focus on data governance

- become increasingly important as businesses need to ensure that their data is accurate, secure, and
compliant

- Cloud-based data integration solutions can help businesses to improve their data governance by
providing features such as data lineage and auditing

Trends in Cloud Data Integration (continuation)

- also expect to see the development of new cloud-based data integration solutions that offer even more
features and capabilities

cloud continues to evolve, so too will the future of data integration in the following areas:

- The cloud can make it easier for businesses to scale their data integration operations
- The cloud can help businesses to improve the security of their data integration operations
- The cloud can help businesses to save money on data integration costs
- The cloud can provide businesses with access to a wider range of data integration tools and
services

- cloud is a powerful tool that can help businesses to improve their data integration capabilities

- cloud technology evolves, developers and organizations alike can expect more innovative and powerful
cloud-based data integration solutions being developed

Impact of Cloud Services on Database Integration

- cloud and data integration play significant roles in shaping the future of technology and how
organizations manage and leverage their data

cloud's impact on data integration and its future implications:

Scalability and Flexibility

- cloud provides virtually unlimited scalability, enabling businesses to handle large volumes of data

- Data integration platforms hosted in the cloud can effortlessly scale up or down based on demand,
allowing organizations to adapt to changing data integration requirements without investing in
additional hardware or infrastructure
Connectivity and Interoperability

- cloud acts as a centralized platform for integrating data from various sources and systems

- seamless connectivity between on premises and cloud-based applications, databases, and services

- cloud-based integration solutions, organizations can establish data pipelines and connect disparate
systems, facilitating data sharing, synchronization, and real-time access

Real-Time and Event-Driven Integration

- organizations can embrace real-time and event-driven data integration

- Cloud-based integration platforms provide the infrastructure and tools to capture, process, and deliver
data in real-time

- orgs make data-driven and timely decision-making

- leads to more dynamic business processes

- finance, e-commerce, and IoT, where immediate data integration and analysis are crucial

Data Security and Compliance

- robust security measures and compliance standards, often surpassing what individual organizations can
achieve on their own

- data integration increasingly occurs in the cloud, organizations can leverage the security and
compliance features offered by cloud providers to ensure the integrity, confidentiality, and regulatory
compliance of their integrated data

Data Governance and Master Data Management

- enhance data governance practices and support master data management (MDM) initiatives

- centralizing data integration processes and establishing data governance frameworks in the cloud,
organizations can improve data quality, enforce data standards, and maintain a single source of truth
across their enterprise systems

Artificial Intelligence (AI) and Machine Learning (ML)

- cloud provides the necessary infrastructure and computational power to leverage AI and ML techniques
for data integration
- AI-powered integration platforms can automate data mapping, transformation, and cleansing tasks,
accelerating integration processes and improving efficiency

- ML algorithms can also analyze patterns in data integration workflows to optimize performance and
predict potential issues

Hybrid and Multi-Cloud Integration

- Many organizations adopt hybrid cloud or multi-cloud strategies, leveraging a combination of public
cloud services and on-premises infrastructure or multiple cloud providers

- hybrid and multi-cloud environments, enabling seamless integration between various cloud platforms
and on-premises systems

Database as a Service (DBaaS)

- cloud foremostly offers data and data services

- a cloud computing service model that provides users with access to a managed database system over
the internet

- service provider takes care of all the database management tasks, such as provisioning, setup,
configuration, scaling, backups, and maintenance, allowing users to focus on using the database rather
than managing its underlying infrastructure

Key features and aspects of Database as a Service include:

Managed Infrastructure

- service provider hosts and maintains the necessary hardware, networking, and storage infrastructure
required for the database, reducing the burden on users to manage physical servers

Automated Provisioning

- Users can quickly and easily deploy a new database instance through a user-friendly interface or API,
without the need for manual setup

Scalability

- DBaaS platforms can scale up or down based on demand, allowing users to increase or decrease
resources as their database requirements change
Backup and Recovery

- service provider typically handles regular backups of the database and ensures that data can be
recovered in case of failures or disasters

Security

- DBaaS providers implement security measures to protect the database from unauthorized access and
data breaches

- essential for users to understand the security measures in place and their own responsibility for
securing their data

Monitoring and Performance Optimization

- Providers often offer monitoring tools and performance optimization features to ensure that the
database operates efficiently

Multi-Tenancy

- DBaaS usually follows a multi-tenant model, where multiple customers share the same physical
resources while their data remains isolated and secure

Some of the popular Database as a Service offerings include:

Amazon RDS (Relational Database Service)

- fully managed relational database service provided by Amazon Web Services (AWS), supporting
databases like MySQL, PostgreSQL, Oracle, SQL Server, etc.

Microsoft Azure SQL Database

- managed relational database service on Microsoft Azure, supporting SQL Server-based databases

Google Cloud SQL

- Google's managed database service for MySQL, PostgreSQL, and SQL Server

Firebase Realtime Database


- real-time NoSQL database offered by Google Firebase for mobile and web applications

- DBaaS is especially beneficial for businesses and developers who want to focus on their applications'
functionality and data usage without worrying about database infrastructure management

- simplifies database deployment, reduces administrative overhead, and provides scalability and
flexibility in handling data-intensive workloads

- essential to choose the appropriate DBaaS provider and database type based on specific application
requirements and performance needs

Week 11 - RESPONSIVE WEB DESIGN

Responsive Web Design

- about using HTML and CSS to automatically resize, hide, shrink, or enlarge, a website, to make it look
good on all devices (desktops, tablets, and phones).

- a web design approach to make web pages render well on all devices with different screen sizes and
viewports.

Create Responsive Design

Responsive Viewport - Meta Tag

- viewport is the user's visible area of a web page

- viewport varies with the device, and will be smaller on a mobile


phone than on a computer screen

Setting The Viewport:

- tag was introduced in HTML5 which is a method to let web designers take control over the viewport

- width=device-width part sets the width of the page to follow the screen-width of the device (which will
vary depending on the device)

- initial-scale=1.0 part sets the initial zoom level when the page is first loaded by the browser

Responsive Grid Layout


- grid layout is a CSS layout that allows you to create responsive and flexible layouts

- serves a modern solution for creating a flexible layouts

- responsive grid view often has 12 columns and a total width of 100%

- grid will automatically resize to take up a different percentage of the screen width on different devices

Syntax:

main {

display: grid;

grid-template-columns: 30% 30% 30%;

Media Queries

- a CSS technique that defines completely different styles for different browser sizes

- a crucial part of responsive design

- a containers for other rulesets that are then implemented based on the media query results

Syntax:

@media only screen and (max-width: 600px) {

body {

background-color: lightblue;

Responsive Text

- can be set with a "vw" unit, which means the "viewport width"

- text size will follow the size of the browser window

Syntax:

<h1 style="font-size:10vw">Hi everyone!</h1>


Responsive Images

- images that scale nicely to fit any browser size

- Using the width property, if the CSS width property is set to 100%, the image will be responsive and
scale up and down

Syntax:

<img src="img_girl.jpg" style="width:100%;">

- Using the max-width property, if the max-width property is set to 100%, the image will scale down if it
has to, but never scale up to be larger than its original size

Syntax:

<img src="img_girl.jpg" style="max-width:100%;height:auto;">

You might also like