Cloud Computing
Cloud Computing
ASWINKUMAR K - 2101021
BALAMURALAIKRISHNA P - 2101024
DHANUSH S - 2101039
GOPINATH S - 2101057
SARATH KRISHNAN K - 2101224
STAFF IN-CHARGE
The rapid adoption of cloud computing has transformed the landscape of data storage and
management. Organizations and individuals increasingly rely on cloud services to store and
process sensitive information, ranging from personal records to critical business data. However,
this shift to cloud-based solutions introduces new challenges related to data confidentiality and
reliability.
Cloud Computing is one of the emerging technologies in Computer Science. Cloud provides
various types of services to us. Database Outsourcing is a recent data management paradigm in
which the data owner stores the confidential data at the third party service provider’s site. The
service provider is responsible for managing and administering the database and allows the data
owner and clients to create, update, delete and access the database. There are chances of
hampering the security of the data due to untrustworthiness of service provider. So, to secure the
data which is outsourced to third party is a great challenge. The major requirements for achieving
security in outsourced databases are confidentiality, privacy, integrity, availability. To achieve
these requirements various data confidentiality mechanisms like fragmentation approach, High-
Performance Anonymization Engine approach etc. are available. In this paper, various
mechanisms for implementing Data Confidentiality in cloud computing are analyzed along with
their usefulness in a great detail.
INTRODUCTION
Data confidentiality is one of the pressing challenges in the ongoing research in Cloud
computing. as soon as confidentiality becomes a concern, data are encrypted before outsourcing
to a service provider. Hosting confidential business data at a Cloud Service Provider (CSP)
requires the transfer of control over the data to a semi-trusted external service provider. Existing
solutions to protect the data mainly rely on cryptographic techniques. However, these
cryptographic techniques add computational overhead, in particular when the data is distributed
among multiple CSP servers. Storage as a Service is generally seen as a good alternative for a
small or mid-sized business that lacks the capital budget and/or technical personnel to implement
and maintain their own storage. But the main issue is to maintain CIA (Confidentiality, Integrity
and Authentication) (Fig.1) to the data stored in the cloud.
Cloud computing offers several advantages, including scalability, cost-effectiveness, and
ubiquitous access. However, the semi-trusted nature of cloud service providers raises concerns
about the security and privacy of outsourced data. Data breaches, unauthorized access, and data
manipulation are real threats that organizations must address. In this context, ensuring the
confidentiality and reliability of outsourced data becomes paramount. Trusted Cloud provides
you with the ability to create a unified data protection policy across all clouds. The impact of
privacy requirements in the development of modern applications is increasing very quickly.
Many commercial and legal regulations are driving the need to develop reliable solutions for
protecting sensitive information whenever it is stored, processed, or communicated to external
parties.
Overview
In the Outsourced Database Model (ODB), organizations outsource their data management needs
to an external service provider. The service provider hosts client's databases and offers seamless
mechanisms to create, store, update and access (query) their databases. This model introduces
several research issues related to data security which we explore.
In the context of data management and security, data confidentiality refers to the protection
of sensitive information from unauthorized access, disclosure, or alteration. It ensures that only
authorized individuals or systems can access specific data, while keeping it hidden from others.
Encryption Techniques:
• Symmetric Encryption (AES): Uses a shared secret key for both encryption and
decryption. It’s efficient but requires secure key distribution.
• Asymmetric Encryption (RSA): Utilizes public and private keys. RSA ensures
confidentiality during data transmission.
• Role-Based Access Control (RBAC): Assigns permissions based on user roles (e.g.,
admin, user).
• Data Masking and Tokenization: Replaces sensitive data with pseudonyms or tokens.
Data integrity has always been a critical aspect of data management, ensuring the reliability and
accuracy of information. Traditionally, it encompassed various factors:
However, with the advent of cloud architectures and modern data stacks, the definition of data
integrity has evolved. In today’s context, data integrity focuses on:
• Fit for Purpose: Ensuring that the data is suitable for the intended use or analysis.
• Data Privacy: Safeguarding sensitive information and complying with privacy regulations.
By upholding data integrity, we can trust that the data we work with is complete, accurate,
consistent, and secure. It ensures that the information we base our decisions on is reliable and can
be used to derive meaningful insights.
Phases of data integrity technique:
Data integrity always keeps the promise of data consistency and accuracy of data at cloud storage.
Its probabilistic nature and resistance capability of storing data from unauthorized access help
cloud users to gain trust for outsourcing their data to remote clouds. It consists of mainly three
actors in this scheme: Data owner (DO), Cloud Storage/Service Provider (CSP), and Third-Party
Auditor as depicted in Fig. 3. The data owner produces data before uploading it to any local cloud
storage to acquire financial profit. CSP is a third-party organization offering Infrastructure as a
service (IaaS) to cloud users. TPA exempts the burden of management of data of DO by checking
the correctness and intactness of outsourced data. TPA also reduces communication overhead costs
and the computational cost of the data owner. Sometimes, DO own self takes responsibility for
data integrity verification without TPA interference.
Data processing phase: In data processing phase, data files are processed in many way like file is
divided into blocks, applying encryption technique on blocks, generation of message digest,
applying random masking number generation, key generation and applying signature on encrypted
block etc. and finally encrypted data or obfuscated data is outsourced to cloud storage.
Acknowledgement Phase: This phase is totally optional but valuable because sometimes there may
arise a situation where CSP might conceal the message of data loss or discard data accidentally to
maintain their image. But most of the research works skip this step to minimize computational
overhead costs during acknowledgment verification time.
Integrity verification phase: In this phase, DO/ TPA sends a challenge message to CSP and
subsequently, CSP sends a response message as metadata or proof of information to TPA/DO for
data integrity verification. The audit result is sent to DO if verification is done by TPA.
Fig. 3 Entire Cycle of Data Integrity Technique
Confidentiality and reliability verification are crucial for maintaining trust in cloud computing.
Organizations and researchers continue to explore innovative methods to safeguard outsourced
data while preserving its integrity and confidentiality. Database outsourcing is a popular data
management technology in the recent era which is accepted in the industries due to its inherent
profitable features. In this paper, we have discussed the concept of DBaaS, its architecture and its
benefits. Cloud computing offers attractive and cost effective solutions to customers, but it also
imposes many challenges regarding confidentiality, privacy, control and laws and legislation. Most
of the security measures are based on trust reliance where the customer relies strongly on the
providers' trustworthiness. This paper focuses on secure and confidential data outsourcing to Cloud
environments by using Fragmentation and high Performance Anonymization Engine techniques
and applying only minimal encryption to prevent data exposure. In this paper we have mainly
focused on how the data confidentiality applied in outsourced databases and analyzed the
techniques with their usefulness for the same. The future work can also be focused on providing
security for outsourced database along with reducing the communication, computation cost. There
is much scope for improving the optimization of query processing time. The generic system can
be developed which efficiently provides DBaaS and works on any database providing all the
security mechanisms.