0% found this document useful (0 votes)
10 views

Cloud Computing

The document discusses ensuring confidentiality and reliability of data when outsourcing to the cloud. It covers encryption techniques, access controls and privacy-preserving methods for confidentiality. For reliability, it discusses factors like completeness, accuracy, consistency and safety as well as modern interpretations related to data being fit for purpose and restricting unauthorized access.

Uploaded by

ramamamku
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Cloud Computing

The document discusses ensuring confidentiality and reliability of data when outsourcing to the cloud. It covers encryption techniques, access controls and privacy-preserving methods for confidentiality. For reliability, it discusses factors like completeness, accuracy, consistency and safety as well as modern interpretations related to data being fit for purpose and restricting unauthorized access.

Uploaded by

ramamamku
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

SRI RAMAKRISHNA ENGINEERING COLLEGE

[Educational Service: SNR Sons Charitable Trust]


[Autonomous Institution, Reaccredited by NAAC with ‘A+’ Grade]
[Approved by AICTE and Permanently Affiliated to Anna University, Chennai]
[ISO 9001:2015 Certified and all eligible programmes Accredited by NBA]
Vattamalaipalayam, N.G.G.O. Colony Post, Coimbatore – 641 022.

DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING

ACTIVE LEARNING METHOD

CONFIDENTIALITY AND RELIABILITY VERIFICATION IN


OUTSOURCED DATA

ASWINKUMAR K - 2101021
BALAMURALAIKRISHNA P - 2101024
DHANUSH S - 2101039
GOPINATH S - 2101057
SARATH KRISHNAN K - 2101224

THIRD YEAR B.E. CSE & MTECH CSE – VI SEM


Academic Year 2023-2024

20CS215 CLOUD COMPUTING

STAFF IN-CHARGE

Dr. P. MATHIYALAGAN ASP/CSE


ABSTRACT

The rapid adoption of cloud computing has transformed the landscape of data storage and
management. Organizations and individuals increasingly rely on cloud services to store and
process sensitive information, ranging from personal records to critical business data. However,
this shift to cloud-based solutions introduces new challenges related to data confidentiality and
reliability.
Cloud Computing is one of the emerging technologies in Computer Science. Cloud provides
various types of services to us. Database Outsourcing is a recent data management paradigm in
which the data owner stores the confidential data at the third party service provider’s site. The
service provider is responsible for managing and administering the database and allows the data
owner and clients to create, update, delete and access the database. There are chances of
hampering the security of the data due to untrustworthiness of service provider. So, to secure the
data which is outsourced to third party is a great challenge. The major requirements for achieving
security in outsourced databases are confidentiality, privacy, integrity, availability. To achieve
these requirements various data confidentiality mechanisms like fragmentation approach, High-
Performance Anonymization Engine approach etc. are available. In this paper, various
mechanisms for implementing Data Confidentiality in cloud computing are analyzed along with
their usefulness in a great detail.
INTRODUCTION

Data confidentiality is one of the pressing challenges in the ongoing research in Cloud
computing. as soon as confidentiality becomes a concern, data are encrypted before outsourcing
to a service provider. Hosting confidential business data at a Cloud Service Provider (CSP)
requires the transfer of control over the data to a semi-trusted external service provider. Existing
solutions to protect the data mainly rely on cryptographic techniques. However, these
cryptographic techniques add computational overhead, in particular when the data is distributed
among multiple CSP servers. Storage as a Service is generally seen as a good alternative for a
small or mid-sized business that lacks the capital budget and/or technical personnel to implement
and maintain their own storage. But the main issue is to maintain CIA (Confidentiality, Integrity
and Authentication) (Fig.1) to the data stored in the cloud.
Cloud computing offers several advantages, including scalability, cost-effectiveness, and
ubiquitous access. However, the semi-trusted nature of cloud service providers raises concerns
about the security and privacy of outsourced data. Data breaches, unauthorized access, and data
manipulation are real threats that organizations must address. In this context, ensuring the
confidentiality and reliability of outsourced data becomes paramount. Trusted Cloud provides
you with the ability to create a unified data protection policy across all clouds. The impact of
privacy requirements in the development of modern applications is increasing very quickly.
Many commercial and legal regulations are driving the need to develop reliable solutions for
protecting sensitive information whenever it is stored, processed, or communicated to external
parties.

Fig. 1 cloud computing


Architecture of Outsourced Database Model:

Overview
In the Outsourced Database Model (ODB), organizations outsource their data management needs
to an external service provider. The service provider hosts client's databases and offers seamless
mechanisms to create, store, update and access (query) their databases. This model introduces
several research issues related to data security which we explore.

Database as a Service Provides Many Benefits


As data continues to expand, and technology advances at an ever more rapid pace, Database as a
Service (DBaaS) can provide enterprises with a database solution that is simple to use and easy to
update. DBaaS provides developers with a cloud-based database through which scaling, load
balancing, failover and backup can all be managed. DBaaS is perfect for applications that require
quick provisioning and dropping of databases, such as prototype testing, sales promotion and other
short-term projects. In the past, soon after a database was created for one of these purposes, it
might already be obsolete.

Fig. 2 Architecture of outsource database model


DATA CONFIDENTIALITY:

In the context of data management and security, data confidentiality refers to the protection
of sensitive information from unauthorized access, disclosure, or alteration. It ensures that only
authorized individuals or systems can access specific data, while keeping it hidden from others.

Encryption Techniques:

• Symmetric Encryption (AES): Uses a shared secret key for both encryption and
decryption. It’s efficient but requires secure key distribution.

• Asymmetric Encryption (RSA): Utilizes public and private keys. RSA ensures
confidentiality during data transmission.

• Homomorphic Encryption: Enables computations on encrypted data without revealing the


plaintext.

Access Control Mechanisms:

• Role-Based Access Control (RBAC): Assigns permissions based on user roles (e.g.,
admin, user).

• Attribute-Based Access Control (ABAC): Grants access based on attributes (e.g.,


department, clearance level).

• Data Masking and Tokenization: Replaces sensitive data with pseudonyms or tokens.

• Remember, robust data confidentiality practices involve a combination of encryption,


access controls, and privacy-preserving techniques.
DATA RELIABILITY AND INTEGRITY:

Traditional Definition and Modern Interpretation

Data integrity has always been a critical aspect of data management, ensuring the reliability and
accuracy of information. Traditionally, it encompassed various factors:

• Completeness: Ensuring that all required data is present.

• Accuracy: Verifying that the information is correct and error-free.

• Consistency: Maintaining uniformity and coherence across different datasets.

• Safety: Protecting data against loss, corruption, or unauthorized access.

However, with the advent of cloud architectures and modern data stacks, the definition of data
integrity has evolved. In today’s context, data integrity focuses on:

• Fit for Purpose: Ensuring that the data is suitable for the intended use or analysis.

• Authorized Access: Restricting data access to authorized individuals or systems.

• Data Privacy: Safeguarding sensitive information and complying with privacy regulations.

• Data Security: Implementing measures to protect data from unauthorized access or


breaches.

By upholding data integrity, we can trust that the data we work with is complete, accurate,
consistent, and secure. It ensures that the information we base our decisions on is reliable and can
be used to derive meaningful insights.
Phases of data integrity technique:

Data integrity always keeps the promise of data consistency and accuracy of data at cloud storage.
Its probabilistic nature and resistance capability of storing data from unauthorized access help
cloud users to gain trust for outsourcing their data to remote clouds. It consists of mainly three
actors in this scheme: Data owner (DO), Cloud Storage/Service Provider (CSP), and Third-Party
Auditor as depicted in Fig. 3. The data owner produces data before uploading it to any local cloud
storage to acquire financial profit. CSP is a third-party organization offering Infrastructure as a
service (IaaS) to cloud users. TPA exempts the burden of management of data of DO by checking
the correctness and intactness of outsourced data. TPA also reduces communication overhead costs
and the computational cost of the data owner. Sometimes, DO own self takes responsibility for
data integrity verification without TPA interference.

Data processing phase: In data processing phase, data files are processed in many way like file is
divided into blocks, applying encryption technique on blocks, generation of message digest,
applying random masking number generation, key generation and applying signature on encrypted
block etc. and finally encrypted data or obfuscated data is outsourced to cloud storage.

Acknowledgement Phase: This phase is totally optional but valuable because sometimes there may
arise a situation where CSP might conceal the message of data loss or discard data accidentally to
maintain their image. But most of the research works skip this step to minimize computational
overhead costs during acknowledgment verification time.

Integrity verification phase: In this phase, DO/ TPA sends a challenge message to CSP and
subsequently, CSP sends a response message as metadata or proof of information to TPA/DO for
data integrity verification. The audit result is sent to DO if verification is done by TPA.
Fig. 3 Entire Cycle of Data Integrity Technique

Future trends in data integrity approaches


As further research work, we are discussing here the future direction of the data
integrity scheme to enlarge the scope of cloud data security for research process
continuity. New emerging trends in data integrity schemes are listed below.In [39],
authors have already discussed and shown evolutionary trends of data integrity
schemes through a timeline representation from 2007 to 2015 which presented
possible scopes of data integrity strategy. Hence, we show a visual representation of
all probable trends of the integrity scheme from 2016 to 2022 in the timeline
infographic template.
• Blockchain data-based integrity
Data integrity in fog computing
• Distributed Machine Learning Oriented Data Integrity
• Data Integrity in Edge Computing
CONCLUSION:

Confidentiality and reliability verification are crucial for maintaining trust in cloud computing.
Organizations and researchers continue to explore innovative methods to safeguard outsourced
data while preserving its integrity and confidentiality. Database outsourcing is a popular data
management technology in the recent era which is accepted in the industries due to its inherent
profitable features. In this paper, we have discussed the concept of DBaaS, its architecture and its
benefits. Cloud computing offers attractive and cost effective solutions to customers, but it also
imposes many challenges regarding confidentiality, privacy, control and laws and legislation. Most
of the security measures are based on trust reliance where the customer relies strongly on the
providers' trustworthiness. This paper focuses on secure and confidential data outsourcing to Cloud
environments by using Fragmentation and high Performance Anonymization Engine techniques
and applying only minimal encryption to prevent data exposure. In this paper we have mainly
focused on how the data confidentiality applied in outsourced databases and analyzed the
techniques with their usefulness for the same. The future work can also be focused on providing
security for outsourced database along with reducing the communication, computation cost. There
is much scope for improving the optimization of query processing time. The generic system can
be developed which efficiently provides DBaaS and works on any database providing all the
security mechanisms.

You might also like