UNIT-3(CC).pptx
UNIT-3(CC).pptx
UNIT-3(CC).pptx
• Data Encryption:
• A huge amount of data is stored in the cloud systems by enterprises and this data is crucial for the survival of the
enterprise itself. If the data get stolen, it can be sold to the competitive company and they can make use of this data to
develop products making market competition worse.
• Considering the data that is no longer used in the daily activities, we can call this Data at rest. It is good to encrypt the
data at rest as this data will have all the charts and studies about the market trends and the upcoming products of the
same company. This data at rest encryption is important in Cloud Security Services as it alerts the users when hackers
try to access the data at rest.
• Firewall Protection:
• When the user initially tries to access any cloud system from the system, they will be prevented to do so as per firewall
protection. The device must be registered in the firewall security settings after which the user can access the data in the
cloud system.
• This internal and external firewall protection is configured by cloud systems so that any unauthorized sign-ins are
prevented by the firewall.
• When data is sent across the same IP address, the source and destination of the packet are verified by the firewall. Also,
the stability of the packet is checked to ensure the authenticity of the data packet. Some firewalls will check the content
of the data packet to establish that there are no viruses or malware attached to it. External and internal firewalls are
important to verify that the data is not compromised to outsiders in any form.
• Monitoring:
• All the IDs that are being logged into the system are monitored and noted in the cloud logging system
so that when any security threat occurs and if it is from inside, this tracking helps to identify the
individual who logged in at a particular time. Even firewall rules are updated to prevent suspicious
logging attempts thus making the data secure in the cloud storage. Monitoring usually checks for the
authentication rules and IP addresses so that if any suspicious logins are detected, they are prevented
from accessing the data in the storage. This is done at the granular level so that permissions are not
given to an individual directly but to a group of people where the responsibilities are shared. This helps
in monitoring the activities of other people and notifying the security team of any unauthorized data
modulation.
•
• Security at Data centers:
• If all the ways to access data via the system is failed, there is a way for hackers to access data via
server directly. This does not check for firewall protection and there are no authentication rules. This is
why all the physical servers are monitored closely by physical security and watched using CCTV
cameras 24 hours a day. Biometrics are also present in the server rooms where only authorized security
personnel and maintenance officials can enter and check the servers working. Also, logs are enabled
for those who enter and leave the room and the time taken inside the server room. When the concerned
personnel proceeds with more time than permitted, alerts are sent to the security so that they can check
the server rooms for unauthorized personnel.
• Isolated networks:
• When there is an important deployment in the cloud system and the data must be kept hidden from
the corresponding resource group members, it is good to do the deployment in virtually isolated
networks. Security policies should be implemented in all the networking systems and the system
itself should be protected from malicious threats and virus attacks. The accesses and authentications
should be customized and dedicated network links must be used to transfer the data to higher
environments.
• Anomaly detection:
• When the logs are huge, it is difficult to manage the logs manually for which cloud vendors utilize
AI-based algorithms to describe the anomaly in the logging pattern. This helps to manage the logging
details and monitor the discrepancies in the logs.
• Also, vulnerability can be scanned and thus made to know which computing service has less security
systems. This makes the system improve security and protect the data to the core. The location of the
databases can be kept under surveillance so that we can be sure that data is not stored in
unauthenticated databases. Checkpoints are installed in all the deployment of data into the cloud and
higher environments to ensure that the data is kept in the proper cloud storage and in the proper
format of folder details.
•
• Protection through APIs:
• To protect data from the hands of unauthorized personnel, cloud users can
employ APIs and web apps for the security of data. This helps in protecting
the containers and virtual machines from unsecured logins. Auto incidents
can be raised for unofficial logins which helps to protect the systems and
thus the cloud-stored data. And if the threats pose heavy risks, real-time
alerts can be set in the cloud storage to prevent them to access the data.
• All our data in our systems, mobile devices, and storage disks are becoming
cloud storage data and hence it is crucial to have good cloud security
services arranged for these devices. Cloud providers offer cloud security and
if one is not satisfied with the same, users can sort out the help of private
software to achieve the security level intended.
Relevant Cloud Security Design Principles
• There are six design principles for
security in the cloud:
Enable traceability:
Apply security at all layers:
Rather than just focusing on protecting a single outer layer, apply a defense-in-depth approach with
other security controls. Apply to all layers, for example, edge network, virtual private cloud (VPC),
subnet, load balancer, every instance, operating system, and application.
1. Testing of the whole cloud: In this, the cloud is taken as a whole entity,
and based on its features, testing is carried out.
2. Testing within a cloud: This is the testing that is carried out internally
inside the cloud by testing each of its internal features.
3. Testing across the clouds: In this, the testing is carried out based on the
specifications on the different types of clouds-like public, private and
hybrid clouds.
4. SaaS testing in the cloud: In this, functional and non-functional testing
takes place based on requirements.
Types of Cloud Testing
• There are three types of cloud testing:
1. Cloud-Based Application Tests over Cloud: These types of tests help determine
the quality of cloud-based applications concerning different types of clouds.
2.Online-Based Application Tests on a Cloud: Online application
supervisors/vendors perform these tests to check the functions and performance of
their cloud-based services. This testing takes place with the help of Functional
Testing. Online applications are connected with a legacy system and the connection
quality between the application and the legacy system is tested.
3. SaaS or Cloud Oriented Testing: These tests are performed by SaaS or Cloud
vendors. The objective of these tests is to evaluate the quality of individual service
functions that are offered in SaaS or cloud programs.
•
Cloud Testing Environment
There are three main cloud testing environments:
• Latency testing: In this testing the latency time between action and
responses within an application with respect to a user request.
• Availability Testing: This testing determines the cloud must available
all the time round the clock. As there might be any mission-critical
activities that can happen, the administrator i.e., cloud vendor should
ensure that there’s no adverse impact on the customers.
• Multi-Tenancy Testing: In this cloud testing, multiple users use a cloud
offering as a demo. Testing is performed to confirm that there’s
adequate security and access control of the data when multiple users
are working in a single instance.
• Scalability Testing: This testing is performed to make sure that the
offerings provided can scale up or scale down as per the customer’s need.
• Browser Performance testing: In this testing performance of a
cloud-based application i.e., the applications deployed over the cloud is
tested across different web browsers.
• Security Testing: As Cloud provides everything at any time, it is very
important that all user-sensitive data must be secured and has no
unauthorized access to maintain users’ privacy.
• Disaster Recovery Testing: In availability testing, the cloud has to be
available at all times, if there are any types of failures occur like network
outages, breakdown due to high load, system failure, etc. this testing
ensures how fast the failure can be captured and if any data loss occurs
during this period.
Tools for Functional Testing in Cloud
• AppPerfect:
• Jmeter:
• SOASTA CloudTest:
• LoadStorm
• Tools for Security Testing in Cloud
The following are the tools for security testing in the cloud:
1. Nessus: Nessus is a remote security scanning tool that scans the
system and raises an alert if any vulnerability is discovered that
hackers could use to get unauthorized access to sensitive data.
2. Wireshark: Wireshark is an open-source packet analyzer used
for network troubleshooting and monitoring, software, and
communications protocols development.
3. Nmap: Nmap is a network scanner that is used to discover hosts
and services on a network by sending packets and analyzing the
response
• Benefits Of Cloud Testing
• The following are some of the benefits of cloud testing:
1. Availability of Required testing environment: In cloud testing, testing teams
can easily replicate the customer’s environment for effective testing of the
cloud without investing in the additional hardware and software resources for
testing. These resources can be accessed from any device with a network
connection.
2. Less expensive: Cloud testing is more cost-efficient than traditional methods
of testing, as there is no need of investing in additional hardware and software
resources. Customers, as well as the testing team, only pay for what they use.
3. Faster testing: Cloud testing is faster than the traditional method of testing as
most of the management tasks like physical infrastructure management for
testing are removed.
4. Scalability: The cloud computing resources can be increased and decreased
whenever required, based on testing demands.
5. Customization: Cloud testing can be customized as per the usage, cost and
time based on the variety of users and user’s environment.
6. Disaster recovery: Disaster recovery is easily possible as the data backup is
taken at the cloud vendors as well as at the user’s end also.
• Challenges in Cloud Testing
The following are some challenges faced in cloud testing:
1. Privacy and Data Security: As cloud computing provides services on-demand to all users,
data privacy and security becomes a primary concern. Cloud applications are multi-tenant, so
the risk of unauthorized data access remains unsolved.
2. Environment Configuration: Different applications require specific infrastructures like server,
storage, etc. for deployment and testing, it becomes difficult to manage the environment for
cloud testing and leads to issues.
3. Use of Multiple-Cloud models: As there are multiple cloud models like public, private and
hybrid, based on customer requirements the applications are deployed. It becomes
challenging to manage them which can lead to complications, security, and synchronization
issues.
4. Data migration: Migration of data from one cloud service provider to another becomes a
challenging task as that may have different database schema that can require lots of time to
understand.
5. Upgradation in Cloud: The biggest challenge of cloud testing is to do up-gradation in the
cloud and ensure it does not impact the existing users and data. And the cloud providers give
a very short notice period to existing customers about upgrades.
6. Testing of all components: While performing cloud testing requires to test all the
components related to an application must be tested like server, storage, network, and also
validate them in all layers.
Cloud penetration Testing
• Cloud penetration testing is a simulated attack to assess
the security of an organizationʼs cloud-based
applications and infrastructure.
• It is an effective way to proactively identify potential
vulnerabilities, risks, and flaws and provide an actionable
remediation plan to plug loopholes before hackers exploit
them.
To address these access control issues, organizations should implement best practices, such
as:
•Regularly review and audit access permissions.
•Implement the principle of least privilege, where users and applications are granted only the
minimum access required to perform their tasks.
•Use strong authentication methods, including MFA.
•Educate employees and users about security best practices in the cloud.
•Implement automated tools and solutions for monitoring and alerting on access control
changes and suspicious activities.
•Leverage identity federation and single sign-on (SSO) solutions to centralize and simplify
access management.
•Consider using cloud-native security services and solutions provided by the cloud service
provider.
•Regularly update and patch systems and software to mitigate vulnerabilities.
Overall, addressing cloud access control issues is crucial for maintaining the security and
compliance of cloud-based systems and data. It requires a combination of proactive
measures, continuous monitoring, and user education.
Cloud service provider risks
• Using cloud service providers (CSPs) offers numerous benefits, such
as scalability, cost-efficiency, and flexibility.
• However, it also comes with several risks that organizations need to
be aware of and manage effectively.
• Here are some common cloud service provider risks:
1. Data Security and Privacy: Storing sensitive data in the cloud may
raise concerns about data security and privacy breaches. CSPs have
robust security measures, but the responsibility for securing data
ultimately lies with the customer. Organizations must configure and
manage access controls properly and encrypt sensitive data.
2. Data Loss: While CSPs have data redundancy and backup systems,
data loss can still occur due to hardware failures, software bugs, or
human errors. It's essential to have a robust backup and disaster
recovery strategy in place.
3.Compliance and Legal Issues: Different industries and regions
have specific regulatory requirements for data handling and storage.
Using a CSP might make it challenging to ensure compliance with
these regulations. Companies must assess whether the CSP meets
their compliance needs and, if not, implement additional controls.
4. Vendor Lock-In: Depending too heavily on a specific CSP's
services and APIs can lead to vendor lock-in. This means that it can
be challenging and costly to migrate away from that provider if
necessary.
To mitigate this risk, organizations should design their applications to
be as cloud-agnostic as possible.
5.Service Reliability: CSPs can experience outages or service
disruptions, impacting your organization's operations. While major
providers like AWS, Azure, and Google Cloud have high availability,
no cloud service is immune to downtime. Companies should plan for
service interruptions by designing resilient architectures.
6.Cost Overruns: Without proper monitoring and control, cloud
costs can spiral out of control. Organizations must establish cost
management practices, including budgeting, tagging resources, and
implementing auto-scaling policies.
7.Limited Customization: Some CSPs offer a wide range of
services, but they may not cater to highly specialized requirements.
If your organization relies on unique configurations or software, you
may find limitations when using off-the-shelf cloud services.
8.Data Transfer Costs: Transferring data in and out of the cloud
can incur additional costs, especially for large volumes of data.
Organizations should factor these costs into their overall cloud
strategy.
9.Dependency on Third-Party Security: While CSPs provide security
measures, organizations might rely on third-party vendors for additional
security tools and services. Managing these relationships and ensuring
compatibility can be complex.
10.Shared Responsibility Model Misunderstanding: Many CSPs
operate on a shared responsibility model, where they secure the
infrastructure, but customers are responsible for securing their data and
applications. Organizations must understand and fulfill their
responsibilities to avoid security gaps.
• To mitigate these risks, organizations should conduct a thorough risk
assessment, develop a robust cloud security strategy, implement proper
access controls, regularly monitor their cloud environment, and stay
informed about changes in the cloud service provider's offerings and
security practices. Additionally, having contingency plans and
redundancy in place can help maintain business continuity in the face of
cloud-related challenges.