module1-cloudcomputing-final
module1-cloudcomputing-final
Microsoft Azure
Google Cloud Platform (GCP)
IBM Cloud
Oracle Cloud
Cloud computing has transformed how businesses operate by providing a flexible, scalable, and cost-
efficient way to manage IT resources. Whether you're a startup or a large enterprise, cloud solutions can help
improve agility and reduce costs.
CLOUD COMPUTING AT A GLANCE:
Cloud computing is the delivery of computing services—including servers, storage, databases, networking,
software, analytics, and intelligence—over the internet (“the cloud”) to offer faster innovation, flexible
resources, and economies of scale. Here's a quick overview:
Key Characteristics:
On-Demand: Users can access resources anytime without manual intervention from the service
provider.
Scalable: Resources can be scaled up or down instantly depending on the demand.
Pay-As-You-Go: Users only pay for the resources they consume.
Global Access: Accessible from anywhere with an internet connection.
Managed by Providers: Providers manage infrastructure, software, and security.
Main Service Models:
1. IaaS (Infrastructure as a Service): Provides virtualized computing resources, such as virtual
machines, storage, and networks. (e.g., AWS EC2, Azure Virtual Machines)
2. PaaS (Platform as a Service): Offers platforms to develop, run, and manage applications without
worrying about underlying infrastructure. (e.g., Google App Engine, Heroku)
3. SaaS (Software as a Service): Delivers software applications via the cloud, managed by the service
provider. (e.g., Google Workspace, Salesforce)
Types of Cloud:
Public Cloud: Resources are shared among multiple users over the internet.
Private Cloud: Cloud infrastructure is exclusively used by a single organization.
Hybrid Cloud: Combines public and private clouds, allowing data and apps to be shared.
Multi-Cloud: Use of multiple cloud services from different providers for different tasks.
Benefits:
Cost-Efficient: Eliminates capital expenses for buying hardware and software.
Flexibility: Adapt quickly to changes with easy scaling of resources.
Reliability: Data backup and disaster recovery options ensure continuity.
Security: Cloud providers invest heavily in security to protect data.
Collaboration: Facilitates team collaboration and remote work.
Popular Providers:
Amazon Web Services (AWS)
Microsoft Azure
Google Cloud Platform (GCP)
IBM Cloud
Oracle Cloud
Cloud computing simplifies IT management, offers flexibility, and enables innovation while minimizing
Acharya Dr. SarvepalliRadhakrishnan Road, Soladevanahalli, Acharya P. O., Bangalore-560
107 www. ait.ac.in Ph.: 080 2372 2222
ACHARYA INSTITUTE OF TECHNOLOGY
Affiliated to Visvesvaraya Technology University, Belagavi, Approved by AICTE, New
Delhi, organized by Govt. of Karnataka and Accredited by NBA (AE,BT,CSE,ECE,ME,
MT)
costs. It has become a key tool for modern businesses looking to stay agile in a competitive landscape.
HISTORICAL DEVELOPMENT OF CLOUD:
The historical development of cloud computing has evolved over several decades, with contributions from
various technological advancements. Here's a timeline that outlines key milestones:
1. 1960s: The Foundations of Cloud Computing
Concept of Utility Computing:
o Visionary computer scientist John McCarthy proposed the idea that computing could be
provided as a utility, just like water or electricity. This idea laid the groundwork for the
concept of cloud computing.
o Time-sharing systems emerged, allowing multiple users to access a single mainframe
computer, sharing its processing power.
2. 1970s: Virtualization
Virtual Machines (VMs):
o IBM introduced the VM operating system in the 1970s, which allowed a single physical
computer to run multiple virtual machines, each with its own operating system. This concept
of virtualization became essential for the later development of cloud infrastructure.
ARPANET (Precursor to the Internet):
o The Advanced Research Projects Agency Network (ARPANET), a project funded by the
U.S. Department of Defense, became the precursor to the modern internet. It allowed remote
sharing of resources over a network.
3. 1980s: Evolution of Networking and Distributed Computing
Client-Server Model:
o The client-server architecture became popular, where servers provided services to multiple
clients over a network. This model facilitated the development of decentralized computing,
which is a key characteristic of cloud computing.
Emergence of the Internet:
o By the mid-1980s, the internet began to take shape, enabling global networking and
communication, laying the foundation for cloud services to emerge later.
4. 1990s: Early Forms of Cloud and the Web
World Wide Web (1990):
o Tim Berners-Lee developed the World Wide Web, which revolutionized communication
and data exchange, making it easier for people to access and share information online.
Application Service Providers (ASPs):
o The late 1990s saw the rise of ASPs, where companies provided software applications over
the internet, a precursor to modern SaaS (Software as a Service). ASPs provided limited
cloud-like functionalities for businesses.
Salesforce (1999):
o Salesforce launched as one of the first successful SaaS companies, offering customer
relationship management (CRM) software through a web-based platform. It demonstrated
the viability of delivering software over the internet.
5. 2000s: The Emergence of Modern Cloud Computing
Amazon Web Services (AWS) (2006):
o Amazon launched AWS and its first service, Elastic Compute Cloud (EC2). AWS provided
scalable, on-demand computing power to users, marking the birth of modern cloud
infrastructure.
Acharya Dr. SarvepalliRadhakrishnan Road, Soladevanahalli, Acharya P. O., Bangalore-560
107 www. ait.ac.in Ph.: 080 2372 2222
ACHARYA INSTITUTE OF TECHNOLOGY
Affiliated to Visvesvaraya Technology University, Belagavi, Approved by AICTE, New
Delhi, organized by Govt. of Karnataka and Accredited by NBA (AE,BT,CSE,ECE,ME,
MT)
o S3 (Simple Storage Service) was also introduced, offering scalable cloud storage. These
services demonstrated the power of cloud computing for businesses of all sizes.
Google Cloud (2008):
o Google entered the cloud market with Google App Engine, which allowed developers to
build and host web applications in Google's infrastructure.
Microsoft Azure (2010):
o Microsoft launched Azure, initially as a PaaS (Platform as a Service), which later expanded
to provide IaaS and SaaS services. Azure quickly became a major competitor in the cloud
market.
6. 2010s: Cloud Becomes Mainstream
Adoption by Enterprises:
o During this period, cloud computing gained widespread acceptance across industries,
offering cost efficiency, scalability, and flexibility.
Hybrid and Multi-Cloud:
o The adoption of hybrid cloud models, which combine public and private cloud resources,
became popular among enterprises. Multi-cloud strategies, where companies use multiple
cloud providers for different services, also became common.
Google Drive (2012):
o Google launched Google Drive, popularizing cloud storage for individuals and businesses.
Serverless Computing (2014):
o The concept of serverless computing was introduced by AWS with AWS Lambda, allowing
developers to execute code without managing servers, further abstracting the infrastructure
layer.
7. 2020s: Cloud Expansion and New Paradigms
Edge Computing:
o As the demand for low-latency processing increased, edge computing became a key trend,
moving computation closer to the data source or user. This complements cloud computing
by distributing workloads.
AI and Machine Learning in the Cloud:
o Cloud providers expanded their offerings to include artificial intelligence (AI) and machine
learning (ML) services, allowing businesses to leverage advanced analytics without needing
specialized hardware.
5G and IoT Integration:
o With the rollout of 5G and the growth of the Internet of Things (IoT), cloud computing now
supports a broader range of real-time, interconnected devices and applications.
Sustainability Focus:
o Cloud providers like AWS, Microsoft, and Google increasingly focus on making their data
centers more energy-efficient, responding to the growing demand for green computing and
sustainability in cloud operations.
Key Trends in Cloud Computing History:
Virtualization and networking advancements paved the way for cloud infrastructure.
The rise of SaaS demonstrated how businesses could deliver software via the web.
Cloud services evolved from simple hosting and storage to complex computing platforms offering
AI, big data, and real-time applications.
Future of Cloud Computing
Acharya Dr. SarvepalliRadhakrishnan Road, Soladevanahalli, Acharya P. O., Bangalore-560
107 www. ait.ac.in Ph.: 080 2372 2222
ACHARYA INSTITUTE OF TECHNOLOGY
Affiliated to Visvesvaraya Technology University, Belagavi, Approved by AICTE, New
Delhi, organized by Govt. of Karnataka and Accredited by NBA (AE,BT,CSE,ECE,ME,
MT)
The cloud is expected to further evolve with innovations in quantum computing, decentralized cloud
systems, and greater integration with AI and automation.
BUILDING CLOUD COMPUTING ENVIRONMENT:
Building a cloud computing environment involves creating an infrastructure that supports scalable, flexible,
and reliable services, typically managed through virtualization, automation, and orchestration. This can be
done using public, private, or hybrid cloud models. Below is a step-by-step guide to building and setting up a
cloud computing environment:
g., AWS, Azure, Google Cloud), where multiple users share resources. You focus on application
development rather than infrastructure management.
- Private Cloud: Created and managed within the organization, offering greater control, security, and
customization but requiring significant resources to set up.
- Hybrid Cloud: Combines public and private clouds, enabling flexibility and workload distribution across
both environments.
- Multi-Cloud: Utilizing services from multiple cloud providers to diversify risks and optimize
resources.
Example Architecture:
- Compute Layer: Virtual machines, containers, or serverless functions (e.g., AWS Lambda).
- Storage Layer: Distributed storage for files, databases, or backups.
- Network Layer: Virtual networking for internal and external communication.
- Security Layer: Identity and access management, encryption, and compliance measures.
- Monitoring Layer: Continuous tracking of performance, security, and cost.
Conclusion
Building a cloud computing environment involves careful planning, selection of technologies, and
automation for scalability, security, and performance. By leveraging cloud platforms and virtualization, you
can create a flexible, scalable, and cost-effective environment to meet the dynamic needs of modern
applications and businesses.
AMAZON WEB SERVICES
AWS (Amazon Web Services) is a comprehensive cloud computing platform provided by Amazon. It offers
a wide range of cloud-based services including computing power, storage options, networking, and
databases, along with advanced services like artificial intelligence (AI), machine learning (ML), and Internet
of Things (IoT). AWS enables businesses to scale their infrastructure and services globally with a pay-as-
you-go model, providing flexibility and cost-efficiency.
1. Compute Services :
- EC2 (Elastic Compute Cloud) : Virtual servers to run applications.
- Lambda : Serverless computing service to run code without provisioning servers.
- Elastic Beanstalk : Platform as a Service (PaaS) for deploying and managing applications.
2. Storage Services :
- S3 (Simple Storage Service) : Object storage for storing and retrieving any amount of data.
- EBS (Elastic Block Store) : Block storage volumes for use with EC2 instances.
- Glacier : Low-cost archive storage.
3. Database Services :
- RDS (Relational Database Service) : Managed relational databases.
- DynamoDB : NoSQL database service.
- Aurora : High-performance relational database compatible with MySQL and PostgreSQL.
Acharya Dr. SarvepalliRadhakrishnan Road, Soladevanahalli, Acharya P. O., Bangalore-560
107 www. ait.ac.in Ph.: 080 2372 2222
ACHARYA INSTITUTE OF TECHNOLOGY
Affiliated to Visvesvaraya Technology University, Belagavi, Approved by AICTE, New
Delhi, organized by Govt. of Karnataka and Accredited by NBA (AE,BT,CSE,ECE,ME,
MT)
4. Networking :
- VPC (Virtual Private Cloud) : Isolated network resources.
- Route 53 : Scalable Domain Name System (DNS).
- CloudFront : Content Delivery Network (CDN) to deliver data, videos, and APIs globally.
7. Developer Tools :
- CodeBuild, CodeDeploy, CodePipeline : Continuous integration and continuous deployment (CI/CD)
services.
AWS provides scalable, reliable, and flexible cloud solutions for businesses of all sizes, from startups to
large enterprises.
GOOGLE APP ENGINE:
Google App Engine (GAE) is a fully managed platform-as-a-service (PaaS) offering by Google Cloud that
allows developers to build and deploy scalable web applications and services. It abstracts much of the
underlying infrastructure, allowing you to focus on writing code rather than managing servers or scaling
resources.
1. Managed Infrastructure:
- Auto-scaling : Your app scales automatically based on traffic and demand.
- No server management : You don't have to worry about managing servers, patching OS updates, or
configuring scaling policies.
3. Flexible Environments:
- Standard Environment :
- Faster deployment times, built-in scaling, and a limited set of Google Cloud services.
- Apps run in a sandboxed environment, with specific limitations on supported libraries, language
versions, etc.
- Flexible Environment :
- More freedom, supports custom libraries and long-running processes.
- Can handle more intensive computational workloads and provides support for third-party services via
Docker containers.
6. Pricing:
- Pay-as-you-go model based on the resources consumed by your app (e.g., compute, storage, traffic).
- There's also a free tier that allows developers to experiment with the platform without incurring costs for
small-scale apps.
Use Cases:
- Web applications : Hosting websites and web apps with dynamic content.
- APIs and microservices : Building backend services for mobile or web applications.
- Data processing : Handling batch processing or real-time data pipelines.
Would you like to explore a specific aspect of Google App Engine or need help with getting started?
MICROSOFT AZURE
Microsoft Azure is a cloud computing platform and services suite provided by Microsoft, designed to help
businesses and developers build, manage, and deploy applications through a global network of data centers.
Azure offers a wide range of cloud services, including computing, analytics, storage, networking, and
artificial intelligence (AI), and is a key player alongside AWS and Google Cloud in the cloud market.
Key Features of Microsoft Azure:
1. Compute Services:
Virtual Machines (VMs): Deploy Linux or Windows virtual machines to run applications and
workloads.
Azure App Service: A fully managed platform for building and hosting web apps, mobile app
backends, and RESTful APIs.
Azure Kubernetes Service (AKS): Manage containerized applications using Kubernetes.
Azure Functions: A serverless compute service that allows you to execute code on demand without
Acharya Dr. SarvepalliRadhakrishnan Road, Soladevanahalli, Acharya P. O., Bangalore-560
107 www. ait.ac.in Ph.: 080 2372 2222
ACHARYA INSTITUTE OF TECHNOLOGY
Affiliated to Visvesvaraya Technology University, Belagavi, Approved by AICTE, New
Delhi, organized by Govt. of Karnataka and Accredited by NBA (AE,BT,CSE,ECE,ME,
MT)
managing infrastructure.
2. Storage:
Azure Blob Storage: Object storage for unstructured data, such as images, videos, and backups.
Azure Disk Storage: Managed disks for VMs, offering SSD and HDD options.
Azure Files: Managed file shares that can be accessed via SMB protocol.
Azure Backup: A service that provides secure cloud backups.
3. Database and Data Services:
Azure SQL Database: Fully managed relational database service with SQL Server compatibility.
Cosmos DB: Globally distributed NoSQL database service for modern applications.
Azure Synapse Analytics: An analytics service that brings together big data and data warehousing.
Azure Database for MySQL/PostgreSQL: Managed database services for MySQL and PostgreSQL
databases.
4. Networking:
Azure Virtual Network (VNet): Set up and manage your own network infrastructure in the cloud.
Azure Load Balancer: Distribute network traffic across multiple servers to ensure availability.
Azure VPN Gateway: Securely connect your on-premises network to Azure.
Azure CDN: Deliver content to users around the globe with a content delivery network.
5. AI and Machine Learning:
Azure AI: A suite of AI services like Azure Cognitive Services, which includes natural language
processing, computer vision, and speech APIs.
Azure Machine Learning: Build, train, and deploy machine learning models quickly and efficiently.
Azure Bot Service: Develop intelligent bots using integrated AI.
6. Developer Tools and DevOps:
Azure DevOps: A complete DevOps solution with services for CI/CD, source control, package
management, and more.
Azure Pipelines: Automate the build, test, and deploy process of your code to any platform.
GitHub Actions for Azure: Automate Azure workflows directly from GitHub repositories.
Visual Studio Code Integration: Seamless integration with popular development environments for
rapid application development.
7. Security and Identity:
Azure Active Directory (Azure AD): Cloud-based identity and access management service for secure
access to applications and resources.
Azure Security Center: A unified security management system that provides advanced threat
protection.
Azure Sentinel: A cloud-native security information and event management (SIEM) tool that
provides intelligent security analytics.
Key Vault: Securely store and manage sensitive information like keys, passwords, certificates, and
more.
8. Hybrid and Multi-Cloud:
Azure Arc: Manage and govern on-premises, multi-cloud, and edge environments through a unified
platform.
Azure Stack: A hybrid cloud platform that allows users to run Azure services in their own data
centers.
Azure Site Recovery: Keep your business running with an integrated disaster recovery service.
9. Analytics and Big Data:
Azure Data Lake: A scalable data storage and analytics service for big data workloads.
Acharya Dr. SarvepalliRadhakrishnan Road, Soladevanahalli, Acharya P. O., Bangalore-560
107 www. ait.ac.in Ph.: 080 2372 2222
ACHARYA INSTITUTE OF TECHNOLOGY
Affiliated to Visvesvaraya Technology University, Belagavi, Approved by AICTE, New
Delhi, organized by Govt. of Karnataka and Accredited by NBA (AE,BT,CSE,ECE,ME,
MT)
HDInsight: A fully managed Hadoop and Spark service for big data processing.
Azure Databricks: An Apache Spark-based analytics platform optimized for Azure, built for AI and
big data workloads.
10. Compliance and Certifications:
Azure is compliant with a wide range of industry standards and regulations, such as GDPR, HIPAA,
and ISO/IEC 27001.
Provides tools like Azure Policy and Azure Blueprints to ensure compliance across your cloud
environment.
Pricing:
Pay-as-you-go model with pricing based on the resources you use.
Azure Cost Management tools help you track and optimize cloud spending.
Use Cases:
Enterprise Applications: Host large-scale enterprise apps such as ERP, CRM, or custom-built
software.
Data Analytics: Build big data pipelines, data lakes, and perform real-time analytics.
AI and Machine Learning: Build intelligent applications using Azure’s AI services.
Hybrid Cloud Solutions: Extend on-premises infrastructure to the cloud with Azure’s hybrid cloud
offerings.
HADOOP
Apache Hadoop is an open-source framework designed for distributed storage and processing of large
datasets across clusters of computers using simple programming models. It is one of the foundational
technologies used in big data environments and enables efficient large-scale data analysis.
Key Components of Hadoop:
1. Hadoop Distributed File System (HDFS):
HDFS is a distributed file system that stores data across many machines, designed to handle large
files.
Data is split into large blocks (default is 128MB or 256MB) and replicated across different nodes in
the cluster for fault tolerance and redundancy.
It is optimized for high throughput rather than low-latency access to data, which makes it suitable
for batch processing.
2. MapReduce:
MapReduce is a programming model used for processing large data sets with a distributed algorithm
on a Hadoop cluster.
The model has two key phases:
o Map: Processes the input data and produces intermediate key-value pairs.
o Reduce: Aggregates and processes the key-value pairs generated by the map phase to
produce the final output.
It's particularly useful for batch processing tasks that can be parallelized across large datasets.
3. YARN (Yet Another Resource Negotiator):
YARN is Hadoop’s cluster resource management system that manages resources for different
distributed applications.
It schedules and allocates resources (e.g., CPU and memory) to various applications running on a
Hadoop cluster.
YARN enables Hadoop to support a broader range of processing frameworks beyond MapReduce,
such as Apache Spark, Flink, and more.
4. Hadoop Common:
Hadoop Common is a set of utilities and libraries that support other Hadoop modules.
It includes the necessary Java libraries and scripts to start Hadoop services and interact with the file
system.
Hadoop Ecosystem:
In addition to its core components, Hadoop has a rich ecosystem of tools and frameworks that extend its
capabilities for data storage, processing, and analysis.
1. Apache Hive:
A data warehouse software built on top of Hadoop, which allows users to write SQL-like queries
(HiveQL) to manage and query large datasets stored in HDFS.
Ideal for data analysis tasks that don't require real-time processing but can run in batch mode.
2. Apache HBase:
A NoSQL database that provides real-time read and write access to large datasets.
HBase runs on top of HDFS and allows for random, real-time access to big data.
3. Apache Pig:
A platform for analyzing large data sets that consists of a high-level scripting language called Pig
Latin.
It is used for writing data analysis programs for complex data transformations, especially in batch
mode.
4. Apache Spark:
A fast, in-memory data processing framework that can work with Hadoop clusters.
Spark can be used as an alternative to MapReduce due to its faster processing capabilities, thanks to
its ability to keep data in memory between operations.
5. Apache Sqoop:
A tool designed for efficiently transferring bulk data between Hadoop and relational databases like
MySQL, Oracle, or SQL Server.
6. Apache Flume:
A distributed service for efficiently collecting, aggregating, and moving large amounts of log data
from various sources into HDFS.
7. Apache Oozie:
A workflow scheduler that allows you to manage complex Hadoop jobs, coordinating workflows
and dependencies between jobs like MapReduce, Hive, Pig, and others.
8. Cloudera and Hortonworks (HDP):
Hadoop distributions by vendors like Cloudera and Hortonworks (now part of Cloudera) offer
enhanced management tools, support, and additional features for running Hadoop clusters at
enterprise scale.
in batch mode.
5. Data Locality: Hadoop moves the computation to the data rather than the other way around, which
reduces network congestion and improves processing speeds.
6. Open-Source and Extensible: Being an open-source framework, Hadoop has a large community of
developers and a wide range of plugins, extensions, and tools available.
o Companies can build entire custom applications on the platform, such as inventory
management systems, HR systems, or project management tools.
3. Automation:
o Automate complex business processes and reduce manual effort, such as automating
approval processes, data validation, and notifications.
4. Third-party Integrations:
o Build integrations between Salesforce and other enterprise systems, such as ERPs, payment
gateways, or data warehouses, to synchronize data across the business.
5. AppExchange Applications:
o Independent software vendors (ISVs) can build applications on Force.com and offer them on
Salesforce’s AppExchange marketplace.
Benefits of Force.com:
1. Faster Time-to-Market:
o Pre-built tools, templates, and workflows allow for rapid development and deployment of
applications.
2. Customization Without Complexity:
o A wide range of customization options is available, from simple drag-and-drop app building
to writing custom code in Apex and Visualforce.
3. Enterprise-Grade Security:
o Built-in enterprise-level security, compliance, and identity management, ensuring that
sensitive data is handled securely.
4. Scalability:
o As a cloud-based platform, Force.com scales automatically to handle large amounts of data
and traffic.
5. Strong Ecosystem:
o Salesforce’s ecosystem provides access to a wide range of third-party tools, applications,
and support through AppExchange and the Salesforce developer community.
6. Global Availability:
o Force.com runs on Salesforce’s global infrastructure, providing high availability, disaster
recovery, and low-latency access worldwide.
Salesforce Editions:
Salesforce offers multiple editions tailored to different types of organizations and needs:
1. Essentials: Ideal for small businesses, includes core CRM functionality.
2. Professional: Designed for businesses that need a full-featured CRM with advanced sales and
customer service tools.
3. Enterprise: Suitable for larger businesses with complex needs, offering advanced customization,
integration, and automation features.
4. Unlimited: Provides the most extensive features, including unlimited customization, support, and
access to premium services.
Salesforce Ecosystem:
1. Lightning Experience:
Lightning is Salesforce’s modern user interface, offering a more dynamic and interactive experience
than the previous Classic UI.
It includes a drag-and-drop app builder (Lightning App Builder), customizable dashboards, and
responsive design across devices.
2. Trailhead:
Trailhead is Salesforce’s free, gamified learning platform that helps users learn how to use and
develop on Salesforce.
Acharya Dr. SarvepalliRadhakrishnan Road, Soladevanahalli, Acharya P. O., Bangalore-560
107 www. ait.ac.in Ph.: 080 2372 2222
ACHARYA INSTITUTE OF TECHNOLOGY
Affiliated to Visvesvaraya Technology University, Belagavi, Approved by AICTE, New
Delhi, organized by Govt. of Karnataka and Accredited by NBA (AE,BT,CSE,ECE,ME,
MT)
It offers courses (called “Trails”) in topics such as CRM fundamentals, app development, AI, and
business process automation.
3. Salesforce Partners and ISVs:
Salesforce has a robust partner ecosystem, consisting of System Integrators (SIs) who help
companies implement and customize Salesforce, and Independent Software Vendors (ISVs) who
build apps and solutions that run on the platform.
oThe Aneka Container is a lightweight runtime environment that executes tasks across
distributed resources. It acts as a node within the Aneka network.
2. Aneka Manager:
o The Aneka Manager is responsible for orchestrating resources, scheduling tasks, and
managing application execution. It ensures that jobs are distributed across the cloud
infrastructure efficiently.
3. Aneka SDK:
o The Aneka Software Development Kit (SDK) provides libraries and APIs for developers to
build applications for the Aneka platform. It supports different programming languages
like .NET, Java, and more.