0% found this document useful (0 votes)
13 views6 pages

CCD Viva

The document provides an overview of Amazon AWS cloud architecture, detailing front-end and back-end services, integration, and benefits such as scalability and security. It also covers AWS storage, database, management tools, and application services, along with steps for creating an AWS account, launching EC2 instances, and using SageMaker for machine learning experiments. Additionally, it discusses containerization for model deployment and the use of AWS Data Wrangler for dataset preparation.

Uploaded by

bye128168
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views6 pages

CCD Viva

The document provides an overview of Amazon AWS cloud architecture, detailing front-end and back-end services, integration, and benefits such as scalability and security. It also covers AWS storage, database, management tools, and application services, along with steps for creating an AWS account, launching EC2 instances, and using SageMaker for machine learning experiments. Additionally, it discusses containerization for model deployment and the use of AWS Data Wrangler for dataset preparation.

Uploaded by

bye128168
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

EXP 1: Study of any cloud architecture and its service provider with respect to front-end, back-end such as Amazon

AWS, Microsoft Azure,


Google Cloud Platform (GCP)

AWS cloud architecture overview:


AWS provides a wide range of services that can be utilized to build reliable, scalable and secure cloud applications. The architecture usually
involves:

1) Front-end Services:
These are services provided by AWS to help build and manage parts of applications that users interact with like mobile apps or
websites.
• Amazon S3: Stores files like images and videos for websites
• Amazon CloudFront: Speeds up delivery of website content
• AWS Amplify: Helps build and host website and mobile apps

2) Back-end Services:
Back-end services typically handle business logic, data processing, and database management.
• Amazon EC2: Scalable virtual servers for running applications.
• AWS Lambda: Serverless computing for running code without managing servers.
• Amazon RDS: Managed relational database service supporting multiple engines.
• Amazon DynamoDB: Fully managed NoSQL database with fast, predictable performance.
• Amazon SQS and Amazon SNS: Messaging and notification services for decoupling components and enabling scalable, fault-tolerant
systems.

Front-End and Back-End Integration in AWS:


• API Gateway: Creates and manages APIs to securely connect front-end and back-end services.
• AWS Cognito: Manages user authentication and authorization for web and mobile apps.
• AWS SDKs: Tools for integrating AWS services into applications.

Benefits of AWS Cloud Architecture:


1. Scalability: Auto-scaling for handling varying workloads.
2. Reliability: High availability and fault tolerance with multiple data centres.
3. Security: Comprehensive security features and compliance certifications.
4. Cost-Effectiveness: Pay-as-you-go pricing reduces costs.

EXP 2: Study of Amazon AWS environment with respect to storage, database, management tools, application services

AWS offers a comprehensive suite of services for building, deploying, and managing applications in the cloud.

Storage:
• Amazon S3: Scalable object storage for backup, archiving, and analytics.
• Amazon EBS: Block storage for EC2 instances, suitable for databases and file systems.
• Amazon Glacier: Low-cost archival storage for infrequently accessed data.

Database:
• Amazon RDS: Managed relational databases with automated tasks.
• Amazon DynamoDB: Fast, scalable NoSQL database.
• Amazon Redshift: Data warehousing with fast query performance.

Management Tools:
• AWS Management Console: Web interface for managing AWS services.
• AWS CLI: Command-line tool for scripting and automation.
• AWS CloudFormation: Infrastructure as Code service for automating deployments.
• AWS IAM: Manages user access and permissions.

Application Services:
• Amazon API Gateway: Creates and manages APIs.
• Amazon SNS: Messaging service for notifications.
• AWS Step Functions: Orchestrates serverless workflows.
EXP 3: Study of Amazon AWS environment with respect to storage, database, management tools, application services

Creating an AWS Account with Administrative Role:


1. Sign Up: Go to AWS Management Console and create an account with your email, password, and payment info.
2. Support Plan: Choose a support plan (Basic is free).
3. Verify Identity: Verify your identity by phone.
4. Payment Info: Provide a valid credit card.
5. Sign In: Access the AWS Management Console with your credentials.
6. Create Admin User:
o Go to IAM in the console.
o Add a user with “Programmatic access” and “AWS Management Console access.”
o Set a password.
o Attach the “AdministratorAccess” policy.
o Create the user.
7. Access Resources: Sign in with the new admin user credentials.

Study of resource explorer in AWS


The AWS Resource Explorer is a valuable tool for visualizing and understanding your AWS resources and their relationships. It provides insights
into your infrastructure, facilitates troubleshooting and optimization, and enhances resource management capabilities.

Features: Offers an interactive graph view, filtering and grouping options, detailed resource information, and dependency mapping.
Use Cases: Useful for infrastructure analysis, troubleshooting, cost optimization, and ensuring security and compliance.
Limitations: Complexity: In large and complex AWS environments, the Resource Explorer may become cluttered and challenging to navigate
effectively. Resource Support: Not all AWS resources may be fully supported or visualized in the Resource Explorer, especially newer or less
commonly used services.

EXP 8: Create Autopilot experiment using Amazon sage maker Studio UI

Creating an Autopilot Experiment in Amazon SageMaker Studio:

1. Open SageMaker Studio:


o Access from the AWS Management Console.
o Set up a Studio domain and user profile if needed.
2. Launch SageMaker Studio:
o Go to the Launcher tab.
3. Start an Autopilot Experiment:
o Select “Autopilot” under “Training” and click “Start Experiment.”
4. Specify Experiment Details:
o Name your experiment.
o Choose a dataset.
o Configure settings like target attribute and problem type.
5. Run the Experiment:
o Click “Run Experiment.”
o Autopilot will analyze data, perform feature engineering, select algorithms, and tune hyperparameters.
6. Monitor Progress:
o Track progress in real-time with visualizations and metrics.
7. Evaluate Model Candidates:
o Review model performance and select the best model.
8. Deploy Model:
o Deploy as an endpoint for real-time inference or create a batch transform job.
9. Repeat if Needed:
o Create additional experiments with different settings or datasets.
10. Cleanup Resources:
o Clean up resources to avoid unnecessary costs.
This process allows you to leverage SageMaker Autopilot's automated machine learning capabilities to quickly and efficiently build and deploy
machine learning models without the need for manual tuning or extensive data preprocessing.
EXP 9:

Steps After Selecting a Dataset in SageMaker Autopilot:


1. Data Analysis:
o Analyzes dataset structure and characteristics.
o Identifies missing values and anomalies.
2. Feature Engineering:
o Generates and evaluates feature transformations.
o Handles categorical variables, missing data, and scaling.
3. Model Selection:
o Chooses algorithms and hyperparameters.
o Trains and evaluates multiple models.
4. Hyperparameter Tuning:
o Optimizes hyperparameters using techniques like Bayesian optimization.
5. Model Evaluation:
o Assesses models using metrics like accuracy and F1 score.
o Provides insights into performance and feature importance.
6. Model Candidate Generation:
o Creates and ranks model candidates based on performance.
7. User Review and Selection:
o Reviews and selects the best-performing models.
8. Model Deployment:
o Deploys models for real-time or batch inference.
SageMaker Autopilot automates building, training, and deploying machine learning models, saving time and effort while ensuring high-quality
results.

EXP 10:

Model Deployment Using Containerization in the Cloud:


1. Containerization Basics:
o Docker: Platform for creating, deploying, and managing containers.
2. Model Packaging:
o Packaging Models: Models and dependencies are packaged into Docker containers.
o Dockerfile: Defines environment and dependencies.
3. Benefits of Containerization:
o Portability: Consistent deployment across environments.
o Isolation: Ensures models run without interference.
o Scalability: Easily scale containers to handle workloads.
o Reproducibility: Version-controlled Docker images.
4. Cloud Environment Integration:
o Cloud Support: Major providers support Docker containers.
o Orchestration: Platforms like Kubernetes manage and scale containers.
5. Deployment Strategies:
o Single Container: Deploy a single container with the model.
o Microservices: Deploy smaller containers for specific functions.
o Serverless: Use platforms like AWS Lambda for serverless deployment.
6. Monitoring and Logging:
o Monitoring: Tools like Prometheus and Grafana.
o Logging: Solutions like ELK stack and Fluentd.
7. Security Considerations:
o Container Security: Implement best practices and manage vulnerabilities.
o Secrets Management: Securely manage sensitive information.
8. CI/CD:
o Pipelines: Automate building, testing, and deploying containers.
o IaC: Use tools like Terraform for managing environments.
Containerization in the cloud offers a flexible, scalable, and efficient approach to deploying machine learning models.
EXP 4:

Launching an EC2 Instance in AWS:


1. Sign In: Access the AWS Management Console with your credentials.
2. Navigate to EC2: Select “EC2” from the list of AWS services.
3. Launch Instance: Start the process to create a new EC2 instance.
4. Choose AMI: Pick a pre-configured template for your instance.
5. Choose Instance Type: Select the hardware configuration for your instance.
6. Configure Instance: Set up instance details like number and network settings.
7. Add Storage: Define the storage capacity and type for your instance.
8. Add Tags (Optional): Label your instance for better organization.
9. Configure Security Group: Set up firewall rules for your instance.
10. Review and Launch: Double-check settings and initiate instance creation.
11. Create Key Pair: Generate a key pair for secure SSH access.
12. Launch Instances: Finalize and launch your EC2 instance.
13. View Instances: Monitor the status and details of your instance on the dashboard.

EXP 5:

Creating an S3 Bucket:
1. Sign In: Go to AWS Management Console and sign in.
2. Navigate to S3: Select “S3” from the services list.
3. Create Bucket: Click “Create bucket.”
4. Configure Bucket:
o Enter a unique bucket name.
o Select a region.
o Configure additional settings (versioning, logging, encryption).
5. Set Permissions (Optional): Configure bucket policies and ACLs.
6. Review and Create: Review settings and click “Create bucket.”
7. Confirmation: You’ll see a confirmation message once the bucket is created.
Uploading Files to S3 Bucket:
1. Sign In: Sign in to the AWS Management Console.
2. Navigate to S3: Select the bucket.
3. Upload File: Click “Upload,” choose the file, and follow the prompts.
Applying Permissions for Uploading Files:
1. IAM Policies: Create and attach policies to IAM users or groups.
2. Bucket Policies: Define who can upload files and under what conditions.
3. ACLs: Grant specific permissions to AWS accounts or groups.
Authentication and Authorization:
1. IAM: Use IAM roles and policies for access control.
2. Amazon Cognito: Implement user authentication and authorization.
3. Custom Solutions: Use AWS Lambda, API Gateway, and other services for custom authentication.
Example Scenario:
1. Create IAM Policy: Grant s3:PutObject permission.
2. Attach Policy: Attach to an IAM role for authenticated users.
3. Authenticate Users: Use Amazon Cognito for authentication.
4. Upload Files: Users authenticate, obtain temporary credentials, and upload files.
This process helps you manage and secure your S3 bucket and its contents efficiently.
EXP 7:
Setting Up and Using SageMaker Feature Store:
1. Set up AWS Environment: Ensure you have an AWS account and access to Amazon SageMaker.
2. Create a Feature Group:
Python
import sagemaker
from sagemaker.feature_store.feature_group import FeatureGroup
import pandas as pd

# Specify your SageMaker session and role


sagemaker_session = sagemaker.Session()
role = sagemaker.get_execution_role()

# Define the Feature Group name and feature definitions


feature_group_name = 'my-feature-group'
feature_definitions = [
{'FeatureName': 'feature1', 'FeatureType': 'String'},
{'FeatureName': 'feature2', 'FeatureType': 'String'},
# Add more feature definitions as needed
]

# Create the Feature Group


feature_group = FeatureGroup(name=feature_group_name, sagemaker_session=sagemaker_session)
feature_group.create(role_arn=role, feature_definitions=feature_definitions)

# Load feature data into a DataFrame


feature_data_df = pd.DataFrame([
{'feature1': 'value1', 'feature2': 'value2'},
{'feature1': 'value3', 'feature2': 'value4'},
# Add more feature data as needed
])

# Convert DataFrame to a list of dictionaries


records = feature_data_df.to_dict(orient='records')

# Ingest data into the Feature Group


for record in records:
feature_group.put_record(record=record)

3. Verify Feature Group Creation: Check the SageMaker console or use the AWS SDK to verify.
4. Use Features in Model Training: Utilize the features in SageMaker model training jobs.

Note: Ensure your AWS IAM role has the necessary permissions. Customize the example based on your specific use
case and data schema.
EXP6:
Prepare Dataset using AWS Data Wrangler
1. Install AWS Data Wrangler:
pip install awswrangler
2. Load and Transform Data:
3. Process Dataset using scikit-learn
1. Install scikit-learn:
pip install scikit-learn
2. Load and Preprocess Data:

import pandas as pd
import awswrangler as wr
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import accuracy_score

# Load data using Data Wrangler


data = wr.s3.read_csv('s3://your-bucket/your-data.csv')

# Preprocess data (e.g., handle missing values, encode categorical variables)


# Assuming 'target_column' is the column you want to predict
X = data.drop(columns=['target_column'])
y = data['target_column']

# Split data into training and testing sets


X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Build and train machine learning model


model = RandomForestClassifier()
model.fit(X_train, y_train)

# Make predictions on the testing data


y_pred = model.predict(X_test)

# Evaluate model performance


accuracy = accuracy_score(y_test, y_pred)
print('Accuracy:', accuracy)

You might also like