PHP
AN INTERNSHIP REPORT
Submitted by:
TAMIZHALAGAN.T (622122601096)
MASTEROF COMPUTERAPPLICATIONS
PAAVAI ENGINEERING COLLEGE
(AUTONOMOUS)
`
ANNA UNIVERSITY
CHENNAI 600 025
STUDENT’S DECLARATION
I AM TAMIZHALAGAN.P STUDENT OF COMPUTER SCIENCE PROGRAM, ROLL NO:
622122601096 OF THE DEPARTMENT OF MASTER OF COMPUTER APPLICATIONS, PAAVAI
ENGINEERING COLLEGE DO HEREBY DECLARE THAT I HAVE COMPLETED THE
MANDATORY INTERNSHIP FROM 04-01- 2024 TO 30-01-2024 IN CAPITAL SOFT TECHNOLOGIES
PRIVATE LIMITED - BANGALORE UNDER THE FACULTY GUIDESHIP OF KUMAR B.
ENDORSEMENTS
FACULTY GUIDE
HEAD OF DEPARTMENT
(SIGNATURE AND DATE)
PHP
AN INTERNSHIP REPORT
Submitted in accordance with the requirement for the degree of
NAME OF THE COLLEGE : PAAVAI ENGINEERING COLLEGE
DEPERTMENT : MASTER COMPUTER APPLICATION
NAME OF THE FACULTY GUIDE : KUMAR B
PERIOD OF INTERNSHIP : FROM: 04-01- 2024 TO 30-01-2024
NAME OF THE STUDENT : NAVALADIYAN.M
PROGRAM OF STUDY : MCA
REGISTRATION NUMBER 622122601057
DATE OF SUBMISSION :
ACKNOWLEDGEMENTS
I wish to express my thanks to various personalities who are responsible for the completion of
Internship.
I would like to express my sincere thanks to my guide KUMAR.B of CAPITAL SOFT
TECHNOLOGIES PRIVATE LIMITED - BANGALORE would like to convey my special thanks to all the
people who worked along with me in CAPITAL SOFT TECHNOLOGIES PRIVATE LIMITED -
BANGALORE for their patience and continuous encouragement throughout the internship.
I express my deep felt gratitude DR.P. MUTHUSAMY, MCA.,M.E., Ph.D., Head of the
Department of computer Application and whose valuable guidance and unstinting encouragement enable
me to accomplish my internship successfully in time.
I affectionately acknowledge the encouragement received from my friends and those who involved
in giving valuable suggestions and clarifying my doubts which had really helped me in successful
completion of my internship.
DAY1: 04/01/24
ML Introduction
The first day of the internship laid the groundwork for participants by providing an introduction
to Machine Learning (ML). We began by understanding the fundamental concepts of ML, including
supervised learning, unsupervised learning, and reinforcement learning. To illustrate these concepts, we
explored a simple supervised learning example using Python.
Example Coding:
python
# Simple linear regression example
import numpy as np
import matplotlib.pyplot as plt
# Generate random data
np.random.seed(0)
X = 2 * np.random.rand(100, 1)
y = 4 + 3 * X + np.random.randn(100, 1)
# Plot the data
plt.scatter(X, y)
plt.xlabel('X')
plt.ylabel('y')
plt.title('Linear Regression Example')
plt.show()
In this example, we generated random data points representing a linear relationship between X and
y. We visualized the data using a scatter plot, setting the stage for further exploration of supervised
learning algorithms.
DAY 2: 05/01/2024
Python Basics
On Day 2, participants delved into the basics of Python programming, a fundamental tool in the arsenal
of a data scientist. We covered topics such as variables, data types, control structures, functions, and
modules. To reinforce these concepts, participants worked on coding exercises, including a simple
program to calculate the factorial of a number.
Example Coding:
python
# Function to calculate factorial
def factorial(n):
if n == 0:
return 1
else:
return n * factorial(n-1)
# Test the function
num = 5
print("Factorial of", num, "is", factorial(num))
This exercise helped participants gain practical experience in writing Python code and understanding
basic programming constructs.
DAY 3 : 08/01/2024
Data Preprocessing
What is Operator? Simple answer can be given using expression 4 + 5 is equal to 9. Here 4 and 5 are
called operands and + is called operator. PHP language supports following type of operators.
Data preprocessing is a crucial step in any machine learning project, as it involves cleaning,
transforming, and preparing data for analysis. On Day 3, we learned various data preprocessing techniques,
including handling missing values, encoding categorical variables, and scaling features. To illustrate these
techniques, we worked on a data preprocessing example using the pandas library in Python.
Example Coding:
python
import pandas as pd
# Create a sample dataset
data = {'Name': ['John', 'Anna', 'Peter', 'Linda'],
'Age': [25, 30, None, 35],
'Gender': ['M', 'F', 'M', 'F']}
df = pd.DataFrame(data)
# Handling missing values
df['Age'].fillna(df['Age'].mean(), inplace=True)
# Encoding categorical variables
df = pd.get_dummies(df, columns=['Gender'])
# Scaling features
from sklearn.preprocessing import StandardScaler
scaler = StandardScaler()
df[['Age']] = scaler.fit_transform(df[['Age']])
print(df)
In this example, we handled missing values by replacing them with the mean, encoded categorical
variables using one-hot encoding, and scaled features using standardization.
DAY – 4 &5 (09/01/2024 & 10/01/2024)
Supervised Learning
Supervised learning is a type of machine learning where the model is trained on labeled data to
make predictions or decisions. On Day 4, we delved into supervised learning algorithms, including linear
regression and decision trees. To apply these algorithms in practice, we worked on a supervised learning
example using the scikit-learn library in Python.
Example Coding:
python
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import accuracy_score
# Load the Iris dataset
iris = load_iris()
X, y = iris.data, iris.target
# Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Train a logistic regression model
model = LogisticRegression()
model.fit(X_train, y_train)
# Make predictions on the test set
y_pred = model.predict(X_test)
# Calculate accuracy
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
In this example, we trained a logistic regression model on the Iris dataset and evaluated its
performance using accuracy as the evaluation metric.
Unsupervised Learning
Unsupervised learning is a type of machine learning where the model is trained on unlabeled
data to uncover hidden patterns or structures. On Day 5, we explored unsupervised learning algorithms,
including K-means clustering and hierarchical clustering. To apply these algorithms in practice, we
worked on an unsupervised learning example using the scikit-learn library in Python.
Example Coding:
python
from sklearn.datasets import load_iris
from sklearn.cluster import KMeans
import matplotlib.pyplot as plt
# Load the Iris dataset
iris = load_iris()
X = iris.data
# Perform K-means clustering
kmeans = KMeans(n_clusters=3, random_state=42)
clusters = kmeans.fit_predict(X)
# Visualize the clusters
plt.scatter(X[:, 0], X[:, 1], c=clusters, cmap='viridis')
plt.xlabel('Sepal Length')
plt.ylabel('Sepal Width')
plt.title('K-means Clustering')
plt.show()
In this example, we performed K-means clustering on the Iris dataset and visualized the clusters
in a scatter plot.
DAY – 6 11/01/2024
Model Evaluation
Model evaluation is a critical aspect of machine learning model development, as it allows us to
assess the performance and generalization ability of the trained models. On Day 6, we learned various
model evaluation metrics and techniques, including accuracy, precision, recall, and F1 score. To apply
these metrics in practice, we worked on a model evaluation example using the scikit-learn library in
Python.
Example Coding:
python
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import classification_report
# Load the Iris dataset
iris = load_iris()
X, y = iris.data, iris.target
# Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Train a logistic regression model
model = LogisticRegression()
model.fit(X_train, y_train)
# Make predictions on the test set
y_pred = model.predict(X_test)
# Generate classification report
report = classification_report(y_test, y_pred)
print(report)
In this example, we trained a logistic regression model on the Iris dataset and generated a
classification report to evaluate its performance using precision, recall, and F1 score.
DAY – 7 12/01/2024
Neural Networks Intro
Neural networks are a class of machine learning models inspired by the structure and function of the
human brain. On Day 7, we were introduced to the basics of artificial neural networks (ANNs), including
the architecture of a single-layer perceptron and multi-layer perceptron. To understand the training
process of neural networks, we implemented a simple feedforward neural network using the TensorFlow
library in Python.
Example Coding:
python
import tensorflow as tf
# Define the model architecture
model = tf.keras.Sequential([
tf.keras.layers.Dense(10, activation='relu', input_shape=(4,)),
tf.keras.layers.Dense(3, activation='softmax')
])
# Compile the model
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
# Train the model
model.fit(X_train, y_train, epochs=50, verbose=0)
# Evaluate the model
loss, accuracy = model.evaluate(X_test, y_test)
print("Accuracy:", accuracy)
In this example, we defined a feedforward neural network with one hidden layer using TensorFlow and
trained it on the Iris dataset.
DAY – 8 18/01/2024
Deep Learning Basics
Deep learning is a subfield of machine learning that focuses on neural networks with multiple layers
(deep neural networks). On Day 8, we delved into the basics of deep learning, covering topics such as
deep feedforward networks, backpropagation algorithm, and gradient descent optimization. To apply
these concepts in practice, we implemented a deep neural network for image classification using the
Keras library in Python.
Example Coding:
python
import tensorflow as tf
# Load the MNIST dataset
mnist = tf.keras.datasets.mnist
(X_train, y_train), (X_test, y_test) = mnist.load_data()
# Preprocess the data
X_train, X_test = X_train / 255.0, X_test / 255.0
# Define the model architecture
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10, activation='softmax')
])
# Compile the model
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
# Train the model
model.fit(X_train, y_train, epochs=5)
# Evaluate the model
loss, accuracy = model.evaluate(X_test, y_test)
print("Accuracy:", accuracy)
In this example, we defined a deep neural network with multiple hidden layers and trained it on the
MNIST dataset for handwritten digit classification.
DAY – 9 19/01/2024
CNNs Fundamentals
Convolutional Neural Networks (CNNs) are a type of deep neural network particularly well-suited for
image recognition and computer vision tasks. On Day 9, we explored the fundamentals of CNNs, including
convolutional layers, pooling layers, and fully connected layers. To understand the architecture of CNNs,
we implemented a simple CNN for image classification using the TensorFlow library in Python.
Example Coding:
python
import tensorflow as tf
# Define the model architecture
model = tf.keras.Sequential([
tf.keras.layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)),
tf.keras.layers.MaxPooling2D((2, 2)),
tf.keras.layers.Conv2D(64, (3, 3), activation='relu'),
tf.keras.layers.MaxPooling2D((2, 2)),
tf.keras.layers.Conv2D(64, (3, 3), activation='relu'),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(64, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
# Compile the model
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
# Train the model
model.fit(X_train, y_train, epochs=5)
# Evaluate the model
loss, accuracy = model.evaluate(X_test, y_test)
print("Accuracy:", accuracy)
In this example, we defined a CNN with multiple convolutional and pooling layers and trained it on the
MNIST dataset for handwritten digit classification.
DAY – 10 22/01/2024
RNNs Fundamentals
Recurrent Neural Networks (RNNs) are a type of deep neural network designed to process sequential data,
making them ideal for tasks such as natural language processing (NLP) and time series analysis. On Day
10, we delved into the fundamentals of RNNs, including recurrent layers, long short-term memory
(LSTM) cells, and gated recurrent units (GRUs). To understand the architecture of RNNs, we implemented
a simple RNN for sequence prediction using the TensorFlow library in Python.
Example Coding:
python
import tensorflow as tf
# Define the model architecture
model = tf.keras.Sequential([
tf.keras.layers.SimpleRNN(64, input_shape=(None, 1)),
tf.keras.layers.Dense(1)
])
# Compile the model
model.compile(optimizer='adam', loss='mse')
# Train the model
model.fit(X_train, y_train, epochs=10)
# Evaluate the model
loss = model.evaluate(X_test, y_test)
print("Loss:", loss)
In this example, we defined a simple RNN with one recurrent layer and trained it on a sequence
prediction task.
DAY – 11 23/01/2024
Transfer Learning
Transfer learning is a machine learning technique where a model trained on one task is re-purposed or fine-
tuned for a different task. On Day 11, we explored the concept of transfer learning and its practical
applications in leveraging pre-trained models for new tasks. To apply transfer learning in practice, we fine-
tuned a pre-trained convolutional neural network for image classification using the TensorFlow library in
Python.
Example Coding:
python
import tensorflow as tf
# Load the pre-trained model
base_model = tf.keras.applications.MobileNetV2(weights='imagenet',
include_top=False, input_shape=(224, 224, 3))
# Freeze the base model layers
base_model.trainable = False
# Add classification head
model = tf.keras.Sequential([
base_model,
tf.keras.layers.GlobalAveragePooling2D(),
tf.keras.layers.Dense(10, activation='softmax')
])
# Compile the model
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
# Train the model
model.fit(train_images, train_labels, epochs=5, validation_data=(val_images, val_labels))
# Evaluate the model
loss, accuracy = model.evaluate(test_images, test_labels)
print("Accuracy:", accuracy)
In this example, we fine-tuned a pre-trained MobileNetV2 model on a new dataset for image classification.
DAY – 12 24/01/2024
Model Deployment
Model deployment is the process of integrating machine learning models into production environments to
make predictions or decisions in real-time. On Day 12, we learned about various model deployment
strategies and techniques, including cloud-based deployment and containerization with Docker. To
understand the deployment process, we deployed a machine learning model using a Flask web application
in Python.
Example:
Coding (Flask App):
python
from flask import Flask, request, jsonify
import joblib
app = Flask(__name__)
# Load the trained model
model = joblib.load('model.pkl')
@app.route('/predict', methods=['POST'])
def predict():
data = request.get_json()
features = data['features']
prediction = model.predict(features)
return jsonify({'prediction': prediction.tolist()})
if __name__ == '__main__':
app.run(debug=True)
In this example, we created a simple Flask web application to serve predictions from a trained machine
learning model
DAY – 13 25/01/2024
Ethics in ML
Ethical considerations are paramount in the development and deployment of machine learning
systems, as they have the potential to impact individuals and society at large. On Day 13, we explored
ethical issues such as bias, fairness, transparency, and accountability in machine learning. To understand
the ethical implications of machine learning algorithms, we discussed case studies and real-world
examples.
DAY – 14 29/03/2023
Hands-on Project
The penultimate day of the internship was dedicated to a hands-on project where participants applied the
knowledge and skills acquired throughout the internship to solve a real-world problem. Participants
worked in teams to identify a machine learning problem, collect and preprocess data, design and
implement machine learning models, and evaluate model performance. Mentors provided guidance and
feedback throughout the project, ensuring that participants applied best practices and techniques learned
DAY – 15 30/01/2024
Presentation Skills
The final day of the internship focused on honing presentation skills, an essential aspect of
communicating project findings and insights to stakeholders effectively. Participants learned how to
structure presentations, craft compelling narratives, create visually appealing slides, and deliver engaging
talks. Practical exercises and mock presentations helped participants refine their communication skills and
build confidence in presenting their work.
EVALUTION BY THE SUPERVISOR OF THE INTERN ORGANISATION
Student Name : NAVALADIYAN.M
Roll Number 622122601057
Term of Internship : From: 04 01-2024 To: 30-01-2024
Date of evaluation :
Organization Name : CAPITAL SOFT TECHNOLOGIES PRIVATE LIMITED - BANGALORE
Name & Mobile Number of the Supervisor
Supervisor