Sindhu Internship Report
Sindhu Internship Report
BACHELOR OF TECHNOLOGY
in
CSE (ARTIFICIAL INTELLIGENCE)
by
DEPARTMENT OF
1
ACKNOWLEDGEMENT
First I would like to thank Andhra Pradesh State Skill Development Corporation (APSSDC)
for providing me with the transformative opportunity to participate in their Data Science
Internship Program.
It is indeed with a great sense of pleasure and immense sense of gratitude that I acknowledge
the help of these individuals.
I would like to thank my Head of the Department Dr.K.SubbaReddy for his constructive
criticism throughout my internship.
I would like to thank Mr.K.MohanRao internship coordinator Department of CSE(AI) for their
support and advices to get and complete internship in above said organization. I am extremely
grateful to my department staff members and friends who helped me in successful completion
of this internship.
2
DEPARTMENT OF
CERTIFICATE
This is to certify that the “Internship report” submitted by Surampalli Sindhu priya (Regd.
No.: 20F91A4356) is work done by her and submitted during 2023-24 academic year, in partial
fulfilment of the requirements for the award of the degree of BACHELOR OF
TECHNOLOGY in CSE(ARTIFICIAL INTELLIGENCE).
3
CONTENTS
CHAPTER 4: CONCLUSION
CHAPTER 5: ACKNOWLEDGEMENTS
CHAPTER 6: CERTIFICATIONS
4
CHAPTER 1: EXECUTIVE SUMMARY
The APSSDC Data Science Course equips individuals with the skills and knowledge needed to
succeed in the rapidly growing field of data science. This comprehensive program covers the
essential concepts of data analysis, machine learning, and data visualization, using cutting-edge
tools and technologies like Python, SQL, Tableau, and Power BI.
Program Overview
• Foundational Skills: The course begins with a solid foundation in data analysis
fundamentals, including data cleaning, manipulation, and exploration using Excel and
Python.
• Programming Expertise: Participants gain proficiency in Python, the lingua franca of
data science, learning syntax, data structures, libraries like NumPy and pandas, and
object-oriented programming.
• Statistical Analysis: Statistical methods and tools like SciPy and statsmodels are
covered to equip learners with the ability to analyze and interpret data effectively.
• Machine Learning: The course delves into the core concepts of machine learning,
covering supervised and unsupervised learning algorithms, model training and
evaluation using scikit-learn.
• Data Visualization: Tableau and Power BI are introduced to enable participants to create
compelling data visualizations and dashboards that communicate insights clearly.
• SQL Proficiency: The program equips learners with the ability to query and manipulate
data stored in relational databases using SQL.
• Project-Based Learning: Throughout the course, participants work on real-world data
analysis projects, applying their newly acquired skills to solve practical problems.
Key Takeaways
• Industry-ready skills: Graduates gain the skills and knowledge sought after by
employers in the data science domain.
• Hands-on experience: The program emphasizes hands-on learning through practical
exercises, projects, and case studies.
5
Investment in Your Future
The APSSDC Data Science Course is an investment in your future, equipping you with the
skills and knowledge needed to thrive in the data-driven world. With its comprehensive
curriculum, experienced instructors, and industry-relevant focus, this program is the perfect
launchpad for a successful career in data science.
6
CHAPTER 2: INTERNSHIP INTRODUCTION
In today's data-driven world, the ability to extract meaningful insights from vast amounts of
information has become a critical differentiator for individuals and organizations alike. Data
science empowers us to solve complex problems, optimize operations, and make informed
decisions based on robust analysis. This report provides a concise overview of data science,
highlighting its significance within the APSSDC ecosystem and the opportunities it presents
for aspiring professionals.
The exponential growth of data across various industries has generated an urgent need for
skilled professionals who can translate raw information into actionable knowledge. Data
science emerges as a powerful tool bridging the gap between data and decision-making,
enabling us to uncover hidden patterns, predict future trends, and optimize processes. From
personalized healthcare recommendations to targeted marketing campaigns and efficient
resource allocation, data science applications impact nearly every facet of our lives.
At its core, data science involves a structured approach to extracting actionable insights from
data. This iterative process encompasses data collection, cleaning, exploration, analysis,
modeling, and visualization. APSSDC, recognizing the immense potential of this field, has
curated a comprehensive data science program equipping individuals with the tools and
techniques needed to navigate this data-driven landscape.
The APSSDC data science program provides a robust curriculum covering foundational
concepts like statistics, programming (Python), machine learning algorithms, and data
visualization tools like Tableau and Power BI. Through a blend of theoretical knowledge and
practical exercises, participants gain proficiency in data manipulation, analysis, and model
building, preparing them for real-world challenges.
Equipping oneself with data science skills opens doors to diverse career paths across various
industries. Graduates can pursue roles like data analyst, business intelligence specialist,
machine learning engineer, or data scientist, commanding competitive salaries and contributing
significantly to data-driven decision-making within organizations. The APSSDC program acts
as a springboard, propelling individuals towards rewarding careers in this burgeoning field.
7
Data science stands poised to revolutionize how we understand and interact with the world
around us. The APSSDC data science program empowers individuals to become active
participants in this data-driven future, equipping them with the skills and knowledge needed to
extract insights, solve problems, and shape a better tomorrow. We encourage those seeking to
unlock their potential and contribute to the data-driven landscape to explore the transformative
opportunities offered by this program.
8
CHAPTER 3: WEEKLY REPORT
DAY ACTIVITY
Learning Outcome:
Detailed Report:
Relational Database Foundation:
Imagine a library of data organized in neatly arranged tables. Each table has rows (records)
containing individual entries and columns (fields) representing specific data points. These
tables often interconnect through shared fields, forming relationships that facilitate
9
comprehensive data analysis. SQL operates within this structured environment, allowing you
to efficiently retrieve, analyze, and update information.
SQL's Data Manipulation Language (DML) provides a powerful toolset for interacting with
your data:
• SELECT: This cornerstone command retrieves specific data from one or more tables.
You can filter and sort results based on your needs, focusing on relevant portions of the
information landscape.
• INSERT: Introduce new data points into your existing tables, enriching the dataset with
fresh insights.
• UPDATE: Modify existing data efficiently, correcting inaccuracies or adapting
information to reflect current realities.
• DELETE: Remove unwanted data, decluttering your tables and ensuring accurate
representation of the current state.
Beyond manipulating data, SQL's Data Definition Language (DDL) empowers you to define
and refine the underlying data structure:
• CREATE: Craft new tables, specifying the columns, data types, and constraints that will
govern the information housed within.
• ALTER: Adapting to changing needs, you can modify existing tables by adding,
removing, or adjusting columns to refine your data architecture.
• DROP: Remove tables that are no longer relevant or require permanent deletion,
streamlining your database structure.
10
Week 2: Advanced SQL concepts
Day Activity
Learning Outcome:
Day 1: learn joins and relationships
Day 2: known subqueries and nested selects
Day 3: worked on indexes and optimization
Day 4: learn database transactions and ACID properties
Day 5: known about views and stored procedure
Detailed report:
One of SQL's key strengths lies in its ability to combine data from multiple tables based on
shared fields. JOINs, the operators responsible for this merging, come in various flavors:
• INNER JOIN: This workhorse combines rows where the specified condition exists in
both tables, creating a focused intersection of data points.
• LEFT JOIN: All rows from the left table are preserved, with matching rows from the
right table joining to provide a comprehensive picture.
• RIGHT JOIN: Similar to the left join, the right table takes center stage, with matching
rows from the left table completing the picture.
Subqueries:
Imagine nesting one SQL query within another. That's the magic of subqueries! They empower
you to retrieve results from a query and use them within another, enabling complex data
11
filtering and analysis. For instance, you could find all orders placed by customers who also
purchased a specific product.
Stored Procedures:
Think of mini-programs within your database. Stored procedures encapsulate a series of SQL
statements, allowing you to modularize your code and execute complex tasks with a single call.
Imagine calculating complex sales commissions for each employee; a stored procedure can
handle it efficiently.
12
3.2: WEEKLY REPORT ON EXCEL
DAY ACTIVITY
Learning Outcome:
Day 1: Introduction to Microsoft Excel
Day 2: known excel interface and navigation
Day 3: learn how to enter data and formatting in excel
Day 4: learn the basic formulas and functions in excel
Day 5: learn how to sort and filter the data
Detailed report:
1. Spreadsheet Structure:
Excel operates on a grid-like structure called a spreadsheet. Rows represent records, and
columns represent fields in those records. Understanding this structure enables you to organize
and access data effectively.
Navigate cells using arrow keys or mouse clicks. Enter data directly into cells, format it using
drop-down menus, and apply styles to enhance clarity. Utilize autofill functionality for
repetitive data entry.
13
3. Formulas and Functions:
Excel's strength lies in its powerful formulas and functions. Learn basic arithmetic formulas
like SUM, AVERAGE, and COUNT. Explore built-in functions like VLOOKUP for data
lookup, TEXT for data formatting, and CONCATENATE for joining text.
Filter data based on specific criteria to focus on relevant subsets. Highlight important data
points using conditional formatting based on values, thresholds, or duplicates.
14
WEEK 4: Advanced Excel Techniques
DAY ACTIVITY
Learning Outcome:
Day 1: learn advanced formulas and functions of excel
Day 2: pivot tables and pivot charts
Day 3: known data validation and formatting
Day 4: advanced data visualization with sparklines
Day 5: known some excel tips and shortcuts
Detailed report:
Transform your data into visually compelling charts and graphs. Choose appropriate chart types
like bar charts, pie charts, and line graphs to showcase trends and relationships. Customize
charts with titles, labels, and legends for clear communication.
15
Charts & Data Visualization Beyond the Basics:
Excel charts offer immense customization potential. Explore error bars, waterfall charts, combo
charts, and sparklines to present your data in informative and visually impactful ways.
16
3.3: Weekly report on NumPy
WEEK 5:
DAY ACTIVTY
Learning Outcome:
Detailed report:
NumPy, short for Numerical Python, is a fundamental library for scientific computing in
Python. It empowers you to work efficiently with multidimensional arrays of data, unlocking
a world of numerical manipulations and calculations with stunning speed and versatility. Let's
delve into the fascinating world of NumPy basics!
NumPy introduces the concept of n-dimensional arrays, going beyond the limitations of
traditional Python lists. These arrays offer a structured and efficient way to store and
manipulate numerical data like scientific measurements, financial data, or image pixels.
17
Imagine a table of numbers, where each cell holds a data point, and rows or columns represent
different dimensions.
Creating NumPy arrays is straightforward. You can specify the array dimensions and fill them
with values, utilize built-in functions like arrange or ones, or even import data from external
sources like text files.
Navigating within these multidimensional arrays requires powerful indexing and slicing
techniques. Access specific elements using single or multiple indices, or extract entire rows or
columns using slice notation. Think of it like pinpointing specific locations within your data
table.
NumPy offers a vast library of built-in functions for advanced mathematical operations,
statistical analysis, linear algebra, and much more. Explore functions like sin, cos, exp, sqrt,
sum, mean, and linalg. solve to unlock a world of numerical possibilities.
18
WEEK 6: Advanced NumPy concepts
DAY ACTIVITY
1 Broadcasting in NumPy
Learning Outcome:
Detailed report:
Broadcasting:
This powerful feature allows operations between arrays of different shapes. Imagine adding a
scalar (single value) to an entire matrix – NumPy automatically expands the scalar to match
the matrix dimensions, saving you tedious loops.
Fancy Indexing:
Go beyond traditional slicing and indexing! Fancy indexing utilizes advanced techniques like
boolean arrays and element-wise functions to select and manipulate specific data elements with
precision. Think of picking out specific elements from different rows and columns based on
conditions, creating a custom mask for complex data extraction.
19
These pre-defined functions operate element-wise on arrays, enabling efficient vectorized
operations. NumPy offers a vast ufunc library, from basic arithmetic to advanced mathematical
functions. Imagine calculating complex trigonometric operations on entire matrices in a single
line, significantly boosting performance compared to loops.
Custom Ufuncs:
Take control and define your own specialized functions! NumPy allows you to create custom
ufuncs tailored to your specific needs. Imagine implementing a custom distance metric or
numerical algorithm directly within NumPy, enhancing your code's efficiency and flexibility.
NumPy allows you to create different views of the same underlying data without copying it,
saving memory and boosting performance. Explore advanced slicing techniques like
start:stop:step and boolean indexing to extract specific data subsets efficiently.
NumPy forms the backbone of many popular scientific computing libraries like SciPy, Pandas,
and Matplotlib. Understanding its advanced features equips you to leverage these libraries more
effectively and tackle complex data analysis tasks with confidence.
20
3.4: Weekly report on Tableau
DAY ACTIVITY
Learning Outcome:
Detailed report:
Tableau is a revolutionary data visualization tool that empowers you to transform raw data into
stunning and insightful visuals. Whether you're a seasoned data analyst or just starting your
journey, mastering Tableau basics can unlock a powerful new way to communicate and
understand your data.
21
Connecting to Data:
Tableau seamlessly connects to diverse data sources, from Excel spreadsheets and CSV files to
relational databases and even cloud platforms. Drag and drop your data file into Tableau, and
watch it magically come to life!
Once your data is loaded, you'll enter the Tableau workspace. Here, you have three main
components:
• Dimensions: These represent data categories like dates, names, or product types. Think
of them as the building blocks for your chart.
• Measures: These represent numerical values like sales figures, website visits, or
temperature readings. Imagine them as the quantities you want to visualize.
• Shelves: These are designated areas where you place dimensions and measures to build
your chart. Drag and drop elements onto shelves like "Rows," "Columns," "Marks,"
and "Color" to customize your visualization.
Marks are the basic building blocks of any Tableau visualization. They represent individual
data points and come in various shapes like circles, squares, or lines. Different combinations
of dimensions and measures on the shelves result in diverse chart types like bar charts, pie
charts, or scatterplots.
Tableau empowers you to explore your data interactively. Apply filters to focus on specific
subsets of data, highlighting trends and insights within your visualization. Click on data points,
drag sliders, and adjust parameters to see your chart dynamically update, revealing hidden
patterns and correlations.
22
WEEK 8: Advanced Tableau Techniques
DAY ACTIVITY
5 Joins in Tableau
Learning Outcome:
Detailed Report:
Move beyond basic aggregation! Craft bespoke calculated fields to manipulate data and derive
novel insights. Level of Detail (LOD) expressions unlock granular analysis, allowing you to
aggregate across various data hierarchies within a single visualization.
23
Beyond Bar Charts:
Embrace Advanced Visualization Techniques: Ditch the conventional and explore powerful
chart types like heatmaps, scatter plots, network graphs, and boxplots. Dive deep into chart
customization, utilizing advanced formatting, annotations, and interactive elements to captivate
your viewers.
Layout and Storytelling Take Center Stage: Build dashboards that guide users through your
data's narrative. Master layout techniques like grid layouts, floating objects, and custom
containers to curate a visually coherent flow. Implement filters, actions, and tooltips to spark
user exploration and deeper understanding.
Enhance user interactivity and refine your analysis with sets and parameters. Create dynamic
groups of data points based on selections or slider adjustments, empowering users to tailor the
dashboard to their specific interests.
Location data holds untapped potential. Utilize Tableau's mapping capabilities to plot points,
color-code regions, and overlay data layers, revealing spatial relationships and trends with
stunning visual clarity.
Embracing Advanced Analytics and Forecasting: Graduate from descriptive statistics! Explore
forecasting tools to predict future trends, build regression models to pinpoint key drivers, and
leverage clustering techniques to uncover hidden patterns within your data.
Performance Optimization:
Prioritize a Smooth Storytelling Journey: Ensure your dashboards deliver a swift and efficient
experience. Implement best practices for data preparation, utilize efficient chart types, and
optimize queries to avoid performance bottlenecks, keeping your audience engaged and
satisfied.
24
3.5: Weekly report on Power BI
DAY ACTIVITY
Learning Outcomes:
Detailed Report:
In today's data-driven world, transforming raw information into actionable insights is crucial.
Power BI empowers you to do just that, providing a comprehensive platform for data
visualization, analysis, and storytelling. This guide unveils the fundamental concepts of Power
BI, laying the groundwork for your data exploration journey.
Connecting to Data:
Power BI offers exceptional flexibility in data acquisition. Connect to a vast array of sources,
from local Excel files and cloud databases to live feeds and web APIs. Master the intricacies
of each connection type, ensuring seamless data access and accurate analysis.
25
Data Modelling:
Shaping Your Analysis: Before diving into visuals, Power BI lets you shape your data. Learn
the art of building robust data models using relationships, measures, and calculations.
Transform raw data into meaningful insights, ready for intuitive exploration and visualization.
Power BI is a visual powerhouse. Explore its diverse array of chart types, from classic bar
charts and pie charts to advanced maps and custom visuals. Master formatting options,
interactive elements, and storytelling techniques to craft visually compelling dashboards that
captivate your audience.
Behind the visual splendor lies DAX, Power BI's powerful formula language. DAX empowers
you to create custom calculations, manipulate data dynamically, and extract hidden insights
from your datasets. Learn DAX fundamentals like SUMX, FILTER, and CALCULATE to
unlock the full potential of your data models.
Power BI thrives on collaboration. Seamlessly share your dashboards and reports with
colleagues or stakeholders, enabling interactive exploration and informed decision-making.
Learn about Power BI Service, cloud-based platform, for secure sharing, version control, and
collaborative analysis.
26
WEEK 10: Advanced Power BI Techniques
DAY ACTIVITY
Learning Outcome:
Detailed Report:
DAX Mastery:
• Calculated Columns & Measures: Go beyond basic measures! Craft intricate formulas
to manipulate data, derive nuanced insights, and tailor your analysis to specific needs.
Master powerful functions like CALCULATE, VAR, and SWITCH to dissect your data
at any level of granularity.
• Time Intelligence: Dive deep into analyzing temporal data. Utilize TIME
INTELLIGENCE functions like DATESYTD, DATESQTR, and
CALCULATEDATE to analyze trends, seasonality, and year-over-year comparisons.
• Advanced DAX Techniques: Explore cutting-edge DAX concepts like iterating loops,
recursive functions, and advanced filtering techniques to tackle complex data
challenges.
27
Power Query: Data Sculpting Powerhouse:
• Unleashing Creativity: Break free from standard visuals! Explore a world of custom
visualizations crafted by skilled developers. Discover interactive maps, dynamic
timelines, and network graphs to present your data in unique and impactful ways.
• Developing Your Own Visuals: Take the next step and create your own custom visuals!
Learn the basics of using Python or JavaScript to design interactive elements and tailor
the visualization experience to your specific needs.
• Finding and Embedding Visuals: Browse online repositories packed with diverse
custom visuals. Implement them in your reports seamlessly, ensuring compatibility and
enriching your data storytelling capabilities.
• Sharing & Collaboration: Share your dashboards and reports securely with colleagues
and stakeholders on Power BI Service. Utilize workspace features like shared editing,
annotations, and Q&A to foster collaborative data exploration and insights.
• Governance & Security: Implement robust security measures and granular access
controls within Power BI Service. Manage user roles, control data access, and ensure
the integrity of your reports.
• Automated Refresh & Monitoring: Schedule automatic data refresh for your reports,
ensuring always-up-to-date insights. Leverage Power BI Service monitoring tools to
track performance, identify issues, and maintain a consistently smooth experience for
users.
28
3.6: Weekly report on python
WEEK 11:
DAY ACTIVITY
Learning Outcome:
Detailed Report:
Syntax:
• Python values readability, using clear and concise syntax with English-like keywords.
• Indentation is crucial, defining code blocks rather than curly braces.
• Statements typically end with a newline, but semicolons can be used for multiple
statements on one line.
• Comments start with # and are ignored by the interpreter.
Variables:
29
• Use meaningful names like age, name, or total_sales.
• Examples: x = 10, message = "Hello, world!", is_active = True
Conditional Statements:
Loops:
Functions:
def greet(name):
print("Hello, " + name + "!")
30
Modules:
31
WEEK 12: Advanced python concepts
DAY ACTIVITY
Learning Outcome:
Detailed Report:
File Handling:
• Interacting with files: Python uses built-in functions to open, read, write, and close files.
o open() function: Opens a file in a specified mode (e.g., 'r' for reading, 'w' for
writing, 'a' for appending).
o read(), readline(), readlines(): Read file contents.
o write(): Writes data to a file.
o close(): Closes the file.
• Common file operations:
o Reading text files, CSV files, JSON files, etc.
32
o Manipulating file contents.
Exception Handling:
33
o Matplotlib: Data visualization.
• Foundation for numerical computing: Efficiently handles large arrays and matrices.
• Key features:
o N-dimensional array object (ndarray) for efficient data storage and operations.
34
CHAPTER 4: CONCLUSION
APSSDC data science internship wasn't simply a series of coursework; it was a transformative
catalyst, propelling you into the heart of a data-driven future. Over these weeks, I 've
meticulously carved a sophisticated skillset, equipping myself to confidently navigate the
dynamic landscape of data analysis and solution building.
At the foundation lies SQL, the lingua franca of databases. I've become adept at crafting
incisive queries, extracting valuable insights, and shaping data into readily interpretable
formats. This foundational skill empowers me to tap into the very source of information, ready
to decipher its hidden narratives and inform intelligent decision-making.
Next, I honed your dexterity in Excel, the ubiquitous canvas for data manipulation. From
wrangling unruly datasets to crafting compelling visualizations, my Excel mastery transforms
raw numbers into impactful stories. This ensures my findings are not just accurate but also
readily understood, bridging the gap between data and actionable insights.
With NumPy in our arsenal, I wield the power of quantitative precision. Large datasets and
complex calculations are no longer daunting adversaries. I can manipulate arrays with ease,
perform sophisticated analyses, and extract hidden patterns within intricate data structures. This
quantitative prowess forms the backbone of advanced data science methodologies, opening
doors to a vast realm of possibilities.
But data is more than just numbers; it's a tapestry woven with stories waiting to be told. Tableau
and Power BI ignited my inner storyteller, equipping you with the tools to transform raw data
into captivating visuals. Through vibrant dashboards and interactive reports, I can now paint a
picture with my findings, captivating audiences and driving informed action.
Finally, I embraced the versatility of Python, the scripting language that empowers data science.
From automating tedious tasks to building sophisticated models, Python is my bridge to the
cutting edge of the field. With this skill in my hands, I can delve into machine learning,
automate data pipelines, and even craft bespoke solutions for complex problems.
However, the true value of my internship transcends the technical skills I acquired. It lies in the
knowledge I gained and the journey I embarked upon. I've learned to think critically, analyse
problems from diverse perspectives, and approach challenges with a data-driven lens. This
newfound perspective will remain invaluable throughout my career, guiding me as I navigate
the ever-evolving landscape of data.
This internship is not the culmination of my data science journey; it's a springboard propelling
me towards a future brimming with opportunities. From tackling real-world challenges in
diverse fields like healthcare and finance to collaborating on cutting-edge research projects, the
possibilities are limitless. Armed with my newly acquired knowledge and fueled by a newfound
passion, I am now poised to make my mark on the world, one insightful analysis at a time.
35
So, remember, your APSSDC internship wasn't just about mastering software; it was about
stepping into the role of a data-driven problem solver, a storyteller, and an innovator. Go forth,
armed with your sophisticated skillset and a thirst for knowledge, and pave the way for a future
where data empowers change, insights drive progress, and you stand at the forefront of it all.
36
CHAPTER 5. ACKNOWLEGEMENTS
With immense gratitude, I acknowledge the Andhra Pradesh State Skill Development
Corporation (APSSDC) for providing me with the transformative opportunity to participate in
their Data Science Internship Program. This intensive and well-structured program equipped
me with the foundational skills and knowledge to confidently navigate the dynamic landscape
of data science.
I extend my sincere appreciation to the dedicated instructors who expertly guided me through
the curriculum. Their deep expertise and passionate engagement fostered a stimulating learning
environment, propelling me to consistently push my boundaries and refine my understanding.
• Structured Query Language (SQL): I mastered the art of data retrieval and manipulation
within relational databases, establishing a crucial foundation for efficient data analysis.
• Microsoft Excel: My proficiency in data wrangling, cleaning, and visualization
flourished, empowering me to transform raw data into actionable insights for diverse
stakeholders.
• NumPy: I cultivated the ability to perform complex numerical computations and
manipulate multi-dimensional arrays with ease, unlocking the power of quantitative
analysis.
• Tableau and Power BI: I embraced the art of data storytelling, learning to craft
compelling and interactive visualizations that effectively communicate insights to both
technical and non-technical audiences.
• Python: I honed my skills in this versatile scripting language, opening doors to
advanced data analysis, automation, and model building.
Beyond technical skills, the internship fostered the development of critical soft skills. I gained
valuable experience in collaborative teamwork, critical thinking, creative problemsolving, and
clear communication of complex ideas. These transferable skills will undoubtedly prove
invaluable throughout my professional journey.
The APSSDC Data Science Internship Program provided me with an exceptional learning
experience and propelled me onto a promising career path in this rapidly evolving field. I am
confident that the knowledge and skills I acquired will serve as a springboard for success as I
contribute to data-driven solutions and meaningful insights in the years to come.
37
CHAPTER 6: CERTIFICATIONS
38