0% found this document useful (0 votes)
257 views38 pages

Sindhu Internship Report

The document describes an internship report submitted by S. Sindhu priya to fulfill the requirements of a Bachelor of Technology degree in Computer Science and Engineering with a specialization in Artificial Intelligence. It includes an executive summary of the internship, which provided training in data science through courses covering foundational skills, programming, statistical analysis, machine learning, data visualization, SQL, and project-based learning to equip graduates with industry-relevant skills. The report also acknowledges contributions from the internship coordinator, department head, and other supporting staff and friends.

Uploaded by

kodurilatha191
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
257 views38 pages

Sindhu Internship Report

The document describes an internship report submitted by S. Sindhu priya to fulfill the requirements of a Bachelor of Technology degree in Computer Science and Engineering with a specialization in Artificial Intelligence. It includes an executive summary of the internship, which provided training in data science through courses covering foundational skills, programming, statistical analysis, machine learning, data visualization, SQL, and project-based learning to equip graduates with industry-relevant skills. The report also acknowledges contributions from the internship coordinator, department head, and other supporting staff and friends.

Uploaded by

kodurilatha191
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

INTERNSHIP REPORT

A report submitted in partial fulfilment of the requirements for the Award of


Degree of

BACHELOR OF TECHNOLOGY
in
CSE (ARTIFICIAL INTELLIGENCE)

by

Name: S. Sindhu priya

Regd. No: 20F91A4356

DEPARTMENT OF

CSE (ARTIFICIAL INTELLIGENCE)


PRAKASAM ENGINEERING COLLEGE
Approved by AICTE, affiliated to JNTU, Kakinada
O.V Road Kandukur,
Prakasam dist., Andhra Pradesh-523105

1
ACKNOWLEDGEMENT

First I would like to thank Andhra Pradesh State Skill Development Corporation (APSSDC)
for providing me with the transformative opportunity to participate in their Data Science
Internship Program.

It is indeed with a great sense of pleasure and immense sense of gratitude that I acknowledge
the help of these individuals.

I am highly indebted to Secretary & Correspondent Dr.K.Ramaiah , Director Dr.K.Vijay


Srinivas and Principal Dr.CH.Ravi Kumar for the facilities provided to accomplish this
internship.

I would like to thank my Head of the Department Dr.K.SubbaReddy for his constructive
criticism throughout my internship.

I would like to thank Mr.K.MohanRao internship coordinator Department of CSE(AI) for their
support and advices to get and complete internship in above said organization. I am extremely
grateful to my department staff members and friends who helped me in successful completion
of this internship.

2
DEPARTMENT OF

CSE (ARTIFICIAL INTELLIGENCE)

PRAKASAM ENGINEERING COLLEGE: KANDUKUR

CERTIFICATE
This is to certify that the “Internship report” submitted by Surampalli Sindhu priya (Regd.
No.: 20F91A4356) is work done by her and submitted during 2023-24 academic year, in partial
fulfilment of the requirements for the award of the degree of BACHELOR OF
TECHNOLOGY in CSE(ARTIFICIAL INTELLIGENCE).

Department Internship Coordinator HEAD OF THE DEPARTMENT

NAME: Department of CSE(AI)

SIGN OF EXTERNAL EXAMINER

3
CONTENTS

CHAPTER 1: EXECUTIVE SUMMARY

CHAPTER 2: INTERNSHIP INTRODUCTION

CHAPTER 3: WEEKLY REPORT

CHAPTER 4: CONCLUSION

CHAPTER 5: ACKNOWLEDGEMENTS

CHAPTER 6: CERTIFICATIONS

4
CHAPTER 1: EXECUTIVE SUMMARY

The APSSDC Data Science Course equips individuals with the skills and knowledge needed to
succeed in the rapidly growing field of data science. This comprehensive program covers the
essential concepts of data analysis, machine learning, and data visualization, using cutting-edge
tools and technologies like Python, SQL, Tableau, and Power BI.

Program Overview

• Foundational Skills: The course begins with a solid foundation in data analysis
fundamentals, including data cleaning, manipulation, and exploration using Excel and
Python.
• Programming Expertise: Participants gain proficiency in Python, the lingua franca of
data science, learning syntax, data structures, libraries like NumPy and pandas, and
object-oriented programming.
• Statistical Analysis: Statistical methods and tools like SciPy and statsmodels are
covered to equip learners with the ability to analyze and interpret data effectively.
• Machine Learning: The course delves into the core concepts of machine learning,
covering supervised and unsupervised learning algorithms, model training and
evaluation using scikit-learn.
• Data Visualization: Tableau and Power BI are introduced to enable participants to create
compelling data visualizations and dashboards that communicate insights clearly.
• SQL Proficiency: The program equips learners with the ability to query and manipulate
data stored in relational databases using SQL.
• Project-Based Learning: Throughout the course, participants work on real-world data
analysis projects, applying their newly acquired skills to solve practical problems.

Key Takeaways

• Industry-ready skills: Graduates gain the skills and knowledge sought after by
employers in the data science domain.
• Hands-on experience: The program emphasizes hands-on learning through practical
exercises, projects, and case studies.

• Expert guidance: Experienced instructors and mentors provide personalized guidance


and support throughout the course.
• Career advancement: The APSSDC Data Science Course prepares individuals for
successful careers in data science, data analysis, and related fields.

5
Investment in Your Future

The APSSDC Data Science Course is an investment in your future, equipping you with the
skills and knowledge needed to thrive in the data-driven world. With its comprehensive
curriculum, experienced instructors, and industry-relevant focus, this program is the perfect
launchpad for a successful career in data science.

6
CHAPTER 2: INTERNSHIP INTRODUCTION

In today's data-driven world, the ability to extract meaningful insights from vast amounts of
information has become a critical differentiator for individuals and organizations alike. Data
science empowers us to solve complex problems, optimize operations, and make informed
decisions based on robust analysis. This report provides a concise overview of data science,
highlighting its significance within the APSSDC ecosystem and the opportunities it presents
for aspiring professionals.

The Rise of Data Science:

The exponential growth of data across various industries has generated an urgent need for
skilled professionals who can translate raw information into actionable knowledge. Data
science emerges as a powerful tool bridging the gap between data and decision-making,
enabling us to uncover hidden patterns, predict future trends, and optimize processes. From
personalized healthcare recommendations to targeted marketing campaigns and efficient
resource allocation, data science applications impact nearly every facet of our lives.

Demystifying the Data Science Landscape:

At its core, data science involves a structured approach to extracting actionable insights from
data. This iterative process encompasses data collection, cleaning, exploration, analysis,
modeling, and visualization. APSSDC, recognizing the immense potential of this field, has
curated a comprehensive data science program equipping individuals with the tools and
techniques needed to navigate this data-driven landscape.

APSSDC: Empowering Data-Driven Professionals:

The APSSDC data science program provides a robust curriculum covering foundational
concepts like statistics, programming (Python), machine learning algorithms, and data
visualization tools like Tableau and Power BI. Through a blend of theoretical knowledge and
practical exercises, participants gain proficiency in data manipulation, analysis, and model
building, preparing them for real-world challenges.

Unlocking a World of Possibilities:

Equipping oneself with data science skills opens doors to diverse career paths across various
industries. Graduates can pursue roles like data analyst, business intelligence specialist,
machine learning engineer, or data scientist, commanding competitive salaries and contributing
significantly to data-driven decision-making within organizations. The APSSDC program acts
as a springboard, propelling individuals towards rewarding careers in this burgeoning field.

7
Data science stands poised to revolutionize how we understand and interact with the world
around us. The APSSDC data science program empowers individuals to become active
participants in this data-driven future, equipping them with the skills and knowledge needed to
extract insights, solve problems, and shape a better tomorrow. We encourage those seeking to
unlock their potential and contribute to the data-driven landscape to explore the transformative
opportunities offered by this program.

8
CHAPTER 3: WEEKLY REPORT

3.1: WEEKLY REPORT ON SQL

Week 1: fundamentals of SQL

DAY ACTIVITY

1 Introduction to Relational Databases

2 Basic SQL syntax

3 Creating and Modifying Tables

4 Retrieving Data with SELECT

5 Filtering and Sorting Data

Learning Outcome:

Day 1: understand the relational databases

Day 2: learn the basic syntaxes of SQL

Day 3: create and modify some tables

Day 4: using select command retrieved the data

Day 5: filtering and sorting the data using SQL commands

Detailed Report:
Relational Database Foundation:

Imagine a library of data organized in neatly arranged tables. Each table has rows (records)
containing individual entries and columns (fields) representing specific data points. These
tables often interconnect through shared fields, forming relationships that facilitate

9
comprehensive data analysis. SQL operates within this structured environment, allowing you
to efficiently retrieve, analyze, and update information.

DML: The Language of Data Manipulation:

SQL's Data Manipulation Language (DML) provides a powerful toolset for interacting with
your data:

• SELECT: This cornerstone command retrieves specific data from one or more tables.
You can filter and sort results based on your needs, focusing on relevant portions of the
information landscape.
• INSERT: Introduce new data points into your existing tables, enriching the dataset with
fresh insights.
• UPDATE: Modify existing data efficiently, correcting inaccuracies or adapting
information to reflect current realities.
• DELETE: Remove unwanted data, decluttering your tables and ensuring accurate
representation of the current state.

DDL: Shaping the Data Landscape:

Beyond manipulating data, SQL's Data Definition Language (DDL) empowers you to define
and refine the underlying data structure:

• CREATE: Craft new tables, specifying the columns, data types, and constraints that will
govern the information housed within.
• ALTER: Adapting to changing needs, you can modify existing tables by adding,
removing, or adjusting columns to refine your data architecture.
• DROP: Remove tables that are no longer relevant or require permanent deletion,
streamlining your database structure.

10
Week 2: Advanced SQL concepts

Day Activity

1 Joins and Relationships

2 Subqueries and Nested SELECTs

3 Indexes and Optimization

4 Database Transactions and ACID properties

5 Views and Stored procedures

Learning Outcome:
Day 1: learn joins and relationships
Day 2: known subqueries and nested selects
Day 3: worked on indexes and optimization
Day 4: learn database transactions and ACID properties
Day 5: known about views and stored procedure

Detailed report:

Joining the Pieces: Unleashing Data Relationships:

One of SQL's key strengths lies in its ability to combine data from multiple tables based on
shared fields. JOINs, the operators responsible for this merging, come in various flavors:

• INNER JOIN: This workhorse combines rows where the specified condition exists in
both tables, creating a focused intersection of data points.
• LEFT JOIN: All rows from the left table are preserved, with matching rows from the
right table joining to provide a comprehensive picture.
• RIGHT JOIN: Similar to the left join, the right table takes center stage, with matching
rows from the left table completing the picture.

Subqueries:
Imagine nesting one SQL query within another. That's the magic of subqueries! They empower
you to retrieve results from a query and use them within another, enabling complex data

11
filtering and analysis. For instance, you could find all orders placed by customers who also
purchased a specific product.

Stored Procedures:
Think of mini-programs within your database. Stored procedures encapsulate a series of SQL
statements, allowing you to modularize your code and execute complex tasks with a single call.
Imagine calculating complex sales commissions for each employee; a stored procedure can
handle it efficiently.

User-Defined Functions (UDFs):


Extend the power of SQL by creating your own custom functions! UDFs let you define reusable
logic for specific tasks, enhancing code clarity and efficiency. Imagine calculating a custom
discount based on specific criteria; a UDF can handle that calculation seamlessly within your
queries.

12
3.2: WEEKLY REPORT ON EXCEL

WEEK 3: Excel Basics and Data Manipulation

DAY ACTIVITY

1 Introduction to Microsoft excel

2 Excel interface and navigation

3 Data entry and formatting in excel

4 Basic formulas and functions

5 Data sorting and filtering

Learning Outcome:
Day 1: Introduction to Microsoft Excel
Day 2: known excel interface and navigation
Day 3: learn how to enter data and formatting in excel
Day 4: learn the basic formulas and functions in excel
Day 5: learn how to sort and filter the data

Detailed report:

1. Spreadsheet Structure:

Excel operates on a grid-like structure called a spreadsheet. Rows represent records, and
columns represent fields in those records. Understanding this structure enables you to organize
and access data effectively.

2. Cell Navigation and Data Entry:

Navigate cells using arrow keys or mouse clicks. Enter data directly into cells, format it using
drop-down menus, and apply styles to enhance clarity. Utilize autofill functionality for
repetitive data entry.

13
3. Formulas and Functions:

Excel's strength lies in its powerful formulas and functions. Learn basic arithmetic formulas
like SUM, AVERAGE, and COUNT. Explore built-in functions like VLOOKUP for data
lookup, TEXT for data formatting, and CONCATENATE for joining text.

4. Filters and Conditional Formatting:

Filter data based on specific criteria to focus on relevant subsets. Highlight important data
points using conditional formatting based on values, thresholds, or duplicates.

14
WEEK 4: Advanced Excel Techniques

DAY ACTIVITY

1 Advanced formulas and functions

2 Pivot tables and pivot charts

3 Data visualization and conditional


formatting

4 Advanced data visualization with sparklines

5 Excel tips and shortcuts

Learning Outcome:
Day 1: learn advanced formulas and functions of excel
Day 2: pivot tables and pivot charts
Day 3: known data validation and formatting
Day 4: advanced data visualization with sparklines
Day 5: known some excel tips and shortcuts

Detailed report:

Pivot Table Power User Techniques:


Pivot tables offer immense analytical power. Master advanced features like calculated fields,
slicers, custom filters, and timeline filters to manipulate and explore your data from diverse
angles.

Charts and Data Visualization:

Transform your data into visually compelling charts and graphs. Choose appropriate chart types
like bar charts, pie charts, and line graphs to showcase trends and relationships. Customize
charts with titles, labels, and legends for clear communication.

15
Charts & Data Visualization Beyond the Basics:
Excel charts offer immense customization potential. Explore error bars, waterfall charts, combo
charts, and sparklines to present your data in informative and visually impactful ways.

Data Validation & Conditional Input:


Ensure data quality and consistency with data validation tools. Define allowed values, error
messages, and dropdown lists to guide users and prevent invalid entries.

Custom Formatting & Conditional Formatting Rules:


Take data visualization to the next level with advanced formatting. Create custom number
formats, conditional formatting rules with formulas, and data bars to visually highlight trends
and outliers.

Macros & VBA:


Automate repetitive tasks and complex workflows with macros, mini-programs written in
Visual Basic for Applications (VBA). Imagine automatically formatting reports, sending emails
based on data conditions, or even creating custom functions.

16
3.3: Weekly report on NumPy

WEEK 5:

DAY ACTIVTY

1 Introduction to NumPy and array creation

2 NumPy array operations and attributes

3 Indexing and slicing in NumPy arrays

4 Universal functions in NumPy

5 NumPy aggregation and statistical functions

Learning Outcome:

Day 1: Introduction to NumPy and array creation

Day 2: learn NumPy array operations and attributes

Day 3: known indexing and slicing in NumPy arrays

Day 4: learn universal functions in NumPy

Day 5: known NumPy aggregation and statistical functions

Detailed report:

An Introduction to NumPy Basics

NumPy, short for Numerical Python, is a fundamental library for scientific computing in
Python. It empowers you to work efficiently with multidimensional arrays of data, unlocking
a world of numerical manipulations and calculations with stunning speed and versatility. Let's
delve into the fascinating world of NumPy basics!

Multidimensional Arrays: The Pillars of NumPy:

NumPy introduces the concept of n-dimensional arrays, going beyond the limitations of
traditional Python lists. These arrays offer a structured and efficient way to store and
manipulate numerical data like scientific measurements, financial data, or image pixels.

17
Imagine a table of numbers, where each cell holds a data point, and rows or columns represent
different dimensions.

Array Creation and Initialization:

Creating NumPy arrays is straightforward. You can specify the array dimensions and fill them
with values, utilize built-in functions like arrange or ones, or even import data from external
sources like text files.

Array Indexing and Slicing:

Navigating within these multidimensional arrays requires powerful indexing and slicing
techniques. Access specific elements using single or multiple indices, or extract entire rows or
columns using slice notation. Think of it like pinpointing specific locations within your data
table.

NumPy Functions: A Library of Mathematical Powerhouses:

NumPy offers a vast library of built-in functions for advanced mathematical operations,
statistical analysis, linear algebra, and much more. Explore functions like sin, cos, exp, sqrt,
sum, mean, and linalg. solve to unlock a world of numerical possibilities.

18
WEEK 6: Advanced NumPy concepts

DAY ACTIVITY

1 Broadcasting in NumPy

2 NumPy Linear Algebra Operations

3 Random Module in NumPy

4 NumPy file Input/Output

5 Unit testing in NumPy

Learning Outcome:

Day 1: broadcasting in NumPy

Day 2: NumPy linear algebra operations

Day 3: NumPy random module

Day 4: input/output file operations in NumPy

Day 5: NumPy unit testing

Detailed report:

Broadcasting:

This powerful feature allows operations between arrays of different shapes. Imagine adding a
scalar (single value) to an entire matrix – NumPy automatically expands the scalar to match
the matrix dimensions, saving you tedious loops.

Fancy Indexing:

Go beyond traditional slicing and indexing! Fancy indexing utilizes advanced techniques like
boolean arrays and element-wise functions to select and manipulate specific data elements with
precision. Think of picking out specific elements from different rows and columns based on
conditions, creating a custom mask for complex data extraction.

Universal Functions (ufuncs):

19
These pre-defined functions operate element-wise on arrays, enabling efficient vectorized
operations. NumPy offers a vast ufunc library, from basic arithmetic to advanced mathematical
functions. Imagine calculating complex trigonometric operations on entire matrices in a single
line, significantly boosting performance compared to loops.

Custom Ufuncs:

Take control and define your own specialized functions! NumPy allows you to create custom
ufuncs tailored to your specific needs. Imagine implementing a custom distance metric or
numerical algorithm directly within NumPy, enhancing your code's efficiency and flexibility.

Memory Views and Array Slicing Tricks:

NumPy allows you to create different views of the same underlying data without copying it,
saving memory and boosting performance. Explore advanced slicing techniques like
start:stop:step and boolean indexing to extract specific data subsets efficiently.

NumPy in Other Libraries:

NumPy forms the backbone of many popular scientific computing libraries like SciPy, Pandas,
and Matplotlib. Understanding its advanced features equips you to leverage these libraries more
effectively and tackle complex data analysis tasks with confidence.

20
3.4: Weekly report on Tableau

WEEK 7: Introduction to Tableau Basics

DAY ACTIVITY

1 Introduction to data visualization and Tableau

2 Tableau Interface and navigation

3 Connecting to Data sources in Tableau

4 Basic Data visualization with Tableau

5 Filters and Sorting in Tableau

Learning Outcome:

Day 1: introduction to data visualization and Tableau

Day 2: Tableau interface and navigation

Day 3: connecting to Data sources in tableau

Day 4: basic data visualization with tableau

Day 5: filtering and sorting in tableau

Detailed report:

Tableau is a revolutionary data visualization tool that empowers you to transform raw data into
stunning and insightful visuals. Whether you're a seasoned data analyst or just starting your
journey, mastering Tableau basics can unlock a powerful new way to communicate and
understand your data.

21
Connecting to Data:

Tableau seamlessly connects to diverse data sources, from Excel spreadsheets and CSV files to
relational databases and even cloud platforms. Drag and drop your data file into Tableau, and
watch it magically come to life!

The Tableau Workspace:

Once your data is loaded, you'll enter the Tableau workspace. Here, you have three main
components:

• Dimensions: These represent data categories like dates, names, or product types. Think
of them as the building blocks for your chart.
• Measures: These represent numerical values like sales figures, website visits, or
temperature readings. Imagine them as the quantities you want to visualize.
• Shelves: These are designated areas where you place dimensions and measures to build
your chart. Drag and drop elements onto shelves like "Rows," "Columns," "Marks,"
and "Color" to customize your visualization.

Marks and Charts:

Marks are the basic building blocks of any Tableau visualization. They represent individual
data points and come in various shapes like circles, squares, or lines. Different combinations
of dimensions and measures on the shelves result in diverse chart types like bar charts, pie
charts, or scatterplots.

Filters and Interactivity:

Tableau empowers you to explore your data interactively. Apply filters to focus on specific
subsets of data, highlighting trends and insights within your visualization. Click on data points,
drag sliders, and adjust parameters to see your chart dynamically update, revealing hidden
patterns and correlations.

22
WEEK 8: Advanced Tableau Techniques

DAY ACTIVITY

1 Calculations and parameters in tableau

2 Advanced Data Blending

3 Mapping and geographic visualization in


Tableau

4 Tableau server and online overview

5 Joins in Tableau

Learning Outcome:

Day 1: learn Calculations and parameters in tableau

Day 2: Advanced Data Blending

Day 3: Mapping and geographic visualization in Tableau

Day 4: Tableau server and online overview

Day 5: Joins in Tableau

Detailed Report:

Unleash the Power of Calculations and LOD Expressions:

Move beyond basic aggregation! Craft bespoke calculated fields to manipulate data and derive
novel insights. Level of Detail (LOD) expressions unlock granular analysis, allowing you to
aggregate across various data hierarchies within a single visualization.

23
Beyond Bar Charts:

Embrace Advanced Visualization Techniques: Ditch the conventional and explore powerful
chart types like heatmaps, scatter plots, network graphs, and boxplots. Dive deep into chart
customization, utilizing advanced formatting, annotations, and interactive elements to captivate
your viewers.

Orchestrate Compelling Dashboards:

Layout and Storytelling Take Center Stage: Build dashboards that guide users through your
data's narrative. Master layout techniques like grid layouts, floating objects, and custom
containers to curate a visually coherent flow. Implement filters, actions, and tooltips to spark
user exploration and deeper understanding.

Dynamic Exploration with Sets and Parameters:

Enhance user interactivity and refine your analysis with sets and parameters. Create dynamic
groups of data points based on selections or slider adjustments, empowering users to tailor the
dashboard to their specific interests.

Paint the Picture with Geospatial Analysis:

Location data holds untapped potential. Utilize Tableau's mapping capabilities to plot points,
color-code regions, and overlay data layers, revealing spatial relationships and trends with
stunning visual clarity.

Unveiling Hidden Insights:

Embracing Advanced Analytics and Forecasting: Graduate from descriptive statistics! Explore
forecasting tools to predict future trends, build regression models to pinpoint key drivers, and
leverage clustering techniques to uncover hidden patterns within your data.

Performance Optimization:

Prioritize a Smooth Storytelling Journey: Ensure your dashboards deliver a swift and efficient
experience. Implement best practices for data preparation, utilize efficient chart types, and
optimize queries to avoid performance bottlenecks, keeping your audience engaged and
satisfied.

24
3.5: Weekly report on Power BI

WEEK 9: Power BI fundamentals

DAY ACTIVITY

1 Introduction to Power BI and interface


overview
2 Connecting to data sources in Power BI

3 Data transformation and cleaning in Power BI

4 Creating Basic visualization in Power BI

5 Power BI filters and Drillthroughs

Learning Outcomes:

Day 1: introduction to Power BI and interface overview

Day 2: Connecting to data sources in Power BI

Day 3: Data transformation and cleaning in Power BI

Day 4: Creating Basic visualization in Power BI

Day 5: Power BI filters and Drillthroughs

Detailed Report:

Unveiling the Power of Interactive Data Exploration

In today's data-driven world, transforming raw information into actionable insights is crucial.
Power BI empowers you to do just that, providing a comprehensive platform for data
visualization, analysis, and storytelling. This guide unveils the fundamental concepts of Power
BI, laying the groundwork for your data exploration journey.

Connecting to Data:

Power BI offers exceptional flexibility in data acquisition. Connect to a vast array of sources,
from local Excel files and cloud databases to live feeds and web APIs. Master the intricacies
of each connection type, ensuring seamless data access and accurate analysis.
25
Data Modelling:

Shaping Your Analysis: Before diving into visuals, Power BI lets you shape your data. Learn
the art of building robust data models using relationships, measures, and calculations.

Transform raw data into meaningful insights, ready for intuitive exploration and visualization.

Visualizing your Story:

Power BI is a visual powerhouse. Explore its diverse array of chart types, from classic bar
charts and pie charts to advanced maps and custom visuals. Master formatting options,
interactive elements, and storytelling techniques to craft visually compelling dashboards that
captivate your audience.

DAX: Unveiling the Hidden Power:

Behind the visual splendor lies DAX, Power BI's powerful formula language. DAX empowers
you to create custom calculations, manipulate data dynamically, and extract hidden insights
from your datasets. Learn DAX fundamentals like SUMX, FILTER, and CALCULATE to
unlock the full potential of your data models.

Sharing and Collaboration:

Power BI thrives on collaboration. Seamlessly share your dashboards and reports with
colleagues or stakeholders, enabling interactive exploration and informed decision-making.
Learn about Power BI Service, cloud-based platform, for secure sharing, version control, and
collaborative analysis.

26
WEEK 10: Advanced Power BI Techniques

DAY ACTIVITY

1 DAX (Data Analysis Expressions) in Power BI

2 Advanced Data Modelling in Power BI

3 Power Query Advanced Transformations

4 Power BI custom Visualizations

5 Power BI Service and collaboration

Learning Outcome:

Day 1: DAX in Power BI

Day 2: known advanced data modelling in power bi

Day 3: known power query advanced transformations

Day 4: power BI custom visualizations

Day 5: power BI service and collaboration

Detailed Report:

DAX Mastery:

• Calculated Columns & Measures: Go beyond basic measures! Craft intricate formulas
to manipulate data, derive nuanced insights, and tailor your analysis to specific needs.
Master powerful functions like CALCULATE, VAR, and SWITCH to dissect your data
at any level of granularity.
• Time Intelligence: Dive deep into analyzing temporal data. Utilize TIME
INTELLIGENCE functions like DATESYTD, DATESQTR, and
CALCULATEDATE to analyze trends, seasonality, and year-over-year comparisons.
• Advanced DAX Techniques: Explore cutting-edge DAX concepts like iterating loops,
recursive functions, and advanced filtering techniques to tackle complex data
challenges.

27
Power Query: Data Sculpting Powerhouse:

• M Language Scripting: Master M, the powerful data transformation language behind


Power Query. Craft custom functions, manipulate data structures, and automate
repetitive tasks for streamlined data wrangling.
• Data Cleaning & Shaping: Leverage advanced techniques like conditional splitting,
merging datasets, and error handling to transform messy data into a clean and reliable
foundation for analysis.
• Data Connectors & APIs: Expand your data reach! Utilize custom connectors and APIs
to access specialized data sources and integrate them seamlessly into your Power BI
reports.

Power BI Custom Visualizations:

• Unleashing Creativity: Break free from standard visuals! Explore a world of custom
visualizations crafted by skilled developers. Discover interactive maps, dynamic
timelines, and network graphs to present your data in unique and impactful ways.
• Developing Your Own Visuals: Take the next step and create your own custom visuals!
Learn the basics of using Python or JavaScript to design interactive elements and tailor
the visualization experience to your specific needs.

• Finding and Embedding Visuals: Browse online repositories packed with diverse
custom visuals. Implement them in your reports seamlessly, ensuring compatibility and
enriching your data storytelling capabilities.

Power BI Service: Collaboration & Deployment Nirvana:

• Sharing & Collaboration: Share your dashboards and reports securely with colleagues
and stakeholders on Power BI Service. Utilize workspace features like shared editing,
annotations, and Q&A to foster collaborative data exploration and insights.
• Governance & Security: Implement robust security measures and granular access
controls within Power BI Service. Manage user roles, control data access, and ensure
the integrity of your reports.
• Automated Refresh & Monitoring: Schedule automatic data refresh for your reports,
ensuring always-up-to-date insights. Leverage Power BI Service monitoring tools to
track performance, identify issues, and maintain a consistently smooth experience for
users.

28
3.6: Weekly report on python

WEEK 11:

DAY ACTIVITY

1 Introduction to python and installation

2 Python syntax and variables

3 Control flow: conditional statements

4 Control flow: loops

5 Functions and modules in python

Learning Outcome:

Day 1: introduction to python and installation

Day 2: learn python syntax and variables

Day 3: control flow conditional statements

Day 4: control flow loops

Day 5: functions and modules in python

Detailed Report:

Syntax:

• Python values readability, using clear and concise syntax with English-like keywords.
• Indentation is crucial, defining code blocks rather than curly braces.
• Statements typically end with a newline, but semicolons can be used for multiple
statements on one line.
• Comments start with # and are ignored by the interpreter.

Variables:

• Store data values for later use.


• Declared dynamically without explicit type declaration (Python infers types).

29
• Use meaningful names like age, name, or total_sales.
• Examples: x = 10, message = "Hello, world!", is_active = True

Conditional Statements:

• Control program flow based on conditions.


• if, elif, and else statements are used.
• Indentation defines code blocks within conditional statements.
• Example:
Python
if age>= 18:
print("You are eligible to vote.") else:
print("You are not old enough to vote.")

Loops:

• Repeat code blocks multiple times.


• Two main types:
o for loop: Iterates over a sequence of items (list, tuple, string).
o while loop: Iterates as long as a condition is True.
• Example:
Python
for i in range(5):
print(i) # Prints numbers 0 to 4

Functions:

• Reusable blocks of code that perform specific tasks.


• Define with the def keyword, followed by function name, parentheses, and a colon.
• Can accept arguments (inputs) and return values (outputs).
• Example:
Python

def greet(name):
print("Hello, " + name + "!")

greet("Alice") # Output: Hello, Alice!

30
Modules:

• Organize code into reusable files.


• Import modules using the import keyword.
• Provide a way to structure large programs and share code.
• Example:
Python import
math

radius = 5 area = math.pi *


radius**2
print(area)

31
WEEK 12: Advanced python concepts

DAY ACTIVITY

1 File handling in python

2 Exception handling and error types

3 Object oriented programming (oop) concepts

4 Working with Libraries and external packages

5 Introduction to Numpy and Data Manipulation

Learning Outcome:

Day 1: file handling in python

Day 2: exception handling and error types

Day 3: Object Oriented Programming (OOP) concepts

Day 4: working with libraries and External packages

Day 5: introduction to NumPy and data manipulation

Detailed Report:

File Handling:

• Interacting with files: Python uses built-in functions to open, read, write, and close files.
o open() function: Opens a file in a specified mode (e.g., 'r' for reading, 'w' for
writing, 'a' for appending).
o read(), readline(), readlines(): Read file contents.
o write(): Writes data to a file.
o close(): Closes the file.
• Common file operations:
o Reading text files, CSV files, JSON files, etc.

o Writing data to file.

32
o Manipulating file contents.

Exception Handling:

• Handling errors gracefully: Prevents program crashes and provides informative


messages.
• try...except block: Catches exceptions and executes alternative code.
o try: Code block that might raise exceptions.
o except: Code block that handles specific exceptions.
• Common exception types:
o ValueError: Invalid data type or value.

o TypeError: Incorrect operation on a data type.

o ZeroDivisionError: Attempt to divide by zero.

o FileNotFoundError: File doesn't exist.

o KeyError: Attempting to access a non-existent key in a dictionary.

Object-Oriented Programming (OOP):

• Modelling real-world objects: Organizes code into classes and objects.


• Key concepts:
o Classes: Blueprints for creating objects.
o Objects: Instances of classes, encapsulating data (attributes) and behavior
(methods).
o Inheritance: Creates new classes (subclasses) that inherit properties and
methods from existing classes (base classes).
• Encapsulation, polymorphism, code reusability, and modularity: Key benefits of OOP.

Libraries and External Packages:

• Expanding Python's capabilities: Numerous libraries and packages offer specialized


functionality.
• Installing packages: Use pip (package installer for Python).
• Popular libraries:
o NumPy: Numerical computing and data manipulation.
o Pandas: Data analysis and manipulation.

33
o Matplotlib: Data visualization.

o Requests: Making HTTP requests.

o Beautiful Soup: Web scraping.

NumPy and Data Manipulation:

• Foundation for numerical computing: Efficiently handles large arrays and matrices.
• Key features:
o N-dimensional array object (ndarray) for efficient data storage and operations.

o Universal functions (ufuncs) for element-wise operations on arrays.

o Mathematical functions, linear algebra operations, random number generation,


and more.
• Essential for data analysis, machine learning, and scientific computing.

34
CHAPTER 4: CONCLUSION

APSSDC data science internship wasn't simply a series of coursework; it was a transformative
catalyst, propelling you into the heart of a data-driven future. Over these weeks, I 've
meticulously carved a sophisticated skillset, equipping myself to confidently navigate the
dynamic landscape of data analysis and solution building.

At the foundation lies SQL, the lingua franca of databases. I've become adept at crafting
incisive queries, extracting valuable insights, and shaping data into readily interpretable
formats. This foundational skill empowers me to tap into the very source of information, ready
to decipher its hidden narratives and inform intelligent decision-making.

Next, I honed your dexterity in Excel, the ubiquitous canvas for data manipulation. From
wrangling unruly datasets to crafting compelling visualizations, my Excel mastery transforms
raw numbers into impactful stories. This ensures my findings are not just accurate but also
readily understood, bridging the gap between data and actionable insights.

With NumPy in our arsenal, I wield the power of quantitative precision. Large datasets and
complex calculations are no longer daunting adversaries. I can manipulate arrays with ease,
perform sophisticated analyses, and extract hidden patterns within intricate data structures. This
quantitative prowess forms the backbone of advanced data science methodologies, opening
doors to a vast realm of possibilities.

But data is more than just numbers; it's a tapestry woven with stories waiting to be told. Tableau
and Power BI ignited my inner storyteller, equipping you with the tools to transform raw data
into captivating visuals. Through vibrant dashboards and interactive reports, I can now paint a
picture with my findings, captivating audiences and driving informed action.

Finally, I embraced the versatility of Python, the scripting language that empowers data science.
From automating tedious tasks to building sophisticated models, Python is my bridge to the
cutting edge of the field. With this skill in my hands, I can delve into machine learning,
automate data pipelines, and even craft bespoke solutions for complex problems.

However, the true value of my internship transcends the technical skills I acquired. It lies in the
knowledge I gained and the journey I embarked upon. I've learned to think critically, analyse
problems from diverse perspectives, and approach challenges with a data-driven lens. This
newfound perspective will remain invaluable throughout my career, guiding me as I navigate
the ever-evolving landscape of data.

This internship is not the culmination of my data science journey; it's a springboard propelling
me towards a future brimming with opportunities. From tackling real-world challenges in
diverse fields like healthcare and finance to collaborating on cutting-edge research projects, the
possibilities are limitless. Armed with my newly acquired knowledge and fueled by a newfound
passion, I am now poised to make my mark on the world, one insightful analysis at a time.

35
So, remember, your APSSDC internship wasn't just about mastering software; it was about
stepping into the role of a data-driven problem solver, a storyteller, and an innovator. Go forth,
armed with your sophisticated skillset and a thirst for knowledge, and pave the way for a future
where data empowers change, insights drive progress, and you stand at the forefront of it all.

36
CHAPTER 5. ACKNOWLEGEMENTS

With immense gratitude, I acknowledge the Andhra Pradesh State Skill Development
Corporation (APSSDC) for providing me with the transformative opportunity to participate in
their Data Science Internship Program. This intensive and well-structured program equipped
me with the foundational skills and knowledge to confidently navigate the dynamic landscape
of data science.

I extend my sincere appreciation to the dedicated instructors who expertly guided me through
the curriculum. Their deep expertise and passionate engagement fostered a stimulating learning
environment, propelling me to consistently push my boundaries and refine my understanding.

The program's meticulously crafted curriculum provided me with a comprehensive introduction


to essential data science tools and techniques, including:

• Structured Query Language (SQL): I mastered the art of data retrieval and manipulation
within relational databases, establishing a crucial foundation for efficient data analysis.
• Microsoft Excel: My proficiency in data wrangling, cleaning, and visualization
flourished, empowering me to transform raw data into actionable insights for diverse
stakeholders.
• NumPy: I cultivated the ability to perform complex numerical computations and
manipulate multi-dimensional arrays with ease, unlocking the power of quantitative
analysis.
• Tableau and Power BI: I embraced the art of data storytelling, learning to craft
compelling and interactive visualizations that effectively communicate insights to both
technical and non-technical audiences.
• Python: I honed my skills in this versatile scripting language, opening doors to
advanced data analysis, automation, and model building.

Beyond technical skills, the internship fostered the development of critical soft skills. I gained
valuable experience in collaborative teamwork, critical thinking, creative problemsolving, and
clear communication of complex ideas. These transferable skills will undoubtedly prove
invaluable throughout my professional journey.

The APSSDC Data Science Internship Program provided me with an exceptional learning
experience and propelled me onto a promising career path in this rapidly evolving field. I am
confident that the knowledge and skills I acquired will serve as a springboard for success as I
contribute to data-driven solutions and meaningful insights in the years to come.

Thank you once again for this incredible opportunity.

37
CHAPTER 6: CERTIFICATIONS

38

You might also like