0% found this document useful (0 votes)
3 views22 pages

Essential Python For Deep Learning 1699876799

The document provides an overview of essential Python concepts for deep learning, emphasizing the 'pythonic' way of writing code, which includes readability, built-in features, and idiomatic expressions. It covers data structures like lists, sets, and dictionaries, as well as libraries such as NumPy and Pandas for numerical and data analysis tasks. Additionally, it outlines upcoming topics in a series related to deep learning and neural networks.

Uploaded by

Jamiu Adegbite
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views22 pages

Essential Python For Deep Learning 1699876799

The document provides an overview of essential Python concepts for deep learning, emphasizing the 'pythonic' way of writing code, which includes readability, built-in features, and idiomatic expressions. It covers data structures like lists, sets, and dictionaries, as well as libraries such as NumPy and Pandas for numerical and data analysis tasks. Additionally, it outlines upcoming topics in a series related to deep learning and neural networks.

Uploaded by

Jamiu Adegbite
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Immortal Brains

Deep Learning
Immortal
Program brains
SESSION 1: ESSENTIAL PYTHON FOR DEEP LEARNING
What is pythonic way?
Python has some standards, it need to be written in an idiomatic way called pythonic way. The key
characteristics include: (if you’re not understanding it now, you can skip to slide no. 5)
1. Readability:
Following PEP 8 guidelines for code formatting and style:
# Non-Pythonic
def myfunction(x,y,z):
result = x+y+z Immortal
return result

# Pythonic
brains
def my_function(x, y, z):
result = x + y + z
return result
Source: https://peps.python.org/pep-0008/

Please go through all the rules in the provided source


2. Use of Python's built-in features (idioms) :
Using the sum() function instead of writing a custom summation loop:
# Non-Pythonic
total = 0
for num in [1, 2, 3, 4, 5]:
total += num

# Pythonic
total = sum([1, 2, 3, 4, 5])

3. List comprehensions:
Immortal
Creating a new list with a list comprehension:
# Non-Pythonic brains
squares = []
for num in [1, 2, 3, 4, 5]:
squares.append(num ** 2)

# Pythonic
squares = [num ** 2 for num in [1, 2, 3, 4, 5]]
4. Get Item from Dictionary:
Use dictionary.get() method which combines has_key() and item access.

5. Returning multiple values from a function:


Use tuples and tuple unpacking
def foo():
return 3, 5.5

alpha, beta = foo()

6. Get Max Index of array:


Immortal
import numpy as np
brains
my_array = np.array([10, 30, 20, 40, 50])
max_index = np.argmax(my_array)
print("Index of maximum value:", max_index)

Source: What is pythonic?


The intent is to tell you that in Python, we have idioms to do the most complex operations in the
shortest possible code, more pythonic means lesser code.
Lists, Sets and Dictionaries
Lists are ordered collection of element of same or different data types.
They are suitable for sequential data access which means to access or
find an element we need to traverse through the length of the list.
However you can access randomly if you know the index in O(1) time
complexity.
my_list = [1, 2, 3, "apple", "banana"]

Immortal
Sets are unordered collection of unique values of same datatype. Sets are
efficient for finding if an element exists in the set in O(1) and also doing set
operations such as union, intersection, difference. Sets use hashing which
brains
make them a performant collection for lookups and set operation.
my_set = {1, 2, 3, 3, 4, 4}

Dictionaries: A dictionary is an unordered collection of key-value pairs.


Keys in a dictionary are unique, and they are used to access
corresponding values. They are useful for efficient lookup when we
associate data with labels or identifiers
my_dict = {"name": "Alice", "age": 30, "city": "New York"}
Lists, Sets and Dictionaries- How to write
first python program?
The quickest and easiest way to start writing Python code is by using Google Colab. No installation is needed, and it provides a cloud-based
environment for running Python notebooks. You can easily collaborate and share your notebooks by providing their links with read-only or read-write
access. You can also use Kaggle notebooks, which offer similar features.
Google Colab:
1. Access Google Colab: Visit Google Colab in your web browser.
2. Create a New Notebook: Click on "New Notebook" to create a new Python notebook.
3. Write and Run Code: Start writing Python code in the cells and run them using the play button or by pressing Shift + Enter.
Kaggle Notebooks:
1.

2.
Immortal
Access Kaggle Notebooks: Visit Kaggle Notebooks on Kaggle's website.
Create a New Notebook: Click on "New Notebook" to create a new Python notebook.
3.

brains
Write and Run Code: Similar to Google Colab, you can write and run Python code in the cells.
Alternatively, if you prefer to work offline and have more control over your environment, you can install Anaconda and use Jupyter Notebook.
Anaconda with Jupyter Notebook:
1. Install Anaconda: Download and install Anaconda for your operating system.
2. Open Anaconda Navigator: Launch Anaconda Navigator, and you'll find Jupyter Notebook among the available tools.
3. Launch Jupyter Notebook: Click on the "Launch" button next to Jupyter Notebook in Anaconda Navigator.
4. Create a New Notebook: In the Jupyter Notebook interface, click on "New" and select "Python 3" to create a new notebook.
5. Write and Run Code: Start writing Python code in the cells and run them using Shift + Enter.
6. Save and Share: Save your Jupyter Notebooks locally, and you can share the .ipynb files with others.
Lists, Sets and Dictionaries

Immortal
brains
Complete code available
in the below colab link:

https://colab.research.google.com/drive/1vxJEgoRjvQXYGx6BjTICBpOsmLafYPKj
Python is dynamically typed
Python is a dynamically typed language, which means you don't need to
explicitly declare the data type of a variable. The interpreter infers the type
at runtime. Here's an example to illustrate dynamic typing in Python:

# Dynamic Typing Example


variable_1 = 42
Immortal
print("variable_1 is of type:", type(variable_1)) # Output: <class 'int’>

variable_1 = "Now I'm a string!"


brains
print("variable_1 is of type:", type(variable_1)) # Output: <class 'str'>
Non Immutable Data Structures
While list, sets and dictionaries are mutable data structure as their value
can be modified without using a new object. Numbers, tuples, strings,
bools, etc. are immutable.

Immortal
brains
More statements in Python
# Tuples
my_tuple = (1, 2, 3, 4, 5)

# If-else-elif statement
def classify_number(num):
if num > 0:
return "Positive"
elif num == 0: Immortal
return "Zero"
else:
return "Negative“
brains
# For loop
print("Squared values using for loop:")
for num in my_tuple:
print(num ** 2)
# While loop
print("\nCubed values using while loop:")
index = 0
while index < len(my_tuple):
print(my_tuple[index] ** 3)
index += 1

# Range
print("\nRange example:")

Immortal
for i in range(3, 8, 2): # Start from 3, stop before 8, step by 2
print(i)

# M ap
brains
squared_values = list(map(lambda x: x ** 2, my_tuple))
print("\nSquared values using map and lambda:", squared_values)

# Filter
even_values = list(filter(lambda x: x % 2 == 0, my_tuple))
print("Even values using filter and lambda:", even_values)
# Function
def greet(name):
return f"Hello, {name}!"

# Lambda function
multiply = lambda x, y: x * y

# Example usage of functions and lambda


print("\nFunction example:", greet("Alice"))
print("Lambda function example:", multiply(3, 4))

# Handling div ision by zero using try-except


Immortal
brains
try:
result = 10 / 0
print("\nDiv ision result:", result)
except Exception as e:
print(f"Error: {e}")
finally:
print(“I am always there”)

Colab Link: https://colab.research.google.com/driv e/17UXL1B9BtGJGJkpPF85iOFuV55ROqoWz?usp=sharing


NumPy (Numerical Python)
NumPy (Numerical Python) is a powerful library in Python for numerical and mathematical
operations. It provides support for large, multi-dimensional arrays and matrices, along with a
collection of high-level mathematical functions to operate on these arrays. NumPy is essential
for tasks such as linear algebra, statistical analysis, and data manipulation.
Numpy Arrays are around 50x faster than traditional python Lists as Lists supports multiple data
type objects while Numpy Arrays mandates uniform data type hence optimized storage and
faster performance.

Immortal
Check also: https://stackoverflow.com/questions/8385602/why-are-numpy-arrays-so-fast
Here are some key aspects of NumPy with code examples:

brains
1. Installing NumPy: If you haven't installed NumPy, you can do so using the following:
!pip install numpy (In Anaconda prompt use conda install numpy)
2. Importing NumPy:
import numpy as np
3. Creating NumPy Arrays:
my_list = [1, 2, 3, 4, 5]
my_array = np.array(my_list)
print(my_array)
# Creating an array of zeros
zeros_array = np.zeros(5)
print(zeros_array)
4. NumPy Array Operations:
array1 = np.array([1, 2, 3])
array2 = np.array([4, 5, 6])

# Element-wise addition
result_addition = array1 + array2
print("Element-wise Addition:", result_addition)

# Element-wise multiplication
result_multiply = array1 * array2

Immortal
print("Element-wise Multiplication:", result_multiply)

# Dot product (Works for product of matrices as well)


dot_product = np.dot(array1, array2)
print("Dot Product:", dot_product)
brains
# Trigonometric functions
angles = np.array([0, np.pi/2, np.pi])
sin_values = np.sin(angles)
cos_values = np.cos(angles)
tan_values = np.tan(angles)
print("Sin Values:", sin_values)
print("Cos Values:", cos_values)
print("Tan Values:", tan_values)
NumPy (Numerical Python)
5. NumPy Array Manipulation:
original_array = np.array([[1, 2, 3], [4, 5, 6]])

# Reshape to a 1D array
reshaped_array = original_array.flatten()
print("Reshaped to 1D:", reshaped_array)

# Reshape to a different shape Immortal


brains
reshaped_array_2d = original_array.reshape(3, 2)
print("Reshaped to 3x2:", reshaped_array_2d)

# Array Slicing
array_slice = np.array([1, 2, 3, 4, 5])
# Select elements from index 1 to 3 (exclusive)
sliced_elements = array_slice[1:3]
print("Sliced Elements:", sliced_elements)
Python data analysis using Pandas
Pandas is a Python library designed for data manipulation and analysis. It
provides data structures like Series and DataFrame, which are highly
efficient for working with structured data.
Installing Pandas:
You can install Pandas using the following command in your terminal or
command prompt:
!pip install pandas
Key Features of Pandas: Immortal
DataFrame: A two-dimensional, tabular data structure with labeled axes
(rows and columns).
Resembles a spreadsheet or SQL table. brains
Provides powerful data manipulation and analysis capabilities.
Series: A one-dimensional labeled array capable of holding any data
type.
Represents a column in a DataFrame or a single variable.
Data Input/Output: Pandas supports reading and writing data from/to
various file formats, including CSV, Excel, SQL, and more.
Python data analysis using Pandas
1. Creating a DataFrame:

import pandas as pd

# Creating a DataFrame from a dictionary


data = {'Name': ['Alice', 'Bob', 'Charlie’],
'Age': [25, 30, 35],

Immortal
'City': ['New York', 'San Francisco', 'Los Angeles’]}
df = pd.DataFrame(data)

# Display the DataFrame


print("DataFrame:")
brains
print(df)

In this example, we created a DataFrame from a dictionary where keys


represent column names and values are lists containing the column data.
The resulting DataFrame is a tabular structure with rows and columns.
2. Reading Data from a File:
# Reading data from a CSV file
csv_file_path = 'path/to/your/file.csv’
df_from_csv = pd.read_csv(csv_file_path)
# Reading data from an Excel file
excel_file_path = 'path/to/your/file.xlsx’
df_from_excel = pd.read_excel(excel_file_path)
# Display the DataFrames
print("DataFrame from CSV:")

Immortal
print(df_from_csv)
print("\nDataFrame from Excel:")
print(df_from_excel)
3. Basic DataFrame Operations:
print("DataFrame Info:")
print(df.info())
brains
print("\nDescriptive Statistics:")
print(df.describe())
# Selecting specific columns
selected_columns = df[['Name', 'Age’]]
print("\nSelected Columns:")
print(selected_columns)
4. Data Cleaning and Manipulation:
# Adding a new column
df['Gender'] = ['Female', 'Male', 'Male’]

# Dropping a column
df = df.drop(columns='City’)

# Filtering rows based on a condition


filtered_df = df[df['Age'] > 30]
print("DataFrame after Manipulation:")
print(df)
Immortal
print("\nFiltered DataFrame:")
print(filtered_df) brains
5. Grouping and Aggregation:
# Grouping by 'Gender' and calculating mean age for each group
grouped_df = df.groupby('Gender')['Age'].mean()
print("Grouped and Aggregated DataFrame:")
print(grouped_df)
Textbooks to learn Python

Immortal
brains
Further in the series
1. Exploratory Data Analysis using python libraries : Next on 19th November, 2023.
Learn the techniques of data analysis and preprocessing needed before training the data.
2. Perceptron, MLP, Understanding Human Intelligence:
Explore the foundational concepts of Perceptrons and Multi-Layer Perceptrons (MLPs) and gain insights into how these models emulate aspects of
human intelligence.
3. ANN, Feed Forward Neural Networks:
Understand the architecture and applications of Feed Forward Neural Networks.
4. Back propagation using Gradient Descent with essential Linear Algebra, following a top down approach:
Understand the mechanism of Backpropagation and its relationship with Gradient Descent, underpinned by essential Linear Algebra.

Immortal
5. Regularization to improve the performance of Deep Neural Networks
Discover techniques to enhance the performance of Deep Neural Networks, making your models robust and accurate. Convolution
6. Neural Network for Computer Vision:
Learn how CNNs use filters, and convolution and pooling techniques to enable image processing and recognition.

brains
7. Recurrent Neural Network for Sequential problems.
Tackle sequential data challenges using Recurrent Neural Networks (RNNs) and understand their importance in applications like time series analysis and
natural language processing.
8. NLP using deep learning, word embeddings, contextual embedding
Dive into Natural Language Processing (NLP) and delve into word embeddings and contextual embeddings to gain a deeper understanding of text data
analysis.
9. Attention Mechanism, Transformers and LLM
Explore advanced techniques in text processing, including Attention Mechanisms, Transformers, and Large Language Models (LLMs) that have
revolutionized the field of language understanding.
10. VAE and GANs for Image generations
Delve into creative AI as you discover Variational Autoencoders (VAEs) and Generative Adversarial Networks (GANs), which can generate lifelike images
and artwork.
11. What is Generative AI?
Explore the concept of Generative AI and its practical applications, from creating art to generating synthetic data, and understand how machines become
creators in their own right.
Thank You
Immortal
brains

You might also like