Python
Python
🐞 Python Debugger
Helps pause execution, step through code, inspect variables, and find bugs.
Built-in in VS Code with useful sections like:
o Variables: Shows current variable values.
o Call Stack: Shows current function call hierarchy.
o Watch: Lets you monitor specific expressions or variables in real time.
o Breakpoints: Pauses execution at specific lines.
o Debug Console: Run Python expressions live while paused.
Ternary Operator:
Logical Operators:
While loop:
Functions in Python:
By default all python functions implicitly return “none” which is a special constant that
represents the absence of a value, and is a falsy value.
Keyword Arguments:
For better readability python allows us to specify the arguments passed to a function by explicitly
specifying the name of the parameter along with its value, the order doesn’t matter as long as we
specify the parameters.
PIP
Pip is standard python package manager, used to install and manage and software packages.
Pip looks it up in the Python Package Index (PyPI), which is like a giant library of Python packages.
Python Virtual Environments:
Python virtual environments are isolated directories that contain a specific python interpreter and its
packages, allowing you to manage dependencies independently for different projects and to prevent
conflicts.
Creating Environment:
Underfitting: The model is too simple and fails to learn the underlying patterns in the data, this would
happen if our data set is small or we didn’t run it for enough epochs.
Overfitting: The model is too complex, and learns even the irrelevant noise along with the pattern, in
turn memorizing the data set instead of learning from it. Therefore is good on test data but bad on new
data. Perhaps we ran it for too many epochs.
1. Input layer: Fixed size layer, so if we have an image of 784px, each pixel is fed to the neurons of
the input layer
2. Connection Channels: Neurons of one layer are connected to the other using channel and each
of that channel is assigned a numerical value call weight, each of this weight is multiplied to the
input and is sent to the next neuron.
3. Hidden Layer: Each neuron in the hidden layer receives the output from all the neurons in the
input layer, now each of this neuron in the first layer has a variable called the bias associated
with it, the input from previous layer and bias is passed as parameters to the activation function
which will determine if the neuron should be activated to pass its output to the next layer. Only
activated neurons can transfer data to the next layer.
During training actual values are also transimted to the neural network, for every output it
compares its own output with the actual value, and adjusts it weight accordingly to match better
with the expected output.
Back propagation: The information of how offset our own result was from the expected result is fed
back from the output layer all the way to the input layer going through hidden layers and all that.
The weights are adjusted and the whole process is repeated for a certain epochs to train the model
Types of Neurons:
o Forget Gate: says what sort of state information stored in the state can be
forgotten, no longer needed.
o Input Gate: what new information should be added to the working state.
o Output gate: of all the information stored in the state which part of it should be
outputted. Each gate is assigned 0, 1 indicating if they are often or close.