matplotlib_CS
matplotlib_CS
1. Introduction to Matplotlib
Matplotlib is a powerful Python library used for creating static, animated, and
interactive visualizations. The pyplot module in Matplotlib provides a simple
interface for creating plots, much like the plotting features in MATLAB.
Here's a quick start to see how it works:
import matplotlib.pyplot as plt
# Sample data
x = [1, 2, 3, 4, 5]
y = [1, 4, 9, 16, 25]
# Simulated data
epochs = list(range(1, 11))
train_loss = [0.9, 0.8, 0.6, 0.5, 0.4, 0.35, 0.3, 0.25, 0.2,
0.15]
val_loss = [1.0, 0.9, 0.75, 0.65, 0.6, 0.55, 0.5, 0.45, 0.4,
0.35]
# Plotting
plt.plot(epochs, train_loss, label="Training Loss")
plt.plot(epochs, val_loss, label="Validation Loss", linestyle='--')
plt.xlabel("Epochs")
plt.ylabel("Loss")
1
plt.title("Loss Over Epochs")
plt.legend()
plt.show()
This plot shows how the loss changes over epochs, helping you understand if
your model is learning well.
# Heatmap of weights
plt.imshow(weights, cmap='viridis')
plt.colorbar()
2
plt.title("Neural Network Weights Heatmap")
plt.show()
The color intensity in a heatmap can represent different values (e.g., connection
strengths in neural networks), making it a handy visualization for complex,
interconnected data.
# Activation functions
relu = np.maximum(0, x)
sigmoid = 1 / (1 + np.exp(-x))
tanh = np.tanh(x)
# Plotting
plt.plot(x, relu, label="ReLU")
plt.plot(x, sigmoid, label="Sigmoid")
plt.plot(x, tanh, label="Tanh")
plt.xlabel("Input")
plt.ylabel("Activation")
plt.title("Activation Functions")
plt.legend()
plt.show()
Visualizing these functions helps in understanding their behavior, especially
when comparing different types of neurons in a network.
3
By following these steps, you’ll have a solid foundation for using Matplotlib to
visualize cognitive processes, neural networks, and more complex data associated
with cognitive modeling. Let me know if you'd like further explanations on any
specific technique or example!
In the context of training neural networks or other machine learning models,
train_loss and val_loss are metrics that measure how well the model is
performing on two different datasets: the training set and the validation set.
Here’s a detailed breakdown of each:
4
a balance where both losses decrease without the validation loss starting
to increase (which would mean overfitting).
Example Scenario
Imagine you’re training a neural network for image classification, and your loss
values over time look like this:
In this example:
• From epochs 1 to 10, both train loss and validation loss are decreasing,
which is good.
• After epoch 10, train loss continues to decrease, but validation loss increases.
This indicates that the model might be overfitting and suggests you might
need to apply regularization techniques (like dropout) or stop training at
an earlier epoch.
Interpreting a heatmap of the weights in a neural network can give insights into
how the network is processing information, and potentially where adjustments
might improve performance. Here’s a step-by-step guide on how to interpret it:
5
either reinforcing or inhibiting signals.
• Symmetry: Patterns like symmetry in weights may sometimes indicate
redundancy, which might need pruning. Symmetry in specific layers (like
convolutional layers) can also indicate similar feature detections happening
across different neurons.
• Uniformity or Sparsity:
– If weights are more uniform, with few distinct values, the layer might
not be learning meaningful patterns or is initialized poorly.
– If weights are sparse (e.g., a lot of zero or low-value weights), it
can indicate that the network is efficiently learning to ignore certain
connections, especially in techniques like L1 regularization.
Example Interpretation
Imagine a heatmap with a 5x5 weight matrix where darker squares represent
negative weights, lighter squares represent positive weights, and mid-tone squares
represent weights close to zero:
• A row with all light squares suggests a neuron with highly positive connec-
tions, influencing the next layer positively.
• A column with mostly dark or neutral squares indicates that this neuron
in the next layer receives lower or inhibitory input, likely making it less
active.
This approach can make it easier to decide on adjustments, whether that means
regularizing, pruning, or retraining layers for better network performance.
6
Subplots in Matplotlib allow you to display multiple plots within the same
figure. This can be especially useful for comparing different types of data side-by-
side, like the training and validation losses over epochs or visualizing activation
functions together.
Here's a basic guide to creating subplots.
# Sample data
x = np.linspace(0, 10, 100)
y1 = np.sin(x) # Sine wave
y2 = np.cos(x) # Cosine wave
# Create a figure
plt.figure(figsize=(10, 4))
# First subplot
plt.subplot(1, 2, 1) # 1 row, 2 columns, plot 1
plt.plot(x, y1, label="Sine")
plt.title("Sine Wave")
plt.xlabel("x")
plt.ylabel("sin(x)")
plt.legend()
# Second subplot
plt.subplot(1, 2, 2) # 1 row, 2 columns, plot 2
plt.plot(x, y2, color="orange", label="Cosine")
plt.title("Cosine Wave")
plt.xlabel("x")
plt.ylabel("cos(x)")
plt.legend()
7
means "first plot."
• The subplot(1, 2, 2) specifies the second plot in the same row.
• plt.tight_layout() ensures the plots don’t overlap.
plt.show()
Here’s how it works:
• fig, axs = plt.subplots(2, 2) creates a 2x2 grid of subplots.
• axs[0, 0] accesses the first subplot, axs[0, 1] the second, and so on.
• Each subplot can have its own settings, labels, and titles.
8
3. Combining Different Types of Plots
You can use subplots to combine different types of plots for a better overview,
such as line plots, bar plots, and heatmaps.
# Sample data
x = np.linspace(0, 10, 100)
y = np.sin(x)
This should give you flexibility for laying out multiple plots, helping you compare
and contrast data in cognitive modeling or neural network visualizations! Let
me know if you'd like further help with this.
The linspace function in NumPy generates an array of evenly spaced numbers
over a specified range. It’s particularly useful in plotting and mathematical
computations when you need a sequence of values within a range.
9
Syntax
np.linspace(start, stop, num=50, endpoint=True, retstep=False,
dtype=None)
Parameters
• start: The beginning value of the sequence.
• stop: The end value of the sequence.
• num: The number of samples to generate (default is 50).
• endpoint: If True (default), stop is the last value in the sequence. If
False, the sequence goes up to but does not include stop.
• retstep: If True, returns a tuple of (array, step), where step is the spacing
between samples.
• dtype: The data type of the output array.
Example
import numpy as np
10