Quiz Sol
Quiz Sol
Usman Nazir
Quiz Max Marks: 100
Date: Dec 4, 2024 Time Allowed: 20 min
• Question 1: (Remember)
Define the Universal Approximation Theorem and explain its significance in neural
networks. (Marks: 5)
• Question 2: (Understand)
Describe the role of each component in a neural network (weights, biases, activation
functions). How do these components collectively enable learning? (Marks: 10)
• Question 3: (Apply)
2 1
Given a 2D transformation matrix [ ], apply this transformation to the vector,
1 3
1
𝑣⃑ [ ] and interpret the result in the context of neural network transformations. (Marks:10)
2
• Question 5: (Analyze)
Consider a simple quadratic loss function 𝐿 = (𝑦 − 𝑦̂)2, where 𝑦̂ = 𝑤𝑥 + 𝑏. Analyze
how gradient descent updates the parameters w and b to minimize L. Include partial
derivative computations. (Marks: 15)
• Question 6: (Understand)
Differentiate between batch gradient descent, stochastic gradient descent (SGD), and
mini-batch gradient descent, citing their pros and cons in terms of computational
efficiency and convergence. (Marks: 10)
• Question 7: (Evaluate)
Compare the performance of convolutional layers and fully connected layers in image
recognition tasks. Which is better for spatial data, and why? (Marks: 10)
• Question 8: (Understand)
Discuss the importance of pooling layers in CNNs. How do max pooling and average
pooling differ in their effects on feature extraction? (Marks: 10)
• Question 9: (Create)
Design a CNN architecture using PyTorch for a digit classification task (e.g., MNIST
dataset). Specify the number of layers, activation functions, and loss function used,
and include the training process. (Marks: 20)
import torch
import torch.nn as nn
import torch.optim as optim
# Training loop
for epoch in range(10):
for inputs, labels in train_loader:
optimizer.zero_grad()
outputs = model(inputs)
loss = criterion(outputs, labels)
loss.backward()
optimizer.step()x
Training Process:
• Batch Size: 646464.
• Epochs: 101010.
• Learning Rate: 0.0010.0010.001.
• Output: Model learns to classify digits with increasing accuracy over
epochs.