New Microsoft Word Document
New Microsoft Word Document
Answer:
1. Definition:
2. Purpose:
o Introduces non-linearity.
Q1b) Explain the architecture of an Artificial Neural Network with a neat diagram.
Answer:
1. Definition:
2. Components of ANN:
3. Working:
4. Diagram:
5. Applications:
Answer:
1. Introduction:
2. Structure:
3. Mathematical Model:
o Threshold: 2
5. Limitations:
o No learning mechanism.
Answer:
McCulloch-
Feature Perceptron Adaline
Pitts
Activation
Step Function Step Function Linear Activation
Function
Weight
Fixed Adjusted Adjusted using gradient descent
Adjustment
Answer:
1. Definition:
2. Structure:
3. Working Mechanism:
5. Applications in AI:
Answer:
Answer:
1. Definition:
2. Steps:
3. Mathematical Representation:
4. Example - OR Gate:
o Target: 0, 1, 1, 1
5. Limitations:
Answer:
1. Definition:
2. Components:
3. Working:
4. Advantages:
o Improves accuracy.
5. Applications:
Answer:
1. Initialization:
2. Forward Propagation:
3. Compute Error:
4. Backward Propagation:
Answer:
1. Definition:
2. Structure:
3. Working:
o Forward propagation.
o Compute error.
4. Advantages:
5. Applications:
o Deep learning, pattern recognition.
Answer:
Answer:
1. Definition:
2. Techniques:
o Gradient Descent.
o Weight Regularization.
3. Importance:
o Improves accuracy.
4. Methods:
5. Conclusion:
Answer:
3. Advantages:
4. Disadvantages:
5. Applications:
o Avoids Vanishing Gradient: Unlike sigmoid and tanh, it does not saturate in
positive regions.
3. Disadvantages:
o Dying ReLU Problem: Neurons can become inactive if inputs remain negative.
Q1c) Explain the architecture of an Artificial Neural Network with a neat diagram.
Answer:
1. Definition:
2. Layers of ANN:
3. Working Mechanism:
o Inputs are multiplied by weights and summed.
4. Diagram:
o Include an input layer, hidden layers, and an output layer with weighted
connections.
5. Applications:
Q2a) Draw the structure of the biological neuron and explain its working briefly.
Answer:
2. Working Mechanism:
o Signal travels along the axon to synapses, transmitting to the next neuron.
3. Significance:
5. Applications:
Answer:
o Similar to the perceptron but uses Mean Squared Error (MSE) for weight
updates.
2. Steps:
4. Advantages:
5. Limitations:
Answer:
1. Definition:
2. Types of Errors:
4. Importance:
5. Applications:
Answer:
1. Definition:
2. Layers:
3. Working:
4. Advantages:
5. Applications:
Answer:
1. Learning:
2. Memory:
4. Examples:
5. Applications:
Answer:
Purpose Passes data through the network Minimizes error using weight updates
Answer:
1. Batch Gradient Descent (BGD): Uses entire dataset.
o OR Truth Table.
2. Steps:
o Initialize weights.
o Compute output.
3. Calculations:
4. Result: