Neural Network Assignment Enhanced
Neural Network Assignment Enhanced
If the output value after applying weights and bias crosses a certain threshold, the
perceptron outputs 1; otherwise, it outputs 0. This is known as binary classification.
### Example:
Imagine you are trying to classify an email as "spam" or "not spam" based on features like
the number of suspicious words, presence of attachments, etc. A perceptron can take these
features as input and decide whether the email is spam.
- **Activation Function**: Step function in basic perceptron; modern networks use sigmoid,
tanh, or ReLU.
### Limitation:
A single perceptron can only solve linearly separable problems. For example, it can separate
points with a straight line but cannot solve the XOR problem.
### Structure:
- **Input Layer**: Receives the raw data (e.g., pixel values of an image).
- **Hidden Layers**: Intermediate layers that learn features from the input.
Each layer uses activation functions to introduce non-linearity, allowing the network to
solve problems like XOR that a single-layer perceptron cannot.
### Example:
Imagine recognizing handwritten digits. The first hidden layer might detect edges, the
second might detect shapes, and the final output layer classifies the digit (0–9).
- **Hierarchical Learning**: Early layers learn basic features, deeper layers learn complex
patterns.
- **Generalization**: The model can learn from data without manually defining rules.
1. **Forward Pass**: Compute the prediction by passing inputs through the network.
2. **Loss Calculation**: Measure how far the prediction is from the actual result.
4. **Weight Update**: Adjust weights to minimize the loss using gradient descent.
### Example:
Suppose a neural network is predicting house prices. If the actual price is ₹50 lakhs and the
predicted price is ₹40 lakhs, backpropagation computes this error and updates the weights
to reduce it in future predictions.
Backpropagation is like a student checking their exam answers, finding out which ones
were wrong, understanding why, and improving in the next exam.
This technique is essential for training deep networks, making it the core of modern AI
systems like speech assistants and recommendation engines.
4. Different Types of Neural Network Architectures
Neural networks come in various forms, each suited to a specific type of data or task. Here
are some popular architectures:
**Example**: Predicting student exam scores based on study hours and attendance.
**Example**: Used in Facebook for face detection and in medical imaging to detect tumors.
**Example**: Used in Google Translate, stock price prediction, and chatbot conversations.
### 5. Transformer
**Example**: Used in ChatGPT and BERT for text generation, translation, and
summarization.