AI Midterm Review
AI Midterm Review
Basic Information
Exam Structure
• 5 short Q&A
• 5 multiple choice questions
• 1 coding exercise
3. What is a cost function in machine learning? A: The cost function measures the difference
between predicted values and actual values, helping optimize the model by adjusting parameters
to minimize error.
• Batch Gradient Descent: Uses the entire dataset to compute the gradient.
• Stochastic Gradient Descent (SGD): Updates parameters using a single randomly
chosen example.
• Mini-Batch Gradient Descent: Updates parameters using small batches of data.
• A) Clustering
• B) Reinforcement Learning
• C) Regression
• D) Anomaly Detection
Answer: C) Regression
• A) 𝑦𝑦 = 𝑎𝑎𝑥𝑥 2 + 𝑏𝑏𝑏𝑏 + 𝑐𝑐
• B) 𝑦𝑦 = 𝑤𝑤𝑤𝑤 + 𝜖𝜖
• C) 𝑦𝑦 = 𝑎𝑎𝑥𝑥 3 + 𝑏𝑏𝑥𝑥 2 + 𝑐𝑐𝑐𝑐 + 𝑑𝑑
• D) 𝑦𝑦 = 𝑒𝑒 𝑏𝑏𝑏𝑏
Answer: B) 𝒚𝒚 = 𝒘𝒘𝒘𝒘 + 𝝐𝝐
4. What is the time complexity of the Normal Equation for Linear Regression?
• A) O(n)
• B) O(n^2)
• C) O(n^3)
• D) O(log n)
Answer: C) O(n^3)
• A) Linear
• B) Quadratic
• C) Exponential
• D) Cubic
Answer: C) Exponential
Coding Exercise
Write a Python function to implement Gradient Descent for a simple linear regression problem.
The function should update the weights until the cost function is minimized.
import numpy as np
This function uses Gradient Descent to update the weights for a given input matrix XX and
output vector yy, iterating to minimize the cost function.
Details
Q: What are the types of Machine Learning? A: The main types of Machine Learning are
Supervised Learning, Unsupervised Learning, and Reinforcement Learning.
Q: What is a cost function in Linear Regression? A: The cost function measures the error
between the predicted values and actual values. It helps optimize the model by adjusting the
parameters to minimize this error.
• Batch Gradient Descent: Uses the entire dataset to compute the gradient.
• Stochastic Gradient Descent (SGD): Updates parameters using one random training
example.
• Mini-Batch Gradient Descent: Uses a small batch of training examples for updates.
Q: What are the advantages and disadvantages of the Normal Equation in Gradient
Descent? A: