0% found this document useful (0 votes)
86 views3 pages

Instructions For How To Solve Assignment

The document provides instructions for Assignment 1 and 2 for a Neural Network course. It includes 6 questions for Assignment 1 on topics like neural networks, perceptrons, and classification. It also includes 6 questions for Assignment 2 on topics like linear regression, cost functions, bias-variance dilemma, and perceptron learning rule. Students are instructed to answer each question on separate pages with page numbers and submit by the deadline provided on the learning management system.

Uploaded by

sachin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views3 pages

Instructions For How To Solve Assignment

The document provides instructions for Assignment 1 and 2 for a Neural Network course. It includes 6 questions for Assignment 1 on topics like neural networks, perceptrons, and classification. It also includes 6 questions for Assignment 2 on topics like linear regression, cost functions, bias-variance dilemma, and perceptron learning rule. Students are instructed to answer each question on separate pages with page numbers and submit by the deadline provided on the learning management system.

Uploaded by

sachin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Instructions for how to solve Assignment

1. Each assignment must be solved in white pages (or in register) sheet


separately.
2. Each assignment must be properly mentioned page numbers.
3. Last date for submitting assignment is shown on MS team.
Neural Network (CS-1733)
Assignment-1
Q1. What is neural network? Explain its benefits.

Q2. Explain and write PERCEPTRON CONVERGENCE THEOREM.

Q3 The perceptron may be used to perform numerous logic functions. Demonstrate the
implementation of the binary logic functions AND, OR, and COMPLEMENT.

Q4. A basic limitation of the perceptron is that it cannot implement the EXCLUSIVE OR
function. Explain the reason for this limitation.

Q5. Consider two one-dimensional, Gaussian-distributed classes and that have a common
variance equal to 1. Their mean values are

These two classes are essentially linearly separable. Design a classifier that separates these
two classes.

Q6. Give weights and bias for a McCulloch-Pitts (M-P) neuron with inputs x, y, and z, and
whose output is z if x = -1 and y = 1, and is -1 otherwise.

Q7. Give the following 3-class classification problem:

and the following single layer perceptron:

(a) Can the net learn to separate the samples, given that you want: if x ∈ Ci then yi = 1
and yj = -1 for j ≠ i. No need to solve for the weights, but justify your answer.

(b) Add the sample (-1, 6) to C1. Repeat part (a).


Neural Network (CS-1733)
Assignment-2
Q1. Discuss the basic differences between the maximum a posteriori and maximumlikelihood
estimates of the parameter vector in a linear regression model.

Q2. Starting with the cost function of Eq. (2.36), e(w),

derive the formula of Eq. (2.29)

by minimizing the cost function with respect to the unknown parameter vector w.

Q3. Elaborate on the following statement: A network architecture, constrained through the
incorporation of prior knowledge, addresses the bias–variance dilemma by reducing variance
at the expense of increased bias.

Q4. Given the following input points and corresponding desired outputs:
X = {-0.5, -0.2, -0.1, 0.3, 0.4, 0.5, 0.7}
D = {-1, 1, 2, 3.2, 3.5, 5, 6} write down the cost function with respect to w (setting the bias to
zero). Compute the gradient at the point w = 2 using both direct differentiation and LMS
approximation (average for all data samples in both cases), and see if they agree.

Q5. For the following training samples:

Plot them in input space. Apply the perceptron learning rule to the above samples one-ata-
time to obtain weights that separate the training samples. Set η to 0.5. Work in the space with
the bias as another input element. Use w(0) = (0, 0, 0)T. Write the expression for the resulting
decision boundary.

Q6. XOR. For x2, x3 C1 and x1, x4 C2, describe your observation when you apply the
perceptron learning rule following the same procedure as in (a).

You might also like