Instructions For How To Solve Assignment
Instructions For How To Solve Assignment
Q3 The perceptron may be used to perform numerous logic functions. Demonstrate the
implementation of the binary logic functions AND, OR, and COMPLEMENT.
Q4. A basic limitation of the perceptron is that it cannot implement the EXCLUSIVE OR
function. Explain the reason for this limitation.
Q5. Consider two one-dimensional, Gaussian-distributed classes and that have a common
variance equal to 1. Their mean values are
These two classes are essentially linearly separable. Design a classifier that separates these
two classes.
Q6. Give weights and bias for a McCulloch-Pitts (M-P) neuron with inputs x, y, and z, and
whose output is z if x = -1 and y = 1, and is -1 otherwise.
(a) Can the net learn to separate the samples, given that you want: if x ∈ Ci then yi = 1
and yj = -1 for j ≠ i. No need to solve for the weights, but justify your answer.
by minimizing the cost function with respect to the unknown parameter vector w.
Q3. Elaborate on the following statement: A network architecture, constrained through the
incorporation of prior knowledge, addresses the bias–variance dilemma by reducing variance
at the expense of increased bias.
Q4. Given the following input points and corresponding desired outputs:
X = {-0.5, -0.2, -0.1, 0.3, 0.4, 0.5, 0.7}
D = {-1, 1, 2, 3.2, 3.5, 5, 6} write down the cost function with respect to w (setting the bias to
zero). Compute the gradient at the point w = 2 using both direct differentiation and LMS
approximation (average for all data samples in both cases), and see if they agree.
Plot them in input space. Apply the perceptron learning rule to the above samples one-ata-
time to obtain weights that separate the training samples. Set η to 0.5. Work in the space with
the bias as another input element. Use w(0) = (0, 0, 0)T. Write the expression for the resulting
decision boundary.
Q6. XOR. For x2, x3 C1 and x1, x4 C2, describe your observation when you apply the
perceptron learning rule following the same procedure as in (a).