0% found this document useful (0 votes)
4 views4 pages

Unit I - Artificial Neural Network - Assignment 1

The document is an assignment focused on artificial neural networks, covering topics such as activation functions, perceptrons for logical operations, input patterns for neurons, and calculations for feedforward networks. It includes practical exercises involving binary classification, multi-class classification, and regression neural networks, as well as logical functions and clustering using ART networks. The assignment requires students to perform calculations, create diagrams, and analyze the behavior of neural networks under various conditions.

Uploaded by

thamizhajith007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views4 pages

Unit I - Artificial Neural Network - Assignment 1

The document is an assignment focused on artificial neural networks, covering topics such as activation functions, perceptrons for logical operations, input patterns for neurons, and calculations for feedforward networks. It includes practical exercises involving binary classification, multi-class classification, and regression neural networks, as well as logical functions and clustering using ART networks. The assignment requires students to perform calculations, create diagrams, and analyze the behavior of neural networks under various conditions.

Uploaded by

thamizhajith007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

UNIT I – ARTIFICIAL NEURAL NETWORK –ASSIGNMENT 1

1. Which function is commonly used as an activation function in the output layer of a i) binary
classification ii) Multi-Class Classification iii) regression neural network?
2 Assume that there are L input nodes, plus the bias, M hidden nodes, also plus a bias, and N output
nodes. How many weights are there i) between the input and the hidden layer and ii) between the
hidden layer and the output.
3 Consider a neuron with 2 inputs, 1 output, and a threshold activation function. If the two weights
are w1 = 1 and w2 = 1, and the bias is b = −1.5, then what is the output for input (0,0)? What about
for inputs (1,0), (0,1), and (1,1)? Draw the discriminant function for this function, and write down
its equation. Does it correspond to any particular logic gate?
4 Work out the Perceptrons that construct logical OR and logical AND of their inputs.
5 Work out the Perceptrons that construct logical NOT, NAND, and NOR of their inputs.
6 Below is a diagram if a single artificial neuron (unit):

The node has three inputs x = (x1; x2; x3) that receive only binary signals (either 0 or 1). How many
different input patterns this node can receive? What if the node had four inputs? Five? Can you
give a formula that computes the number of binary input patterns for a given number of inputs?
7 Consider the unit shown on Previous Figure. Suppose that the weights corresponding to the three
inputs have the following values:

and the activation of the unit is given by the step function:

Calculate what will be the output value y of the unit for each of the following
input patterns:

8 Logical operators (i.e. NOT, AND, OR, XOR, etc) are the building blocks of any computational
device. Logical functions return only two possible values, true or false, based on the truth or false
values of their arguments. For example, operator AND returns true only when all its arguments are
true, otherwise (if any of the arguments is false) it returns false. If we denote truth by 1 and false by
0, then logical function AND can be represented by the following table:

Page 1 of 4
This function can be implemented by a single unit with two inputs:

if the weights are w1 = 1 and w2 = 1 and the activation function is:

Note that the threshold level is 2 (v ≥ 2).


a) Test how the neural AND function works.
b) Suggest how to change either the weights or the threshold level of this single unit in
order to implement the logical OR function (true when at least one of the arguments
is true):
c) Do you think it is possible to implement XOR function using a single unit? A network
of several units?
9 The following diagram represents a feed forward neural network with one hidden layer:

A weight on connection between nodes i and j is denoted by wij , such as w13 is the weight on the
connection between nodes 1 and 3. The following table lists all the weights in the network:

Each of the nodes 3, 4, 5 and 6 uses the following activation function:

where v denotes the weighted sum of a node. Each of the input nodes (1 and 2) can only receive
binary values (either 0 or 1). Calculate the output of the network (y5 and y6) for each of the input
patterns:

10 Assume that the neurons have a sigmoid activation function, perform a forward pass and a
backward pass on the network. Assume that the actual output of y is 5 and learning rate is 1.

Page 2 of 4
11 Assume that the neurons have a sigmoid activation function, perform a forward pass and a
backward pass on the network. Assume that the actual output of y is 1 and learning rate is 0.9.

12 Consider the XOR Boolean function that has 4 patterns (0,0) (0.1) (1,0) and (1,1) in a 2-D input space.
Radius r=1. 414.Construct a RBFNN as shown in Figure that classifies the input pattern.

13 Consider an ART-1 network with 5 input units and 3 cluster units. After some training the network attains the
bottom up and top down weight matrices as shown below.

Page 3 of 4
Show the behavior of the network if it is presented with the training pattern S=[0,1,1,1,1].Assume
L=2 and p=.8
14 Construct an ART1 network for clustering 4 input vectors with low vigilance parameter of 0.4 into three
clusters. The four input vectors are [0 0 0 1] [0 1 0 1] [0 0 1 1] and [1 0 0 0]. Assume learning rate is=2 and
top down weights tji(0)=1.

Page 4 of 4

You might also like