0% found this document useful (0 votes)
31 views10 pages

Some Practice Questions

practice questions

Uploaded by

Eisha Ahmad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views10 pages

Some Practice Questions

practice questions

Uploaded by

Eisha Ahmad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Artificial Intelligence

Mid-II Exam, Spring 2016


Date: April 1, 2016 Max Points: 45 Time: 60 min.

Name: -----------------------------------------------------

Registration No: -------------------------

Section: --------------

Attempt All Questions:

Note: Show All your working on the answer sheet.

Problem 1: {Decision Tree Learning} [15 Minutes] [5 + 3 + 2 Points]

In a binary classification task we measured the color, shape and size of various objects and prepared the
following training data where the class/label of each object is either + or !

Color Shape Size Class


Red Square Big +
Blue Square Big +
Red Circle Big +
Red Circle Small !
Green Square Small !
Green Square Big !

It has been decided to use a very simple single decision node identification tree classifier to predict the
class of an object. Further, it has also been decided that the entropy based measure of information gain
should be used to build the decision.

Part a) Build a single node decision tree from the training data provided above.

Part b) Compute the training accuracy of the decision tree built in part a.

Part c) Explain, how can we use real valued features in decision tree learning.

Problem 2: Trees and Ensemble Learning [15 Minutes] [5 + 5 Points]

For both parts of this question assume that the following three single node decision trees (given in the
form of if!then!else rules) have been selected.

Tree 1 Tree 2 Tree 3


If Size == Small Then If Color == Red Then If Shape == Square Then
Class is + Class is + Class is +
Else Else If Color == Green Then Else
Class is ! Class is ! Class is !
Else
Class is +

_________________________________________________________________________________________
Department of Computer Science
National University of Computer & Emerging Sciences, Lahore Page 1 of 3
Artificial Intelligence
Mid-II Exam, Spring 2016
Date: April 1, 2016 Max Points: 45 Time: 60 min.
Part a) [Bagging] Compute the training accuracy of the classifier that uses a simple majority vote of the
three classifiers to predict the class of an object.

Part b) [Boosting] If we assume that the above trees (Tree 1, Tree 2 and Tree 3) have been selected during
three iteration of AdaBoost algorithm respectively then compute

i. Weight Distribution over the training examples during the tree iterations

ii. Training accuracy of the resulting ensemble of these three decision trees.

Problem 3: Neural Networks [20 Minutes] [8 + 7 Points]

The following training data has been prepared to learn weights of a single PERCEPTRON where the X0
represents the bias term and T is the target value.

X0 X1 X2 X3 T
!1 0 0 0 1
!1 0 0 1 1
!1 0 1 0 1
!1 0 1 1 1
!1 1 0 0 1
!1 1 0 1 !1
!1 1 1 0 1
!1 1 1 1 1
Part a)

i. What would be the learned weights of the perceptron if the simple perceptron learning
rule is used to learn the weights? Show all your working.

ii. Use Gradient Decent algorithm to derive the learning rule for a single perceptron with
sigmoid activation function

Part b) Consider The Feed!Forward Neural Network shown below. This network uses the following
activation function is used for each neuron.

f(x) = 1 /( 1 ! e!x )

_________________________________________________________________________________________
Department of Computer Science
National University of Computer & Emerging Sciences, Lahore Page 2 of 3
Artificial Intelligence
Mid-II Exam, Spring 2016
Date: April 1, 2016 Max Points: 45 Time: 60 min.
The weights of neurons are as follows [Top neuron weights to bottom neuron weights]

Input Neuron Weights Hidden Neuron Output Neurons


[0, 1] Top Neuron [0.5, !0.5 0.5] [1, 0]
[1, 0] [!0.5, 0.5, !0.5] [0, 1]
[1, 1] [!1, 0.5]

i. Compute the final outputs of the network for the examples with inputs X = [0 1] and
Targets T = [0 0 1]
ii. Compute the errors for each of the hidden neuron as computed by back propagation algorithm.

Problem 4: Genetic Algorithms [15 Minutes] [2 + 2 + 2 + 4 Points]

A budget airline company operates 3 plains and employs 5 cabin crews. Only one crew can operate on any
plain on a single day, and each crew cannot work for more than two days in a row. The company uses all
planes every day and wants to divide the work equally among the crew members.

The Al class at FAST!NU suggested to use Genetic Algorithms to work out the best weekly schedule for the
company.

i. Suggest a suitable chromosome representation (sequence of genes) for this problem?


ii. Suggest a fitness function for this problem.
iii. How many solutions are in this problem?
iv. Is it necessary to use Genetic Algorithms for solving it? What if the company operated 100 plains and
employed 150 crews?

_________________________________________________________________________________________
Department of Computer Science
National University of Computer & Emerging Sciences, Lahore Page 3 of 3
To solve an important confidential problem using genetic algorithms the following initial population
of chromosomes has been generated at random with each chromosome represented as a sequence of
nine bits.

For this initial population, determine the next generation/population of chromosomes that results
after one iteration. Assume that two selected chromosomes survive into the next generation as is
while the remaining chromosomes are created using single point crossover operation. Also assume
that fitness-proportionate selection is being used to select chromosomes whenever needed and that
the Mutation rate is set to 0.1.

If you need random numbers, choose in order from the following array/list row wise and repeat from
beginning if you need more.
0.69 0.67 0.55 0.64 0.38 0.41 0.79 0.12 0.03 0.96 0.88
0.18 0.30 0.67 0.84 0.76 0.41 0.59 0.07 0.21 0.98 0.46
0.69 0.44 0.60 0.91 0.13 0.63 0.73 0.21 0.38 0.28 0.82
0.29 0.28 0.07 0.55 0.26 0.06 0.42 0.67 0.14 0.76 0.79
Registration No.___________________________

National University of Computer and Emerging Sciences, Lahore Campus


Course: Artificial Intelligence Course Code: CS 401
Program: BS (Computer Science) Semester: Fall 2016
Duration: 20 Minutes Total Marks: 100
Paper Date: 26-Oct-16 Weight 2%
Section: N/A Page(s): 2%
Exam: Quiz 2

Problem [Topics (Perceptrons)]

Part a) [5 Points]

A model of neuron, called perceptron, in the human nervous system is shown in the figure below. As
discussed in class, such a unit can be easily used to represent several different functions. In this
question your job is to find representations of the Boolean function NAND with two inputs using a
perceptron learning rule.

For this purpose you must create four training examples to specify the output for each possible
input. [2 Points]

Use the four training examples to learn weights of the required perceptron. [3 Points]

Remember the perceptron you will design/learn must have a bias term x0 set equal to 1.

Department of Computer Science Page 1


Registration No.___________________________

Part b)[Perceptron Learning Using Gradient Decent] [5 Points]

As discussed in class, the weights of the perceptron can be derived from the training examples where
each training example (Xi , Yi) consists of a vector of inputs and Xi and the corresponding output Yi.
We also discussed a method called gradient descend that can be used to learn weights of a given
perceptron from training examples. In this part your job is to derive a similar learning rule for a single
perceptron that uses the sigmoid function instead of a threshold to compute its final output.

For this purpose you must, first of all, define an error function to be used for deriving the weight
update rule. [2 Points]

Then find the derivative of the error function to obtain the weight update rule [3 Points]

Department of Computer Science Page 2


Registration No.___________________________

National University of Computer and Emerging Sciences, Lahore Campus


Course: Artificial Intelligence Course Code: CS 401
Program: BS (Computer Science) Semester: Fall 2016
Duration: 20 Minutes Total Marks: 100
Paper Date: 26-Oct-16 Weight 2%
Section: N/A Page(s): 2%
Exam: Quiz 2

Problem [Simple Threshold Unit (Perceptrons)]

Part a) [5 Points]

A model of neuron, called perceptron, in the human nervous system is shown in the figure below. As
discussed in class, such a unit can be easily used to represent several different functions. In this
question your job is to find representations of the Boolean function NOR with two inputs using a
perceptron learning rule.

For this purpose you must create four training examples to specify the output for each possible
input. [2 Points]

Use the four training examples to learn weights of the required perceptron. [3 Points]

Remember the perceptron you will design/learn must have a bias term x0 set equal to 1.

Department of Computer Science Page 1


Registration No.___________________________

Part b)[Perceptron Learning Using Gradient Decent] [5 Points]

As discussed in class, the weights of the perceptron can be derived from the training examples where
each training example (Xi , Yi) consists of a vector of inputs and Xi and the corresponding output Yi.
We also discussed a method called gradient descend that can be used to learn weights of a given
perceptron from training examples. In this part your job is to derive a similar learning rule for a single
perceptron that uses the sigmoid function instead of a threshold to compute its final output.

For this purpose you must, first of all, define an error function to be used for deriving the weight
update rule. [2 Points]

Then find the derivative of the error function to obtain the weight update rule [3 Points]

Department of Computer Science Page 2

You might also like