0% found this document useful (0 votes)
34 views9 pages

HW04

The document provides instructions for homework 04, including: 1) Students should fill out answers in the word document and submit a PDF file with their student IDs in the correct format. 2) The homework contains 5 problems related to artificial intelligence concepts like activation functions, decision trees, and neural networks. Students are asked to identify models, provide explanations and calculations. 3) An example is provided of a neural network problem that requires calculating the forward and backward pass during backpropagation with given input, weight, and bias values. Students must show all calculations.

Uploaded by

Nguyen Hikari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views9 pages

HW04

The document provides instructions for homework 04, including: 1) Students should fill out answers in the word document and submit a PDF file with their student IDs in the correct format. 2) The homework contains 5 problems related to artificial intelligence concepts like activation functions, decision trees, and neural networks. Students are asked to identify models, provide explanations and calculations. 3) An example is provided of a neural network problem that requires calculating the forward and backward pass during backpropagation with given input, weight, and bias values. Students must show all calculations.

Uploaded by

Nguyen Hikari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Course: CSC14003 – Introduction to Artificial Intelligence

Class 21CLC – Term II/2022-2023

Homework 04

Submission Notices:
● Conduct your homework by filling answers into the placeholders in this file (in Microsoft Word format).
Questions are shown in black color, instructions/hints are shown in italics and blue color, and your content
should use any color that is different from those.
● After completing your homework, prepare the file for submission by exporting the Word file (filled with
answers) to a PDF file, whose filename follows the following format,
<StudentID-1>_<StudentID-2>_HW01.pdf (Student IDs are sorted in ascending order)
E.g., 2112001_2112002_HW02.pdf
and then submit the file to Moodle directly WITHOUT any kinds of compression (.zip, .rar, .tar, etc.).
● Note that you will get zero credit for any careless mistake, including, but not limited to, the following
things.
1. Wrong file/filename format, e.g., not a pdf file, use “-” instead of “_” for separators, etc.
2. Disorder format of problems and answers
3. Conducted not in English
4. Cheating, i.e., copying other students’ works or letting other students copy your work.

Problem 1. (2pts) Identify each of the following activation functions.

# 2-D representation # 2-D representation

a b

c d

1
e f

g h

Please fill your answer in the table below

# a b c d

Function Tanh or Rectified Linear


Sigmoid or Binary Step
name Hyperbolic Unit (ReLU)
Logistic Activation Activation
Tangent Activation Activation
Function Function
Function Function

# e f g h

Function Symmetric Hard- Leaky Rectified


Linear or Identity Gaussian
name Limit Transfer Linear Unit (Leaky
Activation Activation
Activation ReLU) Activation
Function Function
Function Function

Problem 2. (1pt) Present two objective metrics that can be used to evaluate the attributes for a node on
the decision tree. For each metric, you need to present the formula, identify its domain (i.e., range of
values), and explain for every term in the formula.

Please fill your answer in the table below


Metric name Formula Explanation

2
Problem 3. (2pts) You are given the following tables, which represent the outcomes of some functions.
The functions take two values x and y and output the outcomes of the operations. Please identify at
least two models for each of the functions that are perfectly represent the functions for some choice of
parameters. Justify your answer. Note: there are no constraints on the architecture (e.g, the number of
neurons, activation function, or the best splitting criterion), and the depth of decision tree is 0-index.

a) (1pt) f ( x , y )=x ⨁ y
x y x⨁y
0 0 0
0 1 1
1 0 1
1 1 0

 A neural network with no hidden layer


 A neural network with a single hidden layer
 A decision tree of depth one
 A decision tree of depth two

Explanation:

First model: A neural network with a single hidden layer.

 As we can see, this is the truth table of logical operation XOR. It is impossible to implement the
EXCLUSIVE-OR function Y = X1 ⊕ X2 in a single unit since the data is not linearly separable.

In addition, XOR operation can be written in terms of AND, OR and NOT operation:

3
x XOR y = (¬ (x  y))  (x  y), which means we must compose multiple logical operations by using a
hidden layer to represent the XOR function. Therefore, we choose the neural network with a single
hidden layer.

Below is the model of a multi-layer neural network:

To make a XOR gate, we will make h1 node to perform the (¬ (x  y)) operation, h2 node to perform
the (x  y) operation, and the z node to perform (h1  h2) operation.

Second model: A decision tree of depth two.

 For each decision node, there are two options represented by 0 and 1, so there are four options in
total fitting the values in the table.

Below is the decision tree representing XOR function:

b) (1pt) f ( x , y )=¬(x ∨ y)
x y ¬( x ∨ y)
0 0 1
0 1 0
1 0 0
1 1 0

4
 A neural network with no hidden layer
 A neural network with a single hidden layer
 A decision tree of depth one
 A decision tree of depth two

5
Explanation:

First model: A neural network with no hidden layer.

 We have infinite number of lines which divide the plane into two regions; for one region, the output
is 0 and for another region, the output is 1, so the data points formed by OR function are linearly
separable. The data points formed are (0,0):0, (0,1):1, (1,0):1, (1,1):1. Therefore, we can use one
perceptron to represent OR function.

(x1 OR x2)

Below is the perceptron that performs the f (x , y )=¬ ( x ∨ y ) function:

Second model: A decision tree of depth two.

 For the root decision node, there are two options represented by 0 and 1. Since ¬ ( x ∨ y ) can be
rewritten as ¬ x ∧ ¬ y which means if x takes value 1, then ¬ x will take value 0, and the output will
always be 0 regardless of the value of y. When x takes value 0, the output will depend on the value
of y, or we will consider the decision node y.

Below is the decision tree representing the f ( x , y )=¬ ( x ∨ y ) function:

6
Problem 4. (2pts) Consider the following training dataset, in which Transportation is the target
attribute. Show calculations to choose an attribute for the root node of the ID3 decision tree
Gender Car Ownership Travel Cost Income Level Transportation
Male 0 Cheap Low Bus
Male 1 Cheap Medium Bus
Female 1 Cheap Medium Train
Female 0 Cheap Low Bus
Male 1 Cheap Medium Bus
Male 0 Standard Medium Train
Female 1 Standard Medium Train
Female 1 Expensive High Car
Male 2 Expensive Medium Car
Female 2 Expensive High Car

Please fill your answer in the table below

Counts Metric values


Attribute
Bus Car Train H AE IG
values
Whole
Gender Female
(0.5pt)
Male
Car 0
Ownership
1
(0.5pt)
2
Travel Cost Cheap
(0.5pt)
Expensive
Standard
Income Low

7
Level Medium
(0.5pt)
High

Problem 5. (3pts) Consider the following neuron network, which includes 3 input neurons, 2 hidden
neurons and 1 output neurons.

Initial input, weight and bias values are


x1 x2 x3 w14 w15 w24 w25 w34 w35 w46 w56 4 5 6
1 0 1 0.2 –0.3 0.4 0.1 –0.5 0.2 –0.3 –0.2 –0.4 0.2 0.1

The expected output value is 1. The learning rate is 0.9.


Knowing that the actual output at some neuron j is calculated as follows.

[ ]
n
y j ( p )=sigmoid ∑ x i ( p)× wij ( p)+θ j
i=1

where n is the number of inputs of neuron j, w ij is the corresponding link from a neuron i in the previous
layer to neuron j, and θ j is the bias at neuron j.

Present all calculations required to perform the backpropagation once (i.e., one forward pass and one
backward pass) on the given neural network in the following cases.

a) Ignore all biases (precision to 3 decimal places).

(0.25pt) Ignore all biases – Forward


Neuron 4 5 6

Output 0.426 0.475 0.445

(1pt) Ignore all biases – Backward


Weight w46 w56 w14 w15 w24 w25 w34 w35

8
Value -0.247 -0.141 0.191 -0.306 0.4 0.1 -0.509 0.194

b) Consider all biases such that each bias is treated as a neuron and thus it will be also updated
(precision to 3 decimal places).

(0.25pt) Consider all biases – Forward


Neuron 4 5 6

Output

(1.5pt) Consider all biases – Backward


Weight w46 w56 w14 w15 w24 w25 w34 w35 4 5 6

Value

You might also like