0% found this document useful (0 votes)
3 views2 pages

IML Major

The document outlines various topics related to neural networks, including an overview of feedforward networks, the modeling of non-linear decision boundaries, and the learning process in neural networks. It also discusses convolutional neural networks (CNNs), ensemble learning techniques, data clustering methods, time-series modeling, dimensionality reduction, and support vector machines (SVM). Each section includes specific questions and tasks related to these concepts.

Uploaded by

atanu29th
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views2 pages

IML Major

The document outlines various topics related to neural networks, including an overview of feedforward networks, the modeling of non-linear decision boundaries, and the learning process in neural networks. It also discusses convolutional neural networks (CNNs), ensemble learning techniques, data clustering methods, time-series modeling, dimensionality reduction, and support vector machines (SVM). Each section includes specific questions and tasks related to these concepts.

Uploaded by

atanu29th
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

:sti on s :

_1 pot ar-
. Ne ura l Ne two rks (8]
3tiC 1:U
p ·d d ina .tbe Jll ·t's
(a) roV1 e an ove rvi ew of 3 1 £ h fig ure a.n
aye r eed for wa rd neu ral net wo rk wit bil e l
tio ns .
. . bouDda.rieS ~ [4]
(b) EArplain how m lt· 1 [4]
ild in u aye r neu ral net wo rks can mo del non -lin e~r dec1s10n
bas ic bu 1-
g blo ck Per cep tro n(/ neu ron ) is jus t
a lin ear classifier.
.
(c) Bri efl y exp lai n how 1ear mn . pef,orm ed in neu ral net wo rks .
. g 1s
[3+21
!. CN N 1ut ion
. of lD Co nvo . al da ta
. . . . mu la . h cfu nen s1o n ]
p n of Co nvo luti on ope rat ion wit h for [ 4
(a) roV1de bn ef des cnp tio . · · g for big .
• upo n neu ral net wo rks wh ile trrunID
(b) E A'1)lam how CN N's imp rov ed I/ filt er of siz
h
e
suc h as ima ges . .th the ker ne
29 X 29. It is con vol ute ~ Wl f im age . Co mp ute t e
to the CN N is of size [2]
(c) Th e inp ut ima ge g of 1 pix els on eac h sid e
O
an
I and zer o pad din
5 X 5 wit h a stri de of [3]
tur e em bed din g.
dim ens ion of out put fea
of vanishing gradients in dee p lea rni ng.
(d) Bri efly exp lain the con cep t

3. En sem ble Lea rni ng


[3]
sed by ens em ble lea rni ng [4]
(a) Briefly exp lain 3 key pro ble ms add res
ensemble lea rni ng tec hni que s.
(b) Giv e sho rt overview of at lea st two [P. T. 0. ]

1
g
4. D a ta C lu st er in [4]
io n of a .
(a) Give v is u al co n st ru ct
1o m er ati~ e cl us te ring technique. [4]
code o f gg
(b) P ro v id e p su ed o K means clustermg al
gorithm . [5]
I
(c) L.ist a t ea st 5 li m it at io f
°
K m ea ns cl us te ring algorithm .
ns
els
5. Time-series Mod .
st at e tr ansition di ag ra m]
.
ple st t t rans1t. 1on probabilty m at ri x an d d ra w as so ci at ed [3 +3
(a ) P ro v id e a sain a e
(
t 3 st at es in th e Markov process). e n o tati ons for in p u t,
assu m e a t le as h ap p ro p ri at [4]
at io n o f R N N un folding with time wit
visualiz
(b) D ra w g ra p hpicaralam et er s.
o u p u t an d
N Classifier
6 · D im en si o n al it y R ed u ct io n & KN n th e in p u t d-dimensional
io n gi ve
ti n g P C A ba se d di mensionality reduct (6 ]
s o f co m p u
(a ) P ro v id e key st ep sisting o f n samples. fier . [31
d a ta m at ri x X co n K N N cl as si
m il ar it y m ea su re s th a t ca n b e used in
la o f th re e si
(b) P ro v id e formu
ea r
& K er n el T o ck h 1 in so lv in g n o n -l in l 1
7. SV M li .
zat lo n ca n e P 6
n o f example o f ho w ke rn e l51
al v is u al iz at io
(a ) Give g ra h p ic
ms.
classification proble
rmulation o f SVM.
(b ) P ro v id e d u al fo

Good Luck!

You might also like