Untitled Notebook
Untitled Notebook
-
:
favtomatieally "
how to learn from and make predictions about , data
'
ML :
, - =
CS
a÷:g:!÷;÷mwPb
stats a ← neuro .
→ 's
i t
{
learning from
}iFn÷g÷g÷qg":g
-
labeled finding patterns lstrct .
learning by
examples =
data
"
" make
clustering
a '
regression , "
,
classification
"
i¥E÷÷*÷÷÷÷÷¥
as
parallel computing for
large scale applications )
-
ML / neural networks :
Bet :
:'
ML vs classical statistics :
-
to estimate
=
value of unknown quantity
statistical inference
" "
classically low - dimension
,
ie
,
few parameter to estimate from
data
video
2nd : predict traj .
redials :
Machine learninging
-
1) observable
-
1 measurable quantities a ERP
features
" "
p = # of
features priori
a
j
'
al assumed
→
Rj ,
. . .
, p
a
inputs X e Rn xp
observations • n
X a [ a
"'
am , . -
n
a
"'
TT n a # data points
, ,
• "
fam poles
x .
fi!!:L I:÷÷÷ ) -
:
•
inhumane.com
learning laws of
motion
.
2) parametrized -
model :
output alphabet Y J
.
.
real GR :
regression
keassifica-fionms-sfcndojso.tk
weight Cage , height )
discrete 5 E { Oil , - -
-
regs : .
. .
.
fo John J or Ca ) or ) or
L parameters of model
{ fo : ① Erm 3 : hypothesis
class
Function to predict output label from
Wil
-
input sample
µ finding
The
learning pb the A
best "
Ely , f. H1 ) e IR > o
E
argfionnm.E.elgifegh.im?
-
-
Joon )
'
L l Yi Facial )
,
= ( gi -
squared er .
assumed
→ noise is
gaussian .
slides vs cats
Dogs
on
keynote
→ discuss all ingredients
"
perception learning
'
M¥6 :
how to
design good predictive
&
a
=
model ? K
:D =D train Diest
o ) direct U
!
1) led on the
'
training
'
set
o¥y
E'
"
E ecg f. ca 't'd )
argnfin
=
( Dtrain I
with D train =
{ (gcif , htt'd Sia ,
"
E out of sample
"
e
.
ooh -
. error
"
generalisation
i error
'
Four -
757 e ly '
't t , to titis)
l Diest l
wit n'te ) Sj ,
,
|Eat3E
" "
Ein : in -
sample error
"
e
training error
f Yo ff
'
if multiple candidates of model : . .
.
, ,
→
"
beat
"
model is the one that
minimizes the out -
of sample
-
error
→
a
cross - validation "
-
Sum up
#
-
MI :
"
how to learn from and make
,
N N
←
e : D .
Cx , g)
"
X f IR
"
P "
EIR
→
, y
9
input
.
labels
3¥79
"'
D= { ( a
, gli )
'
t
data pout label
favuple
seen ERP ?
ya EIR
p
: # of features
'
yci EIR :
regression
C- IN :
classification
2) parametrized model :
-
Fa Ca )
{ Aa :
of
ER
Ya 3 :
t hypothesis class
→ i
model
f¥ :
function
( for
used
new
to
,
predict
unseen
,
"
fresh
"
data points )
"
allow to
→
generation "
=3
)costfunctia a energy functional
( notion of "
optimality "
l " goodness "
off )
Q :
y
EIR
, if a
Cal E IR t IR > o
T
e Cy✓
, to ex ) )
A
YR N
'
EIRN
=
it e ( y
Cil
,
to Caen ) ) mean
=
square
error
d
squared :
iffy ,
fo ki ) = Mst ( y foot)
,
2
-1¥ ( fo is
g)
"'
Cnc
=
y -
In ,
t
-
learning problem Copt pb ) .
:
Edd train ) =
,
2
MEE ar g min E (y "'
-
fo Cecily )
Q ERM ly en , nailedtrain
- -
to ) ( O l D train )
got ran
=
g
T
conditioning
" "
" "
given
-
D=DtrainUD#
{ ⑥ train n D test =
Of
@ train { ( Rei ERP glider ) 3! !Tin
'
a
,
N N N' test choice
= train + :
your
N train 7 N best
N train ~ O 9 - N
,
posteriori a
=3 -0
Evaluate your model !
-
"
of sample
"
tout : out error
"
r
generalisation error
EGG )
,f¥
Eat '
-
cab )'
ey * • **
"
Foot > Ein n
.
in -
sample error
"
training
"
" error
)
Ein f
' "
=
y Cali 's
Edwin ,
;
Ein la )
Ein Ce)
learning argmin
:
often
-