0% found this document useful (0 votes)
13 views19 pages

Untitled Notebook

Notes

Uploaded by

drdsmith08
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views19 pages

Untitled Notebook

Notes

Uploaded by

drdsmith08
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Introduction

-
:

favtomatieally "
how to learn from and make predictions about , data
'
ML :
, - =

CS

a÷:g:!÷;÷mwPb
stats a ← neuro .

→ 's

i t

{
learning from

}iFn÷g÷g÷qg":g
-
labeled finding patterns lstrct .
learning by
examples =

data
"
" make
clustering
a '

regression , "
,

classification
"

dim . reduction ( PCA)


whymlissopopwlarsincen20102.es
of big
era
a -
data "
: NL

needs large datasets


exponential 9 in

memory
and computational capabilities
( GPU graphical processing units
:
DL
=

i¥E÷÷*÷÷÷÷÷¥
as
parallel computing for
large scale applications )
-

ML / neural networks :
Bet :

theory started in 60 , ¥01


'

:'
ML vs classical statistics :
-

=¥e : how to use data

to estimate
=
value of unknown quantity
statistical inference
" "
classically low - dimension
,
ie
,
few parameter to estimate from
data

Mt : how to use data to predict


=

data stat : get the


f
on now unseen
Ex :
that falls grau coast
recordings of apple
.

video
2nd : predict traj .
redials :
Machine learninging
-

1) observable
-
1 measurable quantities a ERP

features
" "

p = # of
features priori
a
j
'
al assumed

Rj ,
. . .

, p
a

important to describe the system of interest


dataset :D ( X y ) { ( Kil ERP gli EIR ) 3in '
=
=
, , ,

i labels liar buts G E Rn


"
' "
" "

inputs X e Rn xp
observations • n

X a [ a
"'
am , . -
n
a
"'
TT n a # data points
, ,
• "
fam poles

x .
fi!!:L I:÷÷÷ ) -
:

inhumane.com
learning laws of
motion
.
2) parametrized -
model :

f : se E IRP x O ERM to John) J EY -


-

output alphabet Y J
.

.
real GR :
regression

keassifica-fionms-sfcndojso.tk
weight Cage , height )
discrete 5 E { Oil , - -
-

regs : .
. .
.

fo John J or Ca ) or ) or

L parameters of model
{ fo : ① Erm 3 : hypothesis
class
Function to predict output label from
Wil
-

input sample

µ finding
The
learning pb the A
best "

parameters £ given data D


3) Cowen ( fo )
' "
or
energy :

Ely , f. H1 ) e IR > o

→ used to date the I


performance quality
of the model on the dataset @

based the cost i


learning on
optimizing
-

E
argfionnm.E.elgifegh.im?
-
-

Joon )
'
L l Yi Facial )
,
= ( gi -
squared er .
assumed
→ noise is
gaussian .

slides vs cats
Dogs
on
keynote
→ discuss all ingredients
"

we 'll understand the

perception learning
'
M¥6 :

how to
design good predictive
&
a
=

model ? K

:D =D train Diest
o ) direct U

with : D train A @ best =


&
~
90%
-
10%

do the splitting before anything


else and never change that back
,
even if the results are poor ,

or thermite creates bias .

!
1) led on the
'
training
'
set
o¥y
E'
"

E ecg f. ca 't'd )
argnfin
=

( Dtrain I
with D train =
{ (gcif , htt'd Sia ,

the learnt model


2) Ever on

the test data D test :

"
E out of sample
"
e
.
ooh -
. error
"
generalisation
i error
'

Four -

757 e ly '
't t , to titis)
l Diest l
wit n'te ) Sj ,
,

|Eat3E
" "
Ein : in -

sample error
"
e
training error

f Yo ff
'
if multiple candidates of model : . .
.

, ,


"
beat
"
model is the one that
minimizes the out -

of sample
-
error


a
cross - validation "

-
Sum up
#
-

yesterday 's lecture :

MI :
"
how to learn from and make
,

predictions about , dada


"

so perML : learn from examples


I labeled data
ML
-
ingredients r

N N

e : D .
Cx , g)
"
X f IR
"
P "
EIR

, y
9
input
.

labels
3¥79
"'
D= { ( a
, gli )
'

t
data pout label
favuple
seen ERP ?
ya EIR

p
: # of features
'
yci EIR :
regression
C- IN :
classification
2) parametrized model :
-

f : a ERP x a ERM te fo (a) EIR

Fa Ca )
{ Aa :
of
ER
Ya 3 :

t hypothesis class

→ i
model

f¥ :
function
( for
used
new
to

,
predict
unseen
,
"
fresh
"

data points )
"
allow to

generation "

=3
)costfunctia a energy functional
( notion of "
optimality "
l " goodness "
off )
Q :
y
EIR
, if a
Cal E IR t IR > o

T
e Cy✓
, to ex ) )
A
YR N
'
EIRN

=
it e ( y
Cil
,
to Caen ) ) mean
=
square
error
d

squared :
iffy ,
fo ki ) = Mst ( y foot)
,

2
-1¥ ( fo is
g)
"'
Cnc
=
y -

In ,
t

-
learning problem Copt pb ) .
:

afgemjnmii i i iea.ra.mg folacin )


"

Edd train ) =
,
2
MEE ar g min E (y "'
-
fo Cecily )
Q ERM ly en , nailedtrain
- -

to ) ( O l D train )
got ran
=

g
T
conditioning
" "

" "
given
-

before doing anything else

D=DtrainUD#
{ ⑥ train n D test =
Of
@ train { ( Rei ERP glider ) 3! !Tin
'
a
,
N N N' test choice
= train + :
your

N train 7 N best
N train ~ O 9 - N
,

This should changes



splitting neer be

posteriori a

• the splitting in D= @train U Diest

meet be random ( unbiased )


@train
@ nee learning donne on

=3 -0
Evaluate your model !
-

"
of sample
"
tout : out error
"
r
generalisation error

EGG )
,f¥
Eat '
-
cab )'

ey * • **

"
Foot > Ein n

.
in -

sample error
"

training
"
" error

)
Ein f
' "
=

y Cali 's
Edwin ,
;
Ein la )
Ein Ce)
learning argmin
:

often
-

You might also like