0% found this document useful (0 votes)
6 views56 pages

Lecture2 ML Regression

The document covers the basics of machine learning and linear regression, emphasizing the importance of data, models, and computational power in deep learning. It explains supervised learning as a mapping from inputs to outputs, with a focus on regression models such as predicting house prices. Additionally, it discusses model parameters and the training process to optimize predictions based on input/output pairs.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views56 pages

Lecture2 ML Regression

The document covers the basics of machine learning and linear regression, emphasizing the importance of data, models, and computational power in deep learning. It explains supervised learning as a mapping from inputs to outputs, with a focus on regression models such as predicting house prices. Additionally, it discusses model parameters and the training process to optimize predictions based on input/output pairs.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

Deep Learning Theory and Applications

Lecture 2: Machine Learning Basics


and Linear Regression

Authors: Ta Viet Cuong, Ph. D


HMI laboratory, University of Engineering and Technology
Slide adapt from: https://github.com/udlbook/udlbook/
Spring, 2025
Outline

1. Machine Learning Foundations

2. Linear regression

3. Examples

2
1. Machine Learning

3
Deep Learning vs ML

4
Why is Deep Learning popular?
- Re-defined the AI approach: DATA + MODEL + GPU(s) = SOLVED
- Have a verify flexible framework: supervised, unsupervised,
reinforce, semi-supervised, …
- Transfer-able from tasks to tasks within related domains

5
Supervised Learning: Regression or
Classification

12
Regression: Predict House Price
Supervised learning overview
• Supervised learning model = mapping from one or more inputs to
one or more outputs
• Model is a mathematical equation

• Computing the inputs from the outputs = inference


Supervised learning overview
• Supervised learning model = mapping from one or more inputs to
one or more outputs
• Model is a mathematical equation

• Computing the inputs from the outputs = inference

• Example:
• Input is student’s CV
• Output is estimated salary
Supervised learning overview
• Supervised learning model = mapping from one or more inputs to
one or more outputs
• Model is a mathematical equation

• Computing the inputs from the outputs = inference


• Model also includes parameters
• Parameters affect outcome of equation
Supervised learning overview
• Supervised learning model = mapping from one or more inputs to
one or more outputs
• Model is a mathematical equation
• Model is a family of equations
• Computing the inputs from the outputs = inference
• Model also includes parameters
• Parameters affect outcome of equation
Supervised learning overview
• Supervised learning model = mapping from one or more inputs to
one or more outputs
• Model is a mathematical equation
• Model is a family of equations
• Computing the inputs from the outputs = inference
• Model also includes parameters
• Parameters affect outcome of equation
Supervised learning overview
• Supervised learning model = mapping from one or more inputs to
one or more outputs
• Model is a family of equations
• Blank
• Computing the inputs from the outputs = inference
• Model also includes parameters
• Parameters affect outcome of equation
• Training a model = finding parameters that predict outputs “well”
from inputs for a training dataset of input/output pairs
Notation:
• Input:
Variables always Roman letters
<latexit sha1_base64="gMTgjs7J9T7tfl8I4J/4iGZ8J/U=">AAAB8XicbVDLSgMxFL1TX7W+qi7dBIvgqsyIqMuiG5cV7APbUjLpnTY0kxmSjFiG/oUbF4q49W/c+Tdm2llo64HA4Zx7ybnHjwXXxnW/ncLK6tr6RnGztLW9s7tX3j9o6ihRDBssEpFq+1Sj4BIbhhuB7VghDX2BLX98k/mtR1SaR/LeTGLshXQoecAZNVZ66IbUjPwgfZr2yxW36s5AlomXkwrkqPfLX91BxJIQpWGCat3x3Nj0UqoMZwKnpW6iMaZsTIfYsVTSEHUvnSWekhOrDEgQKfukITP190ZKQ60noW8ns4R60cvE/7xOYoKrXsplnBiUbP5RkAhiIpKdTwZcITNiYgllitushI2ooszYkkq2BG/x5GXSPKt6F9Xzu/NK7TqvowhHcAyn4MEl1OAW6tAABhKe4RXeHO28OO/Ox3y04OQ7h/AHzucPADaRJQ==</latexit>

x Normal = scalar
Bold = vector
Capital Bold = matrix

• Output:
y
<latexit sha1_base64="Orsd8Uetr8OD0z/KWA5V57CS7gM=">AAAB8XicbVDLSgMxFL1TX7W+qi7dBIvgqsxIUZdFNy4r2Ae2Q8mkmTY0kwxJRhiG/oUbF4q49W/c+Tdm2llo64HA4Zx7ybkniDnTxnW/ndLa+sbmVnm7srO7t39QPTzqaJkoQttEcql6AdaUM0HbhhlOe7GiOAo47QbT29zvPlGlmRQPJo2pH+GxYCEj2FjpcRBhMwnCLJ0NqzW37s6BVolXkBoUaA2rX4ORJElEhSEca9333Nj4GVaGEU5nlUGiaYzJFI9p31KBI6r9bJ54hs6sMkKhVPYJg+bq740MR1qnUWAn84R62cvF/7x+YsJrP2MiTgwVZPFRmHBkJMrPRyOmKDE8tQQTxWxWRCZYYWJsSRVbgrd88irpXNS9y3rjvlFr3hR1lOEETuEcPLiCJtxBC9pAQMAzvMKbo50X5935WIyWnGLnGP7A+fwBAbuRJg==</latexit>

Functions always square brackets


• Model: <latexit sha1_base64="zbhhmP1jUTpEtCmKK8cQ80zGoB8=">AAACDHicbVDNSgMxGMzWv1r/qh69BIvgqexKUS9C0YvHCrYVdpeSTbNtaLJZkqx0WfYBvPgqXjwo4tUH8ObbmG0raOtAYJiZj3zfBDGjStv2l1VaWl5ZXSuvVzY2t7Z3qrt7HSUSiUkbCybkXYAUYTQibU01I3exJIgHjHSD0VXhd++JVFREtzqNic/RIKIhxUgbqVeteRzpYRBmaX7h8UCMMy8IYZi7P/o4903KrtsTwEXizEgNzNDqVT+9vsAJJ5HGDCnlOnas/QxJTTEjecVLFIkRHqEBcQ2NECfKzybH5PDIKH0YCmlepOFE/T2RIa5UygOTLFZU814h/ue5iQ7P/YxGcaJJhKcfhQmDWsCiGdinkmDNUkMQltTsCvEQSYS16a9iSnDmT14knZO6c1pv3DRqzctZHWVwAA7BMXDAGWiCa9ACbYDBA3gCL+DVerSerTfrfRotWbOZffAH1sc3+pKcOA==</latexit>

Normal = returns scalar


y = f[x] Bold = returns vector
Capital Bold = returns matrix
Notation example:
• Input: <latexit sha1_base64="wzhjxXVL59bjZ1ZjP8k5Zv3PPIc=">AAACKnicbVDLSgMxFM3Ud31VXboZLIKrMiNF3Qg+Ni4r2Ad0SknSO21okhmSjLQM8z1u/BU3XSjFrR9i+kC09ULg3HPO5eYeEnOmjeeNndzK6tr6xuZWfntnd2+/cHBY01GiKFRpxCPVIFgDZxKqhhkOjVgBFoRDnfTvJ3r9GZRmkXwywxhaAnclCxnFxlLtwm0gsOmRMB1k1wGBLpMpsYxigywQJBqkuAtZEMywYBymPcjOj61dKHolb1ruMvDnoIjmVWkXRkEnookAaSjHWjd9LzatFCvDKIcsHyQaYkz7dlPTQokF6FY6PTVzTy3TccNI2SeNO2V/T6RYaD0UxDonh+lFbUL+pzUTE161UibjxICks0Vhwl0TuZPc3A5TQA0fWoCpYvavLu1hhamx6eZtCP7iycugdl7yL0rlx3Lx5m4exyY6RifoDPnoEt2gB1RBVUTRC3pD7+jDeXVGztj5nFlzznzmCP0p5+sbbAGqWA==</latexit>


age
Area Structured or tabular
x= data
mileage
Year

• Output: <latexit sha1_base64="CiWwSq4e5YCAetgoDDod1m+VwkA=">AAACFXicbVDLSsNAFJ3UV62vqEs3g0VwISWRom6EohuXFewDmlAmk9t26GQSZibSEvoTbvwVNy4UcSu482+cPhBtPXDhcM693HtPkHCmtON8Wbml5ZXVtfx6YWNza3vH3t2rqziVFGo05rFsBkQBZwJqmmkOzUQCiQIOjaB/PfYb9yAVi8WdHibgR6QrWIdRoo3Utk+G+BJ7AXSZyIKIaMkGIy8K4kGWSEZh5IEIf4y2XXRKzgR4kbgzUkQzVNv2pxfGNI1AaMqJUi3XSbSfEakZ5TAqeKmChNA+6ULLUEEiUH42+WqEj4wS4k4sTQmNJ+rviYxESg2jwHSa+3pq3huL/3mtVHcu/IyJJNUg6HRRJ+VYx3gcEQ6ZBKr50BBCJTO3YtojklBtgiyYENz5lxdJ/bTknpXKt+Vi5WoWRx4doEN0jFx0jiroBlVRDVH0gJ7QC3q1Hq1n6816n7bmrNnMPvoD6+Mbd6+gUQ==</latexit>

⇥ ⇤
y = price

• Model: <latexit sha1_base64="driP7DC20JBlMn9lO/ABYb3rBkU=">AAACAXicbVDLSsNAFJ34rPUVdSO4GSyCq5JIUTdC0Y3LCvYBSSiT6aQdOpmEmYk0hLjxV9y4UMStf+HOv3HSZqGtBy4czrmXe+/xY0alsqxvY2l5ZXVtvbJR3dza3tk19/Y7MkoEJm0csUj0fCQJo5y0FVWM9GJBUOgz0vXHN4XffSBC0ojfqzQmXoiGnAYUI6WlvnmYwivohn40yYLccUOkRn6QTXKvb9asujUFXCR2SWqgRKtvfrmDCCch4QozJKVjW7HyMiQUxYzkVTeRJEZ4jIbE0ZSjkEgvm36QwxOtDGAQCV1cwan6eyJDoZRp6OvO4kQ57xXif56TqODSyyiPE0U4ni0KEgZVBIs44IAKghVLNUFYUH0rxCMkEFY6tKoOwZ5/eZF0zur2eb1x16g1r8s4KuAIHINTYIML0AS3oAXaAINH8AxewZvxZLwY78bHrHXJKGcOwB8Ynz+TPZcA</latexit>

y = f[x]
Model
• Parameters: <latexit sha1_base64="fW07nrotCrqiLM5lBOS/3ACQsXU=">AAAB+HicbVC7TsMwFL0pr1IeDTCyWFRITFWCEDBWsDAWiT6kJqocx2mtOk5kO0gl6pewMIAQK5/Cxt/gtBmg5UiWj865Vz4+QcqZ0o7zbVXW1jc2t6rbtZ3dvf26fXDYVUkmCe2QhCeyH2BFORO0o5nmtJ9KiuOA014wuS383iOViiXiQU9T6sd4JFjECNZGGtp1L0h4qKaxubx0zIZ2w2k6c6BV4pakASXaQ/vLCxOSxVRowrFSA9dJtZ9jqRnhdFbzMkVTTCZ4RAeGChxT5efz4DN0apQQRYk0R2g0V39v5DhWRTYzGWM9VsteIf7nDTIdXfs5E2mmqSCLh6KMI52gogUUMkmJ5lNDMJHMZEVkjCUm2nRVMyW4y19eJd3zpnvZvLi/aLRuyjqqcAwncAYuXEEL7qANHSCQwTO8wpv1ZL1Y79bHYrRilTtH8AfW5w88sZN6</latexit>
Parameters always Greek
letters

• Model : <latexit sha1_base64="YU2YQesIl0F2a2j+LG7JIv6p0XQ=">AAACHnicbVDLSsNAFJ34rPUVdelmsAgupCRSHxuh6MZlBfuAJJTJZNIOnWTCzERaQr7Ejb/ixoUigiv9GydtBW09MHA4517m3OMnjEplWV/GwuLS8spqaa28vrG5tW3u7LYkTwUmTcwZFx0fScJoTJqKKkY6iSAo8hlp+4Prwm/fEyEpj+/UKCFehHoxDSlGSktd89SNkOr7YTbK4SV0I58PM9cPYZg7P84wP3Z9zgI50i5zkz71umbFqlpjwHliT0kFTNHomh9uwHEakVhhhqR0bCtRXoaEopiRvOymkiQID1CPOJrGKCLSy8bn5fBQKwEMudAvVnCs/t7IUCSLcHqyiCxnvUL8z3NSFV54GY2TVJEYTz4KUwYVh0VXMKCCYMVGmiAsqM4KcR8JhJVutKxLsGdPnietk6p9Vq3d1ir1q2kdJbAPDsARsME5qIMb0ABNgMEDeAIv4NV4NJ6NN+N9MrpgTHf2wB8Yn9/DkqOI</latexit>

y = f[x, ]
Loss function
• Training dataset of I pairs of input/output examples:
I
<latexit sha1_base64="aYKJls3gxo4H7A6/dQhUn1+H0GU=">AAACFHicbVDLSsNAFJ3UV62vqEs3g0UQlJJIUTdC0Y3uKtgHNDFMppN26OTBzEQsIR/hxl9x40IRty7c+TdO0gjaemCGc8+9l3vvcSNGhTSML600N7+wuFRerqysrq1v6JtbbRHGHJMWDlnIuy4ShNGAtCSVjHQjTpDvMtJxRxdZvnNHuKBhcCPHEbF9NAioRzGSSnL0AyuxfCSHrpfcp05C08OfcJyHVvafmeltcpU6etWoGTngLDELUgUFmo7+afVDHPskkJghIXqmEUk7QVxSzEhasWJBIoRHaEB6igbIJ8JO8qNSuKeUPvRCrl4gYa7+7kiQL8TYd1VltrGYzmXif7leLL1TO6FBFEsS4MkgL2ZQhjBzCPYpJ1iysSIIc6p2hXiIOMJS+VhRJpjTJ8+S9lHNPK7Vr+vVxnlhRxnsgF2wD0xwAhrgEjRBC2DwAJ7AC3jVHrVn7U17n5SWtKJnG/yB9vEN+gCgCA==</latexit>

{xi , yi }i=1
• Loss function or cost function measures how bad model is:
<latexit sha1_base64="hEz86CO3UCw8TPROwDlyfIH4Vs8=">AAAX/HiclZjNbtw2EMfX/UzdL6dF4UMvQo0AReAudoOk7aVAYsdJHDu1nfgrsWyDkigtY4qSJcrejaCeem2forei175L36EP0GOHknZpcWigXcNeen5/DsnhkKLopZzlcjD4a+6tt9959733b3ww/+FHH3/y6cLNz/bzpMh8uucnPMkOPZJTzgTdk0xyephmlMQepwfe2ariBxc0y1kiduUkpccxiQQLmU8kmE4X/nE9GjERZUmRurW7MiYZWOqq1XzZGj1O/LNq011hET9yvTAdsWWncgsR0MzLiE+7Qjee5DIrZHmnP6Bj+NdLxmVYqZrj5ab6cXVaNvY4CSivqmXnP7srlZ/TklXK10QVXPDGfhhWJ+V6NXMsM8KEExBJqkr1PDued6kI6sGeLiwN+oP64+DCsC0s9drP9unNL/52g8QvYiqkz0meHw0HqTwuSSaZz2k17xY5TaG3JKJHUBQkpvlxWQ+jcm6BJXDCJINfIZ3aerVGSeI8n8QeKGMiR7nJlNHGjgoZfn9cMpEWkgq/aSgsuCMTR823E7CM+pJPoED8jEFfHX9EIMQSsmLeFfTST+KYiKB0V9Z2qrLJh5KeF3WGVFVXs1ZrIIjXKlbWd2demKQxe0ORk1qinFwjoFFVlrQf9U3AKADWpwgkgubgU8XHC52hQWFFcMBtWkDOOM8r5FpIGkFMOrJXSAaFlEMOXlWtIhVMZdyRvACJ49xyFKAyg1mArsIXNebgRUpENa0n6VhmcZkrm9lCRkRE6yZgyD7hakRdhSg4h6p+R/WjqXpOxFkbuCStu5opi6HazboameG4iKCrqS2GCpIw6qpqi6HisH8FJCYQ5bYMKzqLHWWxS5kwpQwl5naWeN22U2Uxc3Ocwnrp6tZKFP4LYkREGWD1qW9GhE+78tVkpnamwbmo9apAx84IJqtbBXbhZljTRmBUra3CyjpWhhJHC0xZctlVqt5YpDRl3QEqg7noioyJ8IpsuS5ByiqzuwxDzQpOj77p36Pj41Jt3fUfFE1wlBepzZEy/w9HATwxzfwCizl5CTcmDwz15CUc9ndj6khmJray1HMHBSYIZ3JiLH8WiW6d2mJ2NomNvoJB+YVveHAZkxyGXbEyKDF8w7PfkkC+MUi/GaPPk7zIKNr8jHwGSy1X22LG1MOqu6FyJejuG5TPakEZHg4X9JrqnhFRr4mnl8ABgGRGMMdqSscnbi5hidlWfz3lTdGqiuj5Rtse9Atmp/B9en66Yc5HhFRYww1fcNiy+uJIZWkPfM3S9WrPyo2T2yi1I4vWruTIb9tLu9qivaYH9HzT0ttNpEMqrOGGr7aHWIdUlvbAlz2Om7ZRWLR2JUd+p3G0qi3amdJI/3B3ROFQCqWEB+rYl3C3MZlCiYXSKkxiGhnCxmQK46Krgv9NyQsGD4+uqjGZwu2cdWXKYIrgcG8OoTGZwmYJd5WtzZRuWqSbdinh6chQNiZT+JjE5qgbkymMsDCyCs9ImhrCxoTiODLjOMJxTE1RahOZM5JaZgSllC2hslHSFSmDKRobrY0tjUEPeCKMBlujKc5x5uXWzBNGFgucxXu2hveuaVgSw6EymKIttMYcd8u6yDwzxHDMsgU5ZYYqxQHcNjXbWDM9/XlhiU5y8Ias6QTTS00vMT3Q9ADTTFP0RuCFzzVFbydeeKHpBab7mu5jWmhaYLqn6R6moaYhpo80fYSpr6mP6aqmq5hKTdGJFJ4Imu5iOtJ0hOmhpoeYvtT0JaZPNH2C6StNX2H6RtM3mD7Q9AGmRFOC6Zqma5hSTdHVgReuaLqCqacpeveDtabpNqappimmDzV9iGmgKXorhueZpuh4Aw9GTTmm65quY8o0Re9vXvhM02eYxprGmD7V9CmmrzV9jeljTR9jGmmK7gbgdKLpC0z1LVCZY7qj6Q6m55qe2+8F6GwaPVtibmkHW5gmmiaYbmiK3hTgKKHpGTpPhqLd1aa3TWhfC8WMW1gb8WltFPNQzLiFtbvTtDban0Ix4yPU9bX92UUKhLTe6QOqXmjxXXTpxxMIw6B/dzBcHqgfVVLvjasTIuRl4oDXNBFUyM6l2u3p7fHR8Bje/dVFgnPJAjkapNIZURaNpCoFNJWjpaF6tSXjav50YWlo3gbjwv6d/vDb/r2du0v3V9qb4hu9L3tf9b7uDXvf9e73nvS2e3s9f+5k7ue5X+Z+Xfxp8bfF3xf/aKRvzbV1Pu91Pot//guMHWbl</latexit>

h i
L , f[x, ], {xi , yi }Ii=1
| {z } | {z }
model train data
or for short:
Loss function
• Training dataset of I pairs of input/output examples:
I
<latexit sha1_base64="aYKJls3gxo4H7A6/dQhUn1+H0GU=">AAACFHicbVDLSsNAFJ3UV62vqEs3g0UQlJJIUTdC0Y3uKtgHNDFMppN26OTBzEQsIR/hxl9x40IRty7c+TdO0gjaemCGc8+9l3vvcSNGhTSML600N7+wuFRerqysrq1v6JtbbRHGHJMWDlnIuy4ShNGAtCSVjHQjTpDvMtJxRxdZvnNHuKBhcCPHEbF9NAioRzGSSnL0AyuxfCSHrpfcp05C08OfcJyHVvafmeltcpU6etWoGTngLDELUgUFmo7+afVDHPskkJghIXqmEUk7QVxSzEhasWJBIoRHaEB6igbIJ8JO8qNSuKeUPvRCrl4gYa7+7kiQL8TYd1VltrGYzmXif7leLL1TO6FBFEsS4MkgL2ZQhjBzCPYpJ1iysSIIc6p2hXiIOMJS+VhRJpjTJ8+S9lHNPK7Vr+vVxnlhRxnsgF2wD0xwAhrgEjRBC2DwAJ7AC3jVHrVn7U17n5SWtKJnG/yB9vEN+gCgCA==</latexit>

{xi , yi }i=1
• Loss function or cost function measures how bad model is:
<latexit sha1_base64="hEz86CO3UCw8TPROwDlyfIH4Vs8=">AAAX/HiclZjNbtw2EMfX/UzdL6dF4UMvQo0AReAudoOk7aVAYsdJHDu1nfgrsWyDkigtY4qSJcrejaCeem2forei175L36EP0GOHknZpcWigXcNeen5/DsnhkKLopZzlcjD4a+6tt9959733b3ww/+FHH3/y6cLNz/bzpMh8uucnPMkOPZJTzgTdk0xyephmlMQepwfe2ariBxc0y1kiduUkpccxiQQLmU8kmE4X/nE9GjERZUmRurW7MiYZWOqq1XzZGj1O/LNq011hET9yvTAdsWWncgsR0MzLiE+7Qjee5DIrZHmnP6Bj+NdLxmVYqZrj5ab6cXVaNvY4CSivqmXnP7srlZ/TklXK10QVXPDGfhhWJ+V6NXMsM8KEExBJqkr1PDued6kI6sGeLiwN+oP64+DCsC0s9drP9unNL/52g8QvYiqkz0meHw0HqTwuSSaZz2k17xY5TaG3JKJHUBQkpvlxWQ+jcm6BJXDCJINfIZ3aerVGSeI8n8QeKGMiR7nJlNHGjgoZfn9cMpEWkgq/aSgsuCMTR823E7CM+pJPoED8jEFfHX9EIMQSsmLeFfTST+KYiKB0V9Z2qrLJh5KeF3WGVFVXs1ZrIIjXKlbWd2demKQxe0ORk1qinFwjoFFVlrQf9U3AKADWpwgkgubgU8XHC52hQWFFcMBtWkDOOM8r5FpIGkFMOrJXSAaFlEMOXlWtIhVMZdyRvACJ49xyFKAyg1mArsIXNebgRUpENa0n6VhmcZkrm9lCRkRE6yZgyD7hakRdhSg4h6p+R/WjqXpOxFkbuCStu5opi6HazboameG4iKCrqS2GCpIw6qpqi6HisH8FJCYQ5bYMKzqLHWWxS5kwpQwl5naWeN22U2Uxc3Ocwnrp6tZKFP4LYkREGWD1qW9GhE+78tVkpnamwbmo9apAx84IJqtbBXbhZljTRmBUra3CyjpWhhJHC0xZctlVqt5YpDRl3QEqg7noioyJ8IpsuS5ByiqzuwxDzQpOj77p36Pj41Jt3fUfFE1wlBepzZEy/w9HATwxzfwCizl5CTcmDwz15CUc9ndj6khmJray1HMHBSYIZ3JiLH8WiW6d2mJ2NomNvoJB+YVveHAZkxyGXbEyKDF8w7PfkkC+MUi/GaPPk7zIKNr8jHwGSy1X22LG1MOqu6FyJejuG5TPakEZHg4X9JrqnhFRr4mnl8ABgGRGMMdqSscnbi5hidlWfz3lTdGqiuj5Rtse9Atmp/B9en66Yc5HhFRYww1fcNiy+uJIZWkPfM3S9WrPyo2T2yi1I4vWruTIb9tLu9qivaYH9HzT0ttNpEMqrOGGr7aHWIdUlvbAlz2Om7ZRWLR2JUd+p3G0qi3amdJI/3B3ROFQCqWEB+rYl3C3MZlCiYXSKkxiGhnCxmQK46Krgv9NyQsGD4+uqjGZwu2cdWXKYIrgcG8OoTGZwmYJd5WtzZRuWqSbdinh6chQNiZT+JjE5qgbkymMsDCyCs9ImhrCxoTiODLjOMJxTE1RahOZM5JaZgSllC2hslHSFSmDKRobrY0tjUEPeCKMBlujKc5x5uXWzBNGFgucxXu2hveuaVgSw6EymKIttMYcd8u6yDwzxHDMsgU5ZYYqxQHcNjXbWDM9/XlhiU5y8Ias6QTTS00vMT3Q9ADTTFP0RuCFzzVFbydeeKHpBab7mu5jWmhaYLqn6R6moaYhpo80fYSpr6mP6aqmq5hKTdGJFJ4Imu5iOtJ0hOmhpoeYvtT0JaZPNH2C6StNX2H6RtM3mD7Q9AGmRFOC6Zqma5hSTdHVgReuaLqCqacpeveDtabpNqappimmDzV9iGmgKXorhueZpuh4Aw9GTTmm65quY8o0Re9vXvhM02eYxprGmD7V9CmmrzV9jeljTR9jGmmK7gbgdKLpC0z1LVCZY7qj6Q6m55qe2+8F6GwaPVtibmkHW5gmmiaYbmiK3hTgKKHpGTpPhqLd1aa3TWhfC8WMW1gb8WltFPNQzLiFtbvTtDban0Ix4yPU9bX92UUKhLTe6QOqXmjxXXTpxxMIw6B/dzBcHqgfVVLvjasTIuRl4oDXNBFUyM6l2u3p7fHR8Bje/dVFgnPJAjkapNIZURaNpCoFNJWjpaF6tSXjav50YWlo3gbjwv6d/vDb/r2du0v3V9qb4hu9L3tf9b7uDXvf9e73nvS2e3s9f+5k7ue5X+Z+Xfxp8bfF3xf/aKRvzbV1Pu91Pot//guMHWbl</latexit>

h i
L , f[x, ], {xi , yi }Ii=1
| {z } | {z }
model train data
or for short: <latexit sha1_base64="dfENBjZWeBDHyQgIldnakspAK9o=">AAACBnicbVDLSsNAFJ34rPUVdSlCsAiuSiJFXRbduHBRwT4gCWUymTRDJ5kwcyOU0pUbf8WNC0Xc+g3u/BsnbRbaemCYwzn3cu89QcaZAtv+NpaWV1bX1isb1c2t7Z1dc2+/o0QuCW0TwYXsBVhRzlLaBgac9jJJcRJw2g2G14XffaBSMZHewyijfoIHKYsYwaClvnl063EagesFgodqlOjPy2LmSTaIwe+bNbtuT2EtEqckNVSi1Te/vFCQPKEpEI6Vch07A3+MJTDC6aTq5YpmmAzxgLqapjihyh9Pz5hYJ1oJrUhI/VKwpurvjjFOVLGirkwwxGreK8T/PDeH6NIfszTLgaZkNijKuQXCKjKxQiYpAT7SBBPJ9K4WibHEBHRyVR2CM3/yIumc1Z3zeuOuUWtelXFU0CE6RqfIQReoiW5QC7URQY/oGb2iN+PJeDHejY9Z6ZJR9hygPzA+fwAZyJmL</latexit>

Returns a scalar that is smaller when


L[ ] model maps inputs to outputs better
Training
• Loss function: <latexit sha1_base64="dfENBjZWeBDHyQgIldnakspAK9o=">AAACBnicbVDLSsNAFJ34rPUVdSlCsAiuSiJFXRbduHBRwT4gCWUymTRDJ5kwcyOU0pUbf8WNC0Xc+g3u/BsnbRbaemCYwzn3cu89QcaZAtv+NpaWV1bX1isb1c2t7Z1dc2+/o0QuCW0TwYXsBVhRzlLaBgac9jJJcRJw2g2G14XffaBSMZHewyijfoIHKYsYwaClvnl063EagesFgodqlOjPy2LmSTaIwe+bNbtuT2EtEqckNVSi1Te/vFCQPKEpEI6Vch07A3+MJTDC6aTq5YpmmAzxgLqapjihyh9Pz5hYJ1oJrUhI/VKwpurvjjFOVLGirkwwxGreK8T/PDeH6NIfszTLgaZkNijKuQXCKjKxQiYpAT7SBBPJ9K4WibHEBHRyVR2CM3/yIumc1Z3zeuOuUWtelXFU0CE6RqfIQReoiW5QC7URQY/oGb2iN+PJeDHejY9Z6ZJR9hygPzA+fwAZyJmL</latexit>

L[ ] Returns a scalar that is smaller when


model maps inputs to outputs better

• Find the parameters that minimize the loss:


<latexit sha1_base64="xKCkVHPMbZlPS6XRxAf2oJtIcZs=">AAAXTHiclZhNb9xEGIDdAqWErxREOHCxiCqhqqx2UQtckNqkadMmJUnz2cbbaOwd29OMx449TnZr7V/ip3DiwgF+AUduCIl3bO9OPO/kQKo20/d5PDN+58P2+Blnhez3f792/Z1337vx/s0PFj786ONPPl289dlBkZZ5QPeDlKf5kU8Kypmg+5JJTo+ynJLE5/TQP11V/PCc5gVLxZ6cZHSYkEiwkAVEQuhkcd31YiIrzw+zmE3dn1yP5FHCxMks5K2wiB97iZ+Oq82px2kojxvk5SyK5dBVRj48WVzu9/r1j4sLg7aw7LQ/2ye3vvjLG6VBmVAhA06K4njQz+SwIrlkAafTBa8saEaCUxLRYygKktBiWNW3PHVvQ2TkhmkOf4V06+jlKyqSFMUk8cFMiIwLk6mgjR2XMvxxWDGRlZKKoGkoLLkrU1flzx2xnAaST6BAgpxBX90gJjkJJGR5wRP0IkiThIhR5a2s7UwhjTRioqJnZZ3x6bTrrNUOheJVxsrTvXktTNKEvaWoklpRlVwh0GhaVbQX9UzAKADWowikghZQp8qPH7oDg8IM44CrZlrAdHBfTFHVQtIIctLRXiENChmn4461iiwYyqSj7ILiurddBajMYRSgq/CLGmOwmxExnV0n6VjmSVWomNlCTkRE6ybglgPC1R11DVFyDpcGHetn03pBxGmbuDSru5qriGHt5V1H5jgvYtR16ohhwSSMulYdMSwO+8GIJASy3JZheeeJqyJ2lQlTZWhibuep3207UxFzbo4zWC9db61C6T8nRkZUAFaf+s2ICGhXX03ntjtLznntqwIduzEMVveSZl+71AjcVRubYrPOlWHibEEoTy+6puqNRaUZ696gCpiLrsyZCC9pd+sSTFkV9u7CreYlp8ff9u7T8bDqq2Wj/kHZhIqKMrNVpML/o6IRPIHM+QURc/BSbgweBOrBSzns78bQkdyc2CpSjx0UmCCcyYmx/FkkutfUEbOzaWL0FQKqXvhNmDAGOQy7sgooGX7Ds9QygQLjJoPmHgOeFmVO0eZnzGeI1LraFnOmHlbdDZUrobtvUD6/CsrwcDinV1zuGxn1m3z6aSlGJDeSOVZDOn7tFRKWmG3110PeFK1WRM822vagXzA6ZRDQs5MNczwiZGGHG3XBy4u1Lo4sS3tQ13y6Xu5ZtfH6DprakcW1mxzV2/bSblvcK3pAzzYtvd1EHrKww4262h5iD1mW9qAuex43bXdhce0mR/XO8mi1Le7cNKZ/uBdTSdRrUspH6rUv5V4TMkWJRWkV04RGhtiETDEpuxb831R2GTw8ulYTMsXtgnU1FTClEeXmLTQhU2yWcNdsY6a6aVE37SrhWWyYTcgUn5DEvOsmZIoRFiOreEqyzBCbEMpjbOYxxnnMTCmzSeaIZJYRQVPKNqHyOO1KKmBKY6O1saUx6AFPhdFgGzTlAs+8wjrzhDGLBZ7F+7aG969oWBKjQhUwpS20xlxvy7rIfDPF8JplS3LGDCvDCdw2nW3szN7+/LBCb3J+ONF0gumFpheYHmp6iGmuKfoi8MMXmqKvEz881/Qc0wNNDzAtNS0x3dd0H9NQ0xDTx5o+xjTQNMB0VdNVTKWm6I0Ungia7mEaaxpjeqTpEaYvNX2J6bqm65i+0vQVpm81fYvpQ00fYko0JZiuabqGKdUUHR344YqmK5j6mqJvP1hrmm5jmmmaYfpI00eYjjRFX8XwPNMUvd7Ag1FTjulTTZ9iyjRF329++FzT55gmmiaYPtP0GaZvNH2D6RNNn2AaaYrOBuDtRNNdTPUpUFVguqPpDqZnmp7ZzwXofBh928Tc0hVsYZpqmmK6oSn6UoBXCU1P0ftkKNpdbXbahPa1UMy5hbUZn12Nch6KObewdneaXY32p1DMeYy6vnYwP0iBlNY7/YiqD9r6hLRKSB4xUR8LT6sgmUAa+r17/cHdvvqjSuq7cXVChLxIXag1SwUVsnOodsdLJoXMS3k8GMK3vzpIcC/YSMb9TLoxVYfCqjSimYyXB+rTloynCyeLywPzNBgXDr7rDb7v3d+5t/xgpT0pvul85XztfOMMnB+cB866s+3sO4Hzi/Ob84fz59KvS38v/bP0b6Nev9Ze87nT+fnyxn//BiBF</latexit>

h i
ˆ = argmin L [ ]
Testing
• To test the model, run on a separate test dataset of input / output
pairs

• See how well it generalizes to new data


Outline

1. Machine Learning Foundations

2. Linear regression

3. Examples

27
Example: 1D Linear regression model
• Model: <latexit sha1_base64="ktj2B8/mNFNy1ioPnb0sW2wRwtY=">AAAWq3iclZjZcts2FECZrqm7Oe3UL33h1JNOp01UqZMuL51J7DibnVqOLdux5WhACqQQgyDNxZbC0Wf0a/rafkT/phckJYT3wg/VjCPknkMsFwAJ0UukyPJu998b77z73vsffHjzo5WPP/n0s89Xb31xmMVF6vOBH8s4PfZYxqVQfJCLXPLjJOUs8iQ/8s43NT+65GkmYnWQzxJ+FrFQiUD4LIfQaPXHmfvt7+4w8uJpGcxPp3eGXpBMxNlwuKLjUByV3fkPdaE3n45W17udbvVxaaHXFNad5tMf3fpqPBzHfhFxlfuSZdlpr5vkZyVLc+FLPl8ZFhlPmH/OQn4KRcUinp2V1cjm7m2IjN0gTuFP5W4VffuKkkVZNos8MCOWTzLMdNDGTos8+O2sFCopcq78uqGgkG4euzpN7lik3M/lDArMTwX01fUnLGV+DslcGSp+5cdRxNS4HG5s7c3LocdDoUp+UVSJnc/bzlblcCheZ2w8PVjWInIeiTecVFIpupJrBB7Oy5J3wg4GggMQHU5ArHgGder8eIHbQxQWkgRc1ssDVob7Yk6qVjkPISct7YRoUEgkn7asTWLBVEYtZR8U173tasDzFGYBugpfHM3BfsLUfHFdzqd5GpWZjuEWUqZCXjUBQ/aZ1CNqG6qQEi71W9Yf2HrB1HmTuDipuprqCLIO0raTpzQvatx2qgiyYBGGbauKIEvCth+ziEGWm/IIBhy5OmJXhcKqIAuzn8Zeu+1ER/DanCawX9reVknSf8lQRnQAdp/+Fkz5vK1vxkvbXSTnsvJ1gU/dCUxW+xKWhvWwFo3AqJrYnJpVrpBJswWhNL5qm7o3FpUnoj1AHcCbrkiFCt7S7lQlWLI6PLwDQ00LyU/vdn7m07Oyq7eN/odkEyrKisRWkQ7/j4rG8KDB6wsiePJiiSYPAtXkxRLu72jqWIoXto5UcwcFoZgU+QxtfxGq9jVVBHc2jlBfIaDrhW8mFJrkIGjLOqBl+IZHpmUB+WiQfj1GX8ZZkXJy80PrGSKVrm+LqdAPq/YNVWqhfd/gcnkVlOHhcMmvudxDGfXqfHpxocYsRcmc6imdvhpmOWwx2+6vprwuWq2QX2w37UG/YHYK3+cXo208HyGxqCNRXXBGsdYliWVpD+paLte3e1Zuv/qeLO3Q4tpNSeptemm3Le41PeAXO5be7hCPWNSRqK6mh9QjlqU9qMuexx3bKCyu3ZSk3kUerbbFXZpo+QcHE54zfUyK5Vgf+2I5rENYzKmYW8U44iES6xAWo6Jtwf+xsi/g4dG26hAW+5loazqApTGXeAh1CIv1Fm6bTQyrOxZ1x64ymUyQWYew+JhFeNR1CIshFUOreM6SBIl1iORxgvM4oXlMsJTYJDwjiWVGyJKyLah0ErclHcDSFLU2tTQGPZCxQg02QSxndOVl1pWn0CpWdBUPbA0Prmk4Z6hCHcDSLtlj7nDXusk8nGI4ZtmSnAhkJTSBfez0qbM4/XlBSU5yXjAzdEbplaFXlB4ZekRpaij5ReAFLwwlv0684NLQS0oPDT2ktDC0oHRg6IDSwNCA0keGPqLUN9SndNPQTUpzQ8mJFJ4Ihh5QOjF0QumxoceUvjT0JaVPDH1C6YmhJ5S+MfQNpQ8MfUApM5RRumXoFqXcUPLqwAs2DN2g1DOU/PaDvWZon9LE0ITSh4Y+pHRsKPlVDM8zQ8nxBh6MhkpKnxr6lFJhKPn95gXPDX1OaWRoROkzQ59R+trQ15Q+NvQxpaGh5N0AnE4M3afUvAUqM0r3DN2j9MLQC/t7Ab6cRs+2MHdNBbuUxobGlG4bSn4pwFHC0HNyngxUc1dbvG0i97VALbmFNRlfXE1yHqglt7Dm7rS4mtyfArXkE9L1rcPlixRIKdzpR6vrPfwWlhYOf+r0func27u3fn+jeUN70/na+cb5zuk5vzr3nSdO3xk4vvOn85fzt/PP2t21/bWTtWGtvnOjueZLp/VZ4/8B0OTgog==</latexit>

y = f[x, ]
= 0 + 1x

• Parameters
<latexit sha1_base64="LSKArFbMvTQ1nOgGI386n+z1J6w=">AAAWr3iclZjZbtw2FECVdEvTJU6L+qUvQo0ARZEOZtp0eSmQ2HE2O/U49thOLGdKaSgNY4qStdjjCPMh/Zq+tp/Qv+mlpBlG99IPHcARc88Rl0tSm59KkRf9/r/Xrr/3/gcffnTj45uffPrZ57dWbn9xkCdlFvBRkMgkO/JZzqVQfFSIQvKjNOMs9iU/9E83ND8851kuErVfXKb8JGaREqEIWAGh8cqPnh+mU+H+5no+j4Sq/JgVmZjNPYiO+55XHwceV5MlGq+s9Xv9+ufSwqAtrDntbzi+/dXEmyRBGXNVBJLl+fGgnxYnFcsKEUg+v+mVOU9ZcMoifgxFxWKen1T16ObuHYhM3DDJ4E8Vbh1994yKxXl+GftgQgenOWY6aGPHZRH+elIJlZYFV0HTUFhKt0hcnSp3IjIeFPISCizIBPTVDaYsY0EBCb3pKX4RJHHMIDPe+ubuvGozyM/KOrnzedfZrB2dyKuM9af7y1pEwWPxlpNKakVXcoXAo3lV8V7Uw0BwAKLHCUgUz6FOnR8/dAeIwmKSgIH7yQw6F7ov5qRqVfAIctLRXhENCqnks461QSyYyrij7IHiundcDTgswMCt12HA0RzspUzNF+cVfFZkcZXrGG4hYyridRMw5IBJPaKuoUop4dSgY/2OrRdMnbaJS9K6q5mOIGs/6zpFRvOiJl2njiALFmHUteoIsiRs/QmLGWS5LY9hwLGrI3ZVKKwKsjCHWeJ32051BK/NWQr7pettViT95wxlRAdg9+mjYCrgXX0jWdruIjnnta8LfOZOYbK6p7Asaoa1aARG1cbm1KxzhUyaLQhlyUXX1L2xqDwV3QHqAN50ZSZU+I52ty7BktVh7y4MNSslP/6+9xOfnVR9vW30PySbUFFepraKdPh/VDSBmw1eXxDBk5dINHkQqCcvkXB9R1PHMrywdaSeOygIxaQoLtH2F5HqnlNHcGeTGPUVArpeODKh0CSHYVfWAS3DEW6blgUUoEEGzRgDmeRlxsnFD61niNS6vixmQt+suhdUqYXudYPL5VlQhpvDOb/idB9l1G/y6SelmrAMJXOmp3T22ssL2GK23V9PeVO0WhE/22rbg37B7JRBwM/GW3g+ImJRR6K64DnFWpcklqU9qGu5XN/tWbX1+juytCOLazclqbftpd22uFf0gJ9tW3q7TTxiUUeiutoeUo9YlvagLnset22jsLh2U5J6F3m02hZ3aaLlH+5PecH0Y1IiJ/qxL5FeE8JiQcXCKiYxj5DYhLAYl10L/o+VPQE3j67VhLA4zEVX0wEsTbjEQ2hCWGy2cNdsY1jdtqjbdpXJdIrMJoTFxyzGo25CWIyoGFnFU5amSGxCJI9TnMcpzWOKpdQm4RlJLTNClpRtQWXTpCvpAJZmqLWZpTHogUwUarANYjmnKy+3rjyFVrGiq3hka3h0RcMFQxXqAJZ2yB5zvR3rJvNxivWbqSXJqUBWShM4xM6QOounPz+syJOcH14aeknphaEXlB4aekhpZih5I/DDF4aStxM/PDf0nNIDQw8oLQ0tKR0ZOqI0NDSk9JGhjygNDA0o3TB0g9LCUPJECncEQ/cpnRo6pfTI0CNKXxr6ktInhj6h9JWhryh9a+hbSh8Y+oBSZiijdNPQTUq5oeTTgR+uG7pOqW8oefeDvWbokNLU0JTSh4Y+pHRiKHkrhvuZoeTxBm6MhkpKnxr6lFJhKHl/88Pnhj6nNDY0pvSZoc8ofWPoG0ofG/qY0shQ8m0Ank4M3aPUfAWqckp3Dd2l9MzQM/t3Ab6cRt+2MHdMBTuUJoYmlG4ZSt4U4FHC0FPyPBmq9qq2+NpErmuhWnILazO+OJvkPFRLbmHt1WlxNrk+hWrJp6TrmwfLDymQUrjSj1fWBvgrLC0c/NAb/Ny7t3tv7f56+4X2hvO1843zrTNwfnHuO0+coTNyAudP5y/nb+ef1cHq4err1T8a9fq19pwvnc5vVfwHkIXkEQ==</latexit>

 y-offset
0
= slope
1
Example: 1D Linear regression model
• Model: <latexit sha1_base64="ktj2B8/mNFNy1ioPnb0sW2wRwtY=">AAAWq3iclZjZcts2FECZrqm7Oe3UL33h1JNOp01UqZMuL51J7DibnVqOLdux5WhACqQQgyDNxZbC0Wf0a/rafkT/phckJYT3wg/VjCPknkMsFwAJ0UukyPJu998b77z73vsffHjzo5WPP/n0s89Xb31xmMVF6vOBH8s4PfZYxqVQfJCLXPLjJOUs8iQ/8s43NT+65GkmYnWQzxJ+FrFQiUD4LIfQaPXHmfvt7+4w8uJpGcxPp3eGXpBMxNlwuKLjUByV3fkPdaE3n45W17udbvVxaaHXFNad5tMf3fpqPBzHfhFxlfuSZdlpr5vkZyVLc+FLPl8ZFhlPmH/OQn4KRcUinp2V1cjm7m2IjN0gTuFP5W4VffuKkkVZNos8MCOWTzLMdNDGTos8+O2sFCopcq78uqGgkG4euzpN7lik3M/lDArMTwX01fUnLGV+DslcGSp+5cdRxNS4HG5s7c3LocdDoUp+UVSJnc/bzlblcCheZ2w8PVjWInIeiTecVFIpupJrBB7Oy5J3wg4GggMQHU5ArHgGder8eIHbQxQWkgRc1ssDVob7Yk6qVjkPISct7YRoUEgkn7asTWLBVEYtZR8U173tasDzFGYBugpfHM3BfsLUfHFdzqd5GpWZjuEWUqZCXjUBQ/aZ1CNqG6qQEi71W9Yf2HrB1HmTuDipuprqCLIO0raTpzQvatx2qgiyYBGGbauKIEvCth+ziEGWm/IIBhy5OmJXhcKqIAuzn8Zeu+1ER/DanCawX9reVknSf8lQRnQAdp/+Fkz5vK1vxkvbXSTnsvJ1gU/dCUxW+xKWhvWwFo3AqJrYnJpVrpBJswWhNL5qm7o3FpUnoj1AHcCbrkiFCt7S7lQlWLI6PLwDQ00LyU/vdn7m07Oyq7eN/odkEyrKisRWkQ7/j4rG8KDB6wsiePJiiSYPAtXkxRLu72jqWIoXto5UcwcFoZgU+QxtfxGq9jVVBHc2jlBfIaDrhW8mFJrkIGjLOqBl+IZHpmUB+WiQfj1GX8ZZkXJy80PrGSKVrm+LqdAPq/YNVWqhfd/gcnkVlOHhcMmvudxDGfXqfHpxocYsRcmc6imdvhpmOWwx2+6vprwuWq2QX2w37UG/YHYK3+cXo208HyGxqCNRXXBGsdYliWVpD+paLte3e1Zuv/qeLO3Q4tpNSeptemm3Le41PeAXO5be7hCPWNSRqK6mh9QjlqU9qMuexx3bKCyu3ZSk3kUerbbFXZpo+QcHE54zfUyK5Vgf+2I5rENYzKmYW8U44iES6xAWo6Jtwf+xsi/g4dG26hAW+5loazqApTGXeAh1CIv1Fm6bTQyrOxZ1x64ymUyQWYew+JhFeNR1CIshFUOreM6SBIl1iORxgvM4oXlMsJTYJDwjiWVGyJKyLah0ErclHcDSFLU2tTQGPZCxQg02QSxndOVl1pWn0CpWdBUPbA0Prmk4Z6hCHcDSLtlj7nDXusk8nGI4ZtmSnAhkJTSBfez0qbM4/XlBSU5yXjAzdEbplaFXlB4ZekRpaij5ReAFLwwlv0684NLQS0oPDT2ktDC0oHRg6IDSwNCA0keGPqLUN9SndNPQTUpzQ8mJFJ4Ihh5QOjF0QumxoceUvjT0JaVPDH1C6YmhJ5S+MfQNpQ8MfUApM5RRumXoFqXcUPLqwAs2DN2g1DOU/PaDvWZon9LE0ITSh4Y+pHRsKPlVDM8zQ8nxBh6MhkpKnxr6lFJhKPn95gXPDX1OaWRoROkzQ59R+trQ15Q+NvQxpaGh5N0AnE4M3afUvAUqM0r3DN2j9MLQC/t7Ab6cRs+2MHdNBbuUxobGlG4bSn4pwFHC0HNyngxUc1dbvG0i97VALbmFNRlfXE1yHqglt7Dm7rS4mtyfArXkE9L1rcPlixRIKdzpR6vrPfwWlhYOf+r0func27u3fn+jeUN70/na+cb5zuk5vzr3nSdO3xk4vvOn85fzt/PP2t21/bWTtWGtvnOjueZLp/VZ4/8B0OTgog==</latexit>

y = f[x, ]
= 0 + 1x

• Parameters
<latexit sha1_base64="LSKArFbMvTQ1nOgGI386n+z1J6w=">AAAWr3iclZjZbtw2FECVdEvTJU6L+qUvQo0ARZEOZtp0eSmQ2HE2O/U49thOLGdKaSgNY4qStdjjCPMh/Zq+tp/Qv+mlpBlG99IPHcARc88Rl0tSm59KkRf9/r/Xrr/3/gcffnTj45uffPrZ57dWbn9xkCdlFvBRkMgkO/JZzqVQfFSIQvKjNOMs9iU/9E83ND8851kuErVfXKb8JGaREqEIWAGh8cqPnh+mU+H+5no+j4Sq/JgVmZjNPYiO+55XHwceV5MlGq+s9Xv9+ufSwqAtrDntbzi+/dXEmyRBGXNVBJLl+fGgnxYnFcsKEUg+v+mVOU9ZcMoifgxFxWKen1T16ObuHYhM3DDJ4E8Vbh1994yKxXl+GftgQgenOWY6aGPHZRH+elIJlZYFV0HTUFhKt0hcnSp3IjIeFPISCizIBPTVDaYsY0EBCb3pKX4RJHHMIDPe+ubuvGozyM/KOrnzedfZrB2dyKuM9af7y1pEwWPxlpNKakVXcoXAo3lV8V7Uw0BwAKLHCUgUz6FOnR8/dAeIwmKSgIH7yQw6F7ov5qRqVfAIctLRXhENCqnks461QSyYyrij7IHiundcDTgswMCt12HA0RzspUzNF+cVfFZkcZXrGG4hYyridRMw5IBJPaKuoUop4dSgY/2OrRdMnbaJS9K6q5mOIGs/6zpFRvOiJl2njiALFmHUteoIsiRs/QmLGWS5LY9hwLGrI3ZVKKwKsjCHWeJ32051BK/NWQr7pettViT95wxlRAdg9+mjYCrgXX0jWdruIjnnta8LfOZOYbK6p7Asaoa1aARG1cbm1KxzhUyaLQhlyUXX1L2xqDwV3QHqAN50ZSZU+I52ty7BktVh7y4MNSslP/6+9xOfnVR9vW30PySbUFFepraKdPh/VDSBmw1eXxDBk5dINHkQqCcvkXB9R1PHMrywdaSeOygIxaQoLtH2F5HqnlNHcGeTGPUVArpeODKh0CSHYVfWAS3DEW6blgUUoEEGzRgDmeRlxsnFD61niNS6vixmQt+suhdUqYXudYPL5VlQhpvDOb/idB9l1G/y6SelmrAMJXOmp3T22ssL2GK23V9PeVO0WhE/22rbg37B7JRBwM/GW3g+ImJRR6K64DnFWpcklqU9qGu5XN/tWbX1+juytCOLazclqbftpd22uFf0gJ9tW3q7TTxiUUeiutoeUo9YlvagLnset22jsLh2U5J6F3m02hZ3aaLlH+5PecH0Y1IiJ/qxL5FeE8JiQcXCKiYxj5DYhLAYl10L/o+VPQE3j67VhLA4zEVX0wEsTbjEQ2hCWGy2cNdsY1jdtqjbdpXJdIrMJoTFxyzGo25CWIyoGFnFU5amSGxCJI9TnMcpzWOKpdQm4RlJLTNClpRtQWXTpCvpAJZmqLWZpTHogUwUarANYjmnKy+3rjyFVrGiq3hka3h0RcMFQxXqAJZ2yB5zvR3rJvNxivWbqSXJqUBWShM4xM6QOounPz+syJOcH14aeknphaEXlB4aekhpZih5I/DDF4aStxM/PDf0nNIDQw8oLQ0tKR0ZOqI0NDSk9JGhjygNDA0o3TB0g9LCUPJECncEQ/cpnRo6pfTI0CNKXxr6ktInhj6h9JWhryh9a+hbSh8Y+oBSZiijdNPQTUq5oeTTgR+uG7pOqW8oefeDvWbokNLU0JTSh4Y+pHRiKHkrhvuZoeTxBm6MhkpKnxr6lFJhKHl/88Pnhj6nNDY0pvSZoc8ofWPoG0ofG/qY0shQ8m0Ank4M3aPUfAWqckp3Dd2l9MzQM/t3Ab6cRt+2MHdMBTuUJoYmlG4ZSt4U4FHC0FPyPBmq9qq2+NpErmuhWnILazO+OJvkPFRLbmHt1WlxNrk+hWrJp6TrmwfLDymQUrjSj1fWBvgrLC0c/NAb/Ny7t3tv7f56+4X2hvO1843zrTNwfnHuO0+coTNyAudP5y/nb+ef1cHq4err1T8a9fq19pwvnc5vVfwHkIXkEQ==</latexit>

 y-offset
0
= slope
1
Example: 1D Linear regression model
• Model: <latexit sha1_base64="ktj2B8/mNFNy1ioPnb0sW2wRwtY=">AAAWq3iclZjZcts2FECZrqm7Oe3UL33h1JNOp01UqZMuL51J7DibnVqOLdux5WhACqQQgyDNxZbC0Wf0a/rafkT/phckJYT3wg/VjCPknkMsFwAJ0UukyPJu998b77z73vsffHjzo5WPP/n0s89Xb31xmMVF6vOBH8s4PfZYxqVQfJCLXPLjJOUs8iQ/8s43NT+65GkmYnWQzxJ+FrFQiUD4LIfQaPXHmfvt7+4w8uJpGcxPp3eGXpBMxNlwuKLjUByV3fkPdaE3n45W17udbvVxaaHXFNad5tMf3fpqPBzHfhFxlfuSZdlpr5vkZyVLc+FLPl8ZFhlPmH/OQn4KRcUinp2V1cjm7m2IjN0gTuFP5W4VffuKkkVZNos8MCOWTzLMdNDGTos8+O2sFCopcq78uqGgkG4euzpN7lik3M/lDArMTwX01fUnLGV+DslcGSp+5cdRxNS4HG5s7c3LocdDoUp+UVSJnc/bzlblcCheZ2w8PVjWInIeiTecVFIpupJrBB7Oy5J3wg4GggMQHU5ArHgGder8eIHbQxQWkgRc1ssDVob7Yk6qVjkPISct7YRoUEgkn7asTWLBVEYtZR8U173tasDzFGYBugpfHM3BfsLUfHFdzqd5GpWZjuEWUqZCXjUBQ/aZ1CNqG6qQEi71W9Yf2HrB1HmTuDipuprqCLIO0raTpzQvatx2qgiyYBGGbauKIEvCth+ziEGWm/IIBhy5OmJXhcKqIAuzn8Zeu+1ER/DanCawX9reVknSf8lQRnQAdp/+Fkz5vK1vxkvbXSTnsvJ1gU/dCUxW+xKWhvWwFo3AqJrYnJpVrpBJswWhNL5qm7o3FpUnoj1AHcCbrkiFCt7S7lQlWLI6PLwDQ00LyU/vdn7m07Oyq7eN/odkEyrKisRWkQ7/j4rG8KDB6wsiePJiiSYPAtXkxRLu72jqWIoXto5UcwcFoZgU+QxtfxGq9jVVBHc2jlBfIaDrhW8mFJrkIGjLOqBl+IZHpmUB+WiQfj1GX8ZZkXJy80PrGSKVrm+LqdAPq/YNVWqhfd/gcnkVlOHhcMmvudxDGfXqfHpxocYsRcmc6imdvhpmOWwx2+6vprwuWq2QX2w37UG/YHYK3+cXo208HyGxqCNRXXBGsdYliWVpD+paLte3e1Zuv/qeLO3Q4tpNSeptemm3Le41PeAXO5be7hCPWNSRqK6mh9QjlqU9qMuexx3bKCyu3ZSk3kUerbbFXZpo+QcHE54zfUyK5Vgf+2I5rENYzKmYW8U44iES6xAWo6Jtwf+xsi/g4dG26hAW+5loazqApTGXeAh1CIv1Fm6bTQyrOxZ1x64ymUyQWYew+JhFeNR1CIshFUOreM6SBIl1iORxgvM4oXlMsJTYJDwjiWVGyJKyLah0ErclHcDSFLU2tTQGPZCxQg02QSxndOVl1pWn0CpWdBUPbA0Prmk4Z6hCHcDSLtlj7nDXusk8nGI4ZtmSnAhkJTSBfez0qbM4/XlBSU5yXjAzdEbplaFXlB4ZekRpaij5ReAFLwwlv0684NLQS0oPDT2ktDC0oHRg6IDSwNCA0keGPqLUN9SndNPQTUpzQ8mJFJ4Ihh5QOjF0QumxoceUvjT0JaVPDH1C6YmhJ5S+MfQNpQ8MfUApM5RRumXoFqXcUPLqwAs2DN2g1DOU/PaDvWZon9LE0ITSh4Y+pHRsKPlVDM8zQ8nxBh6MhkpKnxr6lFJhKPn95gXPDX1OaWRoROkzQ59R+trQ15Q+NvQxpaGh5N0AnE4M3afUvAUqM0r3DN2j9MLQC/t7Ab6cRs+2MHdNBbuUxobGlG4bSn4pwFHC0HNyngxUc1dbvG0i97VALbmFNRlfXE1yHqglt7Dm7rS4mtyfArXkE9L1rcPlixRIKdzpR6vrPfwWlhYOf+r0func27u3fn+jeUN70/na+cb5zuk5vzr3nSdO3xk4vvOn85fzt/PP2t21/bWTtWGtvnOjueZLp/VZ4/8B0OTgog==</latexit>

y = f[x, ]
= 0 + 1x

• Parameters
<latexit sha1_base64="LSKArFbMvTQ1nOgGI386n+z1J6w=">AAAWr3iclZjZbtw2FECVdEvTJU6L+qUvQo0ARZEOZtp0eSmQ2HE2O/U49thOLGdKaSgNY4qStdjjCPMh/Zq+tp/Qv+mlpBlG99IPHcARc88Rl0tSm59KkRf9/r/Xrr/3/gcffnTj45uffPrZ57dWbn9xkCdlFvBRkMgkO/JZzqVQfFSIQvKjNOMs9iU/9E83ND8851kuErVfXKb8JGaREqEIWAGh8cqPnh+mU+H+5no+j4Sq/JgVmZjNPYiO+55XHwceV5MlGq+s9Xv9+ufSwqAtrDntbzi+/dXEmyRBGXNVBJLl+fGgnxYnFcsKEUg+v+mVOU9ZcMoifgxFxWKen1T16ObuHYhM3DDJ4E8Vbh1994yKxXl+GftgQgenOWY6aGPHZRH+elIJlZYFV0HTUFhKt0hcnSp3IjIeFPISCizIBPTVDaYsY0EBCb3pKX4RJHHMIDPe+ubuvGozyM/KOrnzedfZrB2dyKuM9af7y1pEwWPxlpNKakVXcoXAo3lV8V7Uw0BwAKLHCUgUz6FOnR8/dAeIwmKSgIH7yQw6F7ov5qRqVfAIctLRXhENCqnks461QSyYyrij7IHiundcDTgswMCt12HA0RzspUzNF+cVfFZkcZXrGG4hYyridRMw5IBJPaKuoUop4dSgY/2OrRdMnbaJS9K6q5mOIGs/6zpFRvOiJl2njiALFmHUteoIsiRs/QmLGWS5LY9hwLGrI3ZVKKwKsjCHWeJ32051BK/NWQr7pettViT95wxlRAdg9+mjYCrgXX0jWdruIjnnta8LfOZOYbK6p7Asaoa1aARG1cbm1KxzhUyaLQhlyUXX1L2xqDwV3QHqAN50ZSZU+I52ty7BktVh7y4MNSslP/6+9xOfnVR9vW30PySbUFFepraKdPh/VDSBmw1eXxDBk5dINHkQqCcvkXB9R1PHMrywdaSeOygIxaQoLtH2F5HqnlNHcGeTGPUVArpeODKh0CSHYVfWAS3DEW6blgUUoEEGzRgDmeRlxsnFD61niNS6vixmQt+suhdUqYXudYPL5VlQhpvDOb/idB9l1G/y6SelmrAMJXOmp3T22ssL2GK23V9PeVO0WhE/22rbg37B7JRBwM/GW3g+ImJRR6K64DnFWpcklqU9qGu5XN/tWbX1+juytCOLazclqbftpd22uFf0gJ9tW3q7TTxiUUeiutoeUo9YlvagLnset22jsLh2U5J6F3m02hZ3aaLlH+5PecH0Y1IiJ/qxL5FeE8JiQcXCKiYxj5DYhLAYl10L/o+VPQE3j67VhLA4zEVX0wEsTbjEQ2hCWGy2cNdsY1jdtqjbdpXJdIrMJoTFxyzGo25CWIyoGFnFU5amSGxCJI9TnMcpzWOKpdQm4RlJLTNClpRtQWXTpCvpAJZmqLWZpTHogUwUarANYjmnKy+3rjyFVrGiq3hka3h0RcMFQxXqAJZ2yB5zvR3rJvNxivWbqSXJqUBWShM4xM6QOounPz+syJOcH14aeknphaEXlB4aekhpZih5I/DDF4aStxM/PDf0nNIDQw8oLQ0tKR0ZOqI0NDSk9JGhjygNDA0o3TB0g9LCUPJECncEQ/cpnRo6pfTI0CNKXxr6ktInhj6h9JWhryh9a+hbSh8Y+oBSZiijdNPQTUq5oeTTgR+uG7pOqW8oefeDvWbokNLU0JTSh4Y+pHRiKHkrhvuZoeTxBm6MhkpKnxr6lFJhKHl/88Pnhj6nNDY0pvSZoc8ofWPoG0ofG/qY0shQ8m0Ank4M3aPUfAWqckp3Dd2l9MzQM/t3Ab6cRt+2MHdMBTuUJoYmlG4ZSt4U4FHC0FPyPBmq9qq2+NpErmuhWnILazO+OJvkPFRLbmHt1WlxNrk+hWrJp6TrmwfLDymQUrjSj1fWBvgrLC0c/NAb/Ny7t3tv7f56+4X2hvO1843zrTNwfnHuO0+coTNyAudP5y/nb+ef1cHq4err1T8a9fq19pwvnc5vVfwHkIXkEQ==</latexit>

 y-offset
0
= slope
1
Example: 1D Linear regression model
• Model: <latexit sha1_base64="ktj2B8/mNFNy1ioPnb0sW2wRwtY=">AAAWq3iclZjZcts2FECZrqm7Oe3UL33h1JNOp01UqZMuL51J7DibnVqOLdux5WhACqQQgyDNxZbC0Wf0a/rafkT/phckJYT3wg/VjCPknkMsFwAJ0UukyPJu998b77z73vsffHjzo5WPP/n0s89Xb31xmMVF6vOBH8s4PfZYxqVQfJCLXPLjJOUs8iQ/8s43NT+65GkmYnWQzxJ+FrFQiUD4LIfQaPXHmfvt7+4w8uJpGcxPp3eGXpBMxNlwuKLjUByV3fkPdaE3n45W17udbvVxaaHXFNad5tMf3fpqPBzHfhFxlfuSZdlpr5vkZyVLc+FLPl8ZFhlPmH/OQn4KRcUinp2V1cjm7m2IjN0gTuFP5W4VffuKkkVZNos8MCOWTzLMdNDGTos8+O2sFCopcq78uqGgkG4euzpN7lik3M/lDArMTwX01fUnLGV+DslcGSp+5cdRxNS4HG5s7c3LocdDoUp+UVSJnc/bzlblcCheZ2w8PVjWInIeiTecVFIpupJrBB7Oy5J3wg4GggMQHU5ArHgGder8eIHbQxQWkgRc1ssDVob7Yk6qVjkPISct7YRoUEgkn7asTWLBVEYtZR8U173tasDzFGYBugpfHM3BfsLUfHFdzqd5GpWZjuEWUqZCXjUBQ/aZ1CNqG6qQEi71W9Yf2HrB1HmTuDipuprqCLIO0raTpzQvatx2qgiyYBGGbauKIEvCth+ziEGWm/IIBhy5OmJXhcKqIAuzn8Zeu+1ER/DanCawX9reVknSf8lQRnQAdp/+Fkz5vK1vxkvbXSTnsvJ1gU/dCUxW+xKWhvWwFo3AqJrYnJpVrpBJswWhNL5qm7o3FpUnoj1AHcCbrkiFCt7S7lQlWLI6PLwDQ00LyU/vdn7m07Oyq7eN/odkEyrKisRWkQ7/j4rG8KDB6wsiePJiiSYPAtXkxRLu72jqWIoXto5UcwcFoZgU+QxtfxGq9jVVBHc2jlBfIaDrhW8mFJrkIGjLOqBl+IZHpmUB+WiQfj1GX8ZZkXJy80PrGSKVrm+LqdAPq/YNVWqhfd/gcnkVlOHhcMmvudxDGfXqfHpxocYsRcmc6imdvhpmOWwx2+6vprwuWq2QX2w37UG/YHYK3+cXo208HyGxqCNRXXBGsdYliWVpD+paLte3e1Zuv/qeLO3Q4tpNSeptemm3Le41PeAXO5be7hCPWNSRqK6mh9QjlqU9qMuexx3bKCyu3ZSk3kUerbbFXZpo+QcHE54zfUyK5Vgf+2I5rENYzKmYW8U44iES6xAWo6Jtwf+xsi/g4dG26hAW+5loazqApTGXeAh1CIv1Fm6bTQyrOxZ1x64ymUyQWYew+JhFeNR1CIshFUOreM6SBIl1iORxgvM4oXlMsJTYJDwjiWVGyJKyLah0ErclHcDSFLU2tTQGPZCxQg02QSxndOVl1pWn0CpWdBUPbA0Prmk4Z6hCHcDSLtlj7nDXusk8nGI4ZtmSnAhkJTSBfez0qbM4/XlBSU5yXjAzdEbplaFXlB4ZekRpaij5ReAFLwwlv0684NLQS0oPDT2ktDC0oHRg6IDSwNCA0keGPqLUN9SndNPQTUpzQ8mJFJ4Ihh5QOjF0QumxoceUvjT0JaVPDH1C6YmhJ5S+MfQNpQ8MfUApM5RRumXoFqXcUPLqwAs2DN2g1DOU/PaDvWZon9LE0ITSh4Y+pHRsKPlVDM8zQ8nxBh6MhkpKnxr6lFJhKPn95gXPDX1OaWRoROkzQ59R+trQ15Q+NvQxpaGh5N0AnE4M3afUvAUqM0r3DN2j9MLQC/t7Ab6cRs+2MHdNBbuUxobGlG4bSn4pwFHC0HNyngxUc1dbvG0i97VALbmFNRlfXE1yHqglt7Dm7rS4mtyfArXkE9L1rcPlixRIKdzpR6vrPfwWlhYOf+r0func27u3fn+jeUN70/na+cb5zuk5vzr3nSdO3xk4vvOn85fzt/PP2t21/bWTtWGtvnOjueZLp/VZ4/8B0OTgog==</latexit>

y = f[x, ]
= 0 + 1x

• Parameters
<latexit sha1_base64="LSKArFbMvTQ1nOgGI386n+z1J6w=">AAAWr3iclZjZbtw2FECVdEvTJU6L+qUvQo0ARZEOZtp0eSmQ2HE2O/U49thOLGdKaSgNY4qStdjjCPMh/Zq+tp/Qv+mlpBlG99IPHcARc88Rl0tSm59KkRf9/r/Xrr/3/gcffnTj45uffPrZ57dWbn9xkCdlFvBRkMgkO/JZzqVQfFSIQvKjNOMs9iU/9E83ND8851kuErVfXKb8JGaREqEIWAGh8cqPnh+mU+H+5no+j4Sq/JgVmZjNPYiO+55XHwceV5MlGq+s9Xv9+ufSwqAtrDntbzi+/dXEmyRBGXNVBJLl+fGgnxYnFcsKEUg+v+mVOU9ZcMoifgxFxWKen1T16ObuHYhM3DDJ4E8Vbh1994yKxXl+GftgQgenOWY6aGPHZRH+elIJlZYFV0HTUFhKt0hcnSp3IjIeFPISCizIBPTVDaYsY0EBCb3pKX4RJHHMIDPe+ubuvGozyM/KOrnzedfZrB2dyKuM9af7y1pEwWPxlpNKakVXcoXAo3lV8V7Uw0BwAKLHCUgUz6FOnR8/dAeIwmKSgIH7yQw6F7ov5qRqVfAIctLRXhENCqnks461QSyYyrij7IHiundcDTgswMCt12HA0RzspUzNF+cVfFZkcZXrGG4hYyridRMw5IBJPaKuoUop4dSgY/2OrRdMnbaJS9K6q5mOIGs/6zpFRvOiJl2njiALFmHUteoIsiRs/QmLGWS5LY9hwLGrI3ZVKKwKsjCHWeJ32051BK/NWQr7pettViT95wxlRAdg9+mjYCrgXX0jWdruIjnnta8LfOZOYbK6p7Asaoa1aARG1cbm1KxzhUyaLQhlyUXX1L2xqDwV3QHqAN50ZSZU+I52ty7BktVh7y4MNSslP/6+9xOfnVR9vW30PySbUFFepraKdPh/VDSBmw1eXxDBk5dINHkQqCcvkXB9R1PHMrywdaSeOygIxaQoLtH2F5HqnlNHcGeTGPUVArpeODKh0CSHYVfWAS3DEW6blgUUoEEGzRgDmeRlxsnFD61niNS6vixmQt+suhdUqYXudYPL5VlQhpvDOb/idB9l1G/y6SelmrAMJXOmp3T22ssL2GK23V9PeVO0WhE/22rbg37B7JRBwM/GW3g+ImJRR6K64DnFWpcklqU9qGu5XN/tWbX1+juytCOLazclqbftpd22uFf0gJ9tW3q7TTxiUUeiutoeUo9YlvagLnset22jsLh2U5J6F3m02hZ3aaLlH+5PecH0Y1IiJ/qxL5FeE8JiQcXCKiYxj5DYhLAYl10L/o+VPQE3j67VhLA4zEVX0wEsTbjEQ2hCWGy2cNdsY1jdtqjbdpXJdIrMJoTFxyzGo25CWIyoGFnFU5amSGxCJI9TnMcpzWOKpdQm4RlJLTNClpRtQWXTpCvpAJZmqLWZpTHogUwUarANYjmnKy+3rjyFVrGiq3hka3h0RcMFQxXqAJZ2yB5zvR3rJvNxivWbqSXJqUBWShM4xM6QOounPz+syJOcH14aeknphaEXlB4aekhpZih5I/DDF4aStxM/PDf0nNIDQw8oLQ0tKR0ZOqI0NDSk9JGhjygNDA0o3TB0g9LCUPJECncEQ/cpnRo6pfTI0CNKXxr6ktInhj6h9JWhryh9a+hbSh8Y+oBSZiijdNPQTUq5oeTTgR+uG7pOqW8oefeDvWbokNLU0JTSh4Y+pHRiKHkrhvuZoeTxBm6MhkpKnxr6lFJhKHl/88Pnhj6nNDY0pvSZoc8ofWPoG0ofG/qY0shQ8m0Ank4M3aPUfAWqckp3Dd2l9MzQM/t3Ab6cRt+2MHdMBTuUJoYmlG4ZSt4U4FHC0FPyPBmq9qq2+NpErmuhWnILazO+OJvkPFRLbmHt1WlxNrk+hWrJp6TrmwfLDymQUrjSj1fWBvgrLC0c/NAb/Ny7t3tv7f56+4X2hvO1843zrTNwfnHuO0+coTNyAudP5y/nb+ef1cHq4err1T8a9fq19pwvnc5vVfwHkIXkEQ==</latexit>

 y-offset
0
= slope
1
Example: 1D Linear regression training data
x = [0.03, 0.19, 0.34, 0.46, 0.78, 0.81, 1.08, 1.18, 1.39, 1.60, 1.65, 1.90]
y = [0.67, 0.85, 1.05, 1.00, 1.40, 1.50, 1.30, 1.54, 1.55, 1.68, 1.73, 1.60]
Example: 1D Linear regression training data

Loss function:
I
<latexit sha1_base64="/hMh896NPSehdG/Bg07J5FjBuSo=">AAAW9HiclZhbc9Q2FIA3lLY0LTS007z0xdMMHdpCJsvQywszkBAgJDQJuUKc7Mhe2Ssiy44vyQaP/0nfOn3t/+lLf0uPbO8Kn6M8dGfCivN9uh1JttdeIkWWLy39M3Pto+sff/Lpjc9mP//i5q0v525/tZ/FRerzPT+WcXrosYxLofheLnLJD5OUs8iT/MA7XdH84JynmYjVbn6Z8OOIhUoEwmc5hAZz440j1wuSkTh2vn/kuFkRDUrxqF+dlGvVXTfy4nEZVEdjCFb3XC+Ww+wSgtLVNe5f6vAPJw9cd9ZSGZRBuVT91BT6Vd3ItM5gbmFpcan+OLTQbwsLvfazNbj9zdAdxn4RcZX7kmXZUX8pyY9LlubCl7yadYuMJ8w/ZSE/gqJiEc+OyzpFlXMHIkMniFP4U7lTRz+sUbIo01MDM2L5KMNMB23sqMiD345LoZIi58pvOgoK6eSxo/PtDEXK/VxeQoH5qYCxOv6IpczPYVVmXcUv/DiKmBqW7vLqdlW6Hg+FKvlZUa9QVXWd1drhULzKWF7bnbYich6J95w0Uiu6kSsEHlZlyRfDRQwEByAWOQGx4hm0qfPjBU4fUdiREnDZbCjYcM7rijStch5CTjraW6JBIZF83LFWiAVLGXWUHVAc546jAc9TWAUYKnxxtAY7CVPVpF7Ox3kalZmO4R5SpkJedwFT9pnUM+oaqpASqvod63dsvWbqtE1cnNRDTXUEWbtp18lTmhc17Dp1BFmwCcOuVUeQJeH6MWQRgyy35QFMOHJ0xK4KhVVBNuZWGnvdvhMdwXtznMB56XqrJUn/OUMZ0QE4ffpbMOXzrr4ST21nkpzz2tcFPnZGsFjdKiwNm2lNOoFZtbGKmnWukEmzBaE0vuiaejQWlSeiO0EdwIeuSIUKPtDu1SXYsjrs3oOppoXkR/cXf+bj43JJHxv9D8kmNJQVia0hHf4fDQ3hjoX3F0Tw4sUSLR4E6sWLJVzf0dKxFG9sHanXDgpCMSnyS3T8Rai6deoIHmwcobFCQLcL30wotMhB0JV1QMvwDfdeywby0ST9Zo6+jLMi5eTih/YzRGpdXxZToW9W3Quq1EL3usHltBaU4eZwzq+o7qGMek0+vbhQQ5aiZI71ko5P3CyHI2Y7/fWSN0WrFfKz9bY/GBesTuH7/GywjtcjJBZ1JGoLHnasbUliWfqDtqbb9cORlesnP5KtHVpcuylJu+0o7bbFvWIE/GzDMtoN4hGLOhK11Y6QesSy9Adt2fO4YZuFxbWbkrQ7yaPVtrhTE23/YHfEc6Yfk8wTbRPCYk7F3CrGEQ+R2ISwGBVdC/6PlR0BN4+u1YSwuJWJrqYDWBpyiafQhLDYHOGu2cawumFRN+wqk8kImU0Ii89ZhGfdhLAYUjG0iqcsSZDYhEgeRziPI5rHBEuJTcIrklhWhGwp24ZKR3FX0gEsjVFvY0tnMAIZK9RhG8RyRndeZt15Cu1iRXfxnq3jvSs6zhlqUAewtEnOmONuWg+Zh1MMj1m2JCcCWQlN4BZ2tqgzefrzgpI8yXnBpaGXlF4YekHpgaEHlKaGkl8EXvDaUPLrxAvODT2ndN/QfUoLQwtK9wzdozQwNKD0maHPKPUN9SldMXSF0txQ8kQKdwRDdykdGTqi9NDQQ0rfGPqG0heGvqD0raFvKX1v6HtKnxj6hFJmKKN01dBVSrmh5NWBFywbukypZyj57QdnzdAtShNDE0qfGvqU0qGh5Fcx3M8MJY83cGM0VFK6ZugapcJQ8vvNC14Z+orSyNCI0peGvqT0naHvKH1u6HNKQ0PJuwF4OjF0h1LzFqjMKN02dJvSM0PP7O8F+HQZPdvG3DQNbFIaGxpTum4o+aUAjxKGnpLnyUC1V7XJ2yZyXQvUlFtYm/FJbZLzQE25hbVXp0ltcn0K1JSPyNBX96cvUiClcKUfzC308VtYWth/sNj/ZfHh9sOFx8vtG9obvW973/Xu9vq9X3uPey96W729nt/7d+b6zM2ZW/Pn83/M/zn/V6Nem2nrfN3rfOb//g9NB/xE</latexit>

X
L[ ] = (f[xi , ] yi ) 2
i=1
I
X
= ( 0+ 1 xi yi ) 2
i=1

“Least squares loss


function”
Example: 1D Linear regression loss function

Loss function:
I
<latexit sha1_base64="/hMh896NPSehdG/Bg07J5FjBuSo=">AAAW9HiclZhbc9Q2FIA3lLY0LTS007z0xdMMHdpCJsvQywszkBAgJDQJuUKc7Mhe2Ssiy44vyQaP/0nfOn3t/+lLf0uPbO8Kn6M8dGfCivN9uh1JttdeIkWWLy39M3Pto+sff/Lpjc9mP//i5q0v525/tZ/FRerzPT+WcXrosYxLofheLnLJD5OUs8iT/MA7XdH84JynmYjVbn6Z8OOIhUoEwmc5hAZz440j1wuSkTh2vn/kuFkRDUrxqF+dlGvVXTfy4nEZVEdjCFb3XC+Ww+wSgtLVNe5f6vAPJw9cd9ZSGZRBuVT91BT6Vd3ItM5gbmFpcan+OLTQbwsLvfazNbj9zdAdxn4RcZX7kmXZUX8pyY9LlubCl7yadYuMJ8w/ZSE/gqJiEc+OyzpFlXMHIkMniFP4U7lTRz+sUbIo01MDM2L5KMNMB23sqMiD345LoZIi58pvOgoK6eSxo/PtDEXK/VxeQoH5qYCxOv6IpczPYVVmXcUv/DiKmBqW7vLqdlW6Hg+FKvlZUa9QVXWd1drhULzKWF7bnbYich6J95w0Uiu6kSsEHlZlyRfDRQwEByAWOQGx4hm0qfPjBU4fUdiREnDZbCjYcM7rijStch5CTjraW6JBIZF83LFWiAVLGXWUHVAc546jAc9TWAUYKnxxtAY7CVPVpF7Ox3kalZmO4R5SpkJedwFT9pnUM+oaqpASqvod63dsvWbqtE1cnNRDTXUEWbtp18lTmhc17Dp1BFmwCcOuVUeQJeH6MWQRgyy35QFMOHJ0xK4KhVVBNuZWGnvdvhMdwXtznMB56XqrJUn/OUMZ0QE4ffpbMOXzrr4ST21nkpzz2tcFPnZGsFjdKiwNm2lNOoFZtbGKmnWukEmzBaE0vuiaejQWlSeiO0EdwIeuSIUKPtDu1SXYsjrs3oOppoXkR/cXf+bj43JJHxv9D8kmNJQVia0hHf4fDQ3hjoX3F0Tw4sUSLR4E6sWLJVzf0dKxFG9sHanXDgpCMSnyS3T8Rai6deoIHmwcobFCQLcL30wotMhB0JV1QMvwDfdeywby0ST9Zo6+jLMi5eTih/YzRGpdXxZToW9W3Quq1EL3usHltBaU4eZwzq+o7qGMek0+vbhQQ5aiZI71ko5P3CyHI2Y7/fWSN0WrFfKz9bY/GBesTuH7/GywjtcjJBZ1JGoLHnasbUliWfqDtqbb9cORlesnP5KtHVpcuylJu+0o7bbFvWIE/GzDMtoN4hGLOhK11Y6QesSy9Adt2fO4YZuFxbWbkrQ7yaPVtrhTE23/YHfEc6Yfk8wTbRPCYk7F3CrGEQ+R2ISwGBVdC/6PlR0BN4+u1YSwuJWJrqYDWBpyiafQhLDYHOGu2cawumFRN+wqk8kImU0Ii89ZhGfdhLAYUjG0iqcsSZDYhEgeRziPI5rHBEuJTcIrklhWhGwp24ZKR3FX0gEsjVFvY0tnMAIZK9RhG8RyRndeZt15Cu1iRXfxnq3jvSs6zhlqUAewtEnOmONuWg+Zh1MMj1m2JCcCWQlN4BZ2tqgzefrzgpI8yXnBpaGXlF4YekHpgaEHlKaGkl8EXvDaUPLrxAvODT2ndN/QfUoLQwtK9wzdozQwNKD0maHPKPUN9SldMXSF0txQ8kQKdwRDdykdGTqi9NDQQ0rfGPqG0heGvqD0raFvKX1v6HtKnxj6hFJmKKN01dBVSrmh5NWBFywbukypZyj57QdnzdAtShNDE0qfGvqU0qGh5Fcx3M8MJY83cGM0VFK6ZugapcJQ8vvNC14Z+orSyNCI0peGvqT0naHvKH1u6HNKQ0PJuwF4OjF0h1LzFqjMKN02dJvSM0PP7O8F+HQZPdvG3DQNbFIaGxpTum4o+aUAjxKGnpLnyUC1V7XJ2yZyXQvUlFtYm/FJbZLzQE25hbVXp0ltcn0K1JSPyNBX96cvUiClcKUfzC308VtYWth/sNj/ZfHh9sOFx8vtG9obvW973/Xu9vq9X3uPey96W729nt/7d+b6zM2ZW/Pn83/M/zn/V6Nem2nrfN3rfOb//g9NB/xE</latexit>

X
L[ ] = (f[xi , ] yi ) 2
i=1
I
X
= ( 0+ 1 xi yi ) 2
i=1

“Least squares loss


function”
Example: 1D Linear regression loss function

Loss function:
I
<latexit sha1_base64="/hMh896NPSehdG/Bg07J5FjBuSo=">AAAW9HiclZhbc9Q2FIA3lLY0LTS007z0xdMMHdpCJsvQywszkBAgJDQJuUKc7Mhe2Ssiy44vyQaP/0nfOn3t/+lLf0uPbO8Kn6M8dGfCivN9uh1JttdeIkWWLy39M3Pto+sff/Lpjc9mP//i5q0v525/tZ/FRerzPT+WcXrosYxLofheLnLJD5OUs8iT/MA7XdH84JynmYjVbn6Z8OOIhUoEwmc5hAZz440j1wuSkTh2vn/kuFkRDUrxqF+dlGvVXTfy4nEZVEdjCFb3XC+Ww+wSgtLVNe5f6vAPJw9cd9ZSGZRBuVT91BT6Vd3ItM5gbmFpcan+OLTQbwsLvfazNbj9zdAdxn4RcZX7kmXZUX8pyY9LlubCl7yadYuMJ8w/ZSE/gqJiEc+OyzpFlXMHIkMniFP4U7lTRz+sUbIo01MDM2L5KMNMB23sqMiD345LoZIi58pvOgoK6eSxo/PtDEXK/VxeQoH5qYCxOv6IpczPYVVmXcUv/DiKmBqW7vLqdlW6Hg+FKvlZUa9QVXWd1drhULzKWF7bnbYich6J95w0Uiu6kSsEHlZlyRfDRQwEByAWOQGx4hm0qfPjBU4fUdiREnDZbCjYcM7rijStch5CTjraW6JBIZF83LFWiAVLGXWUHVAc546jAc9TWAUYKnxxtAY7CVPVpF7Ox3kalZmO4R5SpkJedwFT9pnUM+oaqpASqvod63dsvWbqtE1cnNRDTXUEWbtp18lTmhc17Dp1BFmwCcOuVUeQJeH6MWQRgyy35QFMOHJ0xK4KhVVBNuZWGnvdvhMdwXtznMB56XqrJUn/OUMZ0QE4ffpbMOXzrr4ST21nkpzz2tcFPnZGsFjdKiwNm2lNOoFZtbGKmnWukEmzBaE0vuiaejQWlSeiO0EdwIeuSIUKPtDu1SXYsjrs3oOppoXkR/cXf+bj43JJHxv9D8kmNJQVia0hHf4fDQ3hjoX3F0Tw4sUSLR4E6sWLJVzf0dKxFG9sHanXDgpCMSnyS3T8Rai6deoIHmwcobFCQLcL30wotMhB0JV1QMvwDfdeywby0ST9Zo6+jLMi5eTih/YzRGpdXxZToW9W3Quq1EL3usHltBaU4eZwzq+o7qGMek0+vbhQQ5aiZI71ko5P3CyHI2Y7/fWSN0WrFfKz9bY/GBesTuH7/GywjtcjJBZ1JGoLHnasbUliWfqDtqbb9cORlesnP5KtHVpcuylJu+0o7bbFvWIE/GzDMtoN4hGLOhK11Y6QesSy9Adt2fO4YZuFxbWbkrQ7yaPVtrhTE23/YHfEc6Yfk8wTbRPCYk7F3CrGEQ+R2ISwGBVdC/6PlR0BN4+u1YSwuJWJrqYDWBpyiafQhLDYHOGu2cawumFRN+wqk8kImU0Ii89ZhGfdhLAYUjG0iqcsSZDYhEgeRziPI5rHBEuJTcIrklhWhGwp24ZKR3FX0gEsjVFvY0tnMAIZK9RhG8RyRndeZt15Cu1iRXfxnq3jvSs6zhlqUAewtEnOmONuWg+Zh1MMj1m2JCcCWQlN4BZ2tqgzefrzgpI8yXnBpaGXlF4YekHpgaEHlKaGkl8EXvDaUPLrxAvODT2ndN/QfUoLQwtK9wzdozQwNKD0maHPKPUN9SldMXSF0txQ8kQKdwRDdykdGTqi9NDQQ0rfGPqG0heGvqD0raFvKX1v6HtKnxj6hFJmKKN01dBVSrmh5NWBFywbukypZyj57QdnzdAtShNDE0qfGvqU0qGh5Fcx3M8MJY83cGM0VFK6ZugapcJQ8vvNC14Z+orSyNCI0peGvqT0naHvKH1u6HNKQ0PJuwF4OjF0h1LzFqjMKN02dJvSM0PP7O8F+HQZPdvG3DQNbFIaGxpTum4o+aUAjxKGnpLnyUC1V7XJ2yZyXQvUlFtYm/FJbZLzQE25hbVXp0ltcn0K1JSPyNBX96cvUiClcKUfzC308VtYWth/sNj/ZfHh9sOFx8vtG9obvW973/Xu9vq9X3uPey96W729nt/7d+b6zM2ZW/Pn83/M/zn/V6Nem2nrfN3rfOb//g9NB/xE</latexit>

X
L[ ] = (f[xi , ] yi ) 2
i=1
I
X
= ( 0+ 1 xi yi ) 2
i=1

“Least squares loss


function”
Example: 1D Linear regression loss function

Loss function:
I
<latexit sha1_base64="/hMh896NPSehdG/Bg07J5FjBuSo=">AAAW9HiclZhbc9Q2FIA3lLY0LTS007z0xdMMHdpCJsvQywszkBAgJDQJuUKc7Mhe2Ssiy44vyQaP/0nfOn3t/+lLf0uPbO8Kn6M8dGfCivN9uh1JttdeIkWWLy39M3Pto+sff/Lpjc9mP//i5q0v525/tZ/FRerzPT+WcXrosYxLofheLnLJD5OUs8iT/MA7XdH84JynmYjVbn6Z8OOIhUoEwmc5hAZz440j1wuSkTh2vn/kuFkRDUrxqF+dlGvVXTfy4nEZVEdjCFb3XC+Ww+wSgtLVNe5f6vAPJw9cd9ZSGZRBuVT91BT6Vd3ItM5gbmFpcan+OLTQbwsLvfazNbj9zdAdxn4RcZX7kmXZUX8pyY9LlubCl7yadYuMJ8w/ZSE/gqJiEc+OyzpFlXMHIkMniFP4U7lTRz+sUbIo01MDM2L5KMNMB23sqMiD345LoZIi58pvOgoK6eSxo/PtDEXK/VxeQoH5qYCxOv6IpczPYVVmXcUv/DiKmBqW7vLqdlW6Hg+FKvlZUa9QVXWd1drhULzKWF7bnbYich6J95w0Uiu6kSsEHlZlyRfDRQwEByAWOQGx4hm0qfPjBU4fUdiREnDZbCjYcM7rijStch5CTjraW6JBIZF83LFWiAVLGXWUHVAc546jAc9TWAUYKnxxtAY7CVPVpF7Ox3kalZmO4R5SpkJedwFT9pnUM+oaqpASqvod63dsvWbqtE1cnNRDTXUEWbtp18lTmhc17Dp1BFmwCcOuVUeQJeH6MWQRgyy35QFMOHJ0xK4KhVVBNuZWGnvdvhMdwXtznMB56XqrJUn/OUMZ0QE4ffpbMOXzrr4ST21nkpzz2tcFPnZGsFjdKiwNm2lNOoFZtbGKmnWukEmzBaE0vuiaejQWlSeiO0EdwIeuSIUKPtDu1SXYsjrs3oOppoXkR/cXf+bj43JJHxv9D8kmNJQVia0hHf4fDQ3hjoX3F0Tw4sUSLR4E6sWLJVzf0dKxFG9sHanXDgpCMSnyS3T8Rai6deoIHmwcobFCQLcL30wotMhB0JV1QMvwDfdeywby0ST9Zo6+jLMi5eTih/YzRGpdXxZToW9W3Quq1EL3usHltBaU4eZwzq+o7qGMek0+vbhQQ5aiZI71ko5P3CyHI2Y7/fWSN0WrFfKz9bY/GBesTuH7/GywjtcjJBZ1JGoLHnasbUliWfqDtqbb9cORlesnP5KtHVpcuylJu+0o7bbFvWIE/GzDMtoN4hGLOhK11Y6QesSy9Adt2fO4YZuFxbWbkrQ7yaPVtrhTE23/YHfEc6Yfk8wTbRPCYk7F3CrGEQ+R2ISwGBVdC/6PlR0BN4+u1YSwuJWJrqYDWBpyiafQhLDYHOGu2cawumFRN+wqk8kImU0Ii89ZhGfdhLAYUjG0iqcsSZDYhEgeRziPI5rHBEuJTcIrklhWhGwp24ZKR3FX0gEsjVFvY0tnMAIZK9RhG8RyRndeZt15Cu1iRXfxnq3jvSs6zhlqUAewtEnOmONuWg+Zh1MMj1m2JCcCWQlN4BZ2tqgzefrzgpI8yXnBpaGXlF4YekHpgaEHlKaGkl8EXvDaUPLrxAvODT2ndN/QfUoLQwtK9wzdozQwNKD0maHPKPUN9SldMXSF0txQ8kQKdwRDdykdGTqi9NDQQ0rfGPqG0heGvqD0raFvKX1v6HtKnxj6hFJmKKN01dBVSrmh5NWBFywbukypZyj57QdnzdAtShNDE0qfGvqU0qGh5Fcx3M8MJY83cGM0VFK6ZugapcJQ8vvNC14Z+orSyNCI0peGvqT0naHvKH1u6HNKQ0PJuwF4OjF0h1LzFqjMKN02dJvSM0PP7O8F+HQZPdvG3DQNbFIaGxpTum4o+aUAjxKGnpLnyUC1V7XJ2yZyXQvUlFtYm/FJbZLzQE25hbVXp0ltcn0K1JSPyNBX96cvUiClcKUfzC308VtYWth/sNj/ZfHh9sOFx8vtG9obvW973/Xu9vq9X3uPey96W729nt/7d+b6zM2ZW/Pn83/M/zn/V6Nem2nrfN3rfOb//g9NB/xE</latexit>

X
L[ ] = (f[xi , ] yi ) 2
i=1
I
X
= ( 0+ 1 xi yi ) 2
i=1

“Least squares loss


function”
Example: 1D Linear regression loss function

Loss function:
I
<latexit sha1_base64="/hMh896NPSehdG/Bg07J5FjBuSo=">AAAW9HiclZhbc9Q2FIA3lLY0LTS007z0xdMMHdpCJsvQywszkBAgJDQJuUKc7Mhe2Ssiy44vyQaP/0nfOn3t/+lLf0uPbO8Kn6M8dGfCivN9uh1JttdeIkWWLy39M3Pto+sff/Lpjc9mP//i5q0v525/tZ/FRerzPT+WcXrosYxLofheLnLJD5OUs8iT/MA7XdH84JynmYjVbn6Z8OOIhUoEwmc5hAZz440j1wuSkTh2vn/kuFkRDUrxqF+dlGvVXTfy4nEZVEdjCFb3XC+Ww+wSgtLVNe5f6vAPJw9cd9ZSGZRBuVT91BT6Vd3ItM5gbmFpcan+OLTQbwsLvfazNbj9zdAdxn4RcZX7kmXZUX8pyY9LlubCl7yadYuMJ8w/ZSE/gqJiEc+OyzpFlXMHIkMniFP4U7lTRz+sUbIo01MDM2L5KMNMB23sqMiD345LoZIi58pvOgoK6eSxo/PtDEXK/VxeQoH5qYCxOv6IpczPYVVmXcUv/DiKmBqW7vLqdlW6Hg+FKvlZUa9QVXWd1drhULzKWF7bnbYich6J95w0Uiu6kSsEHlZlyRfDRQwEByAWOQGx4hm0qfPjBU4fUdiREnDZbCjYcM7rijStch5CTjraW6JBIZF83LFWiAVLGXWUHVAc546jAc9TWAUYKnxxtAY7CVPVpF7Ox3kalZmO4R5SpkJedwFT9pnUM+oaqpASqvod63dsvWbqtE1cnNRDTXUEWbtp18lTmhc17Dp1BFmwCcOuVUeQJeH6MWQRgyy35QFMOHJ0xK4KhVVBNuZWGnvdvhMdwXtznMB56XqrJUn/OUMZ0QE4ffpbMOXzrr4ST21nkpzz2tcFPnZGsFjdKiwNm2lNOoFZtbGKmnWukEmzBaE0vuiaejQWlSeiO0EdwIeuSIUKPtDu1SXYsjrs3oOppoXkR/cXf+bj43JJHxv9D8kmNJQVia0hHf4fDQ3hjoX3F0Tw4sUSLR4E6sWLJVzf0dKxFG9sHanXDgpCMSnyS3T8Rai6deoIHmwcobFCQLcL30wotMhB0JV1QMvwDfdeywby0ST9Zo6+jLMi5eTih/YzRGpdXxZToW9W3Quq1EL3usHltBaU4eZwzq+o7qGMek0+vbhQQ5aiZI71ko5P3CyHI2Y7/fWSN0WrFfKz9bY/GBesTuH7/GywjtcjJBZ1JGoLHnasbUliWfqDtqbb9cORlesnP5KtHVpcuylJu+0o7bbFvWIE/GzDMtoN4hGLOhK11Y6QesSy9Adt2fO4YZuFxbWbkrQ7yaPVtrhTE23/YHfEc6Yfk8wTbRPCYk7F3CrGEQ+R2ISwGBVdC/6PlR0BN4+u1YSwuJWJrqYDWBpyiafQhLDYHOGu2cawumFRN+wqk8kImU0Ii89ZhGfdhLAYUjG0iqcsSZDYhEgeRziPI5rHBEuJTcIrklhWhGwp24ZKR3FX0gEsjVFvY0tnMAIZK9RhG8RyRndeZt15Cu1iRXfxnq3jvSs6zhlqUAewtEnOmONuWg+Zh1MMj1m2JCcCWQlN4BZ2tqgzefrzgpI8yXnBpaGXlF4YekHpgaEHlKaGkl8EXvDaUPLrxAvODT2ndN/QfUoLQwtK9wzdozQwNKD0maHPKPUN9SldMXSF0txQ8kQKdwRDdykdGTqi9NDQQ0rfGPqG0heGvqD0raFvKX1v6HtKnxj6hFJmKKN01dBVSrmh5NWBFywbukypZyj57QdnzdAtShNDE0qfGvqU0qGh5Fcx3M8MJY83cGM0VFK6ZugapcJQ8vvNC14Z+orSyNCI0peGvqT0naHvKH1u6HNKQ0PJuwF4OjF0h1LzFqjMKN02dJvSM0PP7O8F+HQZPdvG3DQNbFIaGxpTum4o+aUAjxKGnpLnyUC1V7XJ2yZyXQvUlFtYm/FJbZLzQE25hbVXp0ltcn0K1JSPyNBX96cvUiClcKUfzC308VtYWth/sNj/ZfHh9sOFx8vtG9obvW973/Xu9vq9X3uPey96W729nt/7d+b6zM2ZW/Pn83/M/zn/V6Nem2nrfN3rfOb//g9NB/xE</latexit>

X
L[ ] = (f[xi , ] yi ) 2
i=1
I
X
= ( 0+ 1 xi yi ) 2
i=1

“Least squares loss


function”
Example: 1D Linear regression loss function
Example: 1D Linear regression loss function
Example: 1D Linear regression loss function
Example: 1D Linear regression loss function
Example: 1D Linear regression training
Example: 1D Linear regression training
Example: 1D Linear regression training
Example: 1D Linear regression training
Example: 1D Linear regression training

This technique is known as gradient descent


Possible objections
• But you can fit the line model in closed form!
• Yes – but we won’t be able to do this for more complex models
• But we could exhaustively try every slope and intercept combo!
• Yes – but we won’t be able to do this when there are a million parameters
Example: 1D Linear regression testing
• Test with different set of paired input/output data
• Measure performance
• Degree to which this is same as training = generalization
• Might not generalize well because
• Model too simple
• Model too complex
• fits to statistical peculiarities of data
• this is known as overfitting
Where are we going?
• Shallow neural networks (a more flexible model)
• Deep neural networks (an even more flexible model)
• Loss functions (where did least squares come from?)
• How to train neural networks (gradient descent and variants)
• How to measure performance of neural networks (generalization)
Outline

1. Machine Learning Foundations

2. Linear regression

3. Examples

50
Simple Regression
https://github.com/udlbook/udlbook/blob/main/Notebooks/Chap02/2_1_Supervised_Learning.ipynb

51
Simple Regression
Require backgrounds:
- Numpy: https://cs231n.github.io/python-numpy-tutorial/
- Matplotlib
- Pandas

Platform:
- Local
- Google Collab
- Kaggle
- Others

52
Homework

Source: Chapter 2 - https://github.com/udlbook/udlbook/


53
Code Practice

https://www.kaggle.com/competitions/house-prices-advanced-regression-techniques
54
Others Implementation*
Try to understand and reimplement the sklearn api for linear regression:
https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LinearRegression.html

From numerical computation view:


https://cims.nyu.edu/~donev/Teaching/NMI-Fall2014/Lecture-LU.handout.pdf
Example notebook (from ML course):
https://colab.research.google.com/drive/1-WvL4qg_Q-gPAGJ3sfJ6PoT5QuIiPIZp?usp=sharing

55
Thank you for listening!

56

You might also like