0% found this document useful (0 votes)
11 views3 pages

Sigml Errata

Uploaded by

cc7kg0uf3w
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views3 pages

Sigml Errata

Uploaded by

cc7kg0uf3w
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

STATQUEST!!! (HTTPS://STATQUEST.

ORG/)
An epic journey through statistics and machine learning

Home (https://statquest.org/) About (https://statquest.org/about/)

Contact and FAQ (https://statquest.org/contact/) Music (https://statquest.org/the-star-makers/)

SIGML – Errata (https://statquest.org/sigml-errata/)

Support StatQuest!!! (https://statquest.org/support-statquest/)

The StatQuest Store (https://statquest.org/statquest-store/) Video Index (https://statquest.org/video-index/)

StatQuest!!! (https://statquest.org) SIGML – Errata


SIGML – Errata RECENT POSTS
Coding a ChatGPT
Like Transformer
These are the corrections for The StatQuest Illustrated Guide to from Scratch in
Machine Learning (Landscape Edition) purchased before November, PyTorch
2022 (https://statquest.org/coding
a-chatgpt-like-
Page 39, Step 6. The last equation should be 0.7*0.7*0.3 instead transformer-from-
scratch-in-pytorch/)
of 0.7*0.7*0.5.
The Matrix Math
Page 119, Step 2. It should say that the people that ate an Behind Transformer
intermediate amount do love Troll 2 instead of do not. Neural Networks
Page 143, Step 2. The equation for Specificity should have 112 (https://statquest.org/the-
in it instead of 115 matrix-math-behind-
Page 144, Step 1. Both equations are for Precision. transformer-neural-
networks/)
Page 135, The numerators for the two equations should be
Essential Matrix
swapped. Algebra for Neural
Page 259, Step 13. It should say -0.5 instead of 0.5. Networks, Clearly
Page 265, Step 2. The right hand side of the first equation Explained!!!
should not be squared. (https://statquest.org/essent
matrix-algebra-for-
Page 293, Step 6. The -1 should be put inside parentheses.
neural-networks-
clearly-explained/)
These are corrections for The StatQuest Illustrated Guide to
Word Embedding in
Machine Learning (Portrait Edition) purchased after November PyTorch + Lightning
2022. (https://statquest.org/word-
embedding-in-
Page 67, The graphs in the second box, for consistencies sake, pytorch-lightning/)
should be labeled SSR(straight line) and SSR(parabola). Decoder-Only
However, it should be noted that in the wild world of statistics, Transformers,
ChatGPTs specific
SSR and RSS are often used interchangeably.
Transformer, Clearly
Page 81, Instead of the page being called “Logistic Regression: Explained!!!
Weaknesses”, it should be called “Beyond Linear Regression”. (https://statquest.org/decod
Page 102, Step3. The second line of the derivative of the SSR only-transformers-
with respect to slope should be -2*2.3*(1.9 – (0 + 0.5*2.3)) chatgpts-specific-
transformer-clearly-
instead of -2*1.9*(2.3 – (0 + 0.5*1.9)).
explained/)
Page 135, The numerators for the two equations should be
swapped.
Page 265, Step 2. The right hand side of the first equation
should not be squared.
Page 293, Step 6. The -1 should be put inside parentheses.

You might also like