Week 12
Week 12
• The process is slow and can lose some finer details over long
distances.
Self-attention mechanism
• Transformer models modify this process by incorporating
something called a self-attention mechanism.
• https://magazine.sebastianraschka.com/p/understanding-
and-coding-self-attention
Summary
• Generative AI