0% found this document useful (0 votes)
306 views3 pages

Stanford University CS224d - Deep Learning For Natural Language Processing - Syllabus

This document provides a summary of the schedule and syllabus for the CS224d: Deep Learning for Natural Language Processing course at Stanford University. It lists the dates and topics for each of the 12 lectures, which cover introductory material on NLP and deep learning as well as more advanced topics like word embeddings, neural networks, recurrent neural networks, and recursive neural networks. It also provides the due dates for two problem sets and the course project proposal. Recommended readings are provided for each topic to enhance students' learning.

Uploaded by

Rossana Cunha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
306 views3 pages

Stanford University CS224d - Deep Learning For Natural Language Processing - Syllabus

This document provides a summary of the schedule and syllabus for the CS224d: Deep Learning for Natural Language Processing course at Stanford University. It lists the dates and topics for each of the 12 lectures, which cover introductory material on NLP and deep learning as well as more advanced topics like word embeddings, neural networks, recurrent neural networks, and recursive neural networks. It also provides the due dates for two problem sets and the course project proposal. Recommended readings are provided for each topic to enhance students' learning.

Uploaded by

Rossana Cunha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

17/10/2017 Stanford University CS224d: Deep Learning for Natural Language Processing

(http://nlp.stanford.edu/)

CS224d: Deep Learning for Natural Language Processing


(index.html)

Schedule and Syllabus


Unless otherwise speci ed the course lectures and meeting times are:

Tuesday, Thursday 3:00-4:20


Location: Gates B1

Event Date Description Course Materials

Lecture Mar Intro to NLP and Deep Learning Suggested Readings:


29 1. [Linear Algebra Review (http://cs229.stanford.edu/section/cs229-linalg.pdf)]
2. [Probability Review (http://cs229.stanford.edu/section/cs229-prob.pdf)]
3. [Convex Optimization Review (http://cs229.stanford.edu/section/cs229-cvxopt.pdf)]
4. [More Optimization (SGD) Review (http://cs231n.github.io/optimization-1/)]
5. [From Frequency to Meaning: Vector Space Models of Semantics
(http://www.jair.org/media/2934/live-2934-4846-jair.pdf)]

[Lecture Notes 1 (lecture_notes/notes1.pdf)]


[python tutorial (http://cs231n.github.io/python-numpy-tutorial/)] [slides (lectures/CS224d-Le

Lecture Mar Simple Word Vector Suggested Readings:


31 representations: word2vec, GloVe 1. [Distributed Representations of Words and Phrases and their Compositionality
(http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-an
compositionality.pdf)]
2. [E cient Estimation of Word Representations in Vector Space (http://arxiv.org/pdf/1301
[slides (lectures/CS224d-Lecture2.pdf)]

A1 released Apr Pset #1 released [Pset 1 (assignment1/index.html)] [Pset 1 Solutions (assignment1/assignment1_soln)] [Pset
4 Code (assignment1/assignment1_sol.zip)]

Lecture Apr Advanced word vector Suggested Readings:


5 representations: language 1. [GloVe: Global Vectors for Word Representation (http://nlp.stanford.edu/pubs/glove.pdf
models, softmax, single layer 2. [Improving Word Representations via Global Context and Multiple Word Prototypes
networks (http://www.aclweb.org/anthology/P12-1092)]

[Lecture Notes 2 (lecture_notes/notes2.pdf)]


[slides (lectures/CS224d-Lecture3.pdf)]

Lecture Apr Neural Networks and Suggested Readings:


7 backpropagation -- for named 1. [UFLDL tutorial (http://u dl.stanford.edu/wiki/index.php/Backpropagation_Algorithm)]
entity recognition 2. [Learning Representations by Backpropogating Errors
(http://www.iro.umontreal.ca/~vincentp/ift3395/lectures/backprop_old.pdf)]
[Lecture Notes 3 (lecture_notes/notes3.pdf)]
[slides (lectures/CS224d-Lecture4.pdf)]

Lecture Apr Project Advice, Neural Networks Suggested Readings:


12 and Back-Prop (in full gory detail) 1. [Natural Language Processing (almost) from Scratch (http://arxiv.org/pdf/1103.0398v1
2. [A Neural Network for Factoid Question Answering over Paragraphs
(https://cs.umd.edu/~miyyer/pubs/2014_qb_rnn.pdf)]
3. [Grounded Compositional Semantics for Finding and Describing Images with Sentences
(http://nlp.stanford.edu/~socherr/SocherKarpathyLeManningNg_TACL2013.pdf)]
4. [Deep Visual-Semantic Alignments for Generating Image Descriptions
(http://cs.stanford.edu/people/karpathy/deepimagesent/devisagen.pdf)]
5. [Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
(http://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf)]

[slides (lectures/CS224d-Lecture5.pdf)]

http://cs224d.stanford.edu/syllabus.html 1/3
17/10/2017 Stanford University CS224d: Deep Learning for Natural Language Processing

Lecture Apr Practical tips: gradient checks, Suggested Readings:


14 over tting, regularization, 1. [Practical recommendations for gradient-based training of deep architectures
activation functions, details (http://arxiv.org/abs/1206.5533)]
2. [UFLDL page on gradient checking
(http://u dl.stanford.edu/wiki/index.php/Gradient_checking_and_advanced_optimizatio
[slides (lectures/CS224d-Lecture6.pdf)]

A1 Due Apr Pset #1 due


19

Lecture Apr Introduction to Tensor ow Suggested Readings:


19 1. [TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems
(http://download.tensor ow.org/paper/whitepaper2015.pdf)]
[slides (lectures/CS224d-Lecture7.pdf)] [AWS Tutorial (supplementary/aws-tutorial-2.pdf)] [AW
Supplementary (lectures/CS224D-Lecture7-2.pdf)] [AWS Tutorial Video (https://youtu.be/zdnM

A2 released Apr Pset #2 released [Pset 2 (assignment2/index.html)][Pset 2 Solutions (assignment2/assignment2_sol.pdf)] [Pse


20 Code (assignment2/assignment2_dev.zip)]

Lecture Apr Recurrent neural networks -- for Suggested Readings:


21 language modeling and other 1. [Recurrent neural network based language model
tasks (http://www. t.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010
2. [Extensions of recurrent neural network language model
(http://www. t.vutbr.cz/research/groups/speech/publi/2011/mikolov_icassp2011_5528
3. [Opinion Mining with Deep Recurrent Neural Networks (http://www.cs.cornell.edu/~oirs

[slides (lectures/CS224d-Lecture8.pdf)] [minimal net example (karpathy) (http://cs231n.githu


networks-case-study/)] [vanishing grad example (notebooks/vanishing_grad_example.html)]
notebook (notebooks/vanishing_grad_example.ipynb)]
[Lecture Notes 4 (lecture_notes/notes4.pdf)]

Lecture Apr GRUs and LSTMs -- for machine Suggested Readings:


26 translation 1. [Long Short-Term Memory (http://web.eecs.utk.edu/~itamar/courses/ECE-692/Bobby_
2. [Gated Feedback Recurrent Neural Networks (http://arxiv.org/pdf/1502.02367v3.pdf)]
3. [Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
(http://arxiv.org/pdf/1412.3555v1.pdf)]

[slides (lectures/CS224d-Lecture9.pdf)]

Proposal Apr Course Project Proposal due [proposal description (project.html#proposal)]


due 28

Lecture Apr Recursive neural networks -- for Suggested Readings:


28 parsing 1. [Parsing with Compositional Vector Grammars
(http://nlp.stanford.edu/pubs/SocherBauerManningNg_ACL2013.pdf)]
2. [Subgradient Methods for Structured Prediction (http://repository.cmu.edu/cgi/viewcon
article=1054&context=robotics)]
3. [Parsing Natural Scenes and Natural Language with Recursive Neural Networks (http://w
nlp.stanford.edu/pubs/SocherLinNgManning_ICML2011.pdf)]

[Lecture Notes 5 (lecture_notes/LectureNotes5.pdf)]


[slides (lectures/CS224d-Lecture10.pdf)]

Lecture May Recursive neural networks -- for Suggested Readings:


3 different tasks (e.g. sentiment 1. [Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
analysis) (http://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf)]
2. [Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection
(http://papers.nips.cc/paper/4204-dynamic-pooling-and-unfolding-recursive-autoencod
paraphrase-detection.pdf)]
3. [Improved Semantic Representations From Tree-Structured Long Short-Term Memory N
(http://arxiv.org/pdf/1503.00075v2.pdf)]
[slides (lectures/CS224d-Lecture11.pdf)]

A2 Due May Pset #2 Due date


5

Lecture May Review Session for Midterm Suggested Readings: N/A


5
[slides (lectures/CS224d-Lecture12.pdf)]

http://cs224d.stanford.edu/syllabus.html 2/3
17/10/2017 Stanford University CS224d: Deep Learning for Natural Language Processing

Midterm May In-class midterm [midterm solutions (midterm/midterm_solutions.pdf)]


10

A3 released May Pset #3 released [Pset 3 (assignment3/index.html)] [Pset 3 Solutions (assignment3/pset3_soln.pdf)] [Pset 3 So
12 (assignment3/pset3_code.zip)]

Lecture May Convolutional neural networks -- Suggested Readings:


12 for sentence classi cation 1. [A Convolutional Neural Network for Modelling Sentences
(http://nal.co/papers/Kalchbrenner_DCNN_ACL14)]
[slides (lectures/CS224d-Lecture13.pdf)]

Milestone May Course Project Milestone [milestone description (project.html#milestone)]


15

Lecture May Guest Lecture with Andrew Maas Suggested Readings:


17 (http://ai.stanford.edu/~amaas/): 1. [ Deep Neural Networks for Acoustic Modeling in Speech Recognition (papers/maas_pa
Speech recognition
[slides (lectures/CS224d-Lecture14.pdf)]

Lecture May Guest Lecture with Thang Luong Suggested Readings:


19 (http://stanford.edu/~lmthang/): 1. [ Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character M
Machine Translation (papers/achieving.pdf)]
2. [ Addressing the Rare Word Problem in Neural Machine Translation (papers/addressing
3. [ Advances in natural language processing (papers/advances.pdf)]
4. [ Neural machine translation by jointly learning to align and translate (papers/neural_ma
[slides (lectures/CS224d-Lecture15.pdf)]

A3 Due May Pset #3 Due date


21

Lecture May Guest Lecture with Quoc Le Suggested Readings:


24 (http://cs.stanford.edu/~quocle/): 1. [ Sequence to Sequence with Neural Networks (papers/seq2seq.pdf)]
Seq2Seq and Large Scale DL 2. [ Neural Machine Translation by Jointly Learning to Align and Translate (papers/nmt.pdf
3. [ A Neural Conversation Model (papers/ancm.pdf)]
4. [ Neural Programmer: Include Latent Programs with Gradient Descent (papers/npil.pdf)]
[slides (lectures/CS224d-Lecture16.pdf)]

Lecture May The future of Deep Learning for Suggested Readings:


26 NLP: Dynamic Memory Networks 1. [Ask me anthing: Dynamic Memory Networks for NLP (http://arxiv.org/abs/1506.07285)
[slides (lectures/CS224d-Lecture17.pdf)]

Poster June Final project poster


Presentation 1 presentations: 2-5 pm, Gates
patio

Final Project Jun Final course project due date [project description]
Due 3

http://cs224d.stanford.edu/syllabus.html 3/3

You might also like