1. The document discusses various statistical and neural network-based models for representing words and modeling semantics, including LSI, PLSI, LDA, word2vec, and neural network language models. 2. These models represent words based on their distributional properties and contexts using techniques like matrix factorization, probabilistic modeling, and neural networks to learn vector representations. 3. Recent models like word2vec use neural networks to learn word embeddings that capture linguistic regularities and can be used for tasks like analogy-making and machine translation.