0% found this document useful (0 votes)
4 views5 pages

GraphML Course Overview

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as ODT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views5 pages

GraphML Course Overview

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as ODT, PDF, TXT or read online on Scribd
You are on page 1/ 5

Navigating the Graph: Unraveling the Power of ML in Stanford’s Machine

Learning with Graph Course

As an avid learner and someone deeply fascinated by the interconnected world of data and
algorithms, I recently had the pleasure of diving into the captivating realm of Graphs in Machine
Learning through Stanford University’s comprehensive course. Standfords Mchiene Learning with
Graph course is where graphs become not just nodes and edges, but the very backbone of machine
learning magic.a highly recomended course which is available for free on Youtube:
Course Overview:
The course masterfully unveils the synergy between graph theory and machine learning, unraveling
the profound impact of connectedness in shaping intelligent systems. Each lecture is a doorway to a
new dimension, where graphs emerge as powerful tools in the hands of machine learning
enthusiasts.
Lecture Highlights:
• Lecture 1 — Why Graph?
• Introduction and overview of graphs
• Applications and use cases
• Choice of graph representations: Directed, undirected, weighted, bipartite, adjacency matrix,
and list
• Link to Lecture 1

• Lecture 2 — Traditional Ways of Graph Features Extraction


• Overview of Node-level, link-level, graph-level prediction tasks
• Node features: degree, centrality, clustering coefficient, graphlets
• Edge features: shortest distance, common neighbors, local neighborhood overlap, global
neighborhood overlap
• Graph features using kernel methods: graphlet kernel, Weisfeiler-Lehman kernel
• Link to Lecture 2

• Lecture 3 — ML-based Feature Learning (Shallow Embeddings)


• Why we need ML-based feature representation learning
• Overview of embedding and embedding space
• Shallow embedding algorithms like random walks, node2vec
• Different notions of node similarity:
• Naive (similar if two nodes are connected), Neighborhood overlap, Random walk
approaches
• How to embed the whole graph using mentioned embedding techniques
• Link to Lecture 3

• Lecture 4 — Popular Graph Algorithms


• In-depth discussion on PageRank algorithm
• In-depth discussion on random walk with restart
• Link to Lecture 4

• Lecture 5 — Important Concepts for Graph Neural Networks


• Covers message passing, a building block of graph neural networks
• Semi-supervised based learning intro
• Talks about three types of classification: Collective classification, Local classification, and
Relational classification
• Two ways how correlations are formed in a graph: Homophily and Influence
• Link to Lecture 5

• Lecture 6 — Introduction to Deep Learning


• Covers the basics of deep learning to better understand graph neural networks
• Tries to give intuition behind GNN by relating it to deep neural networks
• Link to Lecture 6
• Lecture 7 — Introduction to Graph Neural Networks
• General perspective
• Dives into a single layer of a GNN
• Dives into multi-layer GNN
• Link to Lecture 7

• Lecture 8 — GNN Preprocessing and Training


• Graph augmentation
• How graph Neural networks are trained
• How GNN can be implemented for: Node classification, Link prediction, and Graph
classification
• Link to Lecture 8

• Lecture 9 — GNN Expressive Power


• Talks about how expressive GNN embeddings are in distinguishing different graph
structures
• To generate a node embedding, GNN uses a computational graph corresponding to a subtree
rooted around each node
• GNN can fully distinguish different subtree structures if every step of its neighbor
aggregation is injective
• Basically, expressive power of GNN depends on the neighborhood aggregation function
• Covers the expressive power of different neighborhood aggregation functions like
maxPooling, meanPooling
• Covers an injective aggregation function way expressive than other functions used in GIN
architecture
• Link to Lecture 9

• Lecture 10 — Heterogeneous Graphs


• What is heterogeneous graphs and its practical applications
• Covers how GNN can be implemented for heterogeneous graph using RGNN
• What is knowledge graphs
• Talks about knowledge graph completion task and how it’s a practical task found in many
real-world problems
• How GNN can be used to perform knowledge graph completion task modeling it as a link
prediction task in GNN
• Intro to algorithms like TranseE, TransR, DistMult, and Complex which can be used for
knowledge graph completion
• Link to Lecture 10

• Lecture 11 — Deeper Dive into Knowledge Graph Application


• Question answering in knowledge: Ask complex questions to your knowledge graph using
knowledge graph completion techniques
• Overview about query2box technique to answer complex questions from knowledge graph
• Link to Lecture 11

• Lecture 12 — Neural Subgraph Mapping


• Intro to Neural subgraph mapping, a classical computer science problem
• Neural subgraph mapping applications
• How can GNN be used to perform Neural subgraph mapping
• Link to Lecture 12

• Lecture 13 — Network Communities


• What is network communities and its applications
• Covers concepts important for community detection in graph like modularity
• Covers a very widely used, scalable, and fast community detection algorithm called Louvain
• Link to Lecture 13

• Lecture 14–15 — Generative Models


• What are generative models and why do we need them
• Starts with properties of real-world graphs which we want to synthesize
• Talks about traditional generative models then goes into deep graph generative models

• Lecture 16 — Limitations of Graph Neural Networks


• A perfect GNN should be able to embed different graph structures distinctly and similar
structures very closely.
• But in some real-world cases, the position of a point of graph is important here is where
traditional GNN fails as it treats all the positions of a graph structure the same, traditional
GNN works best for structure-aware tasks and not position-aware tasks
• Second issue is when graph structures include a cyclic structure
• Describes the limitations in detail then dwells into the solution to fix these issues
• Intro to position-aware GNNs to combat the first limitation
• Usage of better message passing technique to tackle the cycle structure which is intro to
identity-aware GNN
• Link to Lecture 16

• Lecture 17 — Scaling GNN


• Unlike traditional ML problems where the entire dataset can be divided into batches and
only a certain number of batches can be loaded into memory at a time enabling scaling of
the training process to large datasets, GNN can’t be directly converted to batches and loaded
into memory since all vertices are connected and not independent
• Cover various GNN sampling techniques with their limitations
• Two methods perform message passing over small subgraphs and each subgraph is only
needed to be loaded into the GPU:
• Neighbour sampling
• Cluster GCN
• One method simplifies GNN into a feature preprocessing task so that it can be run even on a
CPU:
• Simplified GCN
• Link to Lecture 17

Conclusion
The fusion of graph theory with machine learning reveals a transformative synergy, unlocking the
immense potential hidden within the connectedness of data structures. The course not only imparts
a deep understanding of graph representations, algorithms, and neural networks but also showcases
how graphs leverage machine learning, creating a dynamic interplay that reshapes the landscape of
intelligent systems.
The power of connectedness in machine learning is vividly exemplified throughout the lectures.
From the foundational insights of why graphs matter to the intricacies of graph embeddings, each
module underscores the critical role that interconnected data plays in enhancing the capabilities of
machine learning models. The lectures on graph neural networks (GNNs) shed light on how the
inherent structure of graphs, coupled with advanced learning techniques, results in models that
surpass traditional approaches.

You might also like