Seminar Presentation

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

EVERYTHING IS CONNECTED:

GRAPH NEURAL NETWORKS


Seminar Presentation

Presented by
Guide : Dr Jina Varghese
Vishnu Prasad S
Table of Content

Introduction
Literature review papers
Methodology/Techniques used
Advantages
Disadvantages
Application
Conclusion
References
Introduction

Imagine you're faced with a dataset that isn't just a simple table of
rows and columns, but a network of interconnected nodes and
edges, like social networks, recommendation systems, or biological
networks. How do you extract meaningful information from such
data? That's where Graph Neural Networks come into play.
Literature Review

Advatages /
Detail of the paper Overview Limitations
Techniques
Title: “On the Bottleneck of Graph The paper addresses the inherent The authors demonstrate that The paper sheds light on the
Neural Networks and its Practical problem of over-squashing in Graph popular GNNs, such as GCN challenges faced by GNNs in
Implications” Neural Networks (GNNs). and GIN, which absorb propagating messages between
GNNs struggle to propagate incoming edges equally, are distant nodes and presents
Authors: Uri Alon and Eran Yahav information between distant nodes in more susceptible to over- strategies to overcome these
a graph due to a bottleneck when squashing than GAT and limitations. However, it is
aggregating messages across a long GGNN. important to note that this is an
path. ongoing area of research, and
there may be additional
limitations or future work that
can be explored.
title : Interaction Networks for The paper introduces a model It can reason about how
Learning about Objects, Relations called the interaction network objects in complex systems
and Physics that can reason about how interact.
objects in complex systems It supports dynamical
author : Peter W. Battaglia, interact, supporting dynamical predictions.
Razvan Pascanu, Matthew Lai, predictions, as well as It can make inferences about
Danilo Jimenez Rezende, and inferences about the abstract the abstract properties of the
Koray Kavukcuoglu. It was properties of the system. system.
published in the year 2016. It takes graphs as input and
performs object- and relation-
centric reasoning

title : Relational Inductive Biases, The paper explores how using Facilitating learning about
Deep Learning, and Graph relational inductive biases entities, relations, and rules for
Networks within deep learning composing them.
architectures can facilitate Generalizing and extending
author : Peter W. Battaglia, Jessica learning about entities, various approaches for neural
B. Hamrick, Victor Bapst, Alvaro relations, and rules for networks operating on graphs.
Sanchez-Gonzalez, Vinicius composing them. Providing a straightforward
Zambaldi, Mateusz Malinowski, interface for manipulating
Andrea Tacchetti, David Raposo structured knowledge and
producing structured behaviors
Facilitating learning about
title : Geometric Deep Learning: The paper explores how using
entities, relations, and rules for
Grids, Groups, Graphs, Geodesics, geometric principles can expose
composing them.
and Gauges author : Michael M. regularities in high-dimensional
Generalizing and extending
Bronstein, Joan Bruna, Taco data and facilitate learning
various approaches for neural
Cohen, Petar Veličković, about entities, relations, and
networks operating on graphs.
rules for composing them.
Providing a straightforward
interface for manipulating
structured knowledge and
producing structured behaviors
What is a graph
Methodology
Node classification in Graph Neural Networks (GNNs) is the task of assigning labels to
nodes within a graph.
This process leverages the graph's structure and the features associated with each node.
GNNs, through layers of graph convolutions and non-linear transformations, learn to
capture intricate relationships and dependencies among nodes.
This approach finds applications in various domains, including recommendation systems,
social network analysis, and biology, where understanding the properties and roles of
interconnected entities is crucial.
By combining local and global information, GNNs excel at making predictions and
classifications in graph-structured data, unlocking insights that traditional machine
learning approaches may miss.
Link prediction in Graph Neural Networks (GNNs) involves predicting the existence and
often the type of edges between nodes in a graph.
GNNs, equipped with their ability to capture intricate relationships and dependencies
within graph data, are particularly adept at this task.
They find applications in diverse fields, including social networks, recommendation
systems, and biology, aiding in tasks like friend recommendation, content link prediction,
and interaction analysis.
By harnessing the power of GNNs, we can uncover hidden connections and make
informed decisions in various network-based applications.
Graph classification in Graph Neural Networks (GNNs) is the process of assigning a label or
category to an entire graph, based on its structural properties and the relationships between
its nodes and edges.
This task is crucial in various domains, such as molecule classification in chemistry, fraud
detection in financial networks, and document categorization in natural language processing.
GNNs excel at graph classification by aggregating information from individual nodes and
edges, enabling the model to understand and classify entire graphs according to their unique
patterns and features.
This capability makes GNNs a valuable tool for solving problems where the data is inherently
structured as graphs, providing insights and solutions that traditional machine learning
methods may struggle to achieve.
Advantages

Modeling Complex Relationships

Node and Graph-Level Representations

Transferability

Flexibility

Interpretable Representations
Disadvatages
Computational Complexity

Over-smoothing

Limited Handling of Graph Evolution

Data Sparsity

Generalization Challenges

Scalability
Applications

Social Network Analysis

Recommendation Systems

Biological and Chemical Sciences

Natural Language Processing (NLP)

Fraud Detection

Knowledge Graph Completion

Traffic Analysis and Routing


Conclusion

In conclusion, Graph Neural Networks (GNNs) represent a


groundbreaking advancement in the field of machine learning, offering a
powerful framework for modeling and understanding complex
relationships within graph-structured data.
GNNs have found applications in a wide array of domains, from social
network analysis and recommendation systems to biology, finance, and
beyond.
Their ability to capture local and global information, provide interpretable
representations, and adapt to various graph structures makes them a
versatile tool for tackling real-world problems.
References
Alon, U., Yahav, E., 2021. On the bottleneck of graph neural networks and its practical
implications, in: International Conference on Learning Representations

Battaglia, P., Pascanu, R., Lai, M., Jimenez Rezende, D., et al., 2016. Interaction
networks for learning about objects, relations and physics. Advances in neural
information processing systems 29.

Battaglia, P.W., Hamrick, J.B., Bapst, V., Sanchez-Gonzalez, A., Zambaldi, V.,
Malinowski, M., Tacchetti, A., Raposo, D., Santoro, A., Faulkner, R., et al., 2018.
Relational inductive biases, deep learning, and graph networks. arXiv preprint
arXiv:1806.01261 .

Bronstein, M.M., Bruna, J., Cohen, T., Veliˇckovi´c, P., 2021. Geometric deep learning:
Grids, groups, graphs, geodesics, and gauges. arXiv preprint arXiv:2104.13478

You might also like