0% found this document useful (0 votes)
2 views

Optimizing Satellite Orbits and Communication Patterns Using Graph Theory and Machine Learning

Uploaded by

Bhavan Gowda
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Optimizing Satellite Orbits and Communication Patterns Using Graph Theory and Machine Learning

Uploaded by

Bhavan Gowda
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Gowda |1

Optimizing Satellite Orbits and Communication Patterns Using Graph Theory and
Machine Learning

- Bhavan Gowda

12/20/2024
Gowda |2

Citations
1. Smith, J. (2023). History of Satellite Communication Networks. Journal of
Space Technology.

2. Diestel, R. (2017). Graph Theory. Springer.

3. Miller, A. (2022). The Role of Machine Learning in Satellite Constellations.


Communications Today.

4. SpaceX. (2023). Starlink Satellite Optimization. SpaceX Official Website.

5. Johnson, L. (2021). Satellite Applications in Disaster Response. Journal of


Remote Sensing.

6. Arul, Sharmila & Senthil, Gowri & Jayasudha, S. & Alkhayyat, Ahmed &
Azam, Khalikov & Elangovan, R.. (2023). Graph Theory and Algorithms for
Network Analysis. E3S Web of Conferences. 399.
10.1051/e3sconf/202339908002.

7. Dai, Hongguang & Chen, Jinlei & Ni, Yuanyuan & Chen, Hui & Yu, Yawei &
Ji, Xiaowei & Wu, Lei. (2024). Research on improving the efficiency of
wireless networking in the power system. Journal of Physics: Conference
Series. 2849. 012100. 10.1088/1742-6596/2849/1/012100.

8. Perepu, Soumya Sri. (2024). Machine learning.

9. (2022). A REVIEW PAPER ON MACHINE LEARNING AND ITS


APPLICATIONS. International Research Journal of Modernization in
Engineering Technology and Science.

10.Kellis, M. (2021). Graph Theory. Texts in Computer Science.

11.Wilson, S. (2020). Short Communication on Graph Theory.

12.Capderou, M. (2005). Satellites: Orbits and Missions.

13.Seon, K., Han, W., Lee, Y., Lee, H., Kim, M.B., Park, I.H., Jeong, W., Cho, K.,
Lee, J.J., Lee, D., Astronomy, K.K., Institute, S.T., Daejeon, Korea., R.O.,
Astronomy, D.O., Research, C.F., University, Y., Seoul., Physics, D.O.,
University, S., Suwon, Engineering, D.O., Science, K.A., Technology,
Science, U.N., & Ulsan (2020). Space missions for astronomy and
astrophysics in Korea: past, present, and future. Journal of the Korean
Physical Society, 78, 942 - 971.
Gowda |3

14. Gandhi, R. (2018, May 28). Introduction to machine learning algorithms: Linear regression.
Medium. https://towardsdatascience.com/introduction-to-machine-learning-algorithms-linear-
regression-14c4e325882a

15. Diestel, R. (2020). Graph Theory and Its Real-World Applications. Springer.

16. Clark, J., et al. (2023). AI and Space Exploration: A New Frontier. SpaceTech Journal.

Introduction
Satellites are the root causes of new advancements in technology in modern times
with their role in communication, weather forecasting, GPS navigation, and
disaster monitoring. However, their communication networks and orbit are indeed
mostly suboptimal, which causes inefficiencies in fuel consumption and
communication latency. In particular, satellite networks need a continuous
optimization source for consistent global coverage and low operational costs.
Current satellite systems like Starlink, GPS, and communication satellites face
challenges in maximizing communication efficiency and minimizing resource
expenditure. My goal for this research is to show the inefficiencies in satellite
communication networks by integrating graph theory and machine learning into
satellite orbits and communication patterns. The challenge is finding optimal
routes and configurations that can decrease the fuel consumed and maintain an
effective range of global communication, which I will call coverage. Adding graph
theory to satellite communication networks will greatly improve the efficiency of
communication patterns, and using machine learning along with graph theory will
boost coverage by ~16.7% based on previous simulations. The machine learning
models can predict a more efficient orbital path that greatly reduces fuel
consumption. Graph theory is used in solving many complex network problems in
computer science, in areas like network design, routing, algorithms, data analysis,
and algorithm design, where it would be visualized by interconnected nodes and
edges, as well as in biology, transportation, and chemistry. This theory concepts
like shortest paths, minimum spanning trees, and clustering coefficients will give
me the most optimal time and ability to make connections between nodes in a
network. In the context of satellite communication, satellites will be nodes in a
network, and communication strands will be links named edges. The importance of
adding graph theory to this project is because it allows me to optimize these
connections by minimizing the distance signals travel, reducing delays, and making
sure of network redundancy. Based on my background research in network
optimization, has shown me the effectiveness of graphical approaches in terrestrial
communication systems such as radio transmissions, but never satellites due to its
Gowda |4

many variables such as gravity, orbit speed, distance, and radio strength. Machine
learning models like the Random Forest and neural networks have been
successfully added to many projects. In satellite networks machine learning can be
used to predict the most efficient orbits based on data I have collected from
NASA’s APIs; this included satellite position, velocity, and fuel consumption.
Studies before have explored machine learning applications. One example is
collision avoidance and avoidance and autonomous satellite navigation. Machine
learning is not artificial intelligence which gives it a great advantage, because
machine learning algorithms improve performance over time as they are trained by
being shown more data. Satellite networks, like GPS and weather tracking, are
great networks that perform well but why not perform better? Recent methods rely
too much on manual adjustments and deterministic algorithms, which can be
inefficient and resource-intensive in the long run. Automated systems that leverage
graph theory and machine learning could get cost savings and performance
improvements. Real-world applications of this project are satellite constellations
like Starlink, which wants to provide global internet coverage through a network of
thousands of low-Earth orbit satellites. Optimizing the orbits and communication
patterns of these constellations could improve service and reliability and reduce
costs.

Research

1. The Foundations of Satellite Communication Networks


Satellite communication networks were first
seen in the mid-20th century when the
launch of the first artificial satellite, named
Sputnik in 1957, which marked the
beginning of space-based communication.
Later in 1962, Telsat was launched which
marked the first communication satellite that
enabled transatlantic communication which
was the television broadcast, which made
the road for modern network systems
(Smith,2023). Satellite communication systems are is really important for global
interaction. They provide internet access and GPS and facilitate global efforts and
disaster control and monitoring. Even though there are problems with satellite
communication from latency and coverage limits to costly systems and
communication failings. These telecommunication failings are from a satellite
system thousands of miles away, with performance enhancements needed, sitting
Gowda |5

above us for years. However, in recent years, performance enhancements through


machine learning and geometrical theorems have emerged for maximum
efficiency.

2. Graph Theory in Optimizing Satellite Networks


Graph theory is a branch of mathematics that has become a vital tool in optimizing
satellite communication networks. By representing states as nodes and their
communication links as edges, graph theory provides a framework to analyze and
also enhance and improve network performance. Ideas like the shortest path
algorithm and minimum spanning trees are used to reduce communication delays
and majorly improve signal reliability (Diestel, 2017). Even though it's effective,
the traditional graph theory method has limitations
when applied to dynamic networks which require
adaptability which is where machine learning comes
in. The limitations of static graph theory make
researchers explore dynamic solutions. My solution
will be adaptive machine learning. Before that, we
need to understand what graph theory is from a
computer science perspective. It is a concept with
abstract structures that are used to model
relationships and interactions. A graph has vertices,
which are nodes, that show entities, and edges,
which symbolize connections/relationships or in this
project the radio paths, between these nodes. Edges can either be directed,
indicating a one-way relationship, or undirected, representing mutual connections.
In some cases, graphs are weighted, where edges are assigned numerical values to
represent factors like cost, distance, or capacity. This framework is applied in
computer science to solve problems related to networks, algorithms, and data
organization.

3. Machine Learning and Its Role in Optimization


Machine learning has become a tool for addressing the complex challenges faced
by satellite communication networks. By using advanced techniques like regression
models and reinforcement learning algorithms, machine learning gives predictive
adjustment of communication links based on real-time data. Like SpaceX’s Starlink
constellation which includes over 4000 satellites that use machine learning to
efficiently use network traffic and decrease latency, making sure the connection is
really intelligent in high-demand scenarios (Miller, 2022). These advances enhance
the operational efficiency of satellite networks but also play a vital role in
expanding their coverage to remote areas around the world. This improved access
to satellite-based services will have an impact on essential areas like healthcare,
education, and disaster management. Despite these advancements adding graph
Gowda |6

theory and machine learning in satellite communication systems is an area that has
not been done yet. Adding these two tools is not so easy so we need a formula.

Example: Machine learning Algorithm for a Linear Regression (Regression is a


method of modeling a target value based on independent predictors. With the
addition of Gradient Descent and Cost Function)
Cost function
n
2
a 0=a 0−σ × ∑ ( predi− yi )
n i=1
n
2
a 1=a1−σ × ∑ ( pred i− y i ) × x i
n i=1
Gradient Descent
n
1
J= ∑ ( pred i− y i)2
n i=1
n
1
J= ∑ (a0 +a1 × x i− y i )2
n i=1
n n
∂J 2 ∂J 2
= ∑ (a 0+ a1 × x i− y i)⇒ = ∑ ( predi − y i)
∂ a0 n i=1 ∂ a0 n i=1
n n
∂J 2 ∂J 2
= ∑ (a + a × x − y i)× x i ⇒ = ∑ ( pred i− y i)× x i
∂ a1 n i=1 0 1 i ∂ a 1 n i=1

4. Historical Challenges in Satellite Communication

Satellite systems have transformed communication, but their history has been
fraught with challenges, from technological limitations to high operational costs.
Early systems like Telstar and Sputnik faced issues with signal reliability and
limited lifespans. As satellite constellations expanded, problems such as orbital
debris, signal interference, and the inefficiency of deterministic orbital algorithms
began to emerge. The modern era demands not just effective deployment but also
real-time adaptability to changing conditions, making optimization crucial.

Graph Theory Algorithm (7 related to my project)


1. Degree Centrality

The degree of a node measures the number of direct connections (edges) it has
with other nodes.

Degree( v)=∑ A uv
uϵV

- Auv : Entry in the adjacency matrix A, where if there is an edge between u


and v, otherwise 0
Gowda |7

- v : The node for which the degree is calculated


2. Clustering Coefficient

The clustering coefficient measures how connected the neighbors of a node are to
each other.
2× Number of closed triplets
C (v )=
Number of connected triplets
A closed triplet involves three nodes that are all connected which forming a
triangle shape

A connected triplet involves three nodes where at least two are connected by
edges

For all nodes in the graph, the average clustering coefficient is:
1
C=
¿ V ∨¿ ∑ C (v)¿
u ∈V

3. Shortest Path: Dijkstra’s Algorithm

The shortest path between two nodes u and v in a graph minimizes the sum of
weights on the edges.

d (u , v)=min ∑ w(x , y)
(x , y)∈P

- P: Path connecting u to v .
- w (x , y ): Weight of the edge between nodes x and y
- d ( u , v ) :Total weight of the shortest path

4. Graph Diameter

The diameter of a graph is the longest shortest path between any two nodes.
Gowda |8

max
Diameter ( G )= d (u , v )
u, v∈
d (u , v)Shortest path distance between nodes uand v

5. Minimum Spanning Tree (MST)

The MST of a graph is a subset of the edges that connects all nodes with the
minimum possible total edge weight.

MST Weigtt=min ∑ '


w (u , v )
(u ,v)∈ E

- E: The set of all edges in the original graph


- E′: The set of edges included in the Minimum Spanning Tree
- w (u , v );Wight of an edge between uand v

6. Betweenness Centrality

Betweenness centrality measures how often a node acts as a bridge along the
shortest path between other nodes.

σ st (v )
BC ( v )= ∑ σ st
s ≠v ≠ t

- σ st Total number of shortest paths from node sto node t


- σ st (v): Number of those shortest paths passing through v

7. Eigenvector Centrality

Eigenvector centrality measures a node's influence based on the influence of its


neighbors.
1
xv= ∑ x
λ u∈ N (v) u

- x v : Centrality of node v
- N(x): Set of neighbors of v
- λ: Eigenvalue associated with the centrality

Finally with the addition of machine learning with these 7 base ruled algorithms:
Gowda |9

1. Graph Representation:

A graph G is defined as G= (V, E)

 V represents the set of nodes (satellites)

 E represents the set of edges (communication links)

Graph metrics are accounted for each node or the entire graph to shows its
functional type properties.

2. Graph Metrics as Features:

Metrics such as degree centrality, clustering coefficient, shortest path, and graph
diameter are computed:

- Degree centrality for node i


Degree ( i ) = ∑ Aij
j ∈V

where A is the adjacency matrix (I have not tried to make yet)

Clustering coefficient for node i


2 × Number of cloesed triplets
C ( i )=
Number of connected triplets
Shortest path length between two nodes u and v
d (u, v) using Dijkstra’s algorithm.

Graph diameter
Diameter ( G )=maxu , v∈V d (u , v)

3. Machine Learning Model: Integrated Algorithm

Let:

 X G ={x 1 , x 2 x 3 , … }: Graph metrics

X S={s 1 , s 2 s3 , … }Satellite-specific features which will be altitude, velocity, fuel


consumption

 Y: Target variable radio transmissions

 f ML ( x ) :Machine learning model (e.g., Random Forest, Neural Network).

The machine learning model adds graph theory and satellite-specific tools like;

y=f ML ( x )=f ML ([ X G , X S ])

where [XG, XS] represents the concatenation of graph metrics and satellite
features.
G o w d a | 10

Methodology
1. Graph Theory Application
Graph theory is a powerful tool to show satellite communication network systems. In this way
satellites are shown as nodes and their communication links as edges. These connections can be
changed by many reasons like signal strength, latency, and bandwidth. The goal of using graph

but also improve the performance. Degree centrality calculates how many connections
theory is to find the best communication paths between satellites and decrease the signal delay

which are edges a satellite has to other satellites. More of these connections help
improve the network's strength and decrease delays. Clustering coefficient
measures how well-connected a satellite's neighbors are to each other. Shortest
path algorithm finds the quickest route between two satellites.

2. Machine Learning Integration

Machine learning (ML) is used in this project to adjust satellite orbits in real-time based on
network conditions. We use ML models to predict which satellite positions will make the
network more efficient in terms of fuel use, communication delays, and coverage. The main
ML approach involves using models like Random Forest Regressor to predict fuel
consumption and communication delays based on graph metrics. We train the model with
historical data on satellite positions, velocities, and fuel consumption, helping it learn how
different factors like degree centrality and clustering coefficient affect network performance.
Once the model is trained, it can be used in real-time to adjust satellite orbits. If the model
predicts high communication delays or fuel use for certain satellites, their positions can be
adjusted to optimize the network. Feature Extraction which gets important features from the
graph, like satellite connections, shortest paths, and clustering, and combine them with satellite
data (altitude, velocity, fuel consumption). Then model training using historical data to predict
fuel use or communication latency, based on the features.
G o w d a | 11

3d Model of 18x18x18 Satellites in groups

Visual Model of 18x18x18 satellites connected in a 2d model


G o w d a | 12

18x18x18 Satellite System leveled

You might also like