Optimizing Satellite Orbits and Communication Patterns Using Graph Theory and Machine Learning
Optimizing Satellite Orbits and Communication Patterns Using Graph Theory and Machine Learning
Optimizing Satellite Orbits and Communication Patterns Using Graph Theory and
Machine Learning
- Bhavan Gowda
12/20/2024
Gowda |2
Citations
1. Smith, J. (2023). History of Satellite Communication Networks. Journal of
Space Technology.
6. Arul, Sharmila & Senthil, Gowri & Jayasudha, S. & Alkhayyat, Ahmed &
Azam, Khalikov & Elangovan, R.. (2023). Graph Theory and Algorithms for
Network Analysis. E3S Web of Conferences. 399.
10.1051/e3sconf/202339908002.
7. Dai, Hongguang & Chen, Jinlei & Ni, Yuanyuan & Chen, Hui & Yu, Yawei &
Ji, Xiaowei & Wu, Lei. (2024). Research on improving the efficiency of
wireless networking in the power system. Journal of Physics: Conference
Series. 2849. 012100. 10.1088/1742-6596/2849/1/012100.
13.Seon, K., Han, W., Lee, Y., Lee, H., Kim, M.B., Park, I.H., Jeong, W., Cho, K.,
Lee, J.J., Lee, D., Astronomy, K.K., Institute, S.T., Daejeon, Korea., R.O.,
Astronomy, D.O., Research, C.F., University, Y., Seoul., Physics, D.O.,
University, S., Suwon, Engineering, D.O., Science, K.A., Technology,
Science, U.N., & Ulsan (2020). Space missions for astronomy and
astrophysics in Korea: past, present, and future. Journal of the Korean
Physical Society, 78, 942 - 971.
Gowda |3
14. Gandhi, R. (2018, May 28). Introduction to machine learning algorithms: Linear regression.
Medium. https://towardsdatascience.com/introduction-to-machine-learning-algorithms-linear-
regression-14c4e325882a
15. Diestel, R. (2020). Graph Theory and Its Real-World Applications. Springer.
16. Clark, J., et al. (2023). AI and Space Exploration: A New Frontier. SpaceTech Journal.
Introduction
Satellites are the root causes of new advancements in technology in modern times
with their role in communication, weather forecasting, GPS navigation, and
disaster monitoring. However, their communication networks and orbit are indeed
mostly suboptimal, which causes inefficiencies in fuel consumption and
communication latency. In particular, satellite networks need a continuous
optimization source for consistent global coverage and low operational costs.
Current satellite systems like Starlink, GPS, and communication satellites face
challenges in maximizing communication efficiency and minimizing resource
expenditure. My goal for this research is to show the inefficiencies in satellite
communication networks by integrating graph theory and machine learning into
satellite orbits and communication patterns. The challenge is finding optimal
routes and configurations that can decrease the fuel consumed and maintain an
effective range of global communication, which I will call coverage. Adding graph
theory to satellite communication networks will greatly improve the efficiency of
communication patterns, and using machine learning along with graph theory will
boost coverage by ~16.7% based on previous simulations. The machine learning
models can predict a more efficient orbital path that greatly reduces fuel
consumption. Graph theory is used in solving many complex network problems in
computer science, in areas like network design, routing, algorithms, data analysis,
and algorithm design, where it would be visualized by interconnected nodes and
edges, as well as in biology, transportation, and chemistry. This theory concepts
like shortest paths, minimum spanning trees, and clustering coefficients will give
me the most optimal time and ability to make connections between nodes in a
network. In the context of satellite communication, satellites will be nodes in a
network, and communication strands will be links named edges. The importance of
adding graph theory to this project is because it allows me to optimize these
connections by minimizing the distance signals travel, reducing delays, and making
sure of network redundancy. Based on my background research in network
optimization, has shown me the effectiveness of graphical approaches in terrestrial
communication systems such as radio transmissions, but never satellites due to its
Gowda |4
many variables such as gravity, orbit speed, distance, and radio strength. Machine
learning models like the Random Forest and neural networks have been
successfully added to many projects. In satellite networks machine learning can be
used to predict the most efficient orbits based on data I have collected from
NASA’s APIs; this included satellite position, velocity, and fuel consumption.
Studies before have explored machine learning applications. One example is
collision avoidance and avoidance and autonomous satellite navigation. Machine
learning is not artificial intelligence which gives it a great advantage, because
machine learning algorithms improve performance over time as they are trained by
being shown more data. Satellite networks, like GPS and weather tracking, are
great networks that perform well but why not perform better? Recent methods rely
too much on manual adjustments and deterministic algorithms, which can be
inefficient and resource-intensive in the long run. Automated systems that leverage
graph theory and machine learning could get cost savings and performance
improvements. Real-world applications of this project are satellite constellations
like Starlink, which wants to provide global internet coverage through a network of
thousands of low-Earth orbit satellites. Optimizing the orbits and communication
patterns of these constellations could improve service and reliability and reduce
costs.
Research
theory and machine learning in satellite communication systems is an area that has
not been done yet. Adding these two tools is not so easy so we need a formula.
Satellite systems have transformed communication, but their history has been
fraught with challenges, from technological limitations to high operational costs.
Early systems like Telstar and Sputnik faced issues with signal reliability and
limited lifespans. As satellite constellations expanded, problems such as orbital
debris, signal interference, and the inefficiency of deterministic orbital algorithms
began to emerge. The modern era demands not just effective deployment but also
real-time adaptability to changing conditions, making optimization crucial.
The degree of a node measures the number of direct connections (edges) it has
with other nodes.
Degree( v)=∑ A uv
uϵV
The clustering coefficient measures how connected the neighbors of a node are to
each other.
2× Number of closed triplets
C (v )=
Number of connected triplets
A closed triplet involves three nodes that are all connected which forming a
triangle shape
A connected triplet involves three nodes where at least two are connected by
edges
For all nodes in the graph, the average clustering coefficient is:
1
C=
¿ V ∨¿ ∑ C (v)¿
u ∈V
The shortest path between two nodes u and v in a graph minimizes the sum of
weights on the edges.
d (u , v)=min ∑ w(x , y)
(x , y)∈P
- P: Path connecting u to v .
- w (x , y ): Weight of the edge between nodes x and y
- d ( u , v ) :Total weight of the shortest path
4. Graph Diameter
The diameter of a graph is the longest shortest path between any two nodes.
Gowda |8
max
Diameter ( G )= d (u , v )
u, v∈
d (u , v)Shortest path distance between nodes uand v
The MST of a graph is a subset of the edges that connects all nodes with the
minimum possible total edge weight.
6. Betweenness Centrality
Betweenness centrality measures how often a node acts as a bridge along the
shortest path between other nodes.
σ st (v )
BC ( v )= ∑ σ st
s ≠v ≠ t
7. Eigenvector Centrality
- x v : Centrality of node v
- N(x): Set of neighbors of v
- λ: Eigenvalue associated with the centrality
Finally with the addition of machine learning with these 7 base ruled algorithms:
Gowda |9
1. Graph Representation:
Graph metrics are accounted for each node or the entire graph to shows its
functional type properties.
Metrics such as degree centrality, clustering coefficient, shortest path, and graph
diameter are computed:
Graph diameter
Diameter ( G )=maxu , v∈V d (u , v)
Let:
The machine learning model adds graph theory and satellite-specific tools like;
y=f ML ( x )=f ML ([ X G , X S ])
where [XG, XS] represents the concatenation of graph metrics and satellite
features.
G o w d a | 10
Methodology
1. Graph Theory Application
Graph theory is a powerful tool to show satellite communication network systems. In this way
satellites are shown as nodes and their communication links as edges. These connections can be
changed by many reasons like signal strength, latency, and bandwidth. The goal of using graph
but also improve the performance. Degree centrality calculates how many connections
theory is to find the best communication paths between satellites and decrease the signal delay
which are edges a satellite has to other satellites. More of these connections help
improve the network's strength and decrease delays. Clustering coefficient
measures how well-connected a satellite's neighbors are to each other. Shortest
path algorithm finds the quickest route between two satellites.
Machine learning (ML) is used in this project to adjust satellite orbits in real-time based on
network conditions. We use ML models to predict which satellite positions will make the
network more efficient in terms of fuel use, communication delays, and coverage. The main
ML approach involves using models like Random Forest Regressor to predict fuel
consumption and communication delays based on graph metrics. We train the model with
historical data on satellite positions, velocities, and fuel consumption, helping it learn how
different factors like degree centrality and clustering coefficient affect network performance.
Once the model is trained, it can be used in real-time to adjust satellite orbits. If the model
predicts high communication delays or fuel use for certain satellites, their positions can be
adjusted to optimize the network. Feature Extraction which gets important features from the
graph, like satellite connections, shortest paths, and clustering, and combine them with satellite
data (altitude, velocity, fuel consumption). Then model training using historical data to predict
fuel use or communication latency, based on the features.
G o w d a | 11