0% found this document useful (0 votes)
82 views2 pages

ELK Stack

ELK Stack

Uploaded by

gmundluru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views2 pages

ELK Stack

ELK Stack

Uploaded by

gmundluru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

ELK Stack

1. What is ELK Stack?


➔ Elasticsearch, Logstash and Kibana are components of the Elastic Stack (often called ELK Stack), used for
logging, searching, and data visualization
o Elasticsearch – It is an analytics and search engine. A tool used for indexing, storing, and quickly
searching through large volumes of data (also known as logs). It stores data in an efficient,
distributed way that enables high-speed searches, aggregations, and analytics
o Logstash – Logstash is used for data ingestion (the process of collecting data from various sources
and moving it to a single location for storage and analysis) and then sends it to Elasticsearch. It can
handle logs from diverse sources, such as servers, applications, and databases, and supports various
data formats.
o Kibana - Kibana is a visualization tool that connects to Elasticsearch, allowing users to explore and
visualize the data stored there. It also provides an interface for creating dashboards, graphs, and
reports based on data indexed in Elasticsearch.

2. What is a correlation of Elasticsearch, Logstash and Kibana components?


➔ Each component plays a specific role in creating a balanced system that makes data accessible, searchable,
and understand, here are component role
o Logstash (data ingestion and transformation) - It plays a role in collecting data from various sources
(logs, databases, and application metrics), transforming and enriching it (improving quality, visibility,
and depth), and sending the processed data to Elasticsearch
o Elasticsearch (search and analytics engine) - Once the processed data is received from Logstash, it is
indexed for efficient storage and fast querying. Essentially, it acts as a repository where data is stored
to enable quick retrieval and advanced analytics on large datasets.
o Kibana (visualization layer) - It connects to Elasticsearch to visualize and explore data through
charts, dashboards, and interactive queries

3. Does ELK work without Logstash?


➔ Yes, ELK can work without Logstash for core functionality, but there will be some limitations in data
management and processing, such as:
o No complex data transformation (for unstructured logs)
o Limited filtering and enrichment (e.g., searching logs by status code, IP address, or error message)
o Data ingestion limitations (restricted to input sources such as syslog, Kafka, HTTP, and databases)

Let us understand when to skip Logstash, when logs are simple and do not require much processing, and
there is no need for complex data routing, filtering, or enrichment, Beats or other agents can directly send
data to Elasticsearch without the need for transformation.

4. What is beats? Which beat commonly used?


➔ Yes, Beats are lightweight agents that collect and ship data (such as logs, metrics, etc.) from servers. There
are various types of Beats, each designed for a specific use case. These agents help collect different kinds of
data and send them to Elasticsearch or Logstash for further processing.

Beat Type of Data Common use cases


Filebeat Log files Shipping logs from files to Elasticsearch or Logstash
Metricbeat System and App metrics Collecting server / host metrics (CPU, RAM usage)
Packetbeat Network traffic Monitoring network protocols and transactions
Auditbeat Security and audit logs Monitoring file integrity and user activity for security.
Winlogbeat Windows Event logs Collecting Windows Event logs for security and monitoring.
Heartbeat Service availability Monitoring uptime of websites and services.
Journalbeat Systemd journal logs Collecting logs from Linux systems running systemd
Functionbeat Serverless function logs Collecting logs from serverless environments.
Filebeat is most commonly used because it is highly efficient at shipping logs from various sources like app
logs, web logs, system logs to Logstash or Elasticsearch.

Here, need to understand that is Filebeat has two main cases in relation Logstash and Elasticsearch

1. Filebeat → Logstash → Elasticsearch


a. Filebeat collects log files from various sources and sends them to Logstash. Logstash then
processes the logs (e.g., parsing, filtering, adding metadata) and forwards the processed logs to
Elasticsearch.

2. Filebeat → Elasticsearch (without Logstash)


a. Filebeat reads log files, optionally applies lightweight parsing through Filebeat modules, and
sends the logs directly to Elasticsearch for indexing.

5. Could you please provide flow?


I have set up the ELK stack with single-node components for each service (Elasticsearch, Logstash, and
Kibana) to support a small-scale infrastructure. However, for medium or large infrastructures, this setup can
be expanded by increasing the number of nodes. Additionally, Beats can be added as needed, though I have
not included them in this setup

You might also like