Module 5
Module 5
Module-V-Security Analytics
Techniques in Analytics
For a solution to work properly, you must be able to handle structured and unstructured
data to arrive at an accurate assessment.
Attackers are becoming more dynamic, using increasingly complex techniques and tactics.
With security analytics you can conduct root cause investigations to pinpoint their patterns
and store your findings for future use.
Attackers are aware of this and are targeting and looking to disrupt those findings.
Protecting this information, prioritizing threats, and keeping pace with attacker efforts is a
must.
Techniques in Security Analytics
Advanced security analytics tools are crucial for effective threat detection and
response in today's fast-paced cybersecurity landscape.
However, not all companies have informative enough data to fully equip the anomalous
activity detection algorithm to recognize a deviation.
Machine learning allows the system to observe elements of your IT infrastructure to
determine baselines and construct a more robust detection model.
Organizations can train their ML algorithms with a wide variety of methods
for anomaly detection and prevention. Some of the most common anomaly
detection techniques are:
Density-based algorithms
Cluster-based algorithms
Bayesian-network algorithms
Neural network algorithms
Event management and real-time response
A common problem for organisations that implement IDS is that they lack an appropriate incident
response capability.
Identifying a problem is half the battle, knowing how to respond appropriately and having the resources
in place to do so is equally important.
Effective incident response requires skilled security personnel with the knowledge of how to swiftly
remediate threats, as well as robust procedures to address issues without impacting day-to-day
operations.
In many organisations there is a big disconnect between the people charged with monitoring alerts and
those managing infrastructure, meaning that swift remediation can be difficult to achieve.
To highlight the importance of having an appropriate incident response plan in place, the General Data
Protection Regulation (GDPR) requires organisations that process any type of personal data to have
appropriate controls in place to report breaches to a relevant authority within 72 hours, or risk a large
fine.
How to address your IDS challenges
Before deploying an intrusion detection system, organisations should consider
commissioning an independent risk assessment to better understand their environment,
including the key assets requiring protection.
Being armed with this knowledge will help to ensure that an IDS is properly scoped to
ensure that it offers the greatest value and benefits.
Given the challenges of ongoing system maintenance, monitoring and alert investigation,
many organisations may wish to consider enlisting a managed service to perform all the
heavy lifting.
A managed IDS service avoids the need to recruit dedicated security personnel, and if
necessary, can also include all requisite technology, circumventing the need for upfront
capital expenditure.
Log analysis
Log analysis is the process of reviewing computer-generated event logs to proactively identify
bugs, security threats or other risks.
Log analysis can also be used more broadly to ensure compliance with regulations or review
user behavior.
A log is a comprehensive file that captures activity within the operating system, software
applications or devices.
The log file automatically documents any information designated by the system administrators,
including: messages, error reports, file requests, file transfers and sign-in/out requests.
The activity is also timestamped, which helps IT professionals and developers establish an audit
trail in the event of a system failure, breach or other outlying event.
Why is log analysis important?
In many cases, log analysis is a matter of law.
Organizations must adhere to specific regulations that dictate how data is archived and analyzed.
Beyond regulatory compliance, log analysis, when done effectively, can unlock many benefits for
the business.
These include:
Improved troubleshooting
Organizations that regularly review and analyze logs are typically able to identify errors more
quickly. With an advanced log analysis tool, the business may even be possible to pinpoint
problems before they occur, which greatly reduces the time and cost of remediation.
The log also helps the log analyzer review the events leading up to the error, which may make the
issue easier to troubleshoot, as well as prevent in the future.
Enhanced cybersecurity
Effective log analysis dramatically strengthens the organization’s cybersecurity
capabilities.
Regular review and analysis of logs helps organizations more quickly detect anomalies,
contain threats and prioritize responses.
Ingestion: Installing a log collector to gather data from a variety of sources, including
the OS, applications, servers, hosts and each endpoint, across the network
infrastructure.
This helps simplify the analysis process and increase the speed at which data can be
applied throughout the business.
Search and analysis: Leveraging a combination of AI/ML-enabled log analytics and human
resources to review and analyze known errors, suspicious activity or other anomalies within
the system. Given the vast amount of data available within the log, it is important to
automate as much of the log file analysis process as possible. It is also recommended to
help the IT team visualize each log entry, its timing and interrelations.
Monitoring and alerts: The log management system should leverage advanced log
analytics to continuously monitor the log for any log event that requires attention or human
intervention. The system can be programed to automatically issue alerts when certain events
Reporting: Finally, the LMS should provide a streamlined report of all events as well as an
intuitive interface that the log analyzer can leverage to get additional information from the
log.
The limitations of indexing
Many log management software solutions rely on indexing to organize the log. While this was considered
an effective solution in the past, indexing can be a very computationally-expensive activity, causing
latency between data entering a system and then being included in search results and visualizations.
As the speed at which data is produced and consumed increases, this is a limitation that could have
devastating consequences for organizations that need real-time insight into system performance and
events.
Further, with index-based solutions, search patterns are also defined based on what was indexed. This is
another critical limitation, particularly when an investigation is needed and the available data can’t be
searched because it wasn’t properly indexed.
Leading solutions offering free-text search, which allows the IT team to search any field in any log. This
capability helps to improve the speed at which the team can work without compromising performance.
Log analysis methods
Given the massive amount of data being created in today’s digital world, it has become
impossible for IT professionals to manually manage and analyze logs across a sprawling
tech environment.
As such, they require an advanced log management system and techniques that automate
key aspects of the data collection, formatting and analysis processes.
Normalization
Normalization is a data management technique that ensures all data and attributes, such
as IP addresses and timestamps, within the transaction log are formatted in a consistent
way.
Pattern recognition
Classification and tagging is the process of tagging events with key words
and classifying them by group so that similar or related events can be
reviewed together.
Correlation analysis
Artificial ignorance
Artificial ignorance refers to the active disregard for entries that are not
material to system health or performance