Logpoint Search Query Language (1)
Logpoint Search Query Language (1)
LogPoint Language
V 6.12.0 (latest)
October 1, 2021
CONTENTS
2 Simple Search 2
2.1 Single word . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2.2 Multiple words . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2.3 Phrases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2.4 Field values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.5 Logical operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.6 Parentheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.7 Wildcards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.8 Step . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.9 Lower and Upper . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.10 Time Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.11 List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.12 Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3 Aggregators 12
3.1 chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.2 timechart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.3 count() . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.4 distinct_count() . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.5 sum() . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.6 max() and min() . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.7 avg() . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.8 var() . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.9 distinct_list() . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4 One-to-One Commands 22
4.1 rex . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.2 norm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.3 fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.4 rename . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
i
5 Process Commands 27
5.1 String Concat . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.2 Domain Lookup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.3 Difference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
5.4 Summation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.5 Experimental Median Quartile Quantile . . . . . . . . . . . . . . . . . . . 30
5.6 Process lookup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.7 GEOIP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.8 Codec . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.9 InRange . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
5.10 Regex . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
5.11 DNS Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
5.12 Compare . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.13 IP Lookup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.14 Compare Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.15 Clean Char . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.16 Current Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.17 Count Char . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
5.18 DNS Cleanup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
5.19 Grok . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
5.20 AsciiConverter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.21 WhoIsLookup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.22 Eval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.23 toList . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
5.24 toTable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
6 Filtering Commands 44
6.1 search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
6.2 filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
6.3 latest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
6.4 order by . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
6.5 limit <number> . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
7 Pattern Finding 49
7.1 Single Stream . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
7.2 Multiple Streams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
8 Chaining of commands 56
9 Additional Notes 57
9.1 Process or Count . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
9.2 Conditional Expression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
9.3 Forward Slash Expression** . . . . . . . . . . . . . . . . . . . . . . . . . . 57
9.4 norm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
9.5 timechart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
ii
9.6 Capturing normalized field values . . . . . . . . . . . . . . . . . . . . . . . 58
9.7 Grok Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
iii
CHAPTER
ONE
LogPoint’s Query Language is extensive, intuitive, and user-friendly. It covers all the
search commands, functions, arguments, and clauses. You can search the log messages
in various formats depending on the query you use.
LogPoint also supports chaining of commands and multi-line queries. Use a pipe (|) to
chain the commands and press Shift + Enter to add a new line in the query. The search
keywords are not case-sensitive.
Note: The examples of some search queries provided in this section may not yield any
result as the relevant logs may not be available in your system.
This guide provides the following information that you need to use the LogPoint Query
Language:
• Learn about the types of simple queries to familiarize yourself with the LogPoint
Query Language. Refer to Simple Search.
• Learn how to aggregate fields with chart and timechart commands. Refer to
Aggregators.
• Learn how to find one or multiple streams and patterns of data to correlate a
particular event. Refer to Pattern Finding.
• Learn how to chain multiple commands into a single query. Refer to Chaining of
commands.
1
CHAPTER
TWO
SIMPLE SEARCH
You can use the following types of simple queries to familiarize yourself with the
LogPoint Query Language.
login
This query searches for all the logs containing the word login in the message.
account locked
This query searches for all the logs containing both the search terms account and locked
in the message.
2.3 Phrases
Phrase Search lets you search the exact phrase in the logs. You must enclose the words
inside double-quotes (” “).
2
Search Query Language Documentation, Release latest
"account locked"
This query searches for all the logs containing the exact phrase account locked.
user = Bob
This query searches for all the logs from the user Bob.
device_ip = 192.168.2.1
This query searches for all the logs coming from the device with the IP Address
192.168.2.1.
You can combine multiple field value pairs as:
2.5.1 And
Use the logical operator and to search for the slogs containing both the specified
parameters.
This query searches for all the messages containing the word login and the word
successful.
The and operator can also be used for key-value search queries as follows:
2.5.2 Or
Use the logical operator or to search for the logs containing either of the specified
parameters.
login or logout
This query searches for all the messages containing either the word login or the word
logout.
This operator can also be used with the key-value search query as follows:
2.5.3 Not
You can use the hyphen (-) symbol for the logical negation in your searches.
login -Bob
This query searches for the log messages containing the word login but not the word
Bob.
-device_ip = 192.168.2.243
This query returns the logs containing all the device_ips except 192.168.2.243.
Note:
• While searching with field-names, you can also use != and NOT to denote negation.
device_ip != 192.168.2.243
• By default, the or operator binds stronger than the and operator. Therefore, for the
query login or logout and MSWinEventLog, LogPoint returns the log messages
containing either login or logout, but containing MsWinEventLog.
2.6 Parentheses
In LogPoint, the or operator has a higher precedence by default. You can use
parentheses to override the default binding behavior when using the logical operators
in the search query.
This query returns the log messages containing login failed or both denied and locked.
2.7 Wildcards
You can use wildcards as replacements for a part of the query string. Use the following
characters as wildcards:
If you want all the log messages containing the word login or logon, use the following:
log?n
Note: This query also searches for the log messages containing other variations such
as logan, logbn, and logcn.
log*
This query returns the logs containing the words starting with log such as logpoint,
logout, and login.
Note: You can also use Wildcards while forming a search query with field names. To
get all the usernames that end in t, use the following.
username = *t
2.8 Step
You can use the step function to group fields. To see the log messages with
destination_port in steps of 100 as follows:
2.6. Parentheses 5
Search Query Language Documentation, Release latest
destination_port count
0 - 100 50
100 - 200 32
This query searches for all the log messages containing the field destination_port, and
groups them in steps of 100. The value at the end of the query specifies the starting
value of the destination_port for grouping.
Note: You can use the step to group using multiple field names.
• second
• minute
• hour
• day
• day of week
• month
The arguments taken by these functions are numeric. These functions parse Unix
Timestamps.
Note: Unix time is a system for describing instants in time, defined as the number of
seconds that have elapsed since 00:00:00 Coordinated Universal Time (UTC), Thursday,
1 January 1970, not counting leap seconds. It is used widely in Unix-like and many other
operating systems and file formats.
Example: 1384409898 is the Unix time equivalent of Thu, 14 Nov 2013 06:18:18 GMT
In LogPoint, col_ts and log_ts carry Unix timestamps. However, you can create your
own fields which contain the Unix timestamps using the rex or norm commands.
2.10.1 second
You can use the second function to search for the logs generated or collected in
seconds.
The generic syntax for second is:
second(field) = value
second(log_ts) = 23
This query searches for the logs generated during the twenty third second.
2.10.2 minute
You can use the minute function to search for the logs generated or collected in minutes.
The values for the minute range from 0 to 59.
minute(col_ts) = 2
This query searches for the logs generated during the second minute.
minute() can also be used in aggregation functions.
2.10.3 hour
You can use the hour function to search for the logs generated or collected in hours.
The values for the hour range from 1 to 24.
Example:
hour(col_ts) = 1
This query displays the logs generated during the first hour.
2.10.4 day
You can use the day function to search for the logs generated or collected in days.
Example:
day(col_ts) = 4
day_of_week(col_ts) = 7 OR day_of_week(col_ts) = 1
This query displays the logs in off days, i.e, Saturday and Sunday.
2.10.5 month
You can use the month function to search the logs generated or collected in months.
The value of month ranges from 1 (January) to 12 (December).
Example:
month(col_ts) = 6
Note: You can use the relational operators (>, <, = and !=) with the time commands
to create a sensible time-range for your search queries.
2.11 List
You can create a static list with a number of values, and use this list in the search query
instead of keying in all the values.
For example, if you create a list EMPLOYEES with the names of all the employees in
a company, you can check whether a single user has logged into the system using the
following query.
The search query matches the value of the field user with all the values in the
EMPLOYEES list.
You can also use an Inline List while executing a search query.
The generic syntax for inline list is:
In cases where the values have multiple words in the inline List, use quotation marks as
shown below.
2.11. List 9
Search Query Language Documentation, Release latest
2.12 Table
Tables are external file-formats which contain the information you may choose to
associate with a search result. The file formats supported for the tables are CSV, ODBC,
LDAP, and Threat Intelligence. The information obtained is prefixed with the table alias
in the log messages.
For example:
IPList is a CSV table containing fields such as Address, IP, Name, and SN. To view the
content of this external CSV table, use the following query:
table "IPList"
To view all student entries in a table called studentResult, which contains student_name,
student_roll, and percentage as fields, use:
table "studentResult"
To search for all the student entries in the table studentResult who have passed with
distinction:
To search for all the student entries in the table studentResult who have failed:
2.12. Table 10
Search Query Language Documentation, Release latest
Note: In the Data Privacy Module enabled systems, when you use the table query, you
can only see the values of the search results in the encrypted form. You cannot request
a decryption for these values.
2.12. Table 11
CHAPTER
THREE
AGGREGATORS
Aggregation functions are used with the chart and the timechart commands to
aggregate the fields. The search results can be formatted using fields, chart or timechart
commands.
Note: The search results for the aggregation function in a search query have a limit of
10000 search results. Use the limit <number> command with a higher limit to display a
higher number of the search results.
3.1 chart
With the chart command, you can get log messages in the chart form. If you want to
see all the messages containing login and group them by device_ip you can use the
following query.
This query searches for all the log messages containing the word login, and groups them
by device_ip. It then displays the number of log messages for each device_ip.
You can also count by multiple fields. The log message count is then displayed for each
of the field.
In this case, the count of the log messages for every combination of
destination_address and destination_port is grouped and the corresponding count is
shown.
You can use other aggregation functions such as max and min in place of count.
12
Search Query Language Documentation, Release latest
You can also display the chart in different forms such as Column, Bar, Line, and Area.
3.1. chart 13
Search Query Language Documentation, Release latest
In this query, only the log messages containing action=permitted are counted. You can
write the same query as:
3.1. chart 14
Search Query Language Documentation, Release latest
This query displays two columns. The first is the count of the connections with the
permitted action and the second is the count of blocked actions.
3.2 timechart
You can use the timechart command to chart the log messages as a time series data. It
first displays the logs according to the time they were collected or generated. Then, it
returns the log results according to the collection time stamp (col_ts) or log generation
time (log_ts) as selected in the system.
The terms log_ts and col_ts differ in the function.
log_ts col_ts
Denotes time present in log messages. Denotes the time when the log was
actually collected in LogPoint.
For example you can timechart all the messages with login shown below.
This plots the count of all the messages containing the word login into a graph with the
horizontal axis as time. The total time-span is the time selected for the search query.
This query plots the count of the logs based on the log_ts field.
3.2. timechart 15
Search Query Language Documentation, Release latest
You can also use the timechart command to plot the data on a fixed time-interval. To
have a timechart with bars for every 20 minutes, use the following query:
login | timechart count() every 20 minutes
You can use every x minutes, every x hours, or every x days with the timechart command.
Note: When the limit of timechart() is not specified, the number of bars of the
timechart depends on the nature of the query.
• The number is always equal to 30 if the time-range is less than 30 units. For
example, if you provide a time span of 10 minutes LogPoint displays 30 bars in
the span of 20 seconds.
• If the time-range is greater than 30 units, the number of bars is equal to the
time-range. This holds true until the upper limit of the number of bars is reached,
which is 59.
• There are also some special cases for the number of graphs. The number of bars is
equal to the number of seconds specified, and the time span of 1 day displays 24
bars in the span of one hour.
Aggregation functions are used with the chart and the timechart commands by joining
them with the | symbol.
The following aggregators are available in LogPoint:
3.2. timechart 16
Search Query Language Documentation, Release latest
• count()
• distinct_count()
• sum()
• min()
• max()
• avg()
• var()
Note: The aggregators are pluggable from LogPoint 5.2.4. This means LogPoint can
create such functions on request.
3.3 count()
You can use the count function to get the total number of logs in the search results. For
example,
| chart count()
This query displays the total number of log messages in the search results.
This query searches for all the log messages containing the word login. It then groups
the logs by their device_ips and shows the count of the log messages for each of the
Device IP.
You can also give filters to the count() function as shown below.
This query looks for all the log messages containing the word login. It then groups
them by their device_ip s and shows the count of the messages containing the field
value event_id = 528.
3.4 distinct_count()
You can use the distinct_count() function to get the number of distinct count of the
object. For example,
3.3. count() 17
Search Query Language Documentation, Release latest
In this case, though different ports may have multiple counts, distinct_count() returns
the count of the distinct ports for every destination address.
If the search results for a particular destination address had the following data:
port count
21 20
25 30
901 15
The result for the distinct_count() is 3 for each of the ports 21, 25 and 901. However,
the result of the count() is 65.
3.5 sum()
You can use the sum() function to sum the values of the specified fields.
This query displays the sum of all the datasize fields for each device_ip.
You can also give filters to the sum() function.
This query only sums a datasize if it is greater than 500. The expression can be any valid
query string but must not contain any view modifiers.
This query displays the maximum severity value in each of the device_ip.
This query looks for all the log messages containing the word login. Then, it groups
the search results by their device_ips and the col_type and shows the count of the log
messages and the latest col_ts for each of the groups.
3.5. sum() 18
Search Query Language Documentation, Release latest
The max() and min() functions also support filter expressions as:
3.7 avg()
You can use the avg() function to calculate the average of all the values of the specified
field.
3.8 var()
You can use the var() function to calculate the variance of the field values. Variance
describes how far the values are spread out from the mean value.
Execute the following query for proper visualization of how the data fluctuates around
the average value.
Note: You can use +, -, *, /, and ^ to add, subtract, multiply, divide, and to raise the
power in the min(), max(), sum(), avg(), and var() functions.
Example:
avg(field1/field2^2+field3)
Warning: While using the expressions such as avg(), and min(), it is good to use a
proper filter to discard log messages not containing the specified fields.
3.9 distinct_list()
You can use the distinct_list() function to return the list of all the distinct values of the
field. For example, if you want to view all the distinct values of the field action in the
system, you can use the following query:
3.7. avg() 19
Search Query Language Documentation, Release latest
| chart distinct_list(action)
You can use a grouping parameter to group the distinct list. For example:
The above query returns the list of every distinct value of the field action in the actions
column grouped by the grouping parameter user. You can use this example to view all
the actions performed and machines used by every user in the system.
You can also use this function with other aggregation functions. For example:
The above query returns the list of all the distinct actions with their counts for the user
Jolly.
3.9. distinct_list() 20
Search Query Language Documentation, Release latest
3.9. distinct_list() 21
CHAPTER
FOUR
ONE-TO-ONE COMMANDS
The One-to-one commands take one value as input and provide one output.
For example, you can use the rex and the norm commands to extract specific parts
of the log messages into an ad-hoc field name. This is equivalent to normalizing log
messages during the search. However, the extracted values are not saved.
The rex and norm commands do not filter the log messages. They list all the log
messages returned by the query and add the specified ad-hoc key-value pairs if possible.
4.1 rex
You can use the rex command to recognize regex patterns in the re2 format. The
extracted variable is retained only for the current search scope. The result also shows
the log messages that are not matched by the rex expression.
Example Log:
You can use the rex command to extract the protocol id into a field protocol_id with the
following syntax:-
| rex Protocol:\s*(?P<protocol_id>\d+)
22
Search Query Language Documentation, Release latest
Warning: The (?P< >) expression is part of the rex syntax to specify the field name.
You can also extract multiple fields from a single rex operation as shown below.
The extracted values can be used to chart your results. For example,
Since the rex command acts on the search results, you can add it to a query string as
shown below:
Note: Use Single quote to address inline normalization while using square bracket. For
example:
This syntax works: | norm on user <my_user:\S+> | chart count() by my_user.
But this does not. | norm on user <my_user:[A-Z]+> | chart count() by my_user.
If you use the box brackets ( [, ] ), single quote (‘’) is necessary in the syntax.
4.2 norm
You can use the norm command to extract variables from the search results into a
field. The difference between the rex command and the norm command is that norm
supports both, LogPoint normalization syntax and re2 syntax, whereas the rex command
only supports re2 syntax.
Example Log:
To extract the value of the user into the field user, use the following syntax:-
4.2. norm 23
Search Query Language Documentation, Release latest
You can also use the norm command to extract multiple key-value pairs as shown below:
Note:
• For the list of definers (simplified regular expressions), refer to the List of Definers.
• Use Single quote to address inline normalization while using square bracket. For
example:
This syntax works: | norm on user <my_user:\S+> | chart count() by my_user.
But this does not. | norm on user <my_user:[A-Z]+> | chart count() by my_user.
If you use the box brackets ( [, ] ), single quote (‘’) is necessary in the syntax.
4.3 fields
You can use the fields command to display the search results in a tabular form. The
table is constructed with headers according to the field-names you specify. LogPoint
returns null if the logs do not contain the specified fields.
4.3. fields 24
Search Query Language Documentation, Release latest
4.4 rename
You can use the rename command to rename the original field names.
Example:
When multiple fields of a log are renamed as the same name, the rightmost field takes
precedence over others and only that field is renamed.
Example:
Here, if both the source_address and destination_address fields are present in a log,
only the destination_address field is renamed as ip in search results.
4.4. rename 25
Search Query Language Documentation, Release latest
The log messages after normalization can have different field-names for information
carrying similar values. For example, different logs may have name, username, u_name,
or user_name as keys for the same field username. To aggregate all the results and
analyze them properly, you can use the rename command.
In some cases, the field names can be more informative with the use of rename command
as below:
4.4. rename 26
CHAPTER
FIVE
PROCESS COMMANDS
You can use the process command to execute different one-to-one functions which
produce one output for one input given.
Some default process commands available in LogPoint are:
Example:
Example:
27
Search Query Language Documentation, Release latest
5.3 Difference
This process command calculates the difference between two numerical field values of
a search.
Syntax:
Example:
5.3. Difference 28
Search Query Language Documentation, Release latest
5.4 Summation
This process command calculates the sum between two numerical field values of a
search.
Syntax:
| chart sum(fieldname)
Example:
5.4. Summation 29
Search Query Language Documentation, Release latest
Example:
Quartile
Syntax:
Example:
Quantile
Syntax:
| process quantile(fieldname)
Example:
doable_mps=* | process quantile(doable_mps)
|search quantile>0.99
|chart count() by doable_mps order by doable_mps desc
Example:
| process lookup(lookup_table, device_ip)
5.7 GEOIP
This process command gives the geographical information of a public IP address. It
adds a new value “internal” to all the fields generated for the private IP supporting the
RFC 1918 Address Allocation for Private Internets.
Syntax:
Example:
5.8 Codec
The Codec process command encodes the field values to ASCII characters or decodes
the ASCII characters to their text value.
Syntax:
Example:
5.8. Codec 32
Search Query Language Documentation, Release latest
5.9 InRange
The InRange process command determines whether a certain field-value falls within the
range of two given values. The processed query returns TRUE if the value is in the range.
Syntax:
where,
endpoint1 and endpoint2 are the endpoint fields for the range,
the field is the fieldname to check whether its value falls within the given range,
result is the user provided field to assign the result (TRUE or FALSE),
inclusion is the parameter to specify whether the range is inclusive or exclusive of
given endpoint values. When this parameter is TRUE, the endpoints will be included for
the query and if it is FALSE, the endpoints will be excluded.
Example:
5.10 Regex
The Regex process command extracts specific parts of the log messages into custom
field names.
Syntax:
5.9. InRange 33
Search Query Language Documentation, Release latest
Example:
| process regex("(?P<type>\S*)",msg)
Example:
5.12 Compare
This process command compares two values to check if they match or not.
Syntax:
Example:
5.13 IP Lookup
This process command enriches the log messages with the Classless Inter-Domain
Routing (CIDR) address details. A list of CIDRs is uploaded in the CSV format during
the configuration of the plugin. For any IP Address type within the log messages, it
matches the IP with the content of the user-defined Lookup table and then enriches the
search results by adding the CIDR details.
Syntax:
Example:
This command compares the IP column of the lookup_table_A with the device_ip field
of the log and if matched, the search result is enriched.
5.12. Compare 35
Search Query Language Documentation, Release latest
| process compare_network(fieldname1,fieldname2)
source_address=* destination_address=*
| process compare_network (source_address, destination_address)
| chart count() by source_address_public, destination_address_public,
same_network, source_address, destination_address
Syntax:
Example:
Example:
Example:
Example:
5.19 Grok
The grok process command enables you to extract key-value pairs from logs during
query runtime using Grok patterns. Grok patterns are the patterns defined using regular
expression that match with words, numbers, IP addresses, and other data formats.
Refer to Grok Patterns and find a list of all the Grok patterns and their corresponding
regular expressions.
Syntax:
| process grok("<signature>")
Using this command adds the ip_address_in_log, method_in_log, and url_in_log fields
and their respective values to the log if it matches the signature pattern.
5.19. Grok 39
Search Query Language Documentation, Release latest
5.20 AsciiConverter
This process command converts hexadecimal (hex) value and decimal (dec) value of
various keys to their corresponding readable ASCII values. The application supports
the Extended ASCII Table for processing decimal values.
Hexadecimal to ASCII
Syntax:
| process ascii_converter(fieldname,hex) as string
Example:
| process ascii_converter(sig_id,hex) as alias_name
Decimal to ASCII
Syntax:
| process ascii_converter(fieldname,dec) as string
Example:
| process ascii_converter(sig_id,dec) as alias_name
5.21 WhoIsLookup
The whoislookup process command enriches the search result with the information
related to the given field name from the WHOIS database.The WHOIS database consists
5.20. AsciiConverter 40
Search Query Language Documentation, Release latest
| process whoislookup(field_name)
Example:
5.22 Eval
This process command evaluates mathematical, boolean and string expressions. It
places the result of the evaluation in an identifier as a new field.
Syntax:
| process eval("identifier=expression")
Example:
| process eval("Revenue=unit_sold*Selling_price")
5.22. Eval 41
Search Query Language Documentation, Release latest
Note: Refer to the Evaluation Process Plugin Manual for more details.
5.23 toList
This process command populates the dynamic list with the field values of the search
result.
Syntax:
Example:
5.24 toTable
This process command populates the dynamic table with the fields and field values of
the search result.
Syntax:
Example:
5.23. toList 42
Search Query Language Documentation, Release latest
5.24. toTable 43
CHAPTER
SIX
FILTERING COMMANDS
6.1 search
Using the search command, you can conduct searches on the search results. The
LogPoint search query searches on dynamic fields returned from the norm, rex, and
the table commands.
To search for users who have logged in more than 5 times:
If you create a dynamic field new field using norm command as,
To view the logs which have 100 as the value of the new field, use the search command
as:
We recommend you to use the search command only in the following cases:
• When you need to filter the results for simple search (non key-value search).
For example:
| search error
• When you need to filter the results using the or logical operator.
For example:
44
Search Query Language Documentation, Release latest
Note: It is not advised to use the search command unless absolutely necessary. The
reason for this is that the search command uses heavy resources. So, it is always better
to apply any kind of filtering before using the search command.
6.2 filter
The filter command lets you further filter the logs retrieved in the search results.
Syntax:
For example, if you want to display only the domains that have more than 10 events
associated with them in the search results, use the following query:
events>10
The query searches for all the logs containing the fields url and norm_id with the value
of norm_id having Firewall at the end. It then adds a new field domain to the logs
based on the respective URLs and groups the results by their domains. Finally, the
filter command limits the results to only those domains that have more than 10 events
associated with them.
The filter command does not index the intermediate fields, and thus, is computationally
more efficient than the search command. Therefore, LogPoint uses the filter command
to drill-down on the search results, which significantly speeds up the drill-down process.
Note:
• The filter command filters the results based on dynamic fields returned from the
norm, rex, and table commands as well.
• The filter command only works with expressions having the =, >, <, >=, and <=
operators.
• To filter the results with more than one condition, you must chain multiple filter
expressions.
6.2. filter 45
Search Query Language Documentation, Release latest
6.3 latest
The latest command finds the most recent log messages for every unique combination
of provided field values.
This query searches for the latest logs of all the devices.
This query searches for all the latest devices based on the log_ts field whose web server
running on the port number 80 is down.
6.4 order by
Use the order by to sort the search results based on a numeric field. You can sort the
results in either the ascending or the descending order.
Examples:
This query searches for all the syslog messages generated from the device named John
Doe and sorts them in the ascending order of their col_ts values.
This query searches for the logs from all the devices in the system and sorts them in the
descending order of their log_ts values.
Note: The sorting order of the search results is inconsistent when a search query
does not contain an order sorting command. Use the order by command to make it
consistent.
Note:
6.3. latest 46
Search Query Language Documentation, Release latest
• The feature to display the Top-10 and the Rest graphs is supported for the
aggregation queries.
• While using the limit <number> command to retrieve a large volume of logs, make
sure that your system has enough resources to load and render the data.
Example:
This query searches for all the logs having a destination address, filters the top 10
results by their source address and rolls-up all the remaining results in the eleventh
line. The source_address field displays the word other in the table as shown in the
figure below.
SEVEN
PATTERN FINDING
Pattern finding is a method of finding one or multiple streams and patterns of data to
correlate a particular event. For example: five failed logins, followed by a successful
login. It can be performed on the basis of the count and the time of occurrence of
the stream. Use the Pattern Finding rules to detect complex event patterns in a large
number of logs.
Correlation is the ability to track multiple types of logs and deduce meanings from
them. It lets you look for a collection of events that make up a suspicious behavior
and investigate further.
Syntax Description
[] For single streams, square brackets contain a stream of
events.
within Keyword to denote the notion of time frame
having same Keyword
Following are the working examples for pattern finding using single stream:
To find 5 login attempts:
[5 login]
49
Search Query Language Documentation, Release latest
To find 10 login attempts by the same user from the same source_address (multiple
fields) within 5 minutes:
[10 action = "logged on" having same user, source_address within 5 minutes]
The time format for specifying timeframe are: second(s), minute(s), hour(s) and day(s).
[error] as E
This query finds the logs with errors. It then aliases the result as E and displays the fields
prefixed with E such as E.severity, and E.device_ip. You can then use the aliased fields
as shown below:
[2 login | norm <username:word> login successful having same username within 10 seconds]
Example:
[table event_prob] as s1
left join [event = * | chart count() by event] as s2
on s1.event = s2.event
7.2.3 Join
Join queries are used to link the results from different sources. The link between two
streams must have an on condition. The link between two lookup sources or any of the
lookup and stream does not require a time-range. Join as a part of a search string, can
link one data-set to another based on one or more common fields. For instance, two
completely different data-sets can be linked together based on a username or event ID
field present in both the data-sets.
The syntax for joining multiple patterns is as follows:
[stream 1] <aliased as s1> <JOIN> [stream 2] <aliased as s2> on <Join_conditions> |
additional filter query.
To find the events where a reserved port of an Operating System (inside the
PORT_MACHINE table) is equal to the blocked port (inside the BLOCKED_PORT table):
To find 5 login attempts by the same user within 1 minute followed by 5 failed login
attempts by the same user within 1 minute
[5 login having same user within 1 minute] as s1
followed by
[5 failed having same user within 1 minute]
To find 5 login attempts by the same user within 1 minute followed by 5 failed attempts
by the same user within 1 minute and users from both result are same
[5 login having same user within 1 minute] as s1
followed by
[5 failed having same username within 1 minute] as s2
on
s1.username = s2.username
7.2.4 Followed by
Pattern Finding by followed by is useful when two sequential streams are connected to
an action.
For example:
[2 login success having same user] AS stream1
followed by
[login failure] as stream2
ON
stream1.user = stream2.user
Here,
Syntax Description
[ ] AS stream1 A simple pattern finding query aliased as stream1
followed by Keyword
[ ] AS stream2 A simple search aliased as stream2
ON Keyword
stream1.user = Matching field from the 2 streams
stream2.user
• [stream 1] <aliased as s1> <followed by> [stream 2] <aliased as s2> <within time
limit> on <Join_conditions>| additional filter query.
• [stream 1] as s1 followed by [stream2] as s2 within time_interval on s1.field = s2.field
• [stream 1] as s1 followed by [stream2] as s2 on s1.field = s2.field
• Streams can be labeled using alias. Here, the first stream is labeled as s1. This
labeling is useful while setting the join conditions in the join query.
• The operation between multiple streams is carried out using “followed by” or
“join”.
• Use the join keyword to view additional information in the final search. The join
syntax is mostly used with tables for enriching the data.
• Join conditions are simple mathematical operations between the data-sets of two
streams.
• Use additional filter query to mitigate false positives which are generally created
while joining a stream and a table. Searching the query with a distinct key from the
table displays an error-less result.
This query does not display histogram but displays the log table.
This query displays both the histogram and the log table.
Note:
• All the reserved keywords such as on, join, as, and chart are not case-sensitive.
• If you want to use reserved keywords in simple search or some other contexts, put
them in quotes.
EIGHT
CHAINING OF COMMANDS
You can chain multiple commands into a single query by using the pipe (|) character.
Any command except fields can appear before or after any other command. The fields
command must always appear at the end of the command chain.
Example:
This query displays the number of logs with the same device_name appearing more
than 1000 times.
(label = logoff) AND hour (log_ts) > 8 AND hour (log_ts) <16 |
latest by user |
timechart count() by user
This query captures all the log messages labeled as logoff and those collected between
8 AM and 4 PM. It then displays the timechart of the recent users for the selected
time-frame.
56
CHAPTER
NINE
ADDITIONAL NOTES
Similarly,
source_name = "/opt/immune/var/log/audit/webserver.log"
| chart count() by source_address
9.4 norm
57
Search Query Language Documentation, Release latest
| norm doable_mps=<dmps:'['0-9']'+>
9.5 timechart
Limit does not work with timechart.
source_name = /opt/immune/var/log/benchmarker
Now, if you want to capture the first two words of the path,
you can write the query as follows:
In the example above, the rex command is used on a field which captures email
addresses. The email address is then broken into account and domain using the
corresponding regex.
9.5. timechart 58
Search Query Language Documentation, Release latest
Networking-related Patterns
Path-related patterns
Syslog patterns
Log formats