Creating new event lanes¶
Here you can learn how to create a new event lane and select the proper parsing rules for it.
An event lane and its properties (parsing rules, content, log categorization, etc.) are based on data streams. TeskaLabs LogMan.io automatically assigns an event lane based on the name of the data stream and event lane templates.
There are two cases for connecting a new log source:
-
When you connect a log source for which the content is already provided in the Library, an event lane is created based on the relevant event lane template and the data stream name you provide.
-
When you connect a log source for which the content is not provided, an event lane is created, but it does not provide any content, categorization, or parsing rules. These have to be created and specified in the event lane declaration.
Data streams¶
Data in TeskaLabs LogMan.io is organized into data streams. A data stream is a collection of logs (events) from a log source that share the same type and structure.
Logs in a single data stream are not necessarily connected to a single log device. There can be many log devices that produce logs of the same type.
For example, you can have five different log devices that produce data in the same (or similar) format (e.g., Linux servers):
<123> Dec 12 12:20:00 host-1 process: ...
<123> Dec 12 12:20:00 host-2 process: ...
<123> Dec 12 12:20:00 host-3 process: ...
<123> Dec 12 12:20:00 host-4 process: ...
<123> Dec 12 12:20:00 host-5 process: ...
In this case, you can select a single data stream for these log devices and consider all log devices a single log source. However, when one of the log devices produces logs of a different type:
<123>1 2024-12-12T12:20:00.000001 host-x process tag: ...
you should consider it a different log source and create a separate data stream for it, so that different parsing rules can be applied.
How to name a data stream? (1/2)
Stream names are used as a reference for LogMan.io to choose the correct event lane. Therefore, selecting a proper stream name is crucial!
A stream name is typically composed of a vendor name, technology, and numeric identification. Here are some examples of stream names:
cisco-asa-1
cisco-asa-2
linux-rsyslog-1
microsoft-365-1
synology-nas-1
Read more about how to select a proper stream name below.
Generic data stream¶
After you install LogMan.io Collector and connect log sources, all data is collected into a single generic stream. They are stored in Archive and Elasticsearch. No specific parsing rules are applied to events in the generic stream.
To apply specific parsing rules and categorization of the events, create new data streams.
How to create a new data stream¶
In this example, we will guide you on how to create a new data stream.
Suppose you want to create a data stream for logs incoming from IP address 127.0.0.1
.
-
Open the TeskaLabs LogMan.io web application. Navigate to Archive.
-
Select the generic stream. You will see a list of raw logs from various IP addresses.
- Collected: The time when the log was collected by LogMan.io Collector
- Received: The time when the log was received by LogMan.io Receiver.
- Source: Information about the log origin, composed of IP address, port, and protocol.
The values of protocol can be the following:
- S: Stream / TCP protocol
- D: Datagram / UDP protocol
- T: TLS / SSL protocol
-
Open Log sources >> Collectors. Find the collector by its label or identity.
-
Open the Custom tab. Here you can select data streams based on IP addresses, ports, and protocols. Follow this link for details of LogMan.io Collector configuration. Note that the name of the stream is important.
Example configuration of LogMan.io Collectorclassification: syslog-1514: &syslog-1514 # Stream names my-stream-1: - { ip: 127.0.0.1 } input:SmartDatagram:smart-udp-1514: address: '1514' output: smart smart: *syslog-1514 input:SmartStream:smart-tcp-1514: address: '1514' output: smart smart: *syslog-1514 output:CommLink:smart: {}
Click on Apply.
-
From now on, the logs incoming from IP address
127.0.0.1
are sent to the data streammy-stream-1
. Open Archive. Find the streammy-stream-1
. You will see incoming logs from the stream.Info
Once a log is stored in Archive, it cannot be moved to a different data stream (Archive is an immutable database).
How to create an event lane¶
Event lanes are created from event lane templates. An event lane template describes the properties of a data stream that will be applied to the event lane. It has its declaration in the Library:
---
define:
type: lmio/event-lane-template
name: Linux Rsyslog # Human-readable name of the data stream
stream: linux-rsyslog-* # Reference to the stream name
# Categorization of the log source
logsource:
product:
- linux
service:
- syslog
# Parsing rules to apply for the event lane
parsec:
name: /Parsers/Linux/Common/
# Content to enable when the event lane is created
content:
dashboards: /Dashboards/Linux/Common/
After a new data stream is found, it is matched with one of the event lane templates. A new event lane will be created, and it will inherit the properties of that template. A new declaration for the event lane will be created in the Library:
---
define:
type: lmio/event-lane-template
name: Linux Rsyslog (1)
template: /Templates/EventLanes/Linux/linux-rsyslog.yaml
logsource:
product:
- linux
service:
- syslog
parsec:
name: /Parsers/Linux/Common/
instances: 2
content:
dashboards: /Dashboards/Linux/Common/
kafka:
received:
topic: received.company.linux-rsyslog-1
events:
topic: events.company.linux-rsyslog-1
others:
topic: others.company.linux-rsyslog-1
elasticsearch:
events:
index: lmio-company-events-linux-rsyslog-1
others:
index: lmio-company-others
The event lane declaration copies the properties of the data stream from its template and adds information about:
- what Kafka topics and Elasticsearch indices will be used
- how many instances of the LogMan.io Parsec microservice will run inside the LogMan.io cluster
How to name a data stream? (2/2)
When you select a stream name that matches one of the event lane templates, the corresponding event lane will inherit its properties (parsing rules, Library content, categorization, etc.).
Suppose you want to connect a log source of type Linux Rsyslog. You can find the suitable technology in the event lane templates:
---
define:
type: lmio/event-lane-template
name: Linux Rsyslog # Human-readable name of the data stream
stream: linux-rsyslog-* # Reference to the stream name
The star *
at the end of the stream
option matches any number. Therefore, you can name your data stream linux-rsyslog-1
, linux-rsyslog-2
, etc.
The corresponding event lane will then inherit the name of the stream:
---
define:
type: lmio/event-lane
name: Linux Rsyslog (1)
template: /Templates/EventLanes/Linux/linux-rsyslog.yaml
Parsing rules and other event lane properties can be changed manually, so there is no risk in choosing an incorrect stream name on the first try.
Generic templates
When no event lane template is found for a given data stream name, an event lane for the corresponding data stream is created from the Generic event lane template. The generic template applies only the most basic parsing rules to the data stream and provides no particular information about a log source.
When you connect a log source of unknown type or a type that does not exist in the event lane templates, the event lane will be derived from the generic template. You can modify the properties of that event lane later.