ZLog is an Open Source project which uses performance optimized databases to allow searching and filtering big amounts of logdata in realtime.
The latest release is using the following stack:
ElasticSearch 7.13.4
: Database used for saving logdata.Solr 8.4.1
: Database used for saving logdata.Logstash 7.17.13
: Used to parse, filter and upload data to Elasticsearch and Solr.Filebeat 7.13.4
: Used to harvest data from a directory and send it to Logstash.Kibana 7.13.4 (optional)
: GUI to view and analyze data from ElasticSearch. Solr is supported too, but Kibana needs to be configured for it to work properply.
- Docker Engine
- Docker Compose
- 1.5 GB of RAM
Note
Especially on Linux, make sure your user has the required permissions to interact with the Docker daemon.
By default, the stack exposes the following ports:
- 9200: Elasticsearch HTTP
- 8983 & 9983: Solr HTTP
- 5044: Logstash Beats input
- 50000: Logstash TCP input
- 9600: Logstash monitoring API
- 5601: Kibana
- 3000: ZLog Gui
- 8090: ZLog Gateway
Execute the following command in the root folder to build the client and gateway:
./mvnw install
With the docker-compose.yml
file you can install the whole stack with a single command:
docker-compose up
If you want to change the components you can edit that here.
The ZLog Client requires that the retreived data matches a specific format with certain fields. Else the client won't show any logdata.
The following fields and values are needed for the GUI to initially work:
time
in formatYYYY-MM-DDTHH:MM:SS.000Z
Example Value:2016-01-28T04:30:43.000Z
level
The following Strings are allowedINFO
,DEBUG
,WARN
,ERROR
message
Example Value:This is the log message
component
Example Value:ComponentTest
application
Example Value:ApplicationName
Other addional fields can be defined and will be shown in an detailed view when you click the logentry in the GUI.
Examples:
logger
Example Value:Log4J
hostName
Example Value:addesso.server
threadName
Example Value:main
XXX
Example Value:XXX
There are two possibilites to make this work:
- The uploaded logfiles are already formatted and have all the required fields.
- The uploaded logfiles are not formatted. In this case you need to create a filter with logstash for every logfile format that is supposed to get used. Logstash then parses the file according to the filter and saves it into Elastic / Solr.
Important
Configuration is not dynamically reloaded, you will need to restart individual components after any configuration change.
Please refer to the following documentation page for more details about how to configure Elasticsearch inside Docker containers: Install Elasticsearch with Docker.
Please refer to the following documentation page for more details about how to configure Solr inside Docker containers: Install Solr with Docker.
Please refer to the following documentation page for more details about how to configure Kibana inside Docker containers: Install Kibana with Docker.
The Logstash configuration is stored in logstash/config/logstash.yml
.
The Logstash pipelines and filters are stored in logstash/pipeline/
.
Please refer to the following documentation page for more details about how to configure Logstash inside Docker containers: Configuring Logstash for Docker.
The Filebeat default configuration is stored in filebeat/config/kibana.yml
.
Please refer to the following documentation page for more details about how to configure Filebeat inside Docker containers: Configuring Filebeat for Docker.
- In the
Constants_env.js
the build configs are stored (local, developement, production). When building the project you need to make sure the correct config is used.
- In the
application.properties
file you can configure general parameters (For example if Elastic or Solr should be used) - In the
application-elastic.properties
file the configuration for the elastic connection is stored. - In the
application-solr.properties
file the configuration for the solr connection is stored.
ZLog supports two security modes. Security on (OAuth2) and security off. You need to make sure that you configure both the gui and gateway to the same security mode while building the project. Else the gui won't be able to connect to the gateway!
- The config for the gui is located in
package.json
andConstants_env.js
. - The config for the gateway is located in
application.properties
. Here you can setsecurity.enabled
totrue
orfalse
- When security is enabled you need to make sure the required paramters for the OAuth2 provider are correct.
These are stored in
application.yml
There are different tools that allow you to import your logfiles into elasticsearch and solr. One of the most popular tools is logstash in combination with filebeat.
Filebeat simply looks for logs in a specific path (input) and writes all the found data to an defined output (in our case logstash).
Per default all files located in /var/log/*.log
will get automatically sent to Logstash for parsing. But you can change the default behaviour to something that suits your needs.
Logstash is a powerful tool which gives the ability to parse and customize logs before importing them to ElasticSearch and Solr.
For this to work, logstash needs to know how the incoming logfile looks and how to parse it.
We need to write a filter for every different logfile we want to import into elastic or solr.
An example filter is located in logstash/pipeline/logstash.conf
.
It parses and filters the Windows_2k log located in FileConfiguration/Windows_2k.log
.
So if you want to parse your own logfile, you need to write an filter for it and put it in the [`logstash/pipeline'] directory and make sure it gets correctly chosen by logstash.
For more information check out:
https://www.elastic.co/guide/en/logstash/current/index.html
https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html
https://www.elastic.co/guide/en/logstash/current/filter-plugins.html
To add plugins to any component you have to:
- Add a
RUN
statement to the correspondingDockerfile
(eg.RUN logstash-plugin install logstash-filter-json
) - Add the associated plugin code configuration to the service configuration (eg. Logstash input/output)
- Rebuild the images using the
docker-compose build
command
- Fix the Dataimport from Logstash into Solr.
- Secure the connection between gateway and elastic/solr (for example with username/password)
- Upgrade the Stack to the newest version including the Connectors and Libraries
- Refactor / Redo the ZLog Client (Deprecated Libraries and Security Issues)