Skip to content

adessoSE/ZLog

Repository files navigation

Adesso ZLog

ZLog is an Open Source project which uses performance optimized databases to allow searching and filtering big amounts of logdata in realtime.

The latest release is using the following stack:

Screenshot


Contents

  1. Requirements
  2. Usage
  3. Configuration
  4. File import with Logstash and Filebeat
  5. Extensibility

Requirements

Host setup

Note

Especially on Linux, make sure your user has the required permissions to interact with the Docker daemon.

By default, the stack exposes the following ports:

  • 9200: Elasticsearch HTTP
  • 8983 & 9983: Solr HTTP
  • 5044: Logstash Beats input
  • 50000: Logstash TCP input
  • 9600: Logstash monitoring API
  • 5601: Kibana
  • 3000: ZLog Gui
  • 8090: ZLog Gateway

Usage

Build the project

Execute the following command in the root folder to build the client and gateway:

./mvnw install

Run the complete stack

With the docker-compose.yml file you can install the whole stack with a single command:

docker-compose up

If you want to change the components you can edit that here.

Data Scheme

The ZLog Client requires that the retreived data matches a specific format with certain fields. Else the client won't show any logdata.

The following fields and values are needed for the GUI to initially work:

  • time in format YYYY-MM-DDTHH:MM:SS.000Z Example Value: 2016-01-28T04:30:43.000Z
  • level The following Strings are allowed INFO, DEBUG, WARN, ERROR
  • message Example Value: This is the log message
  • component Example Value: ComponentTest
  • application Example Value: ApplicationName

Other addional fields can be defined and will be shown in an detailed view when you click the logentry in the GUI.

Examples:

  • logger Example Value: Log4J
  • hostName Example Value: addesso.server
  • threadName Example Value: main
  • XXX Example Value: XXX

There are two possibilites to make this work:

  • The uploaded logfiles are already formatted and have all the required fields.
  • The uploaded logfiles are not formatted. In this case you need to create a filter with logstash for every logfile format that is supposed to get used. Logstash then parses the file according to the filter and saves it into Elastic / Solr.

Configuration

Important

Configuration is not dynamically reloaded, you will need to restart individual components after any configuration change.

How to configure Elasticsearch

Please refer to the following documentation page for more details about how to configure Elasticsearch inside Docker containers: Install Elasticsearch with Docker.

How to configure Solr

Please refer to the following documentation page for more details about how to configure Solr inside Docker containers: Install Solr with Docker.

How to configure Kibana

Please refer to the following documentation page for more details about how to configure Kibana inside Docker containers: Install Kibana with Docker.

How to configure Logstash

The Logstash configuration is stored in logstash/config/logstash.yml.

The Logstash pipelines and filters are stored in logstash/pipeline/.

Please refer to the following documentation page for more details about how to configure Logstash inside Docker containers: Configuring Logstash for Docker.

How to configure Filebeat

The Filebeat default configuration is stored in filebeat/config/kibana.yml.

Please refer to the following documentation page for more details about how to configure Filebeat inside Docker containers: Configuring Filebeat for Docker.

How to configure ZLog

Client

  • In the Constants_env.js the build configs are stored (local, developement, production). When building the project you need to make sure the correct config is used.

Gateway

Security

ZLog supports two security modes. Security on (OAuth2) and security off. You need to make sure that you configure both the gui and gateway to the same security mode while building the project. Else the gui won't be able to connect to the gateway!

  • The config for the gui is located in package.json and Constants_env.js.
  • The config for the gateway is located in application.properties. Here you can set security.enabled to true or false
  • When security is enabled you need to make sure the required paramters for the OAuth2 provider are correct. These are stored in application.yml

File import with Logstash and Filebeat

There are different tools that allow you to import your logfiles into elasticsearch and solr. One of the most popular tools is logstash in combination with filebeat.

Filebeat

Filebeat simply looks for logs in a specific path (input) and writes all the found data to an defined output (in our case logstash). Per default all files located in /var/log/*.log will get automatically sent to Logstash for parsing. But you can change the default behaviour to something that suits your needs.

Logstash

Logstash is a powerful tool which gives the ability to parse and customize logs before importing them to ElasticSearch and Solr. For this to work, logstash needs to know how the incoming logfile looks and how to parse it. We need to write a filter for every different logfile we want to import into elastic or solr. An example filter is located in logstash/pipeline/logstash.conf. It parses and filters the Windows_2k log located in FileConfiguration/Windows_2k.log.

So if you want to parse your own logfile, you need to write an filter for it and put it in the [`logstash/pipeline'] directory and make sure it gets correctly chosen by logstash.

For more information check out:

https://www.elastic.co/guide/en/logstash/current/index.html

https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html

https://www.elastic.co/guide/en/logstash/current/filter-plugins.html

Extensibility

How to add plugins

To add plugins to any component you have to:

  1. Add a RUN statement to the corresponding Dockerfile (eg. RUN logstash-plugin install logstash-filter-json)
  2. Add the associated plugin code configuration to the service configuration (eg. Logstash input/output)
  3. Rebuild the images using the docker-compose build command

Future and TODOs

  • Fix the Dataimport from Logstash into Solr.
  • Secure the connection between gateway and elastic/solr (for example with username/password)
  • Upgrade the Stack to the newest version including the Connectors and Libraries
  • Refactor / Redo the ZLog Client (Deprecated Libraries and Security Issues)