Skip to content

Logging

sepidetari edited this page Jul 10, 2018 · 38 revisions

Download and install

Download image, start and run Elasticsearch Kibana and Logstash docker containers. The order of running them is important :

Elasticsearch
Windows: 
docker run -d --name elasticsearch  -p 9200:9200 -p 9300:9300 -v C:/esdata:/usr/share/elasticsearch/data docker.elastic.co/elasticsearch/elasticsearch:6.0.1

Linux: 
sudo docker run -d --name elasticsearch  -p 9200:9200 -p 9300:9300 -v "/esdata/":/usr/share/data docker.elastic.co/elasticsearch/elasticsearch:6.0.1

sudo docker run -d --name elasticsearch  -p 9200:9200 -p 9300:9300 -e "http.host=0.0.0.0" -e "transport.host=127.0.0.1" -v "/esdata/":/usr/share/data docker.elastic.co/elasticsearch/elasticsearch:6.0.1

Kibana
Windows: 
docker run --name kibana --link elasticsearch:elasticsearch -p 5601:5601 -d docker.elastic.co/kibana/kibana:6.0.1

Linux: 
sudo docker run --name kibana --link elasticsearch:elasticsearch -p 5601:5601 -d docker.elastic.co/kibana/kibana:6.0.1

Logstash configuration file:

Create logstash.conf file with the following content then store it in: C:/usr/share/logstash/pipeline/ or /usr/share/logstash/pipeline/

input {
  file {
    path =>  "/logs/*.log" 
    type => logs		
  }
 }

filter {
    grok {
      match => { "message" => "\A%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:loglevel}\s+%{NOTSPACE:thread}\s+\"%{DATA:service_name}\"\s+%{GREEDYDATA:msg}"}
remove_field => [ "message" ]
    }
}
 
output {   
         elasticsearch {hosts => [ "elasticsearch:9200" ]
		  index => "sask"
	   }
}
Logstash
Windows: 
docker run -h docker.elastic.co/logstash/logstash:6.0.1 -p 5044:5044 --name logstash --link elasticsearch:elasticsearch -v C:/usr/share/logstash/pipeline/:/usr/share/logstash/pipeline/ -v c:/logs/:/logs docker.elastic.co/logstash/logstash:6.0.1 -f /usr/share/logstash/pipeline/logstash.conf

Linux: 
sudo docker run -h docker.elastic.co/logstash/logstash:6.0.1 -p 5044:5044 --name logstash --link elasticsearch:elasticsearch -v /usr/share/logstash/pipeline/:/usr/share/logstash/pipeline/ -v /logs/:/logs docker.elastic.co/logstash/logstash:6.0.1 -f /usr/share/logstash/pipeline/logstash.conf

Hint:

Logs and Esdata folder used by Logstash and Elasricsearch running containers should be accessible.

Logback configuration file:

For each new microservice a Logback-spring.xml configuration file with exactly the following content should be defined.

<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true">

	<appender name="consoleAppender" class="ch.qos.logback.core.ConsoleAppender">
		<encoder>
			<charset>UTF-8</charset>
			<Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] "@project.artifactId@" [%C] %logger{36} - %msg%n</Pattern> 
		</encoder>
	</appender>

	<appender name="FILE"
		class="ch.qos.logback.core.rolling.RollingFileAppender">
		<file>@logback.dir@@[email protected]</file>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<!-- daily rollover. -->
			<fileNamePattern>@logback.archive@@project.artifactId@.%d{yyyy-MM-dd}.%i.log
			</fileNamePattern>

			<timeBasedFileNamingAndTriggeringPolicy
				class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
				<maxFileSize>10MB</maxFileSize>
			</timeBasedFileNamingAndTriggeringPolicy>
			<!-- keep 7 days' worth of history -->
			<maxHistory>7</maxHistory>
		</rollingPolicy>

		<encoder>
			<charset>UTF-8</charset>
			<Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] "@project.artifactId@" [%C] %logger{36} - %msg%n</Pattern> 
		</encoder>
	</appender>

	<root level="@logback.level.file@">
		<appender-ref ref="FILE" />
	</root>
	<root level="@logback.level.console@">
		<appender-ref ref="consoleAppender" />
	</root>
</configuration>
Kibana UI is currently accessible under the following address:

project-group-search-extraction-staging.cs.uni-paderborn.de:5601/app/kibana