7/30/2017

Sending Logs to ELK Stack through Logback


Sample Kibana Dashboard



Pre-Requirements & Steps to Setup

1. Java Application which already configured with Logback as Logging manager and use Groovy to configure Logback.

Sample Java Start up script 

java \
-Xms512m -Xmx1024m \
-XX:+HeapDumpOnOutOfMemoryError \
-XX:HeapDumpPath="/home/uranadh/opensource/kafka_connect_config/heap-dump.hprof" \
-cp "distributed-services-1.0.0.jar:lib/*" \
 -Dlogback.configurationFile=/home/uranadh/opensource/kafka_connect_config/logback.groovy  \
 org.reactor.monitoring.application.internal.Member

2. Configure logback.groovy file with Logstash Appender.

Please note here we use below Logstash  TCP Appender.

https://github.com/logstash/logstash-logback-encoder#pattern-json-provider

Groovy file


import ch.qos.logback.classic.AsyncAppender;
import ch.qos.logback.classic.encoder.PatternLayoutEncoder;
import ch.qos.logback.core.FileAppender;

import static ch.qos.logback.classic.Level.DEBUG;
import static ch.qos.logback.classic.Level.INFO;

import org.slf4j.LoggerFactory;
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.core.rolling.RollingFileAppender;
import ch.qos.logback.core.rolling.TimeBasedRollingPolicy;
import ch.qos.logback.core.util.FileSize;
import net.logstash.logback.appender.LogstashTcpSocketAppender
import net.logstash.logback.encoder.LogstashEncoder


appender("STASH", LogstashTcpSocketAppender) {
  println "Setting [destination] property to 127.0.0.1:5000"
  destination =  "127.0.0.1:5000" 
  encoder(LogstashEncoder) {
   
  }
}

appender("ASYNC", AsyncAppender) {
  discardingThreshold=0;
  queueSize=500;
  neverBlock=true;
  appenderRef("STASH");
}

//root(DEBUG, ["ASYNC"])
root(INFO, ["ASYNC"])

logger("org.reactor.monitoring", DEBUG,["STASH"],false)

 3. Install ElasticSearch & Run

https://www.elastic.co/downloads/elasticsearch

elasticsearch-5.5.0/bin$ ./elasticsearch


 4. Install, Configure & Run Logstash


https://www.elastic.co/guide/en/logstash/current/installing-logstash.html

 Sample Logstash Configuration- logstash-filter.conf


input { 
  tcp {
   port => 5000
   codec => "json"
  }
}

filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
}

output {
  elasticsearch { hosts => ["localhost:9200"]
          index => "dlogs-%{+YYYY.MM.dd}"
          document_type => "log"
           
         }
  stdout { codec => rubydebug }
}

Run Logstash

logstash-5.5.1$ bin/logstash -f logstash-filter.conf


Sample Console Output

{
     "@timestamp" => 2017-07-30T15:28:37.792Z,
          "level" => "INFO",
           "port" => 52778,
    "thread_name" => "hz.ShutdownThread",
    "level_value" => 20000,
       "@version" => 1,
           "host" => "127.0.0.1",
    "logger_name" => "com.hazelcast.core.LifecycleService",
        "message" => "[10.180.35.234]:8701 [hibernate] [3.7.3] [10.180.35.234]:8701 is SHUTDOWN",
           "tags" => [
        [0] "_grokparsefailure"
    ]
}


 5. Insall & Run Kibana

https://www.elastic.co/guide/en/kibana/current/install.html

kibana-5.5.1-linux-x86_64$ ./bin/kibana 

6. Go to Kibana Dashboard

http://localhost:5601/app/kibana#/discover?_g=(refreshInterval:(display:Off,pause:!f,value:0),time:(from:now-15m,mode:quick,to:now))&_a=(columns:!(_source),index:'dlogs-*',interval:auto,query:(query_string:(analyze_wildcard:!t,query:'*')),sort:!('@timestamp',desc))


No comments:

Post a Comment