31 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana - part 4

<< Back to part 3 <<

4. Config Kibana to show Postfix log event

So far so good, from previous posts we already have event logs stored in Elasticsearch, now we need to use Kibana to display interesting dashboards. 

Kibana is a great tool which connects directly to Elasticsearch, it all based on html, javascript and css and without any server side scripting => Kibana run directly in your browser and connects your browser to Elasticsearch on port tcp:9200.

30 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana - part 3

<< Back to part 2 <<

3. Config Elasticsearch

Elasticsearch is a very strong and flexible search engine based on Apache lucene, it supports load balance and fail over by using shards and replicas technique - this helps to scale out to very big model with multiple clustered nodes. 

In this post, Elasticsearch will act as the search engine and is the final destination of the log stream. The installation and configuration is quite simple and easy.

24 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana - part 2

<< Back to part 1 <<

2. Config Fluentd (td-agent) to receive log stream from Postfix

Fluentd (td-agent) is really a very good log transport and parser, it has a very clearly modular model, support for lot of log format - including custom format, it also has a lot of plugins which support multiple database type.

Fluentd (td-agent) also supports H.A (high availability) and log stream load balancing - this will help able to scale out to very heavy traffic model.

23 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana

Preface

This tutorial will walk you through how to build a Mail Log Centralized system with Postfix, Fluentd, Elasticsearch and Kibana.

At the result, you will able to see log events happening in realtime, detail of an log record, do some analysis like Top Senders, Top Receivers, Top Status ...

Some screenshots :

Events happening in realtime.

Detail of an log event.

Term analytic as Top Senders, Top Receivers, Top Relays ...

List of events

How will all these thing work ? 

In this setup, I will use 2 servers for easy and simple :

- [Server1 - 10.90.7.194 : running Postfix as SMTP server - This act as the source log generator]
- [Server2 - 10.90.7.195 : running Fluentd (td-agent) as log receiver/parser + ElasticSearch as SearchEngine + Kibana as Web GUI Front-end]

16 January 2014

Resume ssh session with linux screen command

First look

You can extend the ssh connection time out by the param ServerAliveInterval as I have mentioned before. But if the connection is accidentally disconnect (maybe by network problem), how can you get resume to your ssh session again ?

Linux screen command is the rescure in such these situations. With screen command you can create as many virtual session terminal as you want.

I myself right after sucessfully ssh logon, I start screen immediately to get into virtual terminal. If there is any problem causing the connection lost, I can resume back to the right terminal easily.

Config ssh client to keep alive with server not to expire session to soon

As a system admin, I always using ssh to connect to my servers almost at everyday. I really that by default an ssh session will soony expired if you dont touch the keyboard for a while (for example if you just go to the rest room for minutes).

How to prevent this happen ?

The answer is very simple, I will show you how to keep long alive session with sshd server.

Just open the ssh (client) config file at : #vim /etc/ssh/ssh_config

and insert a line : ServerAliveInterval 30

This will intervaly 30s send an keep alive package to the server.

From now on your ssh connection will live until you disconnect or log out the session, dont worry about time out anymore.