01 September 2014

OpenVPN with OTP one time password by google authenticator working all the time part 4

<< Back to part 3 <<

5. Testing with pamtester :

We will need a tool named pamtester to debug when working with pam modules.

# yum install pamtester

By default, linux PAM package installed with lot of services :

OpenVPN with OTP one time password by google authenticator working all the time part 3

<< Back to part 2 <<

4. Install google authenticator PAM module 

We need to install pam-devel package to rebuild the google lib PAM module.

# yum install pam-devel

Follow up : http://code.google.com/p/google-authenticator/ to download libpam-google-authenticator source code.

# wget http://google-authenticator.googlecode.com/files/libpam-google-authenticator-1.0-source.tar.bz2

# tar xvf libpam-google-authenticator-1.0-source.tar.bz2

# cd libpam-google-authenticator-1.0-source.tar.bz2

# make install
...........
...........
cp pam_google_authenticator.so /lib64/security
cp google-authenticator /usr/local/bin

24 August 2014

OpenVPN with OTP one time password by google authenticator working all the time part 2

<< Back to part 1 <<

3. Create user (Alice) certificate for authentication :

Generate Alice private key

# openssl genrsa -out alice.key 2048

Generate Alice certificate request

Notice : the common name (CN) should match the username = alice

# openssl req -out alice.req -key alice.key -new -days 365

Sign Alice request by using rootCA

# openssl x509 -in alice.req -out alice.cert -days 365 -req -CA rootCA.cert -CAkey rootCA.key -CAcreateserial

23 August 2014

OpenVPN with OTP one time password by google authenticator working all the time

So why OpenVPN but not l2tp or pptp or ipsec ?

The good thing :

- It can be run at tcp:80 or tcp:443. No more worry about firewall or behind NAT.

- Also can be used with http proxied.

- SSLv3 + PKI Cert Authentication.

- Totally free.

- Multiple platforms supported : Windows, Linux, Mac, Ios, Android ...

The bad thing :

- Some vpn solution like l2tp or pptp is supported natively by Windows, when using OpenVPN you need to manually install the client.

14 July 2014

Fix yahoo smtp mail error code 554 Message not allowed PH01 not accepted for policy reasons

Blocked by yahoo smtp server

For some reasons, my mail server has been denied by yahoo smtp server, I can not send email to yahoo.com domain, the mail log is something like this :

B078B14A8008: to=someone@yahoo.com, relay=mta5.am0.yahoodns.net[66.196.118.34]:25, delay=2.9, delays=0.04/0/0.78/2.1, dsn=5.0.0, status=bounced (host mta5.am0.yahoodns.net[66.196.118.34] said: 554 Message not allowed - [PH01] Email not accepted for policy reasons. Please visit http://postmaster.yahoo.com/errors/postmaster-27.html [120] (in reply to end of DATA command))

I have tried changing my mail server using another IP but no luck. I also tried contacting with yahoo webmaster but no responses.

05 May 2014

Config log4net send log to elasticsearch with fluentd and kibana - realtime and centralization - part 2

<< Back to part 1 <<

1. Config Log4net to push log with syslog format :

On the Log4net machine, config the appender like this :

<appender name="UdpAppender" type="log4net.Appender.UdpAppender">
  <remoteAddress value="10.90.7.195" />
  <remotePort value="5140" />
  <layout type="log4net.Layout.PatternLayout, log4net">
   <conversionPattern value="&lt;190&gt;%date{MMM dd HH:mm:ss} %P{log4net:HostName} %logger: %thread %level %logger Inside-Log %P{log4net:HostName} [[%message" />
  </layout>
  <filter type="log4net.Filter.LevelRangeFilter">
   <param name="LevelMin" value="INFO" />
   <param name="LevelMax" value="ERROR" />
  </filter>
</appender>

Config log4net send log to elasticsearch with fluentd and kibana - realtime and centralization.






This post will help you to config Apache log4net Library to output log to a centralization logging system, which can be used to watch realtime log events + searching + analysis ...

I will use 2 machines in this scenario :

- The Log4net machine with IP = 10.90.7.194, this machine act as the source log generator (installed or used with some application which has implemented Apache Log4net Lib). The Log4net will be configed to send log stream using syslog format on UDP protocol.

17 March 2014

Postfix log centralize and analysis in realtime with fluentd elasticsearch and kibana - part 5

<< Back to Part 4 <<

5. Advanced configuration :

If you follow the previous part, you can now should have postfix log events displaying in Kibana, but there are some advanced trick that you may get interested in.

31 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana - part 4

<< Back to part 3 <<

4. Config Kibana to show Postfix log event

So far so good, from previous posts we already have event logs stored in Elasticsearch, now we need to use Kibana to display interesting dashboards. 

Kibana is a great tool which connects directly to Elasticsearch, it all based on html, javascript and css and without any server side scripting => Kibana run directly in your browser and connects your browser to Elasticsearch on port tcp:9200.

30 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana - part 3

<< Back to part 2 <<

3. Config Elasticsearch

Elasticsearch is a very strong and flexible search engine based on Apache lucene, it supports load balance and fail over by using shards and replicas technique - this helps to scale out to very big model with multiple clustered nodes. 

In this post, Elasticsearch will act as the search engine and is the final destination of the log stream. The installation and configuration is quite simple and easy.

24 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana - part 2

<< Back to part 1 <<

2. Config Fluentd (td-agent) to receive log stream from Postfix

Fluentd (td-agent) is really a very good log transport and parser, it has a very clearly modular model, support for lot of log format - including custom format, it also has a lot of plugins which support multiple database type.

Fluentd (td-agent) also supports H.A (high availability) and log stream load balancing - this will help able to scale out to very heavy traffic model.

23 January 2014

Postfix log centralize and analysis in realtime with fluentd tdagent elasticsearch and kibana

Preface

This tutorial will walk you through how to build a Mail Log Centralized system with Postfix, Fluentd, Elasticsearch and Kibana.

At the result, you will able to see log events happening in realtime, detail of an log record, do some analysis like Top Senders, Top Receivers, Top Status ...

Some screenshots :

Events happening in realtime.

Detail of an log event.

Term analytic as Top Senders, Top Receivers, Top Relays ...

List of events

How will all these thing work ? 

In this setup, I will use 2 servers for easy and simple :

- [Server1 - 10.90.7.194 : running Postfix as SMTP server - This act as the source log generator]
- [Server2 - 10.90.7.195 : running Fluentd (td-agent) as log receiver/parser + ElasticSearch as SearchEngine + Kibana as Web GUI Front-end]

16 January 2014

Resume ssh session with linux screen command

First look

You can extend the ssh connection time out by the param ServerAliveInterval as I have mentioned before. But if the connection is accidentally disconnect (maybe by network problem), how can you get resume to your ssh session again ?

Linux screen command is the rescure in such these situations. With screen command you can create as many virtual session terminal as you want.

I myself right after sucessfully ssh logon, I start screen immediately to get into virtual terminal. If there is any problem causing the connection lost, I can resume back to the right terminal easily.

Config ssh client to keep alive with server not to expire session to soon

As a system admin, I always using ssh to connect to my servers almost at everyday. I really that by default an ssh session will soony expired if you dont touch the keyboard for a while (for example if you just go to the rest room for minutes).

How to prevent this happen ?

The answer is very simple, I will show you how to keep long alive session with sshd server.

Just open the ssh (client) config file at : #vim /etc/ssh/ssh_config

and insert a line : ServerAliveInterval 30

This will intervaly 30s send an keep alive package to the server.

From now on your ssh connection will live until you disconnect or log out the session, dont worry about time out anymore.