Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Linux \ How Linux Log Analysis     - GRUB how to load Linux kernel (Linux)

- CentOS 7.0 Enable iptables firewall (Linux)

- Linux Network Programming - raw socket instance: MAC header message analysis (Programming)

- iOS in the event delivery and the responder chain (Programming)

- Docker startups use continuous integration deployment (Server)

- Differences Shell scripting languages and compiled languages (Programming)

- Without Visual Studio .NET Windows application development (Programming)

- C ++ 11 smart pointers (Programming)

- xCAT deployment (Linux)

- The best tools and techniques to find data on a Unix system (Linux)

- How to create a bootable USB disk to use MultiSystem on Ubuntu (Linux)

- How VirtualBox and VMware virtual machine conversion (Linux)

- Encrypted with GnuPG signature to verify the authenticity and integrity of downloaded file (Linux)

- Ubuntu dual-card system configuration method (Server)

- How to adjust the system time CentOS (Linux)

- Laravel configuration PhpStorm + Xdebug + Chrome settings Debug Environment (Server)

- CentOS terminal display Chinese (Linux)

- Oracle RAC node on the expulsion of the inspection process on OEL6.3 (Database)

- The method of installing software under Ubuntu Linux (Linux)

- Java memory analysis tool uses detailed MAT (Programming)

 
         
  How Linux Log Analysis
     
  Add Date : 2016-09-13      
         
         
         
  Log a lot of information you need to deal with, although sometimes want to extract is not easy to imagine. In this article we will introduce some of you are now able to do the basic log analysis example (only need to search for it). We will also cover some of the more advanced analysis, but these preliminary efforts you need to make the appropriate settings, you can save a lot of time late. Examples of advanced data analysis include generating summary count of RMS filter, and so on.

First we'll show you how to use several different tools at the command line, and then shows how a log management tools can automate much of the heavy lifting so that the log analysis easier.

Search with Grep

Text search to find information is the most basic way. Text search tool is the most common grep. This command-line tool in most Linux distributions have, it allows you to use a regular expression search log. Regular expressions are a special language to write, can identify pattern matching text. The simplest model is to enclose the string you want to find quotes.

Regular Expressions

This is a look "user hoover" in Ubuntu system authentication log examples:

$ Grep "user hoover" /var/log/auth.log
Accepted password for hoover from10.0.2.2 port 4792 ssh2
pam_unix (sshd: session): session opened for user hoover by (uid = 0)
pam_unix (sshd: session): session closed for user hoover
Construction precise regular expressions can be difficult. For example, if we want to search for a port "4792" figure is similar, it may also match the timestamp, URL and other unwanted data. Ubuntu in the following examples, which matches the Apache logs which we do not want.

$ Grep "4792" /var/log/auth.log
Accepted password for hoover from10.0.2.2 port 4792 ssh2
74.91.21.46 - [31 / Mar / 2015: 19: 44: 32 + 0000] "? GET / scripts / samples / search q = 4972 HTTP / 1.0" 404545 "-" "-"
Surround Search

Another useful tips that you can do with grep surround search. This will show you what a match in front of or behind the lines Yes. It can help you debug the cause of the problem or something wrong. Option B shows the first few lines, a few rows behind A display option. For example, we know that when a person fails to log in as an administrator, and they did not reverse resolution of IP, which means they may not have a valid domain name. This is very suspicious!

$ Grep -B 3-A 2'Invalid user '/ var / log / auth.log
Apr2817: 06: 20 ip-172-31-11-241 sshd [12545]: reverse mapping checking getaddrinfo for216-19-2-8.commspeed.net [216.19.2.8] failed - POSSIBLE BREAK-IN ATTEMPT!
Apr2817: 06: 20 ip-172-31-11-241 sshd [12545]: Received disconnect from216.19.2.8: 11: ByeBye [preauth]
Apr2817: 06: 20 ip-172-31-11-241 sshd [12547]: Invalid user admin from216.19.2.8
Apr2817: 06: 20 ip-172-31-11-241 sshd [12547]: input_userauth_request: invalid user admin [preauth]
Apr2817: 06: 20 ip-172-31-11-241 sshd [12547]: Received disconnect from216.19.2.8: 11: ByeBye [preauth]
Tail

You can also use the combination of grep and tail to get the last few lines of a file, or trace logs and real-time printing. This is useful when you make interactive changes, such as starting the server or test code changes.

$ Tail -f /var/log/auth.log | grep 'Invalid user'
Apr3019: 49: 48 ip-172-31-11-241 sshd [6512]: Invalid user ubnt from219.140.64.136
Apr3019: 49: 49 ip-172-31-11-241 sshd [6514]: Invalid user admin from219.140.64.136
For more information about grep and regular expressions is not in the scope of this guide, but Ryan's Tutorials deeper introduction.

Log management system has a higher performance and more powerful search capabilities. They often index data and parallel queries, so you can quickly within seconds you can search GB or TB logs. In contrast, grep will take several minutes or even hours in extreme cases. Log management system also uses a similar Lucene query language that provides a simpler syntax to retrieve digital, and other fields.

With Cut, AWK, and Grok parsing

Command-line tool

Linux provides several command-line tools for text parsing and analysis. When you want to quickly resolve a small amount of data is very useful, but when dealing with large amounts of data may take a long time.

Cut

command allows you to cut from the log parsing delimited fields. Separator means can be separated from the key field or the like on the equal sign or a comma.

Suppose we want from the following log parse out the user:

pam_unix (su: auth): authentication failure; logname = hoover uid = 1000 euid = 0 tty = / dev / pts / 0 ruser = hoover rhost = user = root
We can get a text field in the eighth after the equal sign is divided like the following with cut command. This is an example on a Ubuntu system:

$ Grep "authentication failure" /var/log/auth.log | cut -d '=' - f 8
root
hoover
root
nagios
nagios
AWK

In addition, you can also use awk, it can provide more powerful analytic capabilities field. It provides a scripting language, you can filter out almost any irrelevant things.

For example, assume the Ubuntu system, we have the following line of the log, we want to extract the login user name failed:

Mar2408: 28: 18 ip-172-31-11-241 sshd [32701]: input_userauth_request: invalid user guest [preauth]
You can use awk command like this. First, a regular expression /sshd.*invalid user / sshd invalid user to match the line. Then {$ 9 print} to print the ninth field according to the default separator spaces available. This will output the username.

$ Awk '/sshd.*invalid user / {print $ 9}' / var / log / auth.log
guest
admin
info
test
ubnt
You can read more about how to use regular expressions and output fields in Awk User's Guide.

Log management system

Log management system makes it easier to resolve, so that users can quickly analyze a lot of log files. They can automatically parse standard log formats, such as the common Linux server logs and Web logs. This can save a lot of time, because when you deal with system problems need to consider writing your own parsing logic.

Here is an example sshd log messages, parse out each remoteHost and user. This is a screenshot of Loggly, it is a cloud-based log management service.

You can also customize the parsing of non-standard format. A commonly used tool is Grok, it is common to use a regular expression library, you can resolve the original text as a structured JSON. Here is a case in Logstash Grok parse log files kernel configuration:

filter {
grok {
match => { "message" => "% {CISCOTIMESTAMP: timestamp}% {HOST: host}% {WORD: program}% {NOTSPACE}% {NOTSPACE}% {NUMBER: duration}% {NOTSPACE}% {GREEDYDATA: kernel_logs} "
}
}

Filtered Rsyslog and AWK

Filter so that you can retrieve a specific field value instead of full-text search. This allows you to log analysis is more accurate because it ignores the rest of the match from the log information required. To search for a field value, you first need to parse logs, or at least on the way to retrieve the event structure.

How to filter apps

Typically, you might just want to see the log of an application. If you use the recording are saved to a file it will be very easy. If you need to apply a filter or in a centralized log aggregation will be more complicated. Here are several ways:

With rsyslog daemon log parsing and filtering. The following example will log sshd application writes a file named sshd-message, and then discard the event so that it will not be repeated elsewhere. You can add it to your rsyslog.conf file test this example.

: Programname, isequal, "sshd" / var / log / sshd-messages
& ~
Extract the value of a particular field in a similar awk command-line tools, such as sshd user name. Here is an example of Ubuntu system.

$ Awk '/sshd.*invalid user / {print $ 9}' / var / log / auth.log
guest
admin
info
test
ubnt
With log management system automatically parse the log, then click on the name of the app require filtration. Here is a screenshot extracted Loggly Log Management Service syslog domain. We use the name "sshd" filter

How to filter error

Most people want to see a log of errors. Unfortunately, the default configuration does not direct output syslog severity of the error, making it difficult to filter them.

There are two methods to solve this problem. First, you can modify your rsyslog configuration, output severity of the error in the log file, making for easy viewing and retrieval. In your rsyslog configuration, you can add a template with the pri-text, like this:

"<% Pri-text%>:% timegenerated%,% HOSTNAME%,% syslogtag%,% msg% n"
This example will output the following format. You can see the information indicating an error err.

: Mar 11 18: 18: 00, hoover-VirtualBox, su [5026] :, pam_authenticate: Authentication failure
You can retrieve the error message using awk or grep. In Ubuntu, for example, we can use some grammatical features, for example. And>, they will only match this domain.

$ Grep '.err>' / var / log / auth.log
: Mar1118: 18: 00, hoover-VirtualBox, su [5026] :, pam_authenticate: Authentication failure
Your second option is to use log management system. Good log management system can automatically parse syslog error message and extract the domain. They also allow you to use a simple click Filtering log messages to a specific error.

Displays highlighting wrong severity syslog domain, means that we're wrong filter
     
         
         
         
  More:      
 
- CentOS6 MongoDB connection solution can not break 1000 (Database)
- Recovery from MySQL master data consistency summary (Database)
- Installation in lxml Python module (Linux)
- Command line tool Tmux (Linux)
- Introduction and bash history command to quickly call (Linux)
- PostgreSQL log classification and management (Database)
- Which file system is most suitable for your Linux system (Linux)
- Linux Workstation Security Checklist - from the Linux Foundation Internal (Linux)
- Linux deploy Tutorial (Linux)
- Unsafe reboot start (Linux)
- The user of fedora is not in the sudoers file method to solve (Linux)
- Management DB2 logs (Database)
- Specifies the open ports of the SUSE firewall settings (Linux)
- MySQL monitoring tool -Innotop (Database)
- Linux into single user mode to modify the administrator password (Linux)
- To install and use the Doxygen under Linux (Linux)
- CentOS 7 RHEL 7 to reset the root password (Linux)
- How to install Bugzilla 4.4 on Ubuntu / CentOS 6.x (Linux)
- How Linux Log Analysis (Linux)
- Nginx start, stop, smooth start, smooth upgrade (Server)
     
           
     
  CopyRight 2002-2020 newfreesoft.com, All Rights Reserved.