How to use ELK Stack for log analysis in Linux environment?

WBOY
Release: 2023-07-29 16:53:11
Original
1241 people have browsed it

How to use ELK Stack for log analysis in Linux environment?

1. Introduction to ELK Stack
ELK Stack is a log analysis platform composed of three open source software Elasticsearch, Logstash and Kibana. Elasticsearch is a distributed real-time search and analysis engine, Logstash is a tool for collecting, processing and forwarding logs, and Kibana is an interface for visualizing and analyzing logs.

2. Install ELK Stack

  1. Install Elasticsearch
    (1) Download the latest version of Elasticsearch:
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.15.2-linux-x86_64.tar.gz
Copy after login

(2) Unzip and install Package:

tar -zxvf elasticsearch-7.15.2-linux-x86_64.tar.gz
Copy after login

(3) Run Elasticsearch:

cd elasticsearch-7.15.2/bin
./elasticsearch
Copy after login

(4) Verify that Elasticsearch is running normally, visit http://localhost:9200 in the browser, if the following information is returned, it means installation Success:

{
  "name" : "xxxx",
  "cluster_name" : "elasticsearch",
  "cluster_uuid" : "xxxx",
  "version" : {
    "number" : "7.15.2",
    "build_flavor" : "default",
    "build_type" : "tar",
    "build_hash" : "xxxx",
    "build_date" : "xxxx",
    "build_snapshot" : false,
    "lucene_version" : "xxxx",
    "minimum_wire_compatibility_version" : "xxxx",
    "minimum_index_compatibility_version" : "xxxx"
  },
  "tagline" : "You Know, for Search"
}
Copy after login
  1. Install Logstash
    (1) Download the latest version of Logstash:
wget https://artifacts.elastic.co/downloads/logstash/logstash-7.15.2.tar.gz
Copy after login

(2) Unzip the installation package:

tar -zxvf logstash-7.15.2.tar.gz
Copy after login

(3) Create a Logstash configuration file, such as logstash.conf:

input {
  file {
    path => "/var/log/nginx/access.log"
    start_position => "beginning"
  }
}

filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "nginx-access-log"
  }
  stdout { codec => rubydebug }
}
Copy after login

The above configuration file specifies the input log path, uses Grok mode to match the log format, sends the processed log to Elasticsearch, and passes The stdout plugin outputs debugging information to the terminal.

(4) Run Logstash:

cd logstash-7.15.2/bin
./logstash -f logstash.conf
Copy after login

Note: The configuration information of logstash.conf needs to be modified according to the actual situation.

  1. Install Kibana
    (1) Download the latest version of Kibana:
wget https://artifacts.elastic.co/downloads/kibana/kibana-7.15.2-linux-x86_64.tar.gz
Copy after login

(2) Unzip the installation package:

tar -zxvf kibana-7.15.2-linux-x86_64.tar.gz
Copy after login

(3 ) Modify the config/kibana.yml file and set the address of Elasticsearch:

elasticsearch.hosts: ["http://localhost:9200"]
Copy after login

(4) Run Kibana:

cd kibana-7.15.2/bin
./kibana
Copy after login

(5) Visit http://localhost:5601 in the browser, If you can see the Kibana interface, the installation is successful.

3. Use ELK Stack for log analysis
After the ELK Stack is installed, you can start log analysis.

  1. Collect logs
    In the Logstash configuration file, you can configure logs from multiple sources, such as files, networks, etc. Modify the Logstash configuration file, specify the correct log source, and format it accordingly.
  2. Processing and forwarding logs
    Logstash is a powerful log processing tool that can process and forward logs through built-in plug-ins. In the filter section of the configuration file, you can use a series of plug-ins to parse, filter and format logs.
  3. Storing and Indexing Logs
    In the output section of the Logstash configuration file, you can configure the storage and indexing methods of logs. Elasticsearch is a distributed search engine that can quickly store and retrieve large amounts of data. You can store the processed logs in the corresponding index by configuring the hosts and index parameters of Elasticsearch.
  4. Visualizing and analyzing logs
    Kibana is the visualization tool of ELK Stack. It provides rich charts and dashboards to display and analyze log data. In Kibana, various charts and reports can be customized to meet different needs by creating index patterns, visualizations, and dashboards.

4. Summary
ELK Stack is a powerful and flexible log analysis platform that can help us collect, process, store, visualize and analyze log data. It only takes a few simple steps to install and configure ELK Stack in a Linux environment, and then you can perform log analysis according to actual needs. In this way, we can better understand and utilize log data to optimize system performance, identify potential problems, and improve user experience.

The above is the detailed content of How to use ELK Stack for log analysis in Linux environment?. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template