This project involves setting up a logging infrastructure to capture and analyze Nginx logs using Filebeat and the ELK stack, consisting of Elasticsearch, Logstash, and Kibana.
This project sets up a robust logging infrastructure to capture and analyze Nginx logs. The setup uses Filebeat to ship logs to the ELK stack, which includes Elasticsearch for storage and search, Logstash for processing, and Kibana for visualization.
- Docker and Docker Compose installed on your system.
- Basic knowledge of Nginx, Elasticsearch, Logstash, and Kibana.
-
Clone the repository:
git clone https://github.com/yourusername/elk-logging-infrastructure.git cd elk-logging-infrastructure -
Start the ELK stack using Docker Compose:
docker-compose up -d
-
Verify that the services are running:
docker-compose ps
-
Configure Filebeat to read Nginx logs:
- Edit the
filebeat.ymlconfiguration file to specify the paths to your Nginx log files.
- Edit the
-
Configure Logstash to process the logs:
- Edit the
logstash.conffile to define the input, filter, and output plugins.
- Edit the
-
Configure Kibana to visualize the logs:
- Access Kibana at
http://localhost:5601and set up your index patterns.
- Access Kibana at
- Start Nginx and generate some logs.
- Filebeat will ship the logs to Logstash.
- Logstash will process the logs and send them to Elasticsearch.
- Use Kibana to visualize and analyze the logs.
Contributions are welcome! Please fork the repository and submit a pull request.
This project is licensed under the MIT License.