DevOps – Configure ELK Stack to Monitor Java Application on AWS
Sun, 20 Jul 2025

Follow the stories of academics and their research expeditions
Here’s a complete DevOps documentation for configuring the ELK Stack and monitoring a Java application on AWS. This includes setting up Elasticsearch, Logstash, Kibana, and Filebeat, then integrating with a Java app hosted on an EC2 instance.
To deploy and configure the ELK Stack (Elasticsearch, Logstash, Kibana) on AWS EC2 instances, and use Filebeat to ship logs from a Java application for centralized monitoring and analysis.
[ Java App on EC2 ] ---> [ Filebeat ] ---> [ Logstash ] ---> [ Elasticsearch ] ---> [ Kibana Dashboard ]
AWS Account
2–3 EC2 Instances (Ubuntu 20.04 preferred)
1 for ELK Stack
1 for Java App
Open ports:
Elasticsearch: 9200
Kibana: 5601
Logstash: 5044 (for Filebeat input)
Java application with logs written to file (e.g., app.log
)
IAM roles (if using AWS services like CloudWatch in future)
Security Group to allow communication between instances
Launch 2 EC2 Ubuntu instances:
Instance A: ELK Stack (t3.medium or higher)
Instance B: Java App
Update packages:
sudo apt update && sudo apt upgrade -y
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-7.x.list
sudo apt update && sudo apt install elasticsearch -y
Edit config:
sudo nano /etc/elasticsearch/elasticsearch.yml
Set:
network.host: 0.0.0.0
discovery.type: single-node
Start service:
sudo systemctl enable elasticsearch
sudo systemctl start elasticsearch
Test:
curl http://localhost:9200
sudo apt install logstash -y
Create input config:
sudo nano /etc/logstash/conf.d/beats-input.conf
input {
beats {
port => 5044
}
}
Create filter (optional):
sudo nano /etc/logstash/conf.d/logstash-filter.conf
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} %{GREEDYDATA:log_message}" }
}
}
Create output config:
sudo nano /etc/logstash/conf.d/logstash-output.conf
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "java-app-logs"
}
}
Start Logstash:
sudo systemctl start logstash
sudo systemctl enable logstash
sudo apt install kibana -y
Edit config:
sudo nano /etc/kibana/kibana.yml
Set:
server.host: "0.0.0.0"
elasticsearch.hosts: ["http://localhost:9200"]
Start Kibana:
sudo systemctl start kibana
sudo systemctl enable kibana
Access Kibana:
Visit http://
Assuming the Java app logs to /var/log/java-app/app.log
Install Filebeat:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.17.10-amd64.deb
sudo dpkg -i filebeat-7.17.10-amd64.deb
Edit config:
sudo nano /etc/filebeat/filebeat.yml
Update:
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/java-app/app.log
output.logstash:
hosts: [":5044"]
Enable and start:
sudo systemctl enable filebeat
sudo systemctl start filebeat
Test:
sudo filebeat test output
Go to Kibana → Management → Index Patterns
Create an index pattern: java-app-logs*
Choose @timestamp
as the time filter
Go to Discover tab to view incoming logs
Optional: Create Dashboards and Visualizations
Use Metricbeat to monitor system metrics (CPU, RAM) of the Java EC2
Secure ELK stack with SSL and authentication using Nginx reverse proxy or Elastic X-Pack
Configure alerts in Kibana using Watcher or Elastic Alerting
Centralize logs from multiple Java instances
Ship logs from Dockerized apps using Docker logging drivers
Component | Purpose |
---|---|
Elasticsearch | Storage & search engine |
Logstash | Log processing & filtering |
Kibana | Visualization & dashboards |
Filebeat | Lightweight log shipper (agent) |
Java App | Log generation source |
Sun, 20 Jul 2025
Leave a comment