Logstash doesn't write to logs
Logstash can't write log info because the owner of the log file is root currently.
You should change owner of the log file with the command below
chown logstash:logstash logstash.log
I assume that the reason is you started logstash as root user in contrast with logstash service is started as logstash
(See the contents of this file /etc/init.d/logstash)
Why doesn't logstash produce logs?
First of all you have a few configuration issues:
- Hosts in Elasticsearch should be an array (e.g
hosts => ["myHost:myPort3]
), see the doc - File on Windows using the wildcard should use forward slashes and not backward (see this issue)
- Your date filter is looking for a field "logdate" when it should look for the field "TimeStamp" (given your log file)
- One setting I would had for convenience is the
sincedb_path
as Logstash will not try to parse again a file it already parsed (it checks into a .sincedb to see if it already parsed a file, by default located at $HOME/.sincedb, you need to delete it in between parsing when you test with the same log file)
That's why after a few research (actually a lot, not being a windows user), I could come up with this config that works:
input {
file {
path => "C:/some/log/dir/*"
start_position => beginning
ignore_older => 0
sincedb_path => "NIL" #easier to remove from the current directory, the file will be NIL.sincedb
}
}
filter {
grok {
match => { "message" => "TimeStamp=%{TIMESTAMP_ISO8601:logdate} CorrelationId=%{UUID:correlationId} Level=%{LOGLEVEL:logLevel} Message=%{GREEDYDATA:logMessage}" }
}
# set the event timestamp from the log
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
date {
match => [ "TimeStamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout {}
}
Logstash logs are read but are not pushed to elasticsearch
Fixed the issue by using json
codec instead of json_lines
and also removing start_position
, ignore_older
and sincedb_path
input {
file {
codec => "json"
path => ["/etc/logstash/input.log"]
}
}
output {
elasticsearch {
hosts => ["192.168.169.46:9200"]
}
stdout {
codec => rubydebug
}
}
Also the json_lines
codec seems to be incompatible with the file input(\n
separator it's not working as expected)
Related Topics
How to Know Which of the /Dev/Input/Eventx (X=0..7) Have the Linux Input Stream
Check Battery Level of Connected Bluetooth Device on Linux
Find Files in Created Between a Date Range
Best Practices for Git Repositories on Open Source Projects
Linux Script with Curl to Check Webservice Is Up
Linux: How to Know the Module That Exports a Device Node
How to Load Luks Passphrase from Usb, Falling Back to Keyboard
How to Learn the Structure of Linux Wireless Drivers (Mac80211)
Linux Free Shows High Memory Usage But Top Does Not
Internals of a Linux System Call
How to Change File Permissions in Ubuntu
Best Distributed Filesystem for Commodity Linux Storage Farm
Shell Init Issue When Click Tab, What's Wrong with Getcwd
I Get "Dquote>" as a Result of Executing a Program in Linux Shell
What Is the Best Emacs Workspaces Plugin
Detect the Presence of a Device When It's Hot Plugged in Linux