Old Logs Are Not Imported into Es by Logstash

Old logs are not imported into ES by logstash

When you read an input log to Logstash, Logstash will keep an record about the position it read on this file, that's call sincedb.

Where to write the sincedb database (keeps track of the current position of monitored log files). 
The default will write sincedb files to some path matching "$HOME/.sincedb*"

So, if you want to import old log files, you must delete all the .sincedb* at your $HOME.
Then, you need to set

start_position=>"beginning"

at your configuration file.

Hope this can help you.

How to process old Logs with Logstash

Try this:

file {
path => "/Users/Auyer/ELK/ServerLogs/cowrie.json"
start_position => beginning
sincedb_path => "/dev/null"
codec => json_lines
type => "cowrie"
}

A few days back I had a similar problem. Setting sincedb_path to /dev/null fixed the problem.

Is it possible Logstash push same content from log file to ElasticSearch

The file input documentation contains a whole paragraph about how well it handles rotation

File rotation is detected and handled by this input, regardless of whether the file is rotated via a rename or a copy operation. To support programs that write to the rotated file for some time after the rotation has taken place, include both the original filename and the rotated filename (e.g. /var/log/syslog and /var/log/syslog.1) in the filename patterns to watch (the path option).

Since the tail mode is the default, your path parameter should make sure to use a glob pattern to catch all files, exactly as you did. So you're all set. Happy tailing!

Logstash is skipping records while inserting records in elastic search

PostgreSQL does not give records in same order so kindly add order by clause in query, it will solve your issue.
you can try below configuration, it's working.

input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://ip:5432/dbname"
jdbc_user => "postgres"
jdbc_password => "postgres"
jdbc_driver_library => "/postgresql.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_paging_enabled => true
jdbc_page_size => 25000
statement => "select * from source_table order by id desc"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "sample"
document_type => "docs"
document_id => "%{id}"
}
}

Logstash not importing files due to missing index error

It seems like nothing is making it to elasticsearch currently.

For the current version of es (0.90.5), I had to use elasticsearch_http output. The elasticsearch output seemed to be too closely associated with 0.90.3.

e.g: here is how my config is for log4j format to elastic search

input {
file {
path => "/srv/wso2/wso2am-1.4.0/repository/logs/wso2carbon.log"
path => "/srv/wso2/wso2as-5.1.0/repository/logs/wso2carbon.log"
path => "/srv/wso2/wso2is-4.1.0/repository/logs/wso2carbon.log"
type => "log4j"
}
}

output {
stdout { debug => true debug_format => "ruby"}

elasticsearch_http {
host => "localhost"
port => 9200
}
}

For my file format, I have a grok filter as well - to parse it properly.

filter {
if [message] !~ "^[ \t\n]+$" {
# if the line is a log4j type
if [type] == "log4j" {
# parse out fields from log4j line
grok {
match => [ "message", "TID:%{SPACE}\[%{BASE10NUM:thread_name}\]%{SPACE}\[%{WORD:component}\]%{SPACE}\[%{TIMESTAMP_ISO8601:timestamp}\]%{SPACE}%{LOGLEVEL:level}%{SPACE}{%{JAVACLASS:java_file}}%{SPACE}-%{SPACE}%{GREEDYDATA:log_message}" ]
add_tag => ["test"]
}

if "_grokparsefailure" not in [tags] {
mutate {
replace => ["message", " "]
}
}
multiline {
pattern => "^TID|^ $"
negate => true
what => "previous"
add_field => {"additional_log" => "%{message}"}
remove_field => ["message"]
remove_tag => ["_grokparsefailure"]
}
mutate {
strip => ["additional_log"]
remove_tag => ["test"]
remove_field => ["message"]
}

}
} else {
drop {}
}
}

Also, I would get elasticsearch head plugin to monitor your content in elasticsearch- to easily verify the data and state it is in.



Related Topics



Leave a reply



Submit