I'm new to the ELK stack and I have got LogStash to send data from MySQL to ElasticSearch, and on the terminal, it looks like it has sent all 40,000 records but when I go and look in Kibana I see that only 200 records have been inputted.
Here is my LogStash configuration file I used.
# file: simple-out.conf
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:mysql://localhost:3306/tweets_articles"
# The user we wish to execute our statement as
jdbc_user => "root"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/etc/elasticsearch/elasticsearch-jdbc-2.3.3.1/lib/mysql-connector-java-5.1.38.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_user => "**"
jdbc_password => "***"
# our query
statement => "SELECT * from tweets"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
Would this be an issue with the dates? in MySQL when I print the time of a record it's in this format.
+---------------------+
| PUBLISHED_AT |
+---------------------+
| 2017-03-06 03:43:51 |
| 2017-03-06 03:43:45 |
| 2017-03-06 03:43:42 |
| 2017-03-06 03:43:30 |
| 2017-03-06 03:43:00 |
+---------------------+
5 rows in set (0.00 sec)
But when I see the output from the config in Terminal it looks like this.
"id" => 41298,
"author" => "b'Terk'",
"retweet_count" => "0",
"favorite_count" => "0",
"followers_count" => "49",
"friends_count" => "23",
"body" => "create an ad",
"published_at" => "2017-03-06T07:30:47.000Z",
"@version" => "1",
"@timestamp" => "2017-03-06T06:44:04.756Z"
Can anyone else see why I can't get all 40,000 records in?
Thanks.