mercredi 29 juin 2016

Logstash: MySQL to Elasticsearch (large table)?

I am attempting to import a rather chunky database into Elasticsearch. It has 4m rows across 2 columns (VARCHAR(250) & INT(20)).

When I run the logstash.conf file to import the database into Elasticsearch and I add a LIMIT 0,100 in my SQL command the command runs without any problems. All of the rows end up returned by Logstash in Terminal and then I can see them in the relevant node in Elasticsearch.

When I try to run all of the rows through Logstash at once, it outputs:

Settings: Default pipeline workers: 1 Pipeline Main Started

And nothing more happens.

How do I add such a large table into Elasticsearch?

Here's my logstash.conf script:

input{

  jdbc {
jdbc_driver_library => "/opt/logstash/mysql-connector-java-5.1.39/mysql-connector-java-5.1.39-bin.jar"

jdbc_driver_class => "com.mysql.jdbc.Driver"

jdbc_connection_string => "jdbc:mysql://<ip number>:3306/database"

jdbc_validate_connection => true

jdbc_user => "elastic"

jdbc_password => "password"

schedule => "* * * * *"

    statement => "name, id from master_table"

    }

}

 output

 {
  elasticsearch
    {

    index => "search"
    document_type => "name"
    document_id => "%{id}"
    hosts => "127.0.0.1:9200"
}stdout { codec => json_lines }
}

Aucun commentaire:

Enregistrer un commentaire