Blog Detail

Covid-19 Global Tracker
preview image DevOps
by Anurag Srivastava, Mar 9, 2019, 8:20:38 AM | 3 minutes |

Configuring Logstash to send MongoDB data into Elasticsearch

In this blog, I am going to explain how you can push your MongoDB data into Elasticsearch. You may be thinking what is the benefit of sending MongoDB data into Elasticsearch so let me explain to you the scenarios where you may want to push MongoDB data into Elasticsearch.

  • If you want to apply a robust search for your data.
  • You want to collect your data on a central Elasticsearch cluster for analysis and visualization.

Now to push the data into Elasticsearch you need a "logstash-input-mongodb" input plugin for Logstash. So let us see how we can install this input plugin and can configure Logstash for pushing MongoDB data into Elasticsearch.

If you want to know the basics of Logstash then please refer to the "Introduction to Logstash" blog where I have explained the basics of Logstash.

First, you have to log in as root user

sudo su


Then go to the Logstash installation directory (based on your operating system):

cd /usr/share/logstash


Now execute the following command:

bin/logstash-plugin install logstash-input-mongodb


You will get the following response:


Validating logstash-input-mongodb
Installing logstash-input-mongodb
Installation successful


Now create a configuration file to take MongoDB data as input.

input {
        uri => 'mongodb://usernam:password@anurag-00-00-no6gn.mongodb.net:27017/anurag?ssl=true'
        placeholder_db_dir => '/opt/logstash-mongodb/'
        placeholder_db_name => 'logstash_sqlite.db'
        collection => 'users'
        batch_size => 5000
}
filter {

}
output {
        stdout {
                codec => rubydebug
        }
        elasticsearch {
                action => "index"
                index => "mongo_log_data"
                hosts => ["localhost:9200"]
        }
}

After creating the configuration file execute the following command to pull the data.

bin/logstash -f /etc/logstash/conf.d/mongodata.conf

This command will start fetching from the users collection as we have mentioned the name of the collection in the configuration file. This will also create an Elasticsearch index as "mongo_log_data" and will push the MongoDB data.

You will also get the output on the terminal as I have given two output blocks first for terminal output using stdout while other for Elasticsearch output.  This command will pull MongoDB data for user collection and will push the data into the "mongo_log_data" index of Elasticsearch.

Other Blogs on Elastic Stack:
Introduction to Elasticsearch

Elasticsearch Installation and Configuration on Ubuntu 14.04
Log analysis with Elastic stack 
Elasticsearch Rest API
Basics of Data Search in Elasticsearch
Elasticsearch Rest API
Wildcard and Boolean Search in Elasticsearch
Configure Logstash to push MySQL data into Elasticsearch
Load CSV Data into Elasticsearch
Metrics Aggregation in Elasticsearch
Bucket Aggregation in Elasticsearch
How to create Elasticsearch Cluster

In case of any doubt please leave your comments. You can also follow me on Twitter:https://twitter.com/anubioinfo


If you found this article interesting, then you can explore “Mastering Kibana 6.0”, “Kibana 7 Quick Start Guide”, “Learning Kibana 7”, and “Elasticsearch 7 Quick Start Guide” books to get more insight about Elastic Stack, how to perform data analysis, and how you can create dashboards for key performance indicators using Kibana.

About Author

Anurag Srivastava

Author of “Mastering Kibana 6.x”, “Kibana 7 Quick Start Guide”, “Learning Kibana 7”, &“Elasticsearch 7 Quick Start Guide” books & AWS Certified Solutions Architect.

View Profile

Comments (13)

  • user image
    Johana Hernandez
    Jun 14, 2019, 3:21:09 PM

    Hello Anurag, I appreciate your input on this topic, I would like to know if you could help me with an error when executing the integration of MongoDB to ES .. I follow your recommendation in the configuration of the file but at the moment of reading from Mongo it remains in an infinite reading cycle. Annex the configuration file and the fragment of the cycle. input { mongodb{ uri => 'mongodb://user:password@server:27017/admin?connectTimeoutMS=10000&authSource=admin&authMechanism=SCRAM-SHA-1' placeholder_db_dir => '/Logstash/logstash-7.1.1/db_dir' placeholder_db_name => 'logstash_sqlite.db' collection => 'Monedas' batch_size => 3 generateId => true } } filter { mutate { remove_field => [ "_id" ] } } output { elasticsearch { action => "index" index => "indice_prueba" hosts => ["localhost:9200"] } } This fragment of error repeats itself incessantly. -------------------------------------------------------------------------------------------------------------------------------------------- [2019-06-14T10:32:13.006000 #35556] DEBUG -- : MONGODB | server:27017 | admin.listCollections | SUCCEEDED | 0.181s D, [2019-06-14T10:32:13.341000 #35556] DEBUG -- : MONGODB |server:27017 | admin.listCollections | STARTED | {"listCollections"=>1, "cursor"=>{}, "nameOnly"=>true, "filter"=>{"name"=>{"$not"=>/system\.|\$/}}} -------------------------------------------------------------------------------------------------------------------------------------------- Do you have any idea why this happens? Regards!

  • user image
    Anurag Srivastava
    Aug 14, 2019, 3:55:49 AM

    @johana: Please provide the Logstash configuration file content. Although it is too late but please let me know if this issue was resolved?

  • user image
    PADHIRE REDDY
    Jun 15, 2020, 3:52:45 PM

    @anurag could you please help me out I'm getting same This fragment of error repeats itself incessantly. -------------------------------------------------------------------------------------------------------------------------------------------- [2019-06-14T10:32:13.006000 #35556] DEBUG -- : MONGODB | server:27017 | admin.listCollections | SUCCEEDED | 0.181s D, [2019-06-14T10:32:13.341000 #35556] DEBUG -- : MONGODB |server:27017 | admin.listCollections | STARTED | {"listCollections"=>1, "cursor"=>{}, "nameOnly"=>true, "filter"=>{"name"=>{"$not"=>/system\.|\$/}}} -

  • user image
    Anurag Srivastava
    Jun 15, 2020, 5:51:32 PM

    @padhire: Please share the Logstash configuration file content.

  • user image
    PADHIRE REDDY
    Jun 16, 2020, 4:14:18 AM

    @Anurag this is my config file input { mongodb { uri => "mongodb://localhost:27017/test" placeholder_db_dir => "D:/elk/logstash-7.7.1/place_holder_db_dir" placeholder_db_name => "logstash_sqlite.db" collection => "employee" batch_size => 5 } } output { stdout { codec => rubydebug } elasticsearch { hosts => "localhost:9200" index => "employeemd" "document_type" => "data" } }

  • user image
    PADHIRE REDDY
    Jun 16, 2020, 4:46:35 AM

    input { mongodb { uri => "mongodb://localhost:27017/test" placeholder_db_dir => "D:/elk/logstash-7.7.1/place_holder_db_dir" placeholder_db_name => "logstash_sqlite.db" collection => "employee" batch_size => 50 } } filter { } output { stdout { codec => rubydebug } elasticsearch { action => "index" hosts => "localhost:9200" index => "employeemd" "document_type" => "data" } }

  • user image
    Anurag Srivastava
    Jun 16, 2020, 3:58:53 PM

    Hey this seems ok to me, have you tried with a new collection after adding some data in MongoDB. Probably we are getting success from MongoDB side and may be it is not picking the records so it is good to try on a new collection or by adding some data to the existing collection after executing the Logstash configuration.

  • user image
    Todd K
    Jun 22, 2020, 8:12:03 PM

    Hi, here is my input with the associated error. Any help you can provide would be greatly appreciated! input { mongodb { uri => 'mongodb://localhost:27017/test_phoenix' placeholder_db_dir => 'C:/Users/fdub_user/Documents/logstash-7.6.2' placeholder_db_name => 'logstash_sqlite.db' collection => 'cell_towers' batch_size => 5000 } } filter { } output { stdout { codec => rubydebug } elasticsearch { action => "index" index => "mongo_cell_towers" hosts => ["localhost:9200"] user => "elastic" password => "1234!@#$" } } Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"{\" at line 2, column 13 (byte 22) after input {\r\n uri ", :backtrace=>["C:/Users/fdub_user/Documents/logstash-7.6.2/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "C:/Users/fdub_user/Documents/logstash-7.6.2/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "C:/Users/fdub_user/Documents/logstash-7.6.2/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2580:in `map'", "C:/Users/fdub_user/Documents/logstash-7.6.2/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:161:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "C:/Users/fdub_user/Documents/logstash-7.6.2/logstash-core/lib/logstash/java_pipeline.rb:27:in `initialize'", "C:/Users/fdub_user/Documents/logstash-7.6.2/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "C:/Users/fdub_user/Documents/logstash-7.6.2/logstash-core/lib/logstash/agent.rb:326:in `block in converge_state'"]}

  • user image
    Anurag Srivastava
    Jun 24, 2020, 4:57:02 AM

    Hi Todd, can you please check if your configuration file is proper: sudo bin/logstash --config.test_and_exit -f <path_to_config_file>

  • user image
    Todd K
    Jun 25, 2020, 8:25:22 AM

    Thanks Anurag, I was able to get it to start pulling from Mongo then pushing to ELK, but I got the error where the _id from Mongo wasn't allowing it to ingest the data. I then changed it and it says success connecting to mongo, but does not push the data to ELK.

  • user image
    Anurag Srivastava
    Jun 25, 2020, 9:16:08 AM

    Have you tried by adding some fresh entries in MongoDB after executing the configuration as probably Logstash has updated the pointer. Try once and let me know if it works. In case of any issues please share the error message you are getting.

  • user image
    maniprakash muthukrishnan
    Jun 29, 2020, 9:40:10 PM

    Hi Anurag, I am also facing the same issue .it started reading from Mongo it remains in an infinite reading cycle and index is created without any documents D, [2020-06-29T16:31:01.055046 #105989] DEBUG -- : MONGODB | [77] 127.0.0.1:27017 | test.find | SUCCEEDED | 0.002s D, [2020-06-29T16:31:01.090770 #105989] DEBUG -- : MONGODB | [78] 127.0.0.1:27017 #1 | test.listCollections | STARTED | {"listCollections"=>1, "cursor"=>{}, "nameOnly"=>true, "$db"=>"test", "lsid"=>{"id"=><BSON::Binary:0x2022 type=uuid data=0xed5399691cb74d71...>}} D, [2020-06-29T16:31:01.094120 #105989] DEBUG -- : MONGODB | [78] 127.0.0.1:27017 | test.listCollections | SUCCEEDED | 0.002s D, [2020-06-29T16:31:06.133707 #105989] DEBUG -- : MONGODB | [79] 127.0.0.1:27017 #1 | test.find | STARTED | {"find"=>"p1", "filter"=>{"_id"=>{"$gt"=>BSON::ObjectId('5efa152e4c3cc8a8b1ad6d7e')}}, "limit"=>5, "$db"=>"test", "lsid"=>{"id"=><BSON::Binary:0x2022 type=uuid data=0xed5399691cb74d71...>}} D, [2020-06-29T16:31:06.145513 #105989] DEBUG -- : MONGODB | [79] 127.0.0.1:27017 | test.find | SUCCEEDED | 0.009s D, [2020-06-29T16:31:06.190825 #105989] DEBUG -- : MONGODB | [80] 127.0.0.1:27017 #1 | test.listCollections | STARTED | {"listCollections"=>1, "cursor"=>{}, "nameOnly"=>true, "$db"=>"test", "lsid"=>{"id"=><BSON::Binary:0x2022 type=uuid data=0xed5399691cb74d71...>}} D, [2020-06-29T16:31:06.218383 #105989] DEBUG -- : MONGODB | [80] 127.0.0.1:27017 | test.listCollections | SUCCEEDED | 0.017s D

  • user image
    Leonardo Percoco
    Jul 1, 2020, 12:01:17 PM

    I have the same problem of maniprakash...

Leave a comment

Related Blogs

Configuring Logstash to push MySQL data into Elasticsearch

Feb 9, 2019, 12:06:18 PM | Anurag Srivastava

Introduction to Logstash

Dec 20, 2019, 11:38:31 AM | Anurag Srivastava

Loading CSV Data into Elasticsearch

Feb 9, 2019, 6:34:22 PM | Anurag Srivastava

Execute Commands on Remote Machines using sshpass

Jul 16, 2018, 5:00:02 PM | Anurag Srivastava

Log analysis with Elastic stack

Jan 31, 2018, 6:11:29 AM | Anurag Srivastava

Snapshot and Restore Elasticsearch Indices

Sep 16, 2019, 5:55:06 AM | Anurag Srivastava

How to create Elasticsearch Cluster

Apr 6, 2019, 8:41:41 PM | Anurag Srivastava

Introduction to Elastic APM

Jan 7, 2020, 7:15:34 PM | Anurag Srivastava

Why monitoring is important?

Jan 6, 2020, 7:30:13 PM | Anurag Srivastava

Configuring Django application with Elastic APM

Jan 14, 2020, 10:22:34 AM | Anurag Srivastava

Top Blogs

Configure SonarQube Scanner with Jenkins

Jun 21, 2018, 4:58:11 AM | Anurag Srivastava

Deploying Angular code using Python script

Jun 26, 2018, 4:50:18 PM | Anurag Srivastava

Configure Jenkins for Automated Code Deployment

Jun 13, 2018, 3:44:01 PM | Anurag Srivastava

Execute Commands on Remote Machines using sshpass

Jul 16, 2018, 5:00:02 PM | Anurag Srivastava

SonarQube installation on Ubuntu

May 12, 2018, 4:47:07 PM | Anurag Srivastava

Configuring Logstash to send MongoDB data into Elasticsearch

Mar 9, 2019, 8:20:38 AM | Anurag Srivastava

Wildcard and Boolean Search in Elasticsearch

Aug 10, 2018, 7:14:40 PM | Anurag Srivastava

Why SonarQube is important for IT projects ?

Apr 24, 2018, 2:52:28 PM | Anurag Srivastava

Elasticsearch Rest API

Jul 31, 2018, 6:16:42 PM | Anurag Srivastava

Analyze your project with SonarQube

Jun 2, 2018, 10:49:54 AM | Anurag Srivastava