Storm-kafka-mongoDB integration2019 Community Moderator ElectionMongoDB or CouchDB - fit for production?MongoDB vs. CassandraHow to query MongoDB with “like”?Delete everything in a MongoDB databaseHow do I drop a MongoDB database from the command line?When to use CouchDB over MongoDB and vice versaHow to integrate Storm and KafkaApache Kafka vs Apache Stormstorm-kafka integration errorApache Kafka and Apache Storm Integration

A vote on the Brexit backstop

Did Amazon pay $0 in taxes last year?

Should we avoid writing fiction about historical events without extensive research?

Boss Telling direct supervisor I snitched

Is there a logarithm base for which the logarithm becomes an identity function?

Precision notation for voltmeters

Is it a Cyclops number? "Nobody" knows!

Is this Paypal Github SDK reference really a dangerous site?

Does an unused member variable take up memory?

Has a sovereign Communist government ever run, and conceded loss, on a fair election?

How spaceships determine each other's mass in space?

Can the Witch Sight warlock invocation see through the Mirror Image spell?

Propulsion Systems

What does *dead* mean in *What do you mean, dead?*?

What is better: yes / no radio, or simple checkbox?

How can I portion out frozen cookie dough?

Can I negotiate a patent idea for a raise, under French law?

Rationale to prefer local variables over instance variables?

Where is the License file location for Identity Server in Sitecore 9.1?

Is "cogitate" used appropriately in "I cogitate that success relies on hard work"?

Can I challenge the interviewer to give me a proper technical feedback?

Limpar string com Regex

ESPP--any reason not to go all in?

Why would /etc/passwd be used every time someone executes `ls -l` command?



Storm-kafka-mongoDB integration



2019 Community Moderator ElectionMongoDB or CouchDB - fit for production?MongoDB vs. CassandraHow to query MongoDB with “like”?Delete everything in a MongoDB databaseHow do I drop a MongoDB database from the command line?When to use CouchDB over MongoDB and vice versaHow to integrate Storm and KafkaApache Kafka vs Apache Stormstorm-kafka integration errorApache Kafka and Apache Storm Integration










0















I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



 final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
kafkaConf.ignoreZkOffsets=false;
kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
kafkaConf.stateUpdateIntervalMs=2000;
kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
final TopologyBuilder topologyBuilder = new TopologyBuilder();
topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
Config conf = new Config();
conf.setDebug(true);
conf.setMaxSpoutPending(1000);
conf.setMessageTimeoutSecs(30);


Execute method of bolt



 JSONObject jObj = new JSONObject();
jObj.put("key", input.getString(0));

if (null !=jObj && jObj.size() > 0 ) {
final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
if (quoteCollection != null)
BasicDBObject dbObject = new BasicDBObject();
dbObject.putAll(jObj);
quoteCollection.insert(dbObject);
// logger.info("inserted in Collection !!!");
else
logger.info("Error while inserting data in DB!!!");

collector.ack(input);









share|improve this question




























    0















    I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



    Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



    I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



     final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
    kafkaConf.ignoreZkOffsets=false;
    kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
    kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
    kafkaConf.stateUpdateIntervalMs=2000;
    kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
    final TopologyBuilder topologyBuilder = new TopologyBuilder();
    topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
    topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
    Config conf = new Config();
    conf.setDebug(true);
    conf.setMaxSpoutPending(1000);
    conf.setMessageTimeoutSecs(30);


    Execute method of bolt



     JSONObject jObj = new JSONObject();
    jObj.put("key", input.getString(0));

    if (null !=jObj && jObj.size() > 0 ) {
    final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
    if (quoteCollection != null)
    BasicDBObject dbObject = new BasicDBObject();
    dbObject.putAll(jObj);
    quoteCollection.insert(dbObject);
    // logger.info("inserted in Collection !!!");
    else
    logger.info("Error while inserting data in DB!!!");

    collector.ack(input);









    share|improve this question


























      0












      0








      0








      I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



      Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



      I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



       final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
      kafkaConf.ignoreZkOffsets=false;
      kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
      kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
      kafkaConf.stateUpdateIntervalMs=2000;
      kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
      final TopologyBuilder topologyBuilder = new TopologyBuilder();
      topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
      topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
      Config conf = new Config();
      conf.setDebug(true);
      conf.setMaxSpoutPending(1000);
      conf.setMessageTimeoutSecs(30);


      Execute method of bolt



       JSONObject jObj = new JSONObject();
      jObj.put("key", input.getString(0));

      if (null !=jObj && jObj.size() > 0 ) {
      final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
      if (quoteCollection != null)
      BasicDBObject dbObject = new BasicDBObject();
      dbObject.putAll(jObj);
      quoteCollection.insert(dbObject);
      // logger.info("inserted in Collection !!!");
      else
      logger.info("Error while inserting data in DB!!!");

      collector.ack(input);









      share|improve this question
















      I am reading 500 MB random tuples from Kafka producer continuous and in a storm topology I am inserting it to MongoDb using Mongo Java Driver. The problem is I am getting really low throughput as 4-5 tuples per second.



      Without DB insert if I write a simple print statement I get throughput as 684 tuples per second. I am planning to run 1Million records from Kafka and check the throughput with mongo insert.



      I tried to tune using config setMaxSpoutPending , setMessageTimeoutSecs parms in kafkaconfig.



       final SpoutConfig kafkaConf = new SpoutConfig(zkrHosts, kafkaTopic, zkRoot, clientId);
      kafkaConf.ignoreZkOffsets=false;
      kafkaConf.useStartOffsetTimeIfOffsetOutOfRange=true;
      kafkaConf.startOffsetTime=kafka.api.OffsetRequest.LatestTime();
      kafkaConf.stateUpdateIntervalMs=2000;
      kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
      final TopologyBuilder topologyBuilder = new TopologyBuilder();
      topologyBuilder.setSpout("kafka-spout", new KafkaSpout(kafkaConf), 1);
      topologyBuilder.setBolt("print-messages", new MyKafkaBolt()).shuffleGrouping("kafka-spout");
      Config conf = new Config();
      conf.setDebug(true);
      conf.setMaxSpoutPending(1000);
      conf.setMessageTimeoutSecs(30);


      Execute method of bolt



       JSONObject jObj = new JSONObject();
      jObj.put("key", input.getString(0));

      if (null !=jObj && jObj.size() > 0 ) {
      final DBCollection quoteCollection = dbConnect.getConnection().getCollection("stormPoc");
      if (quoteCollection != null)
      BasicDBObject dbObject = new BasicDBObject();
      dbObject.putAll(jObj);
      quoteCollection.insert(dbObject);
      // logger.info("inserted in Collection !!!");
      else
      logger.info("Error while inserting data in DB!!!");

      collector.ack(input);






      mongodb apache-kafka performance-testing apache-storm






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited 2 days ago







      PPB

















      asked 2 days ago









      PPBPPB

      922414




      922414






















          1 Answer
          1






          active

          oldest

          votes


















          0














          There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



          You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



          Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






          share|improve this answer






















            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55023293%2fstorm-kafka-mongodb-integration%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



            You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



            Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






            share|improve this answer



























              0














              There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



              You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



              Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






              share|improve this answer

























                0












                0








                0







                There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



                You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



                Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.






                share|improve this answer













                There is a storm-mongodb module for integration with Mongo. Does it not do the job? https://github.com/apache/storm/tree/b07413670fa62fec077c92cb78fc711c3bda820c/external/storm-mongodb



                You shouldn't use storm-kafka for Kafka integration, it is deprecated. Use storm-kafka-client instead.



                Setting conf.setDebug(true) will impact your processing, as Storm will log a fairly huge amount of text per tuple.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered 2 days ago









                Stig Rohde DøssingStig Rohde Døssing

                1,741234




                1,741234





























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55023293%2fstorm-kafka-mongodb-integration%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Save data to MySQL database using ExtJS and PHP [closed]2019 Community Moderator ElectionHow can I prevent SQL injection in PHP?Which MySQL data type to use for storing boolean valuesPHP: Delete an element from an arrayHow do I connect to a MySQL Database in Python?Should I use the datetime or timestamp data type in MySQL?How to get a list of MySQL user accountsHow Do You Parse and Process HTML/XML in PHP?Reference — What does this symbol mean in PHP?How does PHP 'foreach' actually work?Why shouldn't I use mysql_* functions in PHP?

                    Compiling GNU Global with universal-ctags support Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern) Data science time! April 2019 and salary with experience The Ask Question Wizard is Live!Tags for Emacs: Relationship between etags, ebrowse, cscope, GNU Global and exuberant ctagsVim and Ctags tips and trickscscope or ctags why choose one over the other?scons and ctagsctags cannot open option file “.ctags”Adding tag scopes in universal-ctagsShould I use Universal-ctags?Universal ctags on WindowsHow do I install GNU Global with universal ctags support using Homebrew?Universal ctags with emacsHow to highlight ctags generated by Universal Ctags in Vim?

                    Add ONERROR event to image from jsp tldHow to add an image to a JPanel?Saving image from PHP URLHTML img scalingCheck if an image is loaded (no errors) with jQueryHow to force an <img> to take up width, even if the image is not loadedHow do I populate hidden form field with a value set in Spring ControllerStyling Raw elements Generated from JSP tagds with Jquery MobileLimit resizing of images with explicitly set width and height attributeserror TLD use in a jsp fileJsp tld files cannot be resolved