3

I'm a newbie here and also newbie for rethinkdb. First of all, I'm sorry for my bad english and I have a questions about update transaction performance of rethinkdb.

Im using Nodejs with Native JS API of rethinkdb. I need to handle with read file stream and get the data to update in rethinkdb. After I acquired the data and filter it with 1k rows then I sent to update in rethinkdb. Then nodejs server with socketio trigger the changefeed function .

It takes 1 sec/1000 transactions. (SSD Drive)

r.table('mds').getAll(data.symbol, { index : "symbol" }).update({ price : data.price, update_date : moment().format('YYYY-MM-DD HH:mm:ss') }, { returnChanges : false}).run(conn, function(err, cursor)....

Is it usually normal for rethinkdb update performance? Could it be faster? Or Am I wrong with the query or conditions?

Pikachu
  • 31
  • 3
  • can you find an answer to the question or not? I am also facing the same issue of slow update querying. – Dipak Feb 13 '18 at 06:13

1 Answers1

1

How many rows are returned by one of those getAll calls? Depending on the number of rows modified in each transaction, 1000 transactions per second might or might not be reasonable.

If the number of rows in each transaction is small, you should probably be getting better performance. One thing you could try is turning on soft durability for the writes. If that doesn't help (or if you need hard durability), the only other thing to do would be to add more RethinkDB servers to your cluster and shard your table across them.

mlucy
  • 5,249
  • 1
  • 17
  • 21
  • How many rows are returned by one of those getAll calls? - It's about 2k rows. I cannot use soft durability because I need to use Changefeed function. Then it would not return the results. However, I also use dockers container for rethinkdb, so I will try to set up another container for testing cluster and so on. Thank you for your suggestion. – Pikachu Jun 23 '16 at 02:43