0

I want to send 1000 records at once in batchc of 500. Once they are sent then i want tail recursion to continue. numberOfEventsPerSecond=1000 putRecordLimit=500 sendInBatches is called before the furure call completes. Is there any way first the futureCall call is completed then only the sendInBatches is called.



  @scala.annotation.tailrec
  final def sendBatches(
    buffer: Seq[File],
    numberOfEventsPerSecond: Int
  ): Seq[PutRecordsRequestEntry] =
{
      val (listToSend, remaining) = buffer.splitAt(numberOfEventsPerSecond)
      val listRes = listToSend
        .grouped(putRecordLimit)
        .toList
        .filter(_.nonEmpty)

      listRes.map { list =>
        futureCall(list.filter(_ != null)) map { putDataResult =>
             println("Sent")
          )
        }
      }
      sendInBatches(fileOption, remaining, numberOfEventsPerSecond)
}

Avenger
  • 793
  • 11
  • 31
  • 1
    **Lists** & **Futures** aren't a good idea when you want batching and parallelism and things like that. I would suggest you to take a look to streaming libraries, like `Akka Streams`, `fs2`, `monix` or `zio zstreams`. – Luis Miguel Mejía Suárez Apr 22 '20 at 12:40

1 Answers1

0

I guess what do you want is to use future synchronously. You can do this with Await.result(future, duration):

val future = futureCall(list.filter(_ != null))
val futureResult = Await.result(future, Duration.Inf)
sendInBatches(fileOption, remaining, numberOfEventsPerSecond)

Await.result should be used very carefully only when it is absolutely necessary.

More information about futures: https://docs.scala-lang.org/overviews/core/futures.html

More information about why you should avoid Await.result How risky is it to call Await.result on db calls

Kobotan
  • 112
  • 6