One needs to send 1 mln HTTP requests concurrently, in batches, and read the responses. No more than 100 requests at a time.

Which way will it be better, recommended, idiomatic?

  • Send 100 ones, wait for them to finish, send another 100, wait for them to finish… and so on

  • Send 100 ones. As a a request among the 100 finishes, add a new one into the pool. “Done - add a new one. Done - add a new one”. As a stream.

  • catacomb@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 months ago

    Where did you get 100 from? I’m just asking if it’s a real limit or a guess at “some manageable number” under one million.

    It can be worth experimenting and tuning this value. You might even find that less than 100 works better.