One needs to send 1 mln HTTP requests concurrently, in batches, and read the responses. No more than 100 requests at a time.

Which way will it be better, recommended, idiomatic?

  • Send 100 ones, wait for them to finish, send another 100, wait for them to finish… and so on

  • Send 100 ones. As a a request among the 100 finishes, add a new one into the pool. “Done - add a new one. Done - add a new one”. As a stream.

  • Borger@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    9 months ago

    The second option. With the first option you’ll end up in situations where you have spare compute/network resource that isn’t being utilised because all the remaining ones in the current batch of 100 are being handled by other threads / worker processes.

      • catacomb@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        Where did you get 100 from? I’m just asking if it’s a real limit or a guess at “some manageable number” under one million.

        It can be worth experimenting and tuning this value. You might even find that less than 100 works better.