Skip to content

This variation on req_perform() performs multiple requests in parallel. Unlike req_perform() it always succeeds; it will never throw an error. Instead it will return error objects, which are your responsibility to handle.

Exercise caution when using this function; it's easy to pummel a server with many simultaneous requests. Only use it with hosts designed to serve many files at once.

Usage

multi_req_perform(reqs, paths = NULL, pool = NULL, cancel_on_error = FALSE)

Arguments

reqs

A list of requests.

paths

An optional list of paths, if you want to download the request bodies to disks. If supplied, must be the same length as reqs.

pool

Optionally, a curl pool made by curl::new_pool(). Supply this if you want to override the defaults for total concurrent connections (100) or concurrent connections per host (6).

cancel_on_error

Should all pending requests be cancelled when you hit an error. Set this to TRUE to stop all requests as soon as you hit an error. Responses that were never performed will have class httr2_cancelled in the result.

Value

A list the same length as reqs where each element is either a response or an error.

Limitations

  • Will not retrieve a new OAuth token if it expires part way through the requests.

  • Does not perform throttling with req_throttle().

  • Does not attempt retries as described by req_retry().

  • Consults the cache set by req_cache() before/after all requests.

In general, where req_perform() might make multiple requests due to retries or OAuth failures, multi_req_perform() will make only make 1.

Examples

# Requesting these 4 pages one at a time would take four seconds:
reqs <- list(
  request("https://httpbin.org/delay/1"),
  request("https://httpbin.org/delay/1"),
  request("https://httpbin.org/delay/1"),
  request("https://httpbin.org/delay/1")
)
# But it's much faster if you request in parallel
system.time(resps <- multi_req_perform(reqs))
#>    user  system elapsed 
#>   0.659   0.404   1.062 

reqs <- list(
  request("https://httpbin.org/status/200"),
  request("https://httpbin.org/status/400"),
  request("FAILURE")
)
# multi_req_perform() will always succeed
resps <- multi_req_perform(reqs)
# you'll need to inspect the results to figure out which requests fails
fail <- vapply(resps, inherits, "error", FUN.VALUE = logical(1))
resps[fail]
#> [[1]]
#> <error/httr2_http_400>
#> Error in `resp_abort()`:
#> ! HTTP 400 Bad Request.
#> ---
#> Backtrace:
#>   1. pkgdown::build_site_github_pages(new_process = FALSE, install = FALSE)
#>  34. httr2 (local) `<fn>`(`<named list>`)
#>  39. httr2::resp_check_status(resp)
#>  40. httr2:::resp_abort(resp, info)
#> 
#> [[2]]
#> <error/httr2_failed>
#> Error:
#> ! Could not resolve host: FAILURE
#>