Recently I was working on a project where I was sending ~15 requests to external APIs, each of which took ~1s meaning that the total request time until the user received a response was around 15 seconds.
I wanted to make this faster so I researched sending these external requests concurrently and was able to reduce the total response time to ~1 second. This is an example of what I did:
Create a fresh project:
laravel new concurrent_requests_example
We will use Guzzle to send the requests:
composer require guzzlehttp/guzzle
I’ll make a controller:
php artisan make:Controller ExampleController
To compare the difference between sequential requests and concurrent I make an endpoint for each:
class ExampleController extends Controller
{
/**
* Sends requests sequentially
*/
public function sequential()
{ } /**
* Sends requests concurrently
*/
public function concurrent()
{ }
}
And the routes I put into web.php:
/* laravel 8 syntax */
Route::get('/sequential', [ExampleController::class, 'sequential']);
Route::get('/concurrent', [ExampleController::class, 'concurrent']);/* pre-laravel 8 syntax */
Route::get('/sequential', 'ExampleController@sequential');
Route::get('/concurrent', 'ExampleController@concurrent');
Now I looked up 10 public apis at random and put sequential requests in my sequential() method:
/**
* Sends requests sequentially
*/
public function sequential()
{
$time_start = microtime(true); $client = new Client(); $responses = []; $responses['archive'] = $client->get('<https://archive.org/advancedsearch.php?q=subject:google+sheets&output=json>');
$responses['cat_facts'] = $client->get('<https://cat-fact.herokuapp.com/facts>');
$responses['coin_gecko'] = $client->get('<https://api.coingecko.com/api/v3/exchange_rates>');
$responses['universities'] = $client->get('<http://universities.hipolabs.com/search?country=United+Kingdom>');
$responses['countries'] = $client->get('<https://restcountries.eu/rest/v2/all>');
$responses['randomuser'] = $client->get('<https://randomuser.me/api/>');
$responses['punkapi'] = $client->get('<https://api.punkapi.com/v2/beers>');
$responses['publicapis'] = $client->get('<https://api.publicapis.org/entries>');
$responses['openlibrary'] = $client->get('<https://openlibrary.org/api/volumes/brief/isbn/9780525440987.json>');
$responses['food_facts'] = $client->get('<https://world.openfoodfacts.org/api/v0/product/737628064502.json>'); foreach ($responses as $key => $result)
{
echo $key . "<br>"; //$result is a Guzzle Response object
//do stuff with $result
//eg. $result->getBody()
} echo 'Sequential execution time in seconds: ' . (microtime(true) - $time_start);
}
And for the concurrent() method I used the same endpoints:
/**
* Sends requests concurrently
*/
public function concurrent()
{
$time_start = microtime(true); $client = new Client(); $promises = [
'archive' => $client->getAsync('<https://archive.org/advancedsearch.php?q=subject:google+sheets&output=json>'),
'cat_facts' => $client->getAsync('<https://cat-fact.herokuapp.com/facts>'),
'coin_gecko' => $client->getAsync('<https://api.coingecko.com/api/v3/exchange_rates>'),
'universities' => $client->getAsync('<http://universities.hipolabs.com/search?country=United+Kingdom>'),
'countries' => $client->getAsync('<https://restcountries.eu/rest/v2/all>'),
'randomuser' => $client->getAsync('<https://randomuser.me/api/>'),
'punkapi' => $client->getAsync('<https://api.punkapi.com/v2/beers>'),
'publicapis' => $client->getAsync('<https://api.publicapis.org/entries>'),
'openlibrary' => $client->getAsync('<https://openlibrary.org/api/volumes/brief/isbn/9780525440987.json>'),
'food_facts' => $client->getAsync('<https://world.openfoodfacts.org/api/v0/product/737628064502.json>'),
]; $responses = Promise\\Utils::settle($promises)->wait(); foreach ($responses as $key => $response)
{
echo $key . "<br>"; //response state is either 'fulfilled' or 'rejected'
if($response['state'] === 'rejected')
{
//handle rejected
continue;
} //$result is a Guzzle Response object
$result = $response['value']; //do stuff with $result
//eg. $result->getBody()
} echo 'Concurrent execution time in seconds: ' . (microtime(true) - $time_start);
}
Sequential output:
archive
cat_facts
coin_gecko
universities
countries
randomuser
punkapi
publicapis
openlibrary
food_facts
Sequential execution time in seconds: 19.31148982048
Concurrent output:
archive
cat_facts
coin_gecko
countries
food_facts
openlibrary
publicapis
punkapi
randomuser
universities
Concurrent execution time in seconds: 4.8118050098419
As you can see there is a significant time saving of about 15 seconds. The more requests you have, the more significant the gap will become.
Hopefully this helps somebody!