Multiple, Sequential Fetch() Promise

multiple, sequential fetch() Promise

You can use recursion

function fetchNextJson(json_url) {
return fetch(json_url, {
method: 'get'
})
.then(function(response) {
return response.json();
})
.then(function(json) {
results.push(json);
return json.Pagination.NextPage.Href
? fetchNextJson(json.Pagination.NextPage.Href)
: results
})
.catch(function(err) {
console.log('error: ' + error);
});
}

var next_json_url = 'http://localhost:3000/one';
var results = [];

fetchNextJson(json_url).then(function(res) {
console.log(res)
})

How to chain multiple fetch() promises?

The best way to go about this is to use Promise.all() and map().

What map will do in this context return all the promises from fetch.

Then what will happen is await will make your code execution synchronous as it'll wait for all of the promise to be resolved before continuing to execute.

The problem with using forEach here is that it doesn't wait for asynchronous request to be completed before it moves onto the next item.

The code that you should be using here is:

fetch(API_URL_DIARY)
.then(response => response.json())
.then(data => {
console.log("old", data);
return data;
})
.then(async data => {
await Promise.all(data.map((e, index, array) => {
return fetch(API_URL_FOOD_DETAILS + e.foodid)
.then(response => response.json())
.then(data => {
array[index] = {...e, ...data};
console.log("update");
})
}));

console.log("new", data)
});

How to do multiple fetch requests

This can be done nicely by

  1. Creating an array of URLs to resolve
  2. Iterating the returned array data
  3. Passing that data into your setters

For example, create some helper functions in some module

// fetch-helpers.js

// performs a request and resolves with JSON
export const fetchJson = async (url, init = {}) => {
const res = await fetch(url, init);
if (!res.ok) {
throw new Error(`${res.status}: ${await res.text()}`);
}
return res.json();
};

// get JSON from multiple URLs and pass to setters
export const fetchAndSetAll = async (collection) => {
// fetch all data first
const allData = await Promise.all(
collection.map(({ url, init }) => fetchJson(url, init))
);

// iterate setters and pass in data
collection.forEach(({ setter }, i) => {
setter(allData[i]);
});
};

and in your component...

import { useEffect, useState } from "react";
import { fetchAndSetAll } from "./fetch-helpers";

export const MyComponent = () => {
// initialise state to match the API response data types
const [timeData, setTimeData] = useState([]);
const [functionData, setFunctionData] = useState([]);

useEffect(() => {
fetchAndSetAll([
{
url: "/api/...",
setter: setTimeData,
},
{
url: "/api/...",
setter: setFunctionData,
},
]).catch(console.error);
}, []);

return <>{/* ... */}</>;
};

Resolve promises one after another (i.e. in sequence)?

Update 2017: I would use an async function if the environment supports it:

async function readFiles(files) {
for(const file of files) {
await readFile(file);
}
};

If you'd like, you can defer reading the files until you need them using an async generator (if your environment supports it):

async function* readFiles(files) {
for(const file of files) {
yield await readFile(file);
}
};

Update: In second thought - I might use a for loop instead:

var readFiles = function(files) {
var p = Promise.resolve(); // Q() in q

files.forEach(file =>
p = p.then(() => readFile(file));
);
return p;
};

Or more compactly, with reduce:

var readFiles = function(files) {
return files.reduce((p, file) => {
return p.then(() => readFile(file));
}, Promise.resolve()); // initial
};

In other promise libraries (like when and Bluebird) you have utility methods for this.

For example, Bluebird would be:

var Promise = require("bluebird");
var fs = Promise.promisifyAll(require("fs"));

var readAll = Promise.resolve(files).map(fs.readFileAsync,{concurrency: 1 });
// if the order matters, you can use Promise.each instead and omit concurrency param

readAll.then(function(allFileContents){
// do stuff to read files.
});

Although there is really no reason not to use async await today.

using a fetch inside another fetch in javascript

Fetch returns a promise, and you can chain multiple promises, and use the result of the 1st request in the 2nd request, and so on.

This example uses the SpaceX API to get the info of the latest launch, find the rocket's id, and fetch the rocket's info.

const url = 'https://api.spacexdata.com/v4';

const result = fetch(`${url}/launches/latest`, { method: 'get' })
.then(response => response.json()) // pass the data as promise to next then block
.then(data => {
const rocketId = data.rocket;

console.log(rocketId, '\n');

return fetch(`${url}/rockets/${rocketId}`); // make a 2nd request and return a promise
})
.then(response => response.json())
.catch(err => {
console.error('Request failed', err)
})

// I'm using the result const to show that you can continue to extend the chain from the returned promise
result.then(r => {
console.log(r.first_stage); // 2nd request result first_stage property
});
.as-console-wrapper { max-height: 100% !important; top: 0; }

Build an event handler using elements from two fetch/then chains

You are running into what's called a race condition. The order of resolution for promises is nondeterministic.

You could wait for both fetch requests to resolve using Promise.all() :

Promise.all([
fetch('data/select.json').then(r => r.json()),
fetch('data/chartist.json').then(r => r.json())
]).then(results => {
// populate select options from results[0]
Object.keys(results[1]).forEach(x => {
let chart = new Chartist.Line(`#${x}`, { "series": results[1][x] });
chart.on('created', event => {
// get the values selected in the <select> and use it
});
});
});

But as with nesting the fetches, this method adds unnecessary delay into your application. Your select element data could be received well before the chart data but instead of updating the UI ahead of time you will have to wait for the second promise to resolve. Additionally, both requests must succeed or your code will not run. You could use Promise.allSettled() and handle any error conditions manually but you'd still have to wait for both requests to complete before your code was executed.

A better way to handle this would be to take advantage of the web's event driven nature by extending EventTarget and dispatching a CustomEvent when your select element is populated.

Since we know it's possible for our custom populated event to fire before the Chartist fetch has resolved then we also need to use a couple booleans to track the status. If we missed the event, just configure the chart otherwise attach an EventListener and wait but only if there is not already a listener queued up.

In a new file :

export class YourController extends EventTarget {
constructor() {
super();
}

populated = false;

async getData(path) {
return fetch(path).then(res => res.json());
}

async getSelectOptions() {
let data = await this.getData('data/select.json');
// populate select options here
this.populated = true;
this.dispatchEvent(new CustomEvent('populated'));
}

async getChartData() {
let data = await this.getData('data/chartist.json');
Object.keys(data).forEach(x => this.createChart(x, data[x]));
}

createChart(id, data) {
let chart = new Chartist.line(`#${id}`, { series:data });
chart.on('created', event => this.onCreated(chart));
}

onCreated(chart) {
if (this.populated) { this.configure(chart); }
else if (chart.waiting) { return; } chart.waiting = true;
this.addEventListener('populated', event => this.configure(chart), { once:true });
}

configure(chart) {
// get and use selected options here
chart.waiting = false;
this.dispatchEvent(new CustomEvent('configured', { detail:chart }));
}
}

In your app :

import { YourController } from './your-controller.js'

const c = new YourController();

c.addEventListener('configured', event => {
console.log('configured:', event.detail)
});

c.getSelectOptions();
c.getChartData();

Setting the { once:true } option in addEventListener() will automatically remove the listener after the first time the populated event fires. Since the created event can fire multiple times, this will prevent an endless stack of calls to .configure(chart) from piling up. I've also updated the script to prevent extra listeners from being added if created is fired more than once before populated.

Fetch multiple promises, return only one

Let me throw a nice compact entry into the mix

It uses Promise.all, however every inner Promise will catch any errors and simply resolve to false in such a case, therefore Promise.all will never reject - any fetch that completes, but does not have license.valid will also resolve to false

The array Promise.all resolves is further processed, filtering out the false values, and returning the first (which from the questions description should be the ONLY) valid JSON response

const urls = [
'https://testurl.com',
'https://anotherurl.com',
'https://athirdurl.com' // This is the valid one
];

export function validate(key) {
return Promise.all(urls.map(url =>
fetch(`${url}/${key}/validate`)
.then(response => response.json())
.then(json => json.license && json.license.valid && json)
.catch(error => false)
))
.then(results => results.filter(result => !!result)[0] || Promise.reject('no matches found'));
}

How can I fetch an array of URLs with Promise.all?

Yes, Promise.all is the right approach, but you actually need it twice if you want to first fetch all urls and then get all texts from them (which again are promises for the body of the response). So you'd need to do

Promise.all(urls.map(u=>fetch(u))).then(responses =>
Promise.all(responses.map(res => res.text()))
).then(texts => {

})

Your current code is not working because forEach returns nothing (neither an array nor a promise).

Of course you can simplify that and start with getting the body from each response right after the respective fetch promise fulfilled:

Promise.all(urls.map(url =>
fetch(url).then(resp => resp.text())
)).then(texts => {

})

or the same thing with await:

const texts = await Promise.all(urls.map(async url => {
const resp = await fetch(url);
return resp.text();
}));


Related Topics



Leave a reply



Submit