Hacker News new | past | comments | ask | show | jobs | submit login

I think the ideal thing to do in a lot of cases is to adhere to the spirit of the warning and actually try to handle the rejections in some way.

In this case I'd want to log the error, or display a notice that the chapter couldn't be loaded. For this we need to catch the rejections and turn them into values we can iterate over with for await/of.

Promise.allSettled() does this for us, but forces us to wait until all promises are... settled. A streaming version would give back an async iterable to we could handle promises in order as soon as they're ready. Or we can just convert into the same return type as allSettled() ourselves:

    async function showChapters(chapterURLs) {
      const chapterPromises = chapterURLs.map(async (url) => {
        try {
          const response = await fetch(url);
          return {status: "fulfilled", value: await response.json()};
        } catch (e) {
          return {status: "rejected", reason: e};
        }
      });
    
      for await (const chapterData of chapterPromises) {
        // Make sure this function renders or logs something for rejections
        appendChapter(chapterData);
      }
    }



I'm laughing - I just posted basically the exact same thing. This is the right way to solve this.

He's trying to shove error handling into the wrong place (The loop should not be responsible for handling failed requests it knows nothing about).


I don't think that is an absolute. Lets say it is paragraphs not chapters, the page needs to know to didn't get all the paragraphs to build the story so it can know the page has failed to complete. That's the perfect time to throw an exception and handle it somewhere that will know how to let the application continue to run sanely.


Sure - I don't think anything is really an absolute in programming, and depending on the use case there are all sorts of ways to structure this.

But in general, if you're making a contract where a promise might reject and using await, you need to be wrapping that code in a try/catch somewhere.

That's the dealio with async/await and promises: you're trading the callback .then().catch() format for try{ await ... }catch(){}.

If you don't want to use try/catch - you need to be making a contract where the failure is not a rejection but an error object. If you want to use rejections, you need to be catching them (in one form or another).


Agreed, thought there is no need to complicate it further by returning objects with status and value fields.

The requirements here seem a bit rare (ignore failures individually, machine gun requests, render incrementally), but the ideal solution is actually glossed over in the post, if you consider that you'll probably want to log those request errors in production:

    async function showChapters(chapterURLs) {
      const requests = chapterURLs.map(url => {
        return fetch(url).then(r => r.json()).catch((err) => {
          MyMonitoring.report(err)
        })
      })
    
      for await (const data of requests) {
          appendChapter(data);
      }
    }
In this scenario, and the post also fails to mention this, appendChapter() needs to deal with empty data as the .catch() handler is resolving to undefined.

This could be easily abstracted into a 'fetchAll' function.

There is absolutely nothing "blunt" or wrong about it. The engine warned us about unhandled promises, we handle them, everybody is happy!


But now some of your requests resolve with undefined, rather than correctly reject (it's exceptional behaviour).

appendChapter(undefined) doesn't seem right to me. Feels like you'll have to do something like if (data === undefined) throw… which kinda shows how hacky this is.


Hi Jake! That is exactly what the code in the article does, isn't it?

     promise.catch(() => {})
It does follow up with "this doesn't change the promises other than marking them as 'handled'. They're still rejected promises" but that doesn't seem right - they become fulfilled promises with a value of undefined, and those values will definitely show up in the for loop later. Is there something I'm missing with the nested promises?

My suggestion was in the spirit of keeping this behaviour the same, except I think explicitly adding the catch handler to fetch() is the right way to do it vs an auxiliary function to 'silence' them.


> they become fulfilled promises with a value of undefined

No they don't, which is why I made that clarification in the post :D

const promise1 = Promise.reject(Error("wow"));

// promise1 is rejected, and unhandled

const promise2 = promise1.catch(() => {});

// promise2 will be fulfilled, and is unhandled

// promise1 is still rejected, but now handled


I don't think this is right. One code smell here is that your `chapterPromises` is no longer a set of promises for chapters. You've had to turn exceptions into objects to work around the unhandled promise behaviour.

It also looks like it'll display chapter 3 if chapter 2 fails to load, which seems wrong. At that point you've failed to display the sequence of chapters.


> I don't think this is right. One code smell here is that your `chapterPromises` is no longer a set of promises for chapters. You've had to turn exceptions into objects to work around the unhandled promise behaviour.

I mean, or you do any sort of sensible thing, like automatically retry the failed request (with backoff and a cap) directly in the catch, where the error is local, understood, and you have the information to handle it.

But in either case - you're right that it's no longer `Promise<chapter>`, it's now `Promise<Option<chapter, err>>` But... that's not a problem. And it's a much better way to indicate to other sections of the code what has happened than changing code paths completely with an exception or unhandled rejection.

Using something like an `Option` (or `Either`, or `Result` - depending on what library/languages you're familiar with) is a pretty sound way to handle errors in a pipeline, and is routinely found across basically every functional language out there.


It depends so much on the app and behavior you want - to show partial data or not, to do things in sequence. What do chapters stand in for in this example?

Yeah, in this case I thought it might be nice to show a "failed to load" message for each failure. Maybe this is a chronological Mastodon timeline, or a photo gallery. Do you want to error the whole UI for one failed object? Maybe

But trying to do something useful with all failures it's a general rule that I think guides towards good thinking about UX and fewer unhandled rejection warnings.


For book chapters this is probably fine, but if there's any chance you might have a lot of things to deal with at once, you probably want to manage a pool of promises.

You might be okay in normal usage, but if a backlog accumulates, you could find yourself consuming all available resources and self destructing over and over.

I've tried it and do not recommend it.


This code does not recapitulate the desired behavior.

Desired: when the load from chapter 10 fails, even if chapter 2 is taking 30s to load, I want to immediately transition the UI into the failure state.

Your code: waits until chapters 1-9 have loaded to report the failure state on chapter 10.

(I posted I believe a correct answer "without the hack" to the OP.)


Does the article’s final code behave in a desired way? To my understanding, it simply catches rejections, but for-await block still waits for every promise in order, up until the rejected one. Your `finally` block gets called too late, unless I miss something obvious.

To make this loop early-cancellable on any out-of-order rejection, I’d create a helper generator and a “semaphore” promise, which would block the yield until there’s a next resolved value or any rejection.

  const arr = []
  arr.length = promises.length
  let step, sem
  function rearm() {
    sem = new Promise(res => {step = res})
  }
  rearm()
  promises.forEach((p, i) => {
    p.then(v => step({i, v}))
    .catch(e => step({i, e}))
  })
  async function* values() {
    let next = 0
    while (next < arr.length) {
      if (arr[next]) {
        yield arr[next++].v
        continue
      }
      const {i, v, e} = await sem
      rearm()
      if (e) {
        yield Promise.reject(e)
        return
      }
      arr[i] = {v}
    }
  }
  try {
    for await (… of values()) {
      …
I barely ever used js generators specifically, so this code is more like a concept (e.g. I suspect a race around rearm()), but you get the idea.


Not sure what `Promise.race(Promise.all(promises), loop());` is supposed to be doing. Promise.race only takes one arg, no? Also not sure what partialUpdate is supposed to be doing. Is that assuming some React-like diff reconciliation?

Personally, I'd go with something like this:

    async function showChapters(chapterURLs) {
      return Promise.all(
        chapterURLs
          .map((url) => fetch(url).then(r => r.json()))
          .map((p, i) => p.then(data => appendChapter(data, i), e => displayError(e, i)))
      );
    }
With this, you can control exactly where and how list items are rendered, regardless of order of promise resolution, and things are rendered as soon as they are available, instead of getting stuck behind slow requests.


I'm not trying to say there's only one way to do this, especially if we are talking about user affordances for failure states. I think there could be a debate about showing what data is available vs not showing partial data.

My point is just that trying to do something with the failures often removes the warning and leads to better UX, or at least conscious decisions about it, and usually some clue in the code about what you intend to happen.


How can you create that behavior?


Like I said I added a solution in the Disqus thread to that blog post... the English summary is that you Promise.race() the Promise.all() with the for-await loop.


Does that work? Isn't it possible for the monitor() function to return the result of Promise.all() rather than loop(), since they are racing?

    return Promise.race(Promise.all(promises), loop());
edit: I guess they are the same in this specific case, but probably wouldn't be in real code, making this function very clever but maybe too rigid.


I think that it works because when loop() rejects, Promise.all() rejects too so it doesn't matter who wins.


I mean in the case where nothing rejects.

Say for example the loop was doing `result.push({success: true, value: item})` instead of just `result.push(item)`. Then the value returned from loop() is different from the one returned by Promise.all(promises), and your overall monitor() function might return either.


I also posted the same thing (and have had this discussion with Jake on Twitter :))


Nice. `Promise.iterateSettled` would be cool.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: