Skip to content

Conversation

@stephenplusplus
Copy link
Contributor

This is really just two types of changes in many places:

  1. Use...

    if (err) {
      callback(err);
      return;
    }

    ...as opposed to anything else.

  2. Use process.nextTick in two places to simulate an async operation.

This comment was marked as spam.

@ryanseys
Copy link
Contributor

ryanseys commented Sep 2, 2014

We should be using setImmediate instead of process.nextTick when executing callbacks. See here for more information.

Pasted here for easy reading:

Callbacks passed to process.nextTick will usually be called at the end of the current flow of execution, and are thus approximately as fast as calling a function synchronously. Left unchecked, this would starve the event loop, preventing any I/O from occurring. setImmediates are queued in the order created, and are popped off the queue once per loop iteration. This is different from process.nextTick which will execute process.maxTickDepth queued callbacks per iteration. setImmediate will yield to the event loop after firing a queued callback to make sure I/O is not being starved.

@stephenplusplus
Copy link
Contributor Author

From the link:

So in a case where you're trying to break up a long running, CPU-bound job using recursion, you would now want to use setImmediate rather than process.nextTick to queue the next iteration as otherwise any I/O event callbacks wouldn't get the chance to run between iterations.

We're just executing a callback. I'm inclined to stick to nextTick for convention reasons, as well as it's the model scenario for async-ifying sync methods according to the docs: process.nextTick.

@ryanseys
Copy link
Contributor

ryanseys commented Sep 2, 2014

process.nextTick has the opportunity to starve the event loop and cause a stack overflow when process.maxTickDepth maxes out and I've seen it happen with streaming before due to a misuse of process.nextTick unresolved in node 0.10.x streams.

We should only use process.nextTick in places where setImmediate isn't available. This is even done in the async library. Considering we are only supporting node 0.10.x for this library, we should be using setImmediate everywhere. For all intents and purposes, process.nextTick and setImmediate will do the same thing in this use case, with setImmediate having the added benefit of not shitting the bed if it's called recursively.

@stephenplusplus
Copy link
Contributor Author

My only reasons for nextTick over setImmediate are based purely on the convention and nextTick documentation. I haven't seen a use of setImmediate to async a sync method, and would feel more comfortable if I did. I do see your points, however I don't think they affect us. It would take a very serious misuse of our library to stack a nextTick queue so high it would choke. Additionally, the async link doesn't do what you say -- it simply uses setImmediate if available in a browser, where there is no nextTick. In this case, setImmediate is much better than setTimeout, 0.

But, yolo. I'll switch it to setImmediate, 'cause why not. It doesn't seem to hurt anything, other than offer a minimal performance loss. Feels weird, but what doesn't in Node?

(@rakyll - feel free to chime in if you prefer it to stick to nextTick)

@stephenplusplus stephenplusplus changed the title connection: use process.nextTick connection: use ~~process.nextTick~~ setImmediate Sep 2, 2014
@ryanseys
Copy link
Contributor

ryanseys commented Sep 2, 2014

Yeah before either of us make a decision, I would like to hear what @rakyll
thinks. She knows the Node ropes better than I do.

@rakyll
Copy link
Contributor

rakyll commented Sep 2, 2014

With node v11.0, immediate queue will be able to tick between process ticks. I think that is going to solve the performance concern mentioned above. Otherwise, LGTM.

See this test: https://github.com/joyent/node/blob/857975d5e7e0d7bf38577db0478d9e5ede79922e/test/simple/test-timers-immediate-queue.js

@ryanseys
Copy link
Contributor

ryanseys commented Sep 2, 2014

@stephenplusplus You can probably merge this whenever.

@stephenplusplus
Copy link
Contributor Author

Woo! Thanks for saving us, Ryan. setImmediate for the win!

stephenplusplus added a commit that referenced this pull request Sep 2, 2014
connection: use setImmediate to simulate async.
@stephenplusplus stephenplusplus merged commit f919a6b into googleapis:master Sep 2, 2014
@ryanseys
Copy link
Contributor

ryanseys commented Sep 2, 2014

😄 👍

sofisl pushed a commit that referenced this pull request Nov 11, 2022
samples: pull in latest typeless bot, clean up some comments

Source-Link: https://togithub.com/googleapis/synthtool/commit/0a68e568b6911b60bb6fd452eba4848b176031d8
Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-nodejs:latest@sha256:5b05f26103855c3a15433141389c478d1d3fe088fb5d4e3217c4793f6b3f245e
sofisl pushed a commit that referenced this pull request Nov 16, 2022
- [ ] Regenerate this pull request now.

PiperOrigin-RevId: 439356405

Source-Link: googleapis/googleapis@afa2ba1

Source-Link: googleapis/googleapis-gen@3e40c17
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiM2U0MGMxN2UxNTEwYzk1ZmFiNThmYzIxNDNjY2I2MWNjZWNhNTk4OSJ9
sofisl pushed a commit that referenced this pull request Nov 17, 2022
sofisl pushed a commit that referenced this pull request Nov 30, 2022
Committer: @miraleung
PiperOrigin-RevId: 310415142

Source-Author: Google APIs <noreply@google.com>
Source-Date: Thu May 7 12:34:02 2020 -0700
Source-Repo: googleapis/googleapis
Source-Sha: 684dfea7decfeca7a7526ea96a8e9256694dd5d8
Source-Link: googleapis/googleapis@684dfea
miguelvelezsa pushed a commit that referenced this pull request Jul 23, 2025
miguelvelezsa pushed a commit that referenced this pull request Jan 14, 2026
GautamSharda pushed a commit that referenced this pull request Jan 14, 2026
GautamSharda pushed a commit that referenced this pull request Jan 15, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants