fix(performance): pre-compute and lazily initialize endpoint methods#622
fix(performance): pre-compute and lazily initialize endpoint methods#622gr2m merged 3 commits intooctokit:mainfrom
Conversation
We have observed that the octokit initialisation is quite slow, especially in combination with [probots](https://github.com/probot/probot) which creates a new octokit instance for each incoming request. This causes the main event loop to block for a considerable time. With our change we moved the preparation of the endpoints api object into the module scope and use a Proxy object to defer the initialisation of octokit defaults and decorations to the first API call per method. Although we have not measured it, we believe the overhead that comes from the proxied method call is insignificant in comparison to the network latency of an API call. Co-authored-by: Markus Wolf <mail@markus-wolf.de>
gr2m
left a comment
There was a problem hiding this comment.
Fascinating. I didn't have a chance to use Proxies yet, but I think the usage makes a lot of sense here. I'm always reluctant about internal caching like this because the first call of the method behaves different then the second call, and if there is rare error condition it's very hard to debug. But given that in this package the methods are very monotonous, and the improvement quite significant, I think it's worth it.
Could you please add a comment with a short description of how it works and why this optimization was done, with a reference to this pull request?
@ZauberNerd could you do that please? No worries if not, I'd say we can add that ourselves and merge the PR |
|
@gr2m sorry for the delay. I've added a comment and fixed a typo ( |
|
Thanks @ZauberNerd! The new release should be out any moment. It will be a fix release so if you reset your lock file you should get the update. Let us know if it all works as expected |
|
🎉 This PR is included in version 7.2.3 🎉 The release is available on: Your semantic-release bot 📦🚀 |
I'm using this in a probot setup and probot is still on v5 of this dependency. |
|
🎉 This PR is included in version 8.0.0-beta.3 🎉 The release is available on: Your semantic-release bot 📦🚀 |
The implementation of octokit#622 only implemented a `get` Proxy trap. This worked fine for the simple use-case of invoking the methods, but failed when users tried to mock functions with Jest or Sinon. It also did not list all functions anymore, when pressing tab-tab in a REPL. This commit implements further Proxy traps which are required for those use-cases and it uses the cache object for mutating operations. Fixes: octokit#683
The implementation of #622 only implemented a `get` Proxy trap. This worked fine for the simple use-case of invoking the methods, but failed when users tried to mock functions with Jest or Sinon. It also did not list all functions anymore, when pressing tab-tab in a REPL. This commit implements further Proxy traps which are required for those use-cases and it uses the cache object for mutating operations. Fixes: #683
|
🎉 This issue has been resolved in version 7.2.3 🎉 The release is available on: Your semantic-release bot 📦🚀 |
We have observed that the octokit initialisation is quite slow,
especially in combination with probots
which creates a new octokit instance for each incoming request.
This causes the main event loop to block for a considerable time.
With our change we moved the preparation of the endpoints api object
into the module scope and use a Proxy object to defer the initialisation
of octokit defaults and decorations to the first API call per method.
Although we have not measured it, we believe the overhead that comes
from the proxied method call is insignificant in comparison to the
network latency of an API call.
We have profiled our change with a flamegraph generated with Clinic.js.
The two screenshots below show the before and after state of 10k octokit initialisations:
Before

After
