Skip to content

feat(issues): add queue-based issue creation with BullMQ#250

Merged
l1ttps merged 4 commits intomainfrom
add-queue-issue-creation
Jan 16, 2026
Merged

feat(issues): add queue-based issue creation with BullMQ#250
l1ttps merged 4 commits intomainfrom
add-queue-issue-creation

Conversation

@l1ttps
Copy link
Copy Markdown
Member

@l1ttps l1ttps commented Jan 15, 2026

No description provided.

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello @l1ttps, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the scalability and reliability of the issue creation system by implementing a queue-based approach using BullMQ. Instead of directly processing issue creation requests, the system now adds these requests to a dedicated queue, which are then handled by a specialized background processor. This change ensures that the application remains responsive under load and provides robust error handling and retry mechanisms for issue creation.

Highlights

  • Asynchronous Issue Creation: The issue creation process has been refactored to be asynchronous, utilizing a BullMQ queue to handle the creation logic in the background.
  • Dedicated Issue Creation Processor: A new IssueCreationProcessor has been introduced within the jobs-registry module to specifically manage the database operations for creating issues and their initial comments.
  • BullMQ Queue Configuration: A new BullMQ queue named ISSUE_CREATION is registered with default job options including automatic removal on completion/failure, multiple retry attempts, and exponential backoff.
  • Decoupled Service Logic: The IssuesService now enqueues issue creation requests and waits for the job to complete, effectively decoupling the API request from the direct database write operations.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a queue-based mechanism for issue creation using BullMQ, moving the logic to a dedicated processor. This is a good architectural improvement. However, the current implementation in IssuesService synchronously waits for the job to complete by polling, which negates the benefits of a queue. My review includes critical feedback to refactor this into a truly asynchronous operation. I've also identified a race condition in the current polling logic and provided a fix. Additionally, there are suggestions to improve the BullMQ configuration for better debuggability and to clean up redundant code.

Comment on lines +220 to +254
// Wait for the job to complete and return the result
// We'll use a promise-based approach to wait for job completion
return new Promise<Issue>((resolve, reject) => {
const checkJob = () => {
this.issueCreationQueue
.getJob(job.id!)
.then(async (updatedJob) => {
if (updatedJob) {
const state = await updatedJob.getState();
if (state === 'completed') {
// For now, we'll query the database to get the created issue
// since the processor returns the issue object
const lastIssue = await this.issuesRepository.findOne({
where: { workspaceId },
order: { no: 'DESC' },
});
if (lastIssue) {
resolve(lastIssue);
} else {
reject(new Error('Created issue not found in database'));
}
} else if (state === 'failed') {
reject(new Error('Issue creation failed'));
} else {
// Still processing, check again in 100ms
setTimeout(checkJob, 100);
}
} else {
reject(new Error('Job not found'));
}
})
.catch(reject);
};
checkJob();
});
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

This polling mechanism to wait for job completion has several significant issues and should be refactored.

  1. Blocks the request: Holding the HTTP request open while a background job runs defeats the purpose of using a queue, which is to improve responsiveness. This can lead to request timeouts and poor scalability. The recommended pattern is to return a 202 Accepted response immediately and notify the client of completion asynchronously (e.g., via WebSockets).

  2. Race condition: When the job completes, the code fetches the lastIssue. This is a race condition. If multiple issues are created concurrently for the same workspace, this could return the wrong issue.

  3. Inefficient: Polling with setTimeout creates unnecessary load on Redis.

If blocking the request is a temporary necessity, you should at least fix the race condition by using the job's return value. The processor already returns the created issue, which is available in job.returnvalue.

removeOnComplete: {
age: 5,
},
removeOnFail: true,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Using removeOnFail: true will cause failed jobs to be removed immediately after all retries are exhausted. This makes it very difficult to debug failures in production, as all information about the failed job (including the error reason) is lost. It's highly recommended to keep failed jobs for a period to allow for inspection.

        removeOnFail: {
          age: 24 * 3600, // keep failed jobs for 24 hours
        },

Comment on lines +232 to +240
const lastIssue = await this.issuesRepository.findOne({
where: { workspaceId },
order: { no: 'DESC' },
});
if (lastIssue) {
resolve(lastIssue);
} else {
reject(new Error('Created issue not found in database'));
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

To fix the race condition and inefficient database query, you should use the returnvalue from the completed job, which contains the Issue object returned by the processor. Also, it's good practice to include the failure reason when a job fails.

                const createdIssue = updatedJob.returnvalue as Issue;
                if (createdIssue) {
                  resolve(createdIssue);
                } else {
                  reject(new Error('Created issue not found in job result'));
                }
              } else if (state === 'failed') {
                reject(new Error(`Issue creation failed: ${updatedJob.failedReason}`));

Comment on lines +24 to +26
removeOnComplete: {
age: 5,
},
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Removing completed jobs after only 5 seconds is quite aggressive. This leaves a very short window to inspect recently completed jobs for auditing or debugging. It's advisable to use a longer duration. For example, 1 hour (3600 seconds).

Suggested change
removeOnComplete: {
age: 5,
},
removeOnComplete: {
age: 3600, // 1 hour
},

Comment on lines +36 to +38
BullModule.registerQueue({
name: BullMQName.ISSUE_CREATION,
}),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The ISSUE_CREATION queue is being registered here, but it has already been registered in IssuesModule. Since both IssuesModule and JobsRegistryModule are global, this registration is redundant. It's cleaner to register a queue in a single location, typically in the module that is the primary producer of jobs for that queue.

@l1ttps l1ttps merged commit 88aa006 into main Jan 16, 2026
9 checks passed
@l1ttps l1ttps deleted the add-queue-issue-creation branch February 1, 2026 02:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant