Skip to content

Fix/connect ps data channel after default data channel connected#4798

Merged
JudyZhuHz merged 20 commits intowebex:nextfrom
mickelr:fix/connect_ps_after_default
Mar 30, 2026
Merged

Fix/connect ps data channel after default data channel connected#4798
JudyZhuHz merged 20 commits intowebex:nextfrom
mickelr:fix/connect_ps_after_default

Conversation

@mickelr
Copy link
Copy Markdown
Contributor

@mickelr mickelr commented Mar 24, 2026

COMPLETES #< INSERT LINK TO ISSUE >

This pull request addresses

When panelist join a webinar in which practice session is running, panelist call voicea.announce() earlier than cantina set up voicea event listener, so cantina cannot get the announcement response.

Sometime when leave webinar, "this.transcription" in meeting/index will be undefined so skipps the cleanup work of voicea plugin and make the announce status is not "idle" which block the next announce() if join again.

Data channel tokens only be present in join response and will be removed in later locus message.

by making the following changes

Make sure default data channel connecrted, then connect PS data channel.
Remove the condition "if(this.transcription)" to cleanup voicea anyway.
Save data channel tokens in join response for later usage.

Change Type

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation update
  • Tooling change
  • Internal code refactor

The following scenarios were tested

Web client panelist join 5k Webinar in which PS already started, check whether it can get caption language list.
Web client leave webinar and join agian, check whether it can get caption language list.

The GAI Coding Policy And Copyright Annotation Best Practices

  • GAI was not used (or, no additional notation is required)
  • Code was generated entirely by GAI
  • GAI was used to create a draft that was subsequently customized or modified
  • Coder created a draft manually that was non-substantively modified by GAI (e.g., refactoring was performed by GAI on manually written code)
  • Tool used for AI assistance (GitHub Copilot / Other - specify)
    • Github Copilot
    • Other - Please Specify
  • This PR is related to
    • Feature
    • Defect fix
    • Tech Debt
    • Automation

I certified that

  • I have read and followed contributing guidelines
  • I discussed changes with code owners prior to submitting this pull request
  • I have not skipped any automated checks
  • All existing and new tests passed
  • I have updated the documentation accordingly

Make sure to have followed the contributing guidelines before submitting.

@mickelr mickelr requested review from a team as code owners March 24, 2026 03:23
@mickelr mickelr added the validated If the pull request is validated for automation. label Mar 24, 2026
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 351fb9599a

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +212 to +216
await new Promise<void>((resolve) => {
const onDefaultSessionConnected = () => {
// @ts-ignore - Fix type
this.webex.internal.llm.off('online', onDefaultSessionConnected);
resolve();
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Re-check practice state after waiting for default online

This code computes isPracticeSession/URLs before the await and then proceeds to registerAndConnect after the wait without re-validating state. If the panelist leaves or is demoted while the default LLM session is still offline, cleanupPSDataChannel() can run, but this suspended call will later resume on 'online' and reconnect the practice-session channel with stale meeting data, re-attaching relay listeners unexpectedly. Re-evaluate meeting/practice eligibility after the wait (or cancel pending waiters during cleanup) before connecting.

Useful? React with 👍 / 👎.

@aws-amplify-us-east-2
Copy link
Copy Markdown

This pull request is automatically being deployed by Amplify Hosting (learn more).

Access this pull request here: https://pr-4798.d3m3l2kee0btzx.amplifyapp.com

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b212b5c8cf

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

this.updatePSDataChannel();
};
// @ts-ignore - Fix type
this.webex.internal.llm.on('online', onDefaultSessionConnected);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Deduplicate deferred online listener before scheduling reconnect

When updatePSDataChannel() runs while the default LLM session is offline, this line adds a new 'online' listener every time with no guard for an existing pending waiter. In practice, role/locus updates can invoke updatePSDataChannel() multiple times before default comes online, so all queued handlers fire and each re-enters updatePSDataChannel(), causing concurrent registerAndConnect/cleanup flows for the same practice-session ID; this can race and leave the practice-session connection state inconsistent or trigger unexpected disconnect/reconnect behavior. Track a single pending listener (or use a one-shot registration with cancellation in cleanup) instead of registering unbounded handlers.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 08cb77f62f

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".


// Ensure the default session data channel is connected before connecting the practice session
// @ts-ignore - Fix type
if (!this.webex.internal.llm.isConnected()) {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Avoid missing default-session online transition

updatePSDataChannel() checks llm.isConnected() before subscribing to 'online', which introduces a check-then-subscribe race: if the default session comes online between the check and llm.on('online', ...), Mercury’s one-time 'online' emission is already gone (it is emitted via trigger in internal-plugin-mercury/src/mercury.js and not replayed), so this call returns and never retries the practice-session connect. In that window, panelists can stay disconnected from the practice-session channel until an unrelated later update happens to call updatePSDataChannel() again.

Useful? React with 👍 / 👎.

@mickelr
Copy link
Copy Markdown
Contributor Author

mickelr commented Mar 25, 2026

@codex review

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex Review: Didn't find any major issues. Already looking forward to the next diff.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 45f0c5a841

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +6314 to +6316
const datachannelToken = this.webex.internal.llm.getDatachannelToken(
DataChannelTokenType.Default
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Restore locus-token fallback for default LLM session

updateLLMConnection() now reads the default token only from llm.getDatachannelToken(...) and no longer falls back to this.locusInfo.self.datachannelToken, so any locus-driven reconnect path that runs before/without saveDataChannelToken(join) leaves datachannelToken undefined even when Locus already has a valid token. In token-auth mode this makes registerAndConnect() send no Data-Channel-Auth-Token header, causing default-session data-channel connect failures for flows like lobby admission or early meeting-info updates.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before saveDataChannelToken() called, the isJoined will be false, so we will not call registerAndConnect() actually

Comment on lines +177 to 179
const practiceSessionDatachannelToken = this.webex.internal.llm.getDatachannelToken(
DataChannelTokenType.PracticeSession
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Recover practice-session token from locus when cache is empty

updatePSDataChannel() now sources the practice-session token only from llm.getDatachannelToken(...) and removed the previous fallback to meeting.locusInfo.self.practiceSessionDatachannelToken. If the practice-session token is first delivered by a later locus update (or cache was cleared), this code attempts registerAndConnect() with undefined, so in JWT-token mode the practice-session channel is opened without auth header and fails to connect even though Locus has the token.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above comment

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 7100da0354

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +177 to 179
const practiceSessionDatachannelToken = this.webex.internal.llm.getDatachannelToken(
DataChannelTokenType.PracticeSession
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Restore token fallback before PS reconnect

After a panelist is demoted, cleanupPSDataChannel() disconnects the practice-session LLM connection, and that path clears the cached practice token in LLM; on a later promotion in the same meeting, this code now reads only getDatachannelToken(...) and no longer falls back to meeting.locusInfo.self.practiceSessionDatachannelToken, so registerAndConnect() is called with undefined. In token-auth mode this sends no Data-Channel-Auth-Token header and can block practice-session reconnection until a full rejoin/refresh repopulates the cache. Fresh evidence versus earlier discussion: this failure is reproducible even after join succeeds, because demotion cleanup itself empties the token cache before the next promotion.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 4e2386afe1

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +6314 to +6316
const datachannelToken = this.webex.internal.llm.getDatachannelToken(
DataChannelTokenType.Default
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Restore locus-token fallback when default cache is empty

updateLLMConnection() now reads only llm.getDatachannelToken(DataChannelTokenType.Default) and no longer falls back to locusInfo.self.datachannelToken, so when the cache is empty it calls registerAndConnect(..., undefined). This is reachable after lobby flows: SELF_ADMITTED_GUEST re-invokes updateLLMConnection() but that path does not call saveDataChannelToken(), so in token-auth mode the request is sent without Data-Channel-Auth-Token even though locus already has a valid token. Fresh evidence vs the earlier thread is this post-admission call path, where isJoined() is true and the cache can still be unset.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 2212f6fe66

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines 280 to 283
this.disconnect(options, sessionId).then(() => {
// Clean up sessions data
this.connections.delete(sessionId);
this.datachannelTokens[sessionId] = undefined;
});
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Clear token cache when disconnecting a meeting session

disconnectLLM() no longer clears the cached data-channel token, which leaves stale auth state in the shared LLM service after teardown-only paths. In the code path I checked (MeetingUtil.cleanUp() -> cleanupLLMConneciton()), cleanup can run without clearMeetingData(), so the old meeting token remains cached; because updateLLMConnection() now reads only the LLM cache, a later meeting can send the previous meeting’s Data-Channel-Auth-Token and fail token-auth connection until a fresh token is explicitly saved.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In my test, clearMeetingData will be always called when leave meeting

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 7fb7706018

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines 280 to 283
this.disconnect(options, sessionId).then(() => {
// Clean up sessions data
this.connections.delete(sessionId);
this.datachannelTokens[sessionId] = undefined;
});
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Reset cached token when disconnecting a session

disconnectLLM() now removes only the connection record and leaves datachannelTokens intact, which leaks auth state across teardown-only flows. Fresh evidence versus the earlier thread: there are cleanup paths that do not run clearMeetingData()Meeting#index#setUpLocusInfoMeetingListener (the shouldLeave === false branch) and Meetings#destroy both call MeetingUtil.cleanUp() directly—so the stale token persists after meeting destruction. Because this commit also switched updateLLMConnection()/updatePSDataChannel() to rely on cached LLM tokens, later reconnects can send an old meeting token and fail data-channel auth.

Useful? React with 👍 / 👎.

if (this.config.enableAutomaticLLM) {
// @ts-ignore
this.webex.internal.llm.on('online', this.handleLLMOnline);
this.saveDataChannelToken(join);
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm wondering if this call maybe should be done earlier in line 6154. I'm not entirely sure what is the meaning of enableAutomaticLLM config flag, if it is false, can you still have an LLM connection and might need the token? I see there are places where updateLLMConnection() is called without the check for enableAutomaticLLM config flag....

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose enableAutomaticLLM should always be true in webex meeting? But I agree do it earlier will be safer.

* @param {Object} join - The parsed join response (from MeetingUtil.parseLocusJoin)
* @returns {void}
*/
saveDataChannelToken(join: any): void {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would be nice to use the LocusDTO type here, because we know that join.locus will be of that type, right? and we could also add the 2 tokens to self type, not sure if we have self type declared already, I think I've added it here: https://github.com/webex/webex-js-sdk/pull/4742/changes#diff-10d3bc06a1bbd80411c0671b115c896747942db210526c4771457e083783d5e7R41 - it's just that it's a draft PR that I stopped working on, because probably the change wasn't needed, but we could still take the type from that PR.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have to save the tokens into llm lib anyway, because there is a refresh logic that client need to call fetch api to get a new token, the new token will be saved in llm to replace the old one.

const datachannelToken = join?.locus?.self?.datachannelToken;
const practiceSessionDatachannelToken = join?.locus?.self?.practiceSessionDatachannelToken;

if (datachannelToken) {
Copy link
Copy Markdown
Collaborator

@marcin-bazyl marcin-bazyl Mar 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

saveDataChannelToken() is only called on join response. Is it not possible for Locus to revoke or to change any of the data channel tokens later after join?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Confirmed with Vishal, client need refresh the token if expired
image

Copy link
Copy Markdown
Collaborator

@marcin-bazyl marcin-bazyl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

approving as my comments are not blocking and can be addressed in subsequent PRs

@JudyZhuHz JudyZhuHz merged commit 2ddffe0 into webex:next Mar 30, 2026
12 checks passed
@github-actions
Copy link
Copy Markdown

🎉 Your changes are now available!
Released in: v3.12.0-next.3
📖 View full changelog →
Packages Updated Version
webex 3.12.0-next.3
@webex/internal-plugin-llm 3.12.0-next.1
@webex/internal-plugin-voicea 3.12.0-next.1
@webex/plugin-meetings 3.12.0-next.2
@webex/test-webex-node 0.0.1-next.1
webex-node 3.12.0-next.1

Thank you for your contribution!
🤖 This is an automated message. For questions, please refer to the documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

validated If the pull request is validated for automation.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants