[Inference] Handle stack connector IDs that resolve to inference endpoints#259656
Merged
viduni94 merged 4 commits intoelastic:mainfrom Mar 26, 2026
Merged
Conversation
Contributor
|
Pinging @elastic/obs-ai-team (Team:obs-ai) |
Contributor
ApprovabilityVerdict: Needs human review This PR changes runtime behavior in how stack connectors that resolve to inference endpoints are handled, adding a new routing path in the chat completion API. The author does not own any of the changed files, which are owned by @elastic/search-kibana and @elastic/obs-ai-team/@elastic/security-generative-ai. You can customize Macroscope's approvability policy. Learn more. |
x-pack/platform/plugins/shared/inference/server/chat_complete/callback_api.ts
Show resolved
Hide resolved
spong
approved these changes
Mar 25, 2026
Member
spong
left a comment
There was a problem hiding this comment.
Evals code changes LGTM! 👍
Thanks @sphilipse for the feedback here and thanks @viduni94 for the fix! 🙌
Contributor
⏳ Build in-progress, with failures
Failed CI Steps
Test Failures
cc @viduni94 |
shahargl
pushed a commit
to shahargl/kibana
that referenced
this pull request
Mar 26, 2026
…oints (elastic#259656) Closes elastic#259641 ## Summary When a stack connector ID (e.g.: a preconfigured `.inference` connector) is passed to the inference plugin's `chatComplete` API, `getConnectorById` may resolve it to an `InferenceConnector` with `isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only checked the `endpointIdCache` to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type. This caused a `Saved object [action/<inference-endpoint-id>] not found` error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference` connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action. ### Changes - `callback_api.ts`: After `getInferenceExecutor` resolves the connector in the stack connector branch, check `connector.isInferenceEndpoint`. If true, redirect to the inference endpoint execution path (`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` + `inferenceEndpointAdapter`). - `api.test.ts`: Added tests covering the "stack connector resolving to inference endpoint" path. - `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side workaround that was extracting inferenceId from `.inference` connectors - no longer needed now that the inference plugin handles this server-side. - `create_connector_fixture.test.ts`: Removed corresponding workaround tests. ## Related - elastic#258530 introduced unified connector listing via `getConnectorList()`/`getConnectorById()`, which returns inference endpoints with `isInferenceEndpoint: true` - elastic#259446 added a temporary client-side workaround in `kbn-evals` (now removed by this PR) ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
jeramysoucy
pushed a commit
to jeramysoucy/kibana
that referenced
this pull request
Apr 1, 2026
…oints (elastic#259656) Closes elastic#259641 ## Summary When a stack connector ID (e.g.: a preconfigured `.inference` connector) is passed to the inference plugin's `chatComplete` API, `getConnectorById` may resolve it to an `InferenceConnector` with `isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only checked the `endpointIdCache` to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type. This caused a `Saved object [action/<inference-endpoint-id>] not found` error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference` connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action. ### Changes - `callback_api.ts`: After `getInferenceExecutor` resolves the connector in the stack connector branch, check `connector.isInferenceEndpoint`. If true, redirect to the inference endpoint execution path (`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` + `inferenceEndpointAdapter`). - `api.test.ts`: Added tests covering the "stack connector resolving to inference endpoint" path. - `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side workaround that was extracting inferenceId from `.inference` connectors - no longer needed now that the inference plugin handles this server-side. - `create_connector_fixture.test.ts`: Removed corresponding workaround tests. ## Related - elastic#258530 introduced unified connector listing via `getConnectorList()`/`getConnectorById()`, which returns inference endpoints with `isInferenceEndpoint: true` - elastic#259446 added a temporary client-side workaround in `kbn-evals` (now removed by this PR) ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
paulinashakirova
pushed a commit
to paulinashakirova/kibana
that referenced
this pull request
Apr 2, 2026
…oints (elastic#259656) Closes elastic#259641 ## Summary When a stack connector ID (e.g.: a preconfigured `.inference` connector) is passed to the inference plugin's `chatComplete` API, `getConnectorById` may resolve it to an `InferenceConnector` with `isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only checked the `endpointIdCache` to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type. This caused a `Saved object [action/<inference-endpoint-id>] not found` error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference` connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action. ### Changes - `callback_api.ts`: After `getInferenceExecutor` resolves the connector in the stack connector branch, check `connector.isInferenceEndpoint`. If true, redirect to the inference endpoint execution path (`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` + `inferenceEndpointAdapter`). - `api.test.ts`: Added tests covering the "stack connector resolving to inference endpoint" path. - `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side workaround that was extracting inferenceId from `.inference` connectors - no longer needed now that the inference plugin handles this server-side. - `create_connector_fixture.test.ts`: Removed corresponding workaround tests. ## Related - elastic#258530 introduced unified connector listing via `getConnectorList()`/`getConnectorById()`, which returns inference endpoints with `isInferenceEndpoint: true` - elastic#259446 added a temporary client-side workaround in `kbn-evals` (now removed by this PR) ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Closes #259641
Summary
When a stack connector ID (e.g.: a preconfigured
.inferenceconnector) is passed to the inference plugin'schatCompleteAPI,getConnectorByIdmay resolve it to anInferenceConnectorwithisInferenceEndpoint: true. Previously,resolveAndCreatePipelineonly checked theendpointIdCacheto decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type.This caused a
Saved object [action/<inference-endpoint-id>] not founderror when callers (e.g.:kbn-evals) passed preconfigured.inferenceconnector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action.Changes
callback_api.ts: AftergetInferenceExecutorresolves the connector in the stack connector branch, checkconnector.isInferenceEndpoint. If true, redirect to the inference endpoint execution path (resolveInferenceEndpoint+createInferenceEndpointExecutor+inferenceEndpointAdapter).api.test.ts: Added tests covering the "stack connector resolving to inference endpoint" path.create_connector_fixture.ts(kbn-evals): Removed the client-side workaround that was extracting inferenceId from.inferenceconnectors - no longer needed now that the inference plugin handles this server-side.create_connector_fixture.test.ts: Removed corresponding workaround tests.Related
getConnectorList()/getConnectorById(), which returns inference endpoints withisInferenceEndpoint: truekbn-evals(now removed by this PR)Checklist
release_note:*label is applied per the guidelinesbackport:*labels.