Skip to content

[Inference] Handle stack connector IDs that resolve to inference endpoints#259656

Merged
viduni94 merged 4 commits intoelastic:mainfrom
viduni94:fix-inference-plugin-connector-id-resolution
Mar 26, 2026
Merged

[Inference] Handle stack connector IDs that resolve to inference endpoints#259656
viduni94 merged 4 commits intoelastic:mainfrom
viduni94:fix-inference-plugin-connector-id-resolution

Conversation

@viduni94
Copy link
Copy Markdown
Contributor

@viduni94 viduni94 commented Mar 25, 2026

Closes #259641

Summary

When a stack connector ID (e.g.: a preconfigured .inference connector) is passed to the inference plugin's chatComplete API, getConnectorById may resolve it to an InferenceConnector with isInferenceEndpoint: true. Previously, resolveAndCreatePipeline only checked the endpointIdCache to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type.

This caused a Saved object [action/<inference-endpoint-id>] not found error when callers (e.g.: kbn-evals) passed preconfigured .inference connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action.

Changes

  • callback_api.ts: After getInferenceExecutor resolves the connector in the stack connector branch, check connector.isInferenceEndpoint. If true, redirect to the inference endpoint execution path (resolveInferenceEndpoint + createInferenceEndpointExecutor + inferenceEndpointAdapter).
  • api.test.ts: Added tests covering the "stack connector resolving to inference endpoint" path.
  • create_connector_fixture.ts (kbn-evals): Removed the client-side workaround that was extracting inferenceId from .inference connectors - no longer needed now that the inference plugin handles this server-side.
  • create_connector_fixture.test.ts: Removed corresponding workaround tests.

Related

Checklist

  • Unit or functional tests were updated or added to match the most common scenarios
  • The PR description includes the appropriate Release Notes section, and the correct release_note:* label is applied per the guidelines
  • Review the backport guidelines and apply applicable backport:* labels.

@viduni94 viduni94 self-assigned this Mar 25, 2026
@viduni94 viduni94 requested a review from a team as a code owner March 25, 2026 20:30
@viduni94 viduni94 added the release_note:skip Skip the PR/issue when compiling release notes label Mar 25, 2026
@viduni94 viduni94 requested review from a team as code owners March 25, 2026 20:30
@viduni94 viduni94 added backport:skip This PR does not require backporting Team:Search Team:obs-ai Observability AI team 9.4.0 labels Mar 25, 2026
@elasticmachine
Copy link
Copy Markdown
Contributor

Pinging @elastic/obs-ai-team (Team:obs-ai)

@viduni94 viduni94 added (deprecated) evals:streams-sigevents This label is deprecated. Use `evals:significant-events` to run the Significant Events eval suite. models:eis/anthropic-claude-4.6-sonnet Run LLM evals against model: eis/anthropic-claude-4.6-sonnet models:judge:eis/google-gemini-3.1-pro Override LLM-as-a-judge connector for evals: eis/google-gemini-3.1-pro labels Mar 25, 2026
@macroscopeapp
Copy link
Copy Markdown
Contributor

macroscopeapp bot commented Mar 25, 2026

Approvability

Verdict: Needs human review

This PR changes runtime behavior in how stack connectors that resolve to inference endpoints are handled, adding a new routing path in the chat completion API. The author does not own any of the changed files, which are owned by @elastic/search-kibana and @elastic/obs-ai-team/@elastic/security-generative-ai.

You can customize Macroscope's approvability policy. Learn more.

Copy link
Copy Markdown
Member

@spong spong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Evals code changes LGTM! 👍

Thanks @sphilipse for the feedback here and thanks @viduni94 for the fix! 🙌

@elasticmachine
Copy link
Copy Markdown
Contributor

⏳ Build in-progress, with failures

Failed CI Steps

Test Failures

  • [job] [logs] Jest Integration Tests #11 / scripts/generate_plugin builds a generated plugin into a viable archive

cc @viduni94

@viduni94 viduni94 removed (deprecated) evals:streams-sigevents This label is deprecated. Use `evals:significant-events` to run the Significant Events eval suite. models:eis/anthropic-claude-4.6-sonnet Run LLM evals against model: eis/anthropic-claude-4.6-sonnet models:judge:eis/google-gemini-3.1-pro Override LLM-as-a-judge connector for evals: eis/google-gemini-3.1-pro labels Mar 25, 2026
Copy link
Copy Markdown
Member

@sphilipse sphilipse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@viduni94 viduni94 merged commit 493503a into elastic:main Mar 26, 2026
18 checks passed
shahargl pushed a commit to shahargl/kibana that referenced this pull request Mar 26, 2026
…oints (elastic#259656)

Closes elastic#259641

## Summary

When a stack connector ID (e.g.: a preconfigured `.inference` connector)
is passed to the inference plugin's `chatComplete` API,
`getConnectorById` may resolve it to an `InferenceConnector` with
`isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only
checked the `endpointIdCache` to decide whether to use the inference
endpoint execution path. If the cache didn't contain the ID, it fell
through to the stack connector adapter path, which then failed because
there's no adapter for the .inference connector type.

This caused a `Saved object [action/<inference-endpoint-id>] not found`
error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference`
connector IDs, because the code attempted to execute the ES inference
endpoint ID as a Kibana saved-object action.

### Changes
- `callback_api.ts`: After `getInferenceExecutor` resolves the connector
in the stack connector branch, check `connector.isInferenceEndpoint`. If
true, redirect to the inference endpoint execution path
(`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` +
`inferenceEndpointAdapter`).
- `api.test.ts`: Added tests covering the "stack connector resolving to
inference endpoint" path.
- `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side
workaround that was extracting inferenceId from `.inference` connectors
- no longer needed now that the inference plugin handles this
server-side.
- `create_connector_fixture.test.ts`: Removed corresponding workaround
tests.

## Related
- elastic#258530 introduced unified connector listing via
`getConnectorList()`/`getConnectorById()`, which returns inference
endpoints with `isInferenceEndpoint: true`
- elastic#259446 added a temporary client-side workaround in `kbn-evals` (now
removed by this PR)

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Apr 1, 2026
…oints (elastic#259656)

Closes elastic#259641

## Summary

When a stack connector ID (e.g.: a preconfigured `.inference` connector)
is passed to the inference plugin's `chatComplete` API,
`getConnectorById` may resolve it to an `InferenceConnector` with
`isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only
checked the `endpointIdCache` to decide whether to use the inference
endpoint execution path. If the cache didn't contain the ID, it fell
through to the stack connector adapter path, which then failed because
there's no adapter for the .inference connector type.

This caused a `Saved object [action/<inference-endpoint-id>] not found`
error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference`
connector IDs, because the code attempted to execute the ES inference
endpoint ID as a Kibana saved-object action.

### Changes
- `callback_api.ts`: After `getInferenceExecutor` resolves the connector
in the stack connector branch, check `connector.isInferenceEndpoint`. If
true, redirect to the inference endpoint execution path
(`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` +
`inferenceEndpointAdapter`).
- `api.test.ts`: Added tests covering the "stack connector resolving to
inference endpoint" path.
- `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side
workaround that was extracting inferenceId from `.inference` connectors
- no longer needed now that the inference plugin handles this
server-side.
- `create_connector_fixture.test.ts`: Removed corresponding workaround
tests.

## Related
- elastic#258530 introduced unified connector listing via
`getConnectorList()`/`getConnectorById()`, which returns inference
endpoints with `isInferenceEndpoint: true`
- elastic#259446 added a temporary client-side workaround in `kbn-evals` (now
removed by this PR)

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
paulinashakirova pushed a commit to paulinashakirova/kibana that referenced this pull request Apr 2, 2026
…oints (elastic#259656)

Closes elastic#259641

## Summary

When a stack connector ID (e.g.: a preconfigured `.inference` connector)
is passed to the inference plugin's `chatComplete` API,
`getConnectorById` may resolve it to an `InferenceConnector` with
`isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only
checked the `endpointIdCache` to decide whether to use the inference
endpoint execution path. If the cache didn't contain the ID, it fell
through to the stack connector adapter path, which then failed because
there's no adapter for the .inference connector type.

This caused a `Saved object [action/<inference-endpoint-id>] not found`
error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference`
connector IDs, because the code attempted to execute the ES inference
endpoint ID as a Kibana saved-object action.

### Changes
- `callback_api.ts`: After `getInferenceExecutor` resolves the connector
in the stack connector branch, check `connector.isInferenceEndpoint`. If
true, redirect to the inference endpoint execution path
(`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` +
`inferenceEndpointAdapter`).
- `api.test.ts`: Added tests covering the "stack connector resolving to
inference endpoint" path.
- `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side
workaround that was extracting inferenceId from `.inference` connectors
- no longer needed now that the inference plugin handles this
server-side.
- `create_connector_fixture.test.ts`: Removed corresponding workaround
tests.

## Related
- elastic#258530 introduced unified connector listing via
`getConnectorList()`/`getConnectorById()`, which returns inference
endpoints with `isInferenceEndpoint: true`
- elastic#259446 added a temporary client-side workaround in `kbn-evals` (now
removed by this PR)

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport:skip This PR does not require backporting release_note:skip Skip the PR/issue when compiling release notes Team:obs-ai Observability AI team Team:Search v9.4.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Inference] resolveAndCreatePipeline should handle stack connector IDs that resolve to inference endpoints

5 participants