Skip to content

[Inference] Consolidate LLM connector listing via inference plugin#258530

Merged
sphilipse merged 49 commits intoelastic:mainfrom
sphilipse:inference-load-connectors-hook
Mar 24, 2026
Merged

[Inference] Consolidate LLM connector listing via inference plugin#258530
sphilipse merged 49 commits intoelastic:mainfrom
sphilipse:inference-load-connectors-hook

Conversation

@sphilipse
Copy link
Copy Markdown
Member

Summary

Moves all Kibana features that list LLM connectors to go through the inference plugin's getConnectorList(), which now returns both Kibana stack connectors and native Elasticsearch inference endpoints in a unified InferenceConnector[] format. This makes sure the whole Kibana platform consumes and lists connectors and inference endpoints in the same way, which is part of our efforts to migrate to a unified model settings page.

Changes:

  • inference-common: adds isPreconfigured to InferenceConnector and RawConnector interfaces; exposes isInferenceEndpoint flag for native ES endpoints
  • inference plugin: getConnectorList() now fetches both stack connectors (via actions plugin) and chat_completion ES inference endpoints, merging them into a single list
  • New @kbn/inference-connectors package: shared useLoadConnectors hook that fetches and deduplicates connectors + native endpoints for frontend use
  • Simplified server-side connector routes in observability_ai_assistant, gen_ai_settings, and streams to delegate to inferenceStart.getConnectorList() instead of duplicating filtering logic
  • Updated client-side connector selectors in agent_builder, automatic_import, workplace_ai_app, and security_solution to use useLoadConnectors from @kbn/inference-connectors
  • Fixed kbn-ai-assistant's chat_actions_menu and chat_body to use connectorId (InferenceConnector) instead of id (ActionConnector), which broke the connector selector display and selection

Checklist

Check the PR satisfies following conditions.

Reviewers should verify this PR satisfies this list as well.

Moves all Kibana features that list LLM connectors to go through the
inference plugin's getConnectorList(), which now returns both Kibana
stack connectors and native Elasticsearch inference endpoints in a
unified InferenceConnector[] format.

Changes:
- inference-common: adds isPreconfigured to InferenceConnector and
  RawConnector interfaces; exposes isInferenceEndpoint flag for native
  ES endpoints
- inference plugin: getConnectorList() now fetches both stack connectors
  (via actions plugin) and chat_completion ES inference endpoints,
  merging them into a single list
- New @kbn/inference-connectors package: shared useLoadConnectors hook
  that fetches and deduplicates connectors + native endpoints for
  frontend use
- Simplified server-side connector routes in observability_ai_assistant,
  gen_ai_settings, and streams to delegate to
  inferenceStart.getConnectorList() instead of duplicating filtering logic
- Updated client-side connector selectors in agent_builder,
  automatic_import, workplace_ai_app, and security_solution to use
  useLoadConnectors from @kbn/inference-connectors
- Fixed kbn-ai-assistant's chat_actions_menu and chat_body to use
  connectorId (InferenceConnector) instead of id (ActionConnector),
  which broke the connector selector display and selection

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@sphilipse sphilipse added the backport:skip This PR does not require backporting label Mar 19, 2026
@sphilipse sphilipse requested a review from a team as a code owner March 19, 2026 12:25
@sphilipse sphilipse added the release_note:feature Makes this part of the condensed release notes label Mar 19, 2026
@sphilipse sphilipse requested review from a team as code owners March 19, 2026 12:25
@sphilipse sphilipse requested review from a team as code owners March 19, 2026 12:25
@sphilipse sphilipse requested review from a team as code owners March 19, 2026 12:25
@botelastic botelastic bot added ci:project-deploy-observability Create an Observability project Team:obs-ai Observability AI team Team:One Workflow Team label for One Workflow (Workflow automation) labels Mar 19, 2026
@elasticmachine
Copy link
Copy Markdown
Contributor

Pinging @elastic/obs-ai-team (Team:obs-ai)

@github-actions
Copy link
Copy Markdown
Contributor

🤖 GitHub comments

Expand to view the GitHub comments

Just comment with:

  • /oblt-deploy : Deploy a Kibana instance using the Observability test environments.
  • run docs-build : Re-trigger the docs validation. (use unformatted text in the comment!)

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Mar 19, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Adds a new shared package @kbn/inference-connectors (sources, types, jest/tsconfig, manifest) that exports a useLoadConnectors hook and AIConnector type. Introduces isPreconfigured on InferenceConnector and optional isPreconfigured on raw connector types. Centralizes connector listing by switching many consumers to import the new hook and by having server routes return InferenceConnector[] via the inference plugin's getConnectorList. Adjusts connector identity usage from id to connectorId in multiple UI areas and updates tests/fixtures and TypeScript project/path references accordingly.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • 🛠️ Update Documentation: Commit on current branch
  • 🛠️ Update Documentation: Create PR

Comment @coderabbitai help to get the list of available commands and usage tips.

kubasobon pushed a commit that referenced this pull request Mar 24, 2026
…258530)

## Summary

Moves all Kibana features that list LLM connectors to go through the
inference plugin's getConnectorList(), which now returns both Kibana
stack connectors and native Elasticsearch inference endpoints in a
unified InferenceConnector[] format. This makes sure the whole Kibana
platform consumes and lists connectors and inference endpoints in the
same way, which is part of our efforts to migrate to a unified model
settings page.

Changes:
- inference-common: adds isPreconfigured to InferenceConnector and
RawConnector interfaces; exposes isInferenceEndpoint flag for native ES
endpoints
- inference plugin: getConnectorList() now fetches both stack connectors
(via actions plugin) and chat_completion ES inference endpoints, merging
them into a single list
- New @kbn/inference-connectors package: shared useLoadConnectors hook
that fetches and deduplicates connectors + native endpoints for frontend
use
- Simplified server-side connector routes in observability_ai_assistant,
gen_ai_settings, and streams to delegate to
inferenceStart.getConnectorList() instead of duplicating filtering logic
- Updated client-side connector selectors in agent_builder,
automatic_import, workplace_ai_app, and security_solution to use
useLoadConnectors from @kbn/inference-connectors
- Fixed kbn-ai-assistant's chat_actions_menu and chat_body to use
connectorId (InferenceConnector) instead of id (ActionConnector), which
broke the connector selector display and selection

### Checklist

Check the PR satisfies following conditions. 

Reviewers should verify this PR satisfies this list as well.

- [x] Any text added follows [EUI's writing
guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses
sentence case text and includes [i18n
support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md)
- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
viduni94 added a commit that referenced this pull request Mar 25, 2026
…ector resolution (#259446)

Closes #259472

## Summary

Fixes two issues breaking `kbn-evals` runs (both local and CI):

### 1. APM / OpenTelemetry tracing conflict
A recent validation in `initTelemetry`
(#258303,
#258663) throws when Elastic APM
and OpenTelemetry tracing are both active. The `evals_tracing` Scout
config enables OTel tracing but didn't explicitly disable APM, causing
Kibana (and the Playwright worker) to crash on startup.

Fix:
- Added a `coerceCliValue` helper in `applyConfigOverrides`
(`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and
numeric strings to numbers before they're set in the config object.
- Added `--elastic.apm.active=false` and
`--elastic.apm.contextPropagationOnly=false` to the `evals_tracing`
Scout server config and to `require_init_apm.js` (for the Playwright
worker when `TRACING_EXPORTERS` is set).
- Updated the `kbn-evals` README to document the required APM settings
when configuring tracing in `kibana.dev.yml`.

### 2. Inference endpoint connector resolution
#258530 consolidated LLM connector
listing through the inference plugin's `getConnectorList()`, which now
returns inference endpoint IDs (e.g.:
`.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack
connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was
still passing the stack connector key to the inference API, which then
tried to execute it as a Kibana action - resulting in "Saved object
`[action/.anthropic-claude-4.6-opus-chat_completion]` not found".

Fix:
- `createConnectorFixture` now detects `.inference-type` connectors and
extracts their `inferenceId` from the config, using the ES inference
endpoint ID directly. This bypasses the Kibana actions framework and
aligns with the unified connector model from
[#258530](#258530).

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Mar 26, 2026
…lastic#258530)

## Summary

Moves all Kibana features that list LLM connectors to go through the
inference plugin's getConnectorList(), which now returns both Kibana
stack connectors and native Elasticsearch inference endpoints in a
unified InferenceConnector[] format. This makes sure the whole Kibana
platform consumes and lists connectors and inference endpoints in the
same way, which is part of our efforts to migrate to a unified model
settings page.

Changes:
- inference-common: adds isPreconfigured to InferenceConnector and
RawConnector interfaces; exposes isInferenceEndpoint flag for native ES
endpoints
- inference plugin: getConnectorList() now fetches both stack connectors
(via actions plugin) and chat_completion ES inference endpoints, merging
them into a single list
- New @kbn/inference-connectors package: shared useLoadConnectors hook
that fetches and deduplicates connectors + native endpoints for frontend
use
- Simplified server-side connector routes in observability_ai_assistant,
gen_ai_settings, and streams to delegate to
inferenceStart.getConnectorList() instead of duplicating filtering logic
- Updated client-side connector selectors in agent_builder,
automatic_import, workplace_ai_app, and security_solution to use
useLoadConnectors from @kbn/inference-connectors
- Fixed kbn-ai-assistant's chat_actions_menu and chat_body to use
connectorId (InferenceConnector) instead of id (ActionConnector), which
broke the connector selector display and selection

### Checklist

Check the PR satisfies following conditions. 

Reviewers should verify this PR satisfies this list as well.

- [x] Any text added follows [EUI's writing
guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses
sentence case text and includes [i18n
support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md)
- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Mar 26, 2026
…ector resolution (elastic#259446)

Closes elastic#259472

## Summary

Fixes two issues breaking `kbn-evals` runs (both local and CI):

### 1. APM / OpenTelemetry tracing conflict
A recent validation in `initTelemetry`
(elastic#258303,
elastic#258663) throws when Elastic APM
and OpenTelemetry tracing are both active. The `evals_tracing` Scout
config enables OTel tracing but didn't explicitly disable APM, causing
Kibana (and the Playwright worker) to crash on startup.

Fix:
- Added a `coerceCliValue` helper in `applyConfigOverrides`
(`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and
numeric strings to numbers before they're set in the config object.
- Added `--elastic.apm.active=false` and
`--elastic.apm.contextPropagationOnly=false` to the `evals_tracing`
Scout server config and to `require_init_apm.js` (for the Playwright
worker when `TRACING_EXPORTERS` is set).
- Updated the `kbn-evals` README to document the required APM settings
when configuring tracing in `kibana.dev.yml`.

### 2. Inference endpoint connector resolution
elastic#258530 consolidated LLM connector
listing through the inference plugin's `getConnectorList()`, which now
returns inference endpoint IDs (e.g.:
`.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack
connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was
still passing the stack connector key to the inference API, which then
tried to execute it as a Kibana action - resulting in "Saved object
`[action/.anthropic-claude-4.6-opus-chat_completion]` not found".

Fix:
- `createConnectorFixture` now detects `.inference-type` connectors and
extracts their `inferenceId` from the config, using the ES inference
endpoint ID directly. This bypasses the Kibana actions framework and
aligns with the unified connector model from
[elastic#258530](elastic#258530).

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
markov00 pushed a commit to markov00/kibana that referenced this pull request Mar 26, 2026
…ector resolution (elastic#259446)

Closes elastic#259472

## Summary

Fixes two issues breaking `kbn-evals` runs (both local and CI):

### 1. APM / OpenTelemetry tracing conflict
A recent validation in `initTelemetry`
(elastic#258303,
elastic#258663) throws when Elastic APM
and OpenTelemetry tracing are both active. The `evals_tracing` Scout
config enables OTel tracing but didn't explicitly disable APM, causing
Kibana (and the Playwright worker) to crash on startup.

Fix:
- Added a `coerceCliValue` helper in `applyConfigOverrides`
(`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and
numeric strings to numbers before they're set in the config object.
- Added `--elastic.apm.active=false` and
`--elastic.apm.contextPropagationOnly=false` to the `evals_tracing`
Scout server config and to `require_init_apm.js` (for the Playwright
worker when `TRACING_EXPORTERS` is set).
- Updated the `kbn-evals` README to document the required APM settings
when configuring tracing in `kibana.dev.yml`.

### 2. Inference endpoint connector resolution
elastic#258530 consolidated LLM connector
listing through the inference plugin's `getConnectorList()`, which now
returns inference endpoint IDs (e.g.:
`.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack
connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was
still passing the stack connector key to the inference API, which then
tried to execute it as a Kibana action - resulting in "Saved object
`[action/.anthropic-claude-4.6-opus-chat_completion]` not found".

Fix:
- `createConnectorFixture` now detects `.inference-type` connectors and
extracts their `inferenceId` from the config, using the ES inference
endpoint ID directly. This bypasses the Kibana actions framework and
aligns with the unified connector model from
[elastic#258530](elastic#258530).

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
viduni94 added a commit that referenced this pull request Mar 26, 2026
…oints (#259656)

Closes #259641

## Summary

When a stack connector ID (e.g.: a preconfigured `.inference` connector)
is passed to the inference plugin's `chatComplete` API,
`getConnectorById` may resolve it to an `InferenceConnector` with
`isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only
checked the `endpointIdCache` to decide whether to use the inference
endpoint execution path. If the cache didn't contain the ID, it fell
through to the stack connector adapter path, which then failed because
there's no adapter for the .inference connector type.

This caused a `Saved object [action/<inference-endpoint-id>] not found`
error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference`
connector IDs, because the code attempted to execute the ES inference
endpoint ID as a Kibana saved-object action.

### Changes
- `callback_api.ts`: After `getInferenceExecutor` resolves the connector
in the stack connector branch, check `connector.isInferenceEndpoint`. If
true, redirect to the inference endpoint execution path
(`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` +
`inferenceEndpointAdapter`).
- `api.test.ts`: Added tests covering the "stack connector resolving to
inference endpoint" path.
- `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side
workaround that was extracting inferenceId from `.inference` connectors
- no longer needed now that the inference plugin handles this
server-side.
- `create_connector_fixture.test.ts`: Removed corresponding workaround
tests.

## Related
- #258530 introduced unified connector listing via
`getConnectorList()`/`getConnectorById()`, which returns inference
endpoints with `isInferenceEndpoint: true`
- #259446 added a temporary client-side workaround in `kbn-evals` (now
removed by this PR)

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
shahargl pushed a commit to shahargl/kibana that referenced this pull request Mar 26, 2026
…oints (elastic#259656)

Closes elastic#259641

## Summary

When a stack connector ID (e.g.: a preconfigured `.inference` connector)
is passed to the inference plugin's `chatComplete` API,
`getConnectorById` may resolve it to an `InferenceConnector` with
`isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only
checked the `endpointIdCache` to decide whether to use the inference
endpoint execution path. If the cache didn't contain the ID, it fell
through to the stack connector adapter path, which then failed because
there's no adapter for the .inference connector type.

This caused a `Saved object [action/<inference-endpoint-id>] not found`
error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference`
connector IDs, because the code attempted to execute the ES inference
endpoint ID as a Kibana saved-object action.

### Changes
- `callback_api.ts`: After `getInferenceExecutor` resolves the connector
in the stack connector branch, check `connector.isInferenceEndpoint`. If
true, redirect to the inference endpoint execution path
(`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` +
`inferenceEndpointAdapter`).
- `api.test.ts`: Added tests covering the "stack connector resolving to
inference endpoint" path.
- `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side
workaround that was extracting inferenceId from `.inference` connectors
- no longer needed now that the inference plugin handles this
server-side.
- `create_connector_fixture.test.ts`: Removed corresponding workaround
tests.

## Related
- elastic#258530 introduced unified connector listing via
`getConnectorList()`/`getConnectorById()`, which returns inference
endpoints with `isInferenceEndpoint: true`
- elastic#259446 added a temporary client-side workaround in `kbn-evals` (now
removed by this PR)

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
logeekal added a commit that referenced this pull request Mar 27, 2026
…ent Builder (#259840)

## Summary

Aligns Automatic Migrations (SIEM Migrations) with the same connector
resolution scheme used by Assistant and Agent Builder.

After #258530, Automatic Migrations was still using the old connector
loading path (`loadAllActions` from triggers-actions-ui), which returns
preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as
connector IDs. This meant the `connectorId` passed to
[`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24)
was a `model_id` (old scheme) instead of the `inference_id`, resulting
in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error.

This PR switches Automatic Migrations to use the inference plugin's
connector resolution (same as Assistant and Agent Builder), which
returns `inference_id`-based connector IDs.

## Changes

- **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with
`useLoadConnectors` / `loadConnectorsForFeature` from
`@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`)
- **Backend:** Replaced `actionsClient.get()` with
`inferenceClient.getConnectorById()` in rules and dashboards start
routes
- **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature`
as a reusable stateless function from `useLoadConnectors`

## Test plan
- [ ] Select a preconfigured inference connector in the SIEM Migrations
onboarding card
- [ ] Start a rule migration — should succeed without "saved object not
found" error
- [ ] Start a dashboard migration — same
- [ ] Verify connector selection persists across modal reopens (via
localStorage)

--
Made with Cursor

---------

Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
SoniaSanzV pushed a commit to SoniaSanzV/kibana that referenced this pull request Mar 30, 2026
…ent Builder (elastic#259840)

## Summary

Aligns Automatic Migrations (SIEM Migrations) with the same connector
resolution scheme used by Assistant and Agent Builder.

After elastic#258530, Automatic Migrations was still using the old connector
loading path (`loadAllActions` from triggers-actions-ui), which returns
preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as
connector IDs. This meant the `connectorId` passed to
[`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24)
was a `model_id` (old scheme) instead of the `inference_id`, resulting
in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error.

This PR switches Automatic Migrations to use the inference plugin's
connector resolution (same as Assistant and Agent Builder), which
returns `inference_id`-based connector IDs.

## Changes

- **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with
`useLoadConnectors` / `loadConnectorsForFeature` from
`@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`)
- **Backend:** Replaced `actionsClient.get()` with
`inferenceClient.getConnectorById()` in rules and dashboards start
routes
- **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature`
as a reusable stateless function from `useLoadConnectors`

## Test plan
- [ ] Select a preconfigured inference connector in the SIEM Migrations
onboarding card
- [ ] Start a rule migration — should succeed without "saved object not
found" error
- [ ] Start a dashboard migration — same
- [ ] Verify connector selection persists across modal reopens (via
localStorage)

--
Made with Cursor

---------

Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
sphilipse added a commit to sphilipse/kibana that referenced this pull request Mar 30, 2026
After PR elastic#258530 consolidated connector listing, EIS connectors are
returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet-
chat_completion) as their connectorId rather than Kibana saved object
IDs. Attack Discovery and Defend Insights passed these IDs to
ActionsClientLlm which used actionsClient.execute(), failing with
"Saved object [action/...] not found".

Fix: Thread an InferenceClient from route handlers to ActionsClientLlm.
When actionTypeId is .inference and an inference client is available,
ActionsClientLlm uses its existing isInferenceEndpoint path which calls
inferenceClient.chatComplete() instead of actionsClient.execute(),
correctly handling inference endpoint IDs.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
sphilipse added a commit that referenced this pull request Mar 30, 2026
)

## Summary

After PR #258530 consolidated connector listing, EIS connectors are
returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet-
chat_completion) as their connectorId rather than Kibana saved object
IDs. Attack Discovery and Defend Insights passed these IDs to
ActionsClientLlm which used actionsClient.execute(), failing with "Saved
object [action/...] not found".

This PR uses a new path when actionTypeId is .inference and an inference
client is available. ActionsClientLlm uses its existing
isInferenceEndpoint path which calls inferenceClient.chatComplete()
instead of actionsClient.execute().

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
e40pud pushed a commit to e40pud/kibana that referenced this pull request Mar 30, 2026
…tic#260268)

## Summary

After PR elastic#258530 consolidated connector listing, EIS connectors are
returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet-
chat_completion) as their connectorId rather than Kibana saved object
IDs. Attack Discovery and Defend Insights passed these IDs to
ActionsClientLlm which used actionsClient.execute(), failing with "Saved
object [action/...] not found".

This PR uses a new path when actionTypeId is .inference and an inference
client is available. ActionsClientLlm uses its existing
isInferenceEndpoint path which calls inferenceClient.chatComplete()
instead of actionsClient.execute().

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
pheyos pushed a commit that referenced this pull request Mar 31, 2026
) (#260345)

## Summary

Cherry pick #260268 to
`deploy-fix@1774851440`

After PR #258530 consolidated connector listing, EIS connectors are
returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet-
chat_completion) as their connectorId rather than Kibana saved object
IDs. Attack Discovery and Defend Insights passed these IDs to
ActionsClientLlm which used actionsClient.execute(), failing with "Saved
object [action/...] not found".

This PR uses a new path when actionTypeId is .inference and an inference
client is available. ActionsClientLlm uses its existing
isInferenceEndpoint path which calls inferenceClient.chatComplete()
instead of actionsClient.execute().

cc @sphilipse @pheyos

Co-authored-by: Sander Philipse <94373878+sphilipse@users.noreply.github.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Apr 1, 2026
…ector resolution (elastic#259446)

Closes elastic#259472

## Summary

Fixes two issues breaking `kbn-evals` runs (both local and CI):

### 1. APM / OpenTelemetry tracing conflict
A recent validation in `initTelemetry`
(elastic#258303,
elastic#258663) throws when Elastic APM
and OpenTelemetry tracing are both active. The `evals_tracing` Scout
config enables OTel tracing but didn't explicitly disable APM, causing
Kibana (and the Playwright worker) to crash on startup.

Fix:
- Added a `coerceCliValue` helper in `applyConfigOverrides`
(`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and
numeric strings to numbers before they're set in the config object.
- Added `--elastic.apm.active=false` and
`--elastic.apm.contextPropagationOnly=false` to the `evals_tracing`
Scout server config and to `require_init_apm.js` (for the Playwright
worker when `TRACING_EXPORTERS` is set).
- Updated the `kbn-evals` README to document the required APM settings
when configuring tracing in `kibana.dev.yml`.

### 2. Inference endpoint connector resolution
elastic#258530 consolidated LLM connector
listing through the inference plugin's `getConnectorList()`, which now
returns inference endpoint IDs (e.g.:
`.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack
connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was
still passing the stack connector key to the inference API, which then
tried to execute it as a Kibana action - resulting in "Saved object
`[action/.anthropic-claude-4.6-opus-chat_completion]` not found".

Fix:
- `createConnectorFixture` now detects `.inference-type` connectors and
extracts their `inferenceId` from the config, using the ES inference
endpoint ID directly. This bypasses the Kibana actions framework and
aligns with the unified connector model from
[elastic#258530](elastic#258530).

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Apr 1, 2026
…oints (elastic#259656)

Closes elastic#259641

## Summary

When a stack connector ID (e.g.: a preconfigured `.inference` connector)
is passed to the inference plugin's `chatComplete` API,
`getConnectorById` may resolve it to an `InferenceConnector` with
`isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only
checked the `endpointIdCache` to decide whether to use the inference
endpoint execution path. If the cache didn't contain the ID, it fell
through to the stack connector adapter path, which then failed because
there's no adapter for the .inference connector type.

This caused a `Saved object [action/<inference-endpoint-id>] not found`
error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference`
connector IDs, because the code attempted to execute the ES inference
endpoint ID as a Kibana saved-object action.

### Changes
- `callback_api.ts`: After `getInferenceExecutor` resolves the connector
in the stack connector branch, check `connector.isInferenceEndpoint`. If
true, redirect to the inference endpoint execution path
(`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` +
`inferenceEndpointAdapter`).
- `api.test.ts`: Added tests covering the "stack connector resolving to
inference endpoint" path.
- `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side
workaround that was extracting inferenceId from `.inference` connectors
- no longer needed now that the inference plugin handles this
server-side.
- `create_connector_fixture.test.ts`: Removed corresponding workaround
tests.

## Related
- elastic#258530 introduced unified connector listing via
`getConnectorList()`/`getConnectorById()`, which returns inference
endpoints with `isInferenceEndpoint: true`
- elastic#259446 added a temporary client-side workaround in `kbn-evals` (now
removed by this PR)

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Apr 1, 2026
…ent Builder (elastic#259840)

## Summary

Aligns Automatic Migrations (SIEM Migrations) with the same connector
resolution scheme used by Assistant and Agent Builder.

After elastic#258530, Automatic Migrations was still using the old connector
loading path (`loadAllActions` from triggers-actions-ui), which returns
preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as
connector IDs. This meant the `connectorId` passed to
[`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24)
was a `model_id` (old scheme) instead of the `inference_id`, resulting
in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error.

This PR switches Automatic Migrations to use the inference plugin's
connector resolution (same as Assistant and Agent Builder), which
returns `inference_id`-based connector IDs.

## Changes

- **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with
`useLoadConnectors` / `loadConnectorsForFeature` from
`@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`)
- **Backend:** Replaced `actionsClient.get()` with
`inferenceClient.getConnectorById()` in rules and dashboards start
routes
- **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature`
as a reusable stateless function from `useLoadConnectors`

## Test plan
- [ ] Select a preconfigured inference connector in the SIEM Migrations
onboarding card
- [ ] Start a rule migration — should succeed without "saved object not
found" error
- [ ] Start a dashboard migration — same
- [ ] Verify connector selection persists across modal reopens (via
localStorage)

--
Made with Cursor

---------

Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
jeramysoucy pushed a commit to jeramysoucy/kibana that referenced this pull request Apr 1, 2026
…tic#260268)

## Summary

After PR elastic#258530 consolidated connector listing, EIS connectors are
returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet-
chat_completion) as their connectorId rather than Kibana saved object
IDs. Attack Discovery and Defend Insights passed these IDs to
ActionsClientLlm which used actionsClient.execute(), failing with "Saved
object [action/...] not found".

This PR uses a new path when actionTypeId is .inference and an inference
client is available. ActionsClientLlm uses its existing
isInferenceEndpoint path which calls inferenceClient.chatComplete()
instead of actionsClient.execute().

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
paulinashakirova pushed a commit to paulinashakirova/kibana that referenced this pull request Apr 2, 2026
…ector resolution (elastic#259446)

Closes elastic#259472

## Summary

Fixes two issues breaking `kbn-evals` runs (both local and CI):

### 1. APM / OpenTelemetry tracing conflict
A recent validation in `initTelemetry`
(elastic#258303,
elastic#258663) throws when Elastic APM
and OpenTelemetry tracing are both active. The `evals_tracing` Scout
config enables OTel tracing but didn't explicitly disable APM, causing
Kibana (and the Playwright worker) to crash on startup.

Fix:
- Added a `coerceCliValue` helper in `applyConfigOverrides`
(`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and
numeric strings to numbers before they're set in the config object.
- Added `--elastic.apm.active=false` and
`--elastic.apm.contextPropagationOnly=false` to the `evals_tracing`
Scout server config and to `require_init_apm.js` (for the Playwright
worker when `TRACING_EXPORTERS` is set).
- Updated the `kbn-evals` README to document the required APM settings
when configuring tracing in `kibana.dev.yml`.

### 2. Inference endpoint connector resolution
elastic#258530 consolidated LLM connector
listing through the inference plugin's `getConnectorList()`, which now
returns inference endpoint IDs (e.g.:
`.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack
connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was
still passing the stack connector key to the inference API, which then
tried to execute it as a Kibana action - resulting in "Saved object
`[action/.anthropic-claude-4.6-opus-chat_completion]` not found".

Fix:
- `createConnectorFixture` now detects `.inference-type` connectors and
extracts their `inferenceId` from the config, using the ES inference
endpoint ID directly. This bypasses the Kibana actions framework and
aligns with the unified connector model from
[elastic#258530](elastic#258530).

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
paulinashakirova pushed a commit to paulinashakirova/kibana that referenced this pull request Apr 2, 2026
…oints (elastic#259656)

Closes elastic#259641

## Summary

When a stack connector ID (e.g.: a preconfigured `.inference` connector)
is passed to the inference plugin's `chatComplete` API,
`getConnectorById` may resolve it to an `InferenceConnector` with
`isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only
checked the `endpointIdCache` to decide whether to use the inference
endpoint execution path. If the cache didn't contain the ID, it fell
through to the stack connector adapter path, which then failed because
there's no adapter for the .inference connector type.

This caused a `Saved object [action/<inference-endpoint-id>] not found`
error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference`
connector IDs, because the code attempted to execute the ES inference
endpoint ID as a Kibana saved-object action.

### Changes
- `callback_api.ts`: After `getInferenceExecutor` resolves the connector
in the stack connector branch, check `connector.isInferenceEndpoint`. If
true, redirect to the inference endpoint execution path
(`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` +
`inferenceEndpointAdapter`).
- `api.test.ts`: Added tests covering the "stack connector resolving to
inference endpoint" path.
- `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side
workaround that was extracting inferenceId from `.inference` connectors
- no longer needed now that the inference plugin handles this
server-side.
- `create_connector_fixture.test.ts`: Removed corresponding workaround
tests.

## Related
- elastic#258530 introduced unified connector listing via
`getConnectorList()`/`getConnectorById()`, which returns inference
endpoints with `isInferenceEndpoint: true`
- elastic#259446 added a temporary client-side workaround in `kbn-evals` (now
removed by this PR)

### Checklist

- [x] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [x] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [x] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.
paulinashakirova pushed a commit to paulinashakirova/kibana that referenced this pull request Apr 2, 2026
…ent Builder (elastic#259840)

## Summary

Aligns Automatic Migrations (SIEM Migrations) with the same connector
resolution scheme used by Assistant and Agent Builder.

After elastic#258530, Automatic Migrations was still using the old connector
loading path (`loadAllActions` from triggers-actions-ui), which returns
preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as
connector IDs. This meant the `connectorId` passed to
[`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24)
was a `model_id` (old scheme) instead of the `inference_id`, resulting
in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error.

This PR switches Automatic Migrations to use the inference plugin's
connector resolution (same as Assistant and Agent Builder), which
returns `inference_id`-based connector IDs.

## Changes

- **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with
`useLoadConnectors` / `loadConnectorsForFeature` from
`@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`)
- **Backend:** Replaced `actionsClient.get()` with
`inferenceClient.getConnectorById()` in rules and dashboards start
routes
- **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature`
as a reusable stateless function from `useLoadConnectors`

## Test plan
- [ ] Select a preconfigured inference connector in the SIEM Migrations
onboarding card
- [ ] Start a rule migration — should succeed without "saved object not
found" error
- [ ] Start a dashboard migration — same
- [ ] Verify connector selection persists across modal reopens (via
localStorage)

--
Made with Cursor

---------

Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
paulinashakirova pushed a commit to paulinashakirova/kibana that referenced this pull request Apr 2, 2026
…tic#260268)

## Summary

After PR elastic#258530 consolidated connector listing, EIS connectors are
returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet-
chat_completion) as their connectorId rather than Kibana saved object
IDs. Attack Discovery and Defend Insights passed these IDs to
ActionsClientLlm which used actionsClient.execute(), failing with "Saved
object [action/...] not found".

This PR uses a new path when actionTypeId is .inference and an inference
client is available. ActionsClientLlm uses its existing
isInferenceEndpoint path which calls inferenceClient.chatComplete()
instead of actionsClient.execute().

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

agent-builder:skip-smoke-tests backport:skip This PR does not require backporting ci:project-deploy-observability Create an Observability project release_note:feature Makes this part of the condensed release notes Team:obs-ai Observability AI team Team:One Workflow Team label for One Workflow (Workflow automation) Team:Search v9.4.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.