[Inference] Consolidate LLM connector listing via inference plugin#258530
[Inference] Consolidate LLM connector listing via inference plugin#258530sphilipse merged 49 commits intoelastic:mainfrom
Conversation
Moves all Kibana features that list LLM connectors to go through the inference plugin's getConnectorList(), which now returns both Kibana stack connectors and native Elasticsearch inference endpoints in a unified InferenceConnector[] format. Changes: - inference-common: adds isPreconfigured to InferenceConnector and RawConnector interfaces; exposes isInferenceEndpoint flag for native ES endpoints - inference plugin: getConnectorList() now fetches both stack connectors (via actions plugin) and chat_completion ES inference endpoints, merging them into a single list - New @kbn/inference-connectors package: shared useLoadConnectors hook that fetches and deduplicates connectors + native endpoints for frontend use - Simplified server-side connector routes in observability_ai_assistant, gen_ai_settings, and streams to delegate to inferenceStart.getConnectorList() instead of duplicating filtering logic - Updated client-side connector selectors in agent_builder, automatic_import, workplace_ai_app, and security_solution to use useLoadConnectors from @kbn/inference-connectors - Fixed kbn-ai-assistant's chat_actions_menu and chat_body to use connectorId (InferenceConnector) instead of id (ActionConnector), which broke the connector selector display and selection Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
Pinging @elastic/obs-ai-team (Team:obs-ai) |
🤖 GitHub commentsExpand to view the GitHub comments
Just comment with:
|
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a new shared package ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
…258530) ## Summary Moves all Kibana features that list LLM connectors to go through the inference plugin's getConnectorList(), which now returns both Kibana stack connectors and native Elasticsearch inference endpoints in a unified InferenceConnector[] format. This makes sure the whole Kibana platform consumes and lists connectors and inference endpoints in the same way, which is part of our efforts to migrate to a unified model settings page. Changes: - inference-common: adds isPreconfigured to InferenceConnector and RawConnector interfaces; exposes isInferenceEndpoint flag for native ES endpoints - inference plugin: getConnectorList() now fetches both stack connectors (via actions plugin) and chat_completion ES inference endpoints, merging them into a single list - New @kbn/inference-connectors package: shared useLoadConnectors hook that fetches and deduplicates connectors + native endpoints for frontend use - Simplified server-side connector routes in observability_ai_assistant, gen_ai_settings, and streams to delegate to inferenceStart.getConnectorList() instead of duplicating filtering logic - Updated client-side connector selectors in agent_builder, automatic_import, workplace_ai_app, and security_solution to use useLoadConnectors from @kbn/inference-connectors - Fixed kbn-ai-assistant's chat_actions_menu and chat_body to use connectorId (InferenceConnector) instead of id (ActionConnector), which broke the connector selector display and selection ### Checklist Check the PR satisfies following conditions. Reviewers should verify this PR satisfies this list as well. - [x] Any text added follows [EUI's writing guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses sentence case text and includes [i18n support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md) - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels. --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…ector resolution (#259446) Closes #259472 ## Summary Fixes two issues breaking `kbn-evals` runs (both local and CI): ### 1. APM / OpenTelemetry tracing conflict A recent validation in `initTelemetry` (#258303, #258663) throws when Elastic APM and OpenTelemetry tracing are both active. The `evals_tracing` Scout config enables OTel tracing but didn't explicitly disable APM, causing Kibana (and the Playwright worker) to crash on startup. Fix: - Added a `coerceCliValue` helper in `applyConfigOverrides` (`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and numeric strings to numbers before they're set in the config object. - Added `--elastic.apm.active=false` and `--elastic.apm.contextPropagationOnly=false` to the `evals_tracing` Scout server config and to `require_init_apm.js` (for the Playwright worker when `TRACING_EXPORTERS` is set). - Updated the `kbn-evals` README to document the required APM settings when configuring tracing in `kibana.dev.yml`. ### 2. Inference endpoint connector resolution #258530 consolidated LLM connector listing through the inference plugin's `getConnectorList()`, which now returns inference endpoint IDs (e.g.: `.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was still passing the stack connector key to the inference API, which then tried to execute it as a Kibana action - resulting in "Saved object `[action/.anthropic-claude-4.6-opus-chat_completion]` not found". Fix: - `createConnectorFixture` now detects `.inference-type` connectors and extracts their `inferenceId` from the config, using the ES inference endpoint ID directly. This bypasses the Kibana actions framework and aligns with the unified connector model from [#258530](#258530). ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…lastic#258530) ## Summary Moves all Kibana features that list LLM connectors to go through the inference plugin's getConnectorList(), which now returns both Kibana stack connectors and native Elasticsearch inference endpoints in a unified InferenceConnector[] format. This makes sure the whole Kibana platform consumes and lists connectors and inference endpoints in the same way, which is part of our efforts to migrate to a unified model settings page. Changes: - inference-common: adds isPreconfigured to InferenceConnector and RawConnector interfaces; exposes isInferenceEndpoint flag for native ES endpoints - inference plugin: getConnectorList() now fetches both stack connectors (via actions plugin) and chat_completion ES inference endpoints, merging them into a single list - New @kbn/inference-connectors package: shared useLoadConnectors hook that fetches and deduplicates connectors + native endpoints for frontend use - Simplified server-side connector routes in observability_ai_assistant, gen_ai_settings, and streams to delegate to inferenceStart.getConnectorList() instead of duplicating filtering logic - Updated client-side connector selectors in agent_builder, automatic_import, workplace_ai_app, and security_solution to use useLoadConnectors from @kbn/inference-connectors - Fixed kbn-ai-assistant's chat_actions_menu and chat_body to use connectorId (InferenceConnector) instead of id (ActionConnector), which broke the connector selector display and selection ### Checklist Check the PR satisfies following conditions. Reviewers should verify this PR satisfies this list as well. - [x] Any text added follows [EUI's writing guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses sentence case text and includes [i18n support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md) - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels. --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…ector resolution (elastic#259446) Closes elastic#259472 ## Summary Fixes two issues breaking `kbn-evals` runs (both local and CI): ### 1. APM / OpenTelemetry tracing conflict A recent validation in `initTelemetry` (elastic#258303, elastic#258663) throws when Elastic APM and OpenTelemetry tracing are both active. The `evals_tracing` Scout config enables OTel tracing but didn't explicitly disable APM, causing Kibana (and the Playwright worker) to crash on startup. Fix: - Added a `coerceCliValue` helper in `applyConfigOverrides` (`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and numeric strings to numbers before they're set in the config object. - Added `--elastic.apm.active=false` and `--elastic.apm.contextPropagationOnly=false` to the `evals_tracing` Scout server config and to `require_init_apm.js` (for the Playwright worker when `TRACING_EXPORTERS` is set). - Updated the `kbn-evals` README to document the required APM settings when configuring tracing in `kibana.dev.yml`. ### 2. Inference endpoint connector resolution elastic#258530 consolidated LLM connector listing through the inference plugin's `getConnectorList()`, which now returns inference endpoint IDs (e.g.: `.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was still passing the stack connector key to the inference API, which then tried to execute it as a Kibana action - resulting in "Saved object `[action/.anthropic-claude-4.6-opus-chat_completion]` not found". Fix: - `createConnectorFixture` now detects `.inference-type` connectors and extracts their `inferenceId` from the config, using the ES inference endpoint ID directly. This bypasses the Kibana actions framework and aligns with the unified connector model from [elastic#258530](elastic#258530). ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…ector resolution (elastic#259446) Closes elastic#259472 ## Summary Fixes two issues breaking `kbn-evals` runs (both local and CI): ### 1. APM / OpenTelemetry tracing conflict A recent validation in `initTelemetry` (elastic#258303, elastic#258663) throws when Elastic APM and OpenTelemetry tracing are both active. The `evals_tracing` Scout config enables OTel tracing but didn't explicitly disable APM, causing Kibana (and the Playwright worker) to crash on startup. Fix: - Added a `coerceCliValue` helper in `applyConfigOverrides` (`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and numeric strings to numbers before they're set in the config object. - Added `--elastic.apm.active=false` and `--elastic.apm.contextPropagationOnly=false` to the `evals_tracing` Scout server config and to `require_init_apm.js` (for the Playwright worker when `TRACING_EXPORTERS` is set). - Updated the `kbn-evals` README to document the required APM settings when configuring tracing in `kibana.dev.yml`. ### 2. Inference endpoint connector resolution elastic#258530 consolidated LLM connector listing through the inference plugin's `getConnectorList()`, which now returns inference endpoint IDs (e.g.: `.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was still passing the stack connector key to the inference API, which then tried to execute it as a Kibana action - resulting in "Saved object `[action/.anthropic-claude-4.6-opus-chat_completion]` not found". Fix: - `createConnectorFixture` now detects `.inference-type` connectors and extracts their `inferenceId` from the config, using the ES inference endpoint ID directly. This bypasses the Kibana actions framework and aligns with the unified connector model from [elastic#258530](elastic#258530). ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…oints (#259656) Closes #259641 ## Summary When a stack connector ID (e.g.: a preconfigured `.inference` connector) is passed to the inference plugin's `chatComplete` API, `getConnectorById` may resolve it to an `InferenceConnector` with `isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only checked the `endpointIdCache` to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type. This caused a `Saved object [action/<inference-endpoint-id>] not found` error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference` connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action. ### Changes - `callback_api.ts`: After `getInferenceExecutor` resolves the connector in the stack connector branch, check `connector.isInferenceEndpoint`. If true, redirect to the inference endpoint execution path (`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` + `inferenceEndpointAdapter`). - `api.test.ts`: Added tests covering the "stack connector resolving to inference endpoint" path. - `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side workaround that was extracting inferenceId from `.inference` connectors - no longer needed now that the inference plugin handles this server-side. - `create_connector_fixture.test.ts`: Removed corresponding workaround tests. ## Related - #258530 introduced unified connector listing via `getConnectorList()`/`getConnectorById()`, which returns inference endpoints with `isInferenceEndpoint: true` - #259446 added a temporary client-side workaround in `kbn-evals` (now removed by this PR) ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…oints (elastic#259656) Closes elastic#259641 ## Summary When a stack connector ID (e.g.: a preconfigured `.inference` connector) is passed to the inference plugin's `chatComplete` API, `getConnectorById` may resolve it to an `InferenceConnector` with `isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only checked the `endpointIdCache` to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type. This caused a `Saved object [action/<inference-endpoint-id>] not found` error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference` connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action. ### Changes - `callback_api.ts`: After `getInferenceExecutor` resolves the connector in the stack connector branch, check `connector.isInferenceEndpoint`. If true, redirect to the inference endpoint execution path (`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` + `inferenceEndpointAdapter`). - `api.test.ts`: Added tests covering the "stack connector resolving to inference endpoint" path. - `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side workaround that was extracting inferenceId from `.inference` connectors - no longer needed now that the inference plugin handles this server-side. - `create_connector_fixture.test.ts`: Removed corresponding workaround tests. ## Related - elastic#258530 introduced unified connector listing via `getConnectorList()`/`getConnectorById()`, which returns inference endpoints with `isInferenceEndpoint: true` - elastic#259446 added a temporary client-side workaround in `kbn-evals` (now removed by this PR) ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…ent Builder (#259840) ## Summary Aligns Automatic Migrations (SIEM Migrations) with the same connector resolution scheme used by Assistant and Agent Builder. After #258530, Automatic Migrations was still using the old connector loading path (`loadAllActions` from triggers-actions-ui), which returns preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as connector IDs. This meant the `connectorId` passed to [`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24) was a `model_id` (old scheme) instead of the `inference_id`, resulting in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error. This PR switches Automatic Migrations to use the inference plugin's connector resolution (same as Assistant and Agent Builder), which returns `inference_id`-based connector IDs. ## Changes - **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with `useLoadConnectors` / `loadConnectorsForFeature` from `@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`) - **Backend:** Replaced `actionsClient.get()` with `inferenceClient.getConnectorById()` in rules and dashboards start routes - **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature` as a reusable stateless function from `useLoadConnectors` ## Test plan - [ ] Select a preconfigured inference connector in the SIEM Migrations onboarding card - [ ] Start a rule migration — should succeed without "saved object not found" error - [ ] Start a dashboard migration — same - [ ] Verify connector selection persists across modal reopens (via localStorage) -- Made with Cursor --------- Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
…ent Builder (elastic#259840) ## Summary Aligns Automatic Migrations (SIEM Migrations) with the same connector resolution scheme used by Assistant and Agent Builder. After elastic#258530, Automatic Migrations was still using the old connector loading path (`loadAllActions` from triggers-actions-ui), which returns preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as connector IDs. This meant the `connectorId` passed to [`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24) was a `model_id` (old scheme) instead of the `inference_id`, resulting in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error. This PR switches Automatic Migrations to use the inference plugin's connector resolution (same as Assistant and Agent Builder), which returns `inference_id`-based connector IDs. ## Changes - **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with `useLoadConnectors` / `loadConnectorsForFeature` from `@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`) - **Backend:** Replaced `actionsClient.get()` with `inferenceClient.getConnectorById()` in rules and dashboards start routes - **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature` as a reusable stateless function from `useLoadConnectors` ## Test plan - [ ] Select a preconfigured inference connector in the SIEM Migrations onboarding card - [ ] Start a rule migration — should succeed without "saved object not found" error - [ ] Start a dashboard migration — same - [ ] Verify connector selection persists across modal reopens (via localStorage) -- Made with Cursor --------- Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
After PR elastic#258530 consolidated connector listing, EIS connectors are returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet- chat_completion) as their connectorId rather than Kibana saved object IDs. Attack Discovery and Defend Insights passed these IDs to ActionsClientLlm which used actionsClient.execute(), failing with "Saved object [action/...] not found". Fix: Thread an InferenceClient from route handlers to ActionsClientLlm. When actionTypeId is .inference and an inference client is available, ActionsClientLlm uses its existing isInferenceEndpoint path which calls inferenceClient.chatComplete() instead of actionsClient.execute(), correctly handling inference endpoint IDs. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
) ## Summary After PR #258530 consolidated connector listing, EIS connectors are returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet- chat_completion) as their connectorId rather than Kibana saved object IDs. Attack Discovery and Defend Insights passed these IDs to ActionsClientLlm which used actionsClient.execute(), failing with "Saved object [action/...] not found". This PR uses a new path when actionTypeId is .inference and an inference client is available. ActionsClientLlm uses its existing isInferenceEndpoint path which calls inferenceClient.chatComplete() instead of actionsClient.execute(). --------- Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…tic#260268) ## Summary After PR elastic#258530 consolidated connector listing, EIS connectors are returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet- chat_completion) as their connectorId rather than Kibana saved object IDs. Attack Discovery and Defend Insights passed these IDs to ActionsClientLlm which used actionsClient.execute(), failing with "Saved object [action/...] not found". This PR uses a new path when actionTypeId is .inference and an inference client is available. ActionsClientLlm uses its existing isInferenceEndpoint path which calls inferenceClient.chatComplete() instead of actionsClient.execute(). --------- Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
) (#260345) ## Summary Cherry pick #260268 to `deploy-fix@1774851440` After PR #258530 consolidated connector listing, EIS connectors are returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet- chat_completion) as their connectorId rather than Kibana saved object IDs. Attack Discovery and Defend Insights passed these IDs to ActionsClientLlm which used actionsClient.execute(), failing with "Saved object [action/...] not found". This PR uses a new path when actionTypeId is .inference and an inference client is available. ActionsClientLlm uses its existing isInferenceEndpoint path which calls inferenceClient.chatComplete() instead of actionsClient.execute(). cc @sphilipse @pheyos Co-authored-by: Sander Philipse <94373878+sphilipse@users.noreply.github.com> Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…ector resolution (elastic#259446) Closes elastic#259472 ## Summary Fixes two issues breaking `kbn-evals` runs (both local and CI): ### 1. APM / OpenTelemetry tracing conflict A recent validation in `initTelemetry` (elastic#258303, elastic#258663) throws when Elastic APM and OpenTelemetry tracing are both active. The `evals_tracing` Scout config enables OTel tracing but didn't explicitly disable APM, causing Kibana (and the Playwright worker) to crash on startup. Fix: - Added a `coerceCliValue` helper in `applyConfigOverrides` (`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and numeric strings to numbers before they're set in the config object. - Added `--elastic.apm.active=false` and `--elastic.apm.contextPropagationOnly=false` to the `evals_tracing` Scout server config and to `require_init_apm.js` (for the Playwright worker when `TRACING_EXPORTERS` is set). - Updated the `kbn-evals` README to document the required APM settings when configuring tracing in `kibana.dev.yml`. ### 2. Inference endpoint connector resolution elastic#258530 consolidated LLM connector listing through the inference plugin's `getConnectorList()`, which now returns inference endpoint IDs (e.g.: `.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was still passing the stack connector key to the inference API, which then tried to execute it as a Kibana action - resulting in "Saved object `[action/.anthropic-claude-4.6-opus-chat_completion]` not found". Fix: - `createConnectorFixture` now detects `.inference-type` connectors and extracts their `inferenceId` from the config, using the ES inference endpoint ID directly. This bypasses the Kibana actions framework and aligns with the unified connector model from [elastic#258530](elastic#258530). ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…oints (elastic#259656) Closes elastic#259641 ## Summary When a stack connector ID (e.g.: a preconfigured `.inference` connector) is passed to the inference plugin's `chatComplete` API, `getConnectorById` may resolve it to an `InferenceConnector` with `isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only checked the `endpointIdCache` to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type. This caused a `Saved object [action/<inference-endpoint-id>] not found` error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference` connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action. ### Changes - `callback_api.ts`: After `getInferenceExecutor` resolves the connector in the stack connector branch, check `connector.isInferenceEndpoint`. If true, redirect to the inference endpoint execution path (`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` + `inferenceEndpointAdapter`). - `api.test.ts`: Added tests covering the "stack connector resolving to inference endpoint" path. - `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side workaround that was extracting inferenceId from `.inference` connectors - no longer needed now that the inference plugin handles this server-side. - `create_connector_fixture.test.ts`: Removed corresponding workaround tests. ## Related - elastic#258530 introduced unified connector listing via `getConnectorList()`/`getConnectorById()`, which returns inference endpoints with `isInferenceEndpoint: true` - elastic#259446 added a temporary client-side workaround in `kbn-evals` (now removed by this PR) ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…ent Builder (elastic#259840) ## Summary Aligns Automatic Migrations (SIEM Migrations) with the same connector resolution scheme used by Assistant and Agent Builder. After elastic#258530, Automatic Migrations was still using the old connector loading path (`loadAllActions` from triggers-actions-ui), which returns preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as connector IDs. This meant the `connectorId` passed to [`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24) was a `model_id` (old scheme) instead of the `inference_id`, resulting in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error. This PR switches Automatic Migrations to use the inference plugin's connector resolution (same as Assistant and Agent Builder), which returns `inference_id`-based connector IDs. ## Changes - **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with `useLoadConnectors` / `loadConnectorsForFeature` from `@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`) - **Backend:** Replaced `actionsClient.get()` with `inferenceClient.getConnectorById()` in rules and dashboards start routes - **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature` as a reusable stateless function from `useLoadConnectors` ## Test plan - [ ] Select a preconfigured inference connector in the SIEM Migrations onboarding card - [ ] Start a rule migration — should succeed without "saved object not found" error - [ ] Start a dashboard migration — same - [ ] Verify connector selection persists across modal reopens (via localStorage) -- Made with Cursor --------- Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
…tic#260268) ## Summary After PR elastic#258530 consolidated connector listing, EIS connectors are returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet- chat_completion) as their connectorId rather than Kibana saved object IDs. Attack Discovery and Defend Insights passed these IDs to ActionsClientLlm which used actionsClient.execute(), failing with "Saved object [action/...] not found". This PR uses a new path when actionTypeId is .inference and an inference client is available. ActionsClientLlm uses its existing isInferenceEndpoint path which calls inferenceClient.chatComplete() instead of actionsClient.execute(). --------- Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
…ector resolution (elastic#259446) Closes elastic#259472 ## Summary Fixes two issues breaking `kbn-evals` runs (both local and CI): ### 1. APM / OpenTelemetry tracing conflict A recent validation in `initTelemetry` (elastic#258303, elastic#258663) throws when Elastic APM and OpenTelemetry tracing are both active. The `evals_tracing` Scout config enables OTel tracing but didn't explicitly disable APM, causing Kibana (and the Playwright worker) to crash on startup. Fix: - Added a `coerceCliValue` helper in `applyConfigOverrides` (`kbn-apm-config-loader`) that converts 'true'/'false' to booleans and numeric strings to numbers before they're set in the config object. - Added `--elastic.apm.active=false` and `--elastic.apm.contextPropagationOnly=false` to the `evals_tracing` Scout server config and to `require_init_apm.js` (for the Playwright worker when `TRACING_EXPORTERS` is set). - Updated the `kbn-evals` README to document the required APM settings when configuring tracing in `kibana.dev.yml`. ### 2. Inference endpoint connector resolution elastic#258530 consolidated LLM connector listing through the inference plugin's `getConnectorList()`, which now returns inference endpoint IDs (e.g.: `.anthropic-claude-4.6-opus-chat_completion`) instead of Kibana stack connector keys (e.g.: `elastic-llm-claude-46-opus`). `kbn-evals` was still passing the stack connector key to the inference API, which then tried to execute it as a Kibana action - resulting in "Saved object `[action/.anthropic-claude-4.6-opus-chat_completion]` not found". Fix: - `createConnectorFixture` now detects `.inference-type` connectors and extracts their `inferenceId` from the config, using the ES inference endpoint ID directly. This bypasses the Kibana actions framework and aligns with the unified connector model from [elastic#258530](elastic#258530). ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…oints (elastic#259656) Closes elastic#259641 ## Summary When a stack connector ID (e.g.: a preconfigured `.inference` connector) is passed to the inference plugin's `chatComplete` API, `getConnectorById` may resolve it to an `InferenceConnector` with `isInferenceEndpoint: true`. Previously, `resolveAndCreatePipeline` only checked the `endpointIdCache` to decide whether to use the inference endpoint execution path. If the cache didn't contain the ID, it fell through to the stack connector adapter path, which then failed because there's no adapter for the .inference connector type. This caused a `Saved object [action/<inference-endpoint-id>] not found` error when callers (e.g.: `kbn-evals`) passed preconfigured `.inference` connector IDs, because the code attempted to execute the ES inference endpoint ID as a Kibana saved-object action. ### Changes - `callback_api.ts`: After `getInferenceExecutor` resolves the connector in the stack connector branch, check `connector.isInferenceEndpoint`. If true, redirect to the inference endpoint execution path (`resolveInferenceEndpoint` + `createInferenceEndpointExecutor` + `inferenceEndpointAdapter`). - `api.test.ts`: Added tests covering the "stack connector resolving to inference endpoint" path. - `create_connector_fixture.ts` (`kbn-evals`): Removed the client-side workaround that was extracting inferenceId from `.inference` connectors - no longer needed now that the inference plugin handles this server-side. - `create_connector_fixture.test.ts`: Removed corresponding workaround tests. ## Related - elastic#258530 introduced unified connector listing via `getConnectorList()`/`getConnectorById()`, which returns inference endpoints with `isInferenceEndpoint: true` - elastic#259446 added a temporary client-side workaround in `kbn-evals` (now removed by this PR) ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [x] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels.
…ent Builder (elastic#259840) ## Summary Aligns Automatic Migrations (SIEM Migrations) with the same connector resolution scheme used by Assistant and Agent Builder. After elastic#258530, Automatic Migrations was still using the old connector loading path (`loadAllActions` from triggers-actions-ui), which returns preconfigured connector YAML keys (e.g., `Anthropic-Claude-Opus-4-6`) as connector IDs. This meant the `connectorId` passed to [`naturalLanguageToEsql`](https://github.com/elastic/kibana/blob/main/x-pack/solutions/security/plugins/security_solution/server/lib/siem_migrations/common/task/util/esql_knowledge_base.ts#L24) was a `model_id` (old scheme) instead of the `inference_id`, resulting in a `Saved object [action/.anthropic-claude-4.6-opus] not found` error. This PR switches Automatic Migrations to use the inference plugin's connector resolution (same as Assistant and Agent Builder), which returns `inference_id`-based connector IDs. ## Changes - **Frontend:** Replaced `useAIConnectors` / `loadAiConnectors` with `useLoadConnectors` / `loadConnectorsForFeature` from `@kbn/inference-connectors` (scoped to `featureId: 'siem_migrations'`) - **Backend:** Replaced `actionsClient.get()` with `inferenceClient.getConnectorById()` in rules and dashboards start routes - **`@kbn/inference-connectors`:** Extracted `loadConnectorsForFeature` as a reusable stateless function from `useLoadConnectors` ## Test plan - [ ] Select a preconfigured inference connector in the SIEM Migrations onboarding card - [ ] Start a rule migration — should succeed without "saved object not found" error - [ ] Start a dashboard migration — same - [ ] Verify connector selection persists across modal reopens (via localStorage) -- Made with Cursor --------- Co-authored-by: Alex Szabo <alex.szabo@elastic.co>
…tic#260268) ## Summary After PR elastic#258530 consolidated connector listing, EIS connectors are returned with inference endpoint IDs (e.g. anthropic-claude-3.7-sonnet- chat_completion) as their connectorId rather than Kibana saved object IDs. Attack Discovery and Defend Insights passed these IDs to ActionsClientLlm which used actionsClient.execute(), failing with "Saved object [action/...] not found". This PR uses a new path when actionTypeId is .inference and an inference client is available. ActionsClientLlm uses its existing isInferenceEndpoint path which calls inferenceClient.chatComplete() instead of actionsClient.execute(). --------- Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Summary
Moves all Kibana features that list LLM connectors to go through the inference plugin's getConnectorList(), which now returns both Kibana stack connectors and native Elasticsearch inference endpoints in a unified InferenceConnector[] format. This makes sure the whole Kibana platform consumes and lists connectors and inference endpoints in the same way, which is part of our efforts to migrate to a unified model settings page.
Changes:
Checklist
Check the PR satisfies following conditions.
Reviewers should verify this PR satisfies this list as well.
release_note:*label is applied per the guidelinesbackport:*labels.