Problem
After #258530, getConnectorById can resolve a Kibana stack connector ID (e.g.: elastic-llm-claude-46-opus) to an inference endpoint. However, resolveAndCreatePipeline in callback_api.ts makes its executor routing decision before resolution, based on the raw connectorId:
return from(endpointIdCache.has(connectorId)).pipe(
switchMap((isInferenceEndpoint) => {
const resolve: () => Promise<ResolvedPipelineContext> = isInferenceEndpoint
When a stack connector ID is passed, endpointIdCache.has() returns false (the cache only contains ES inference endpoint IDs), so the actions-based path is chosen. getConnectorById then resolves it to an inference endpoint with isInferenceEndpoint: true, but createInferenceExecutor still calls actionsClient.execute() with the resolved endpoint ID - which doesn't exist as a Kibana saved object:
Saved object [action/.anthropic-claude-4.6-opus-chat_completion] not found
Expected behavior
The inference plugin should accept both stack connector IDs and inference endpoint IDs. When a stack connector ID resolves to an inference endpoint, the execution should be routed to createInferenceEndpointExecutor (the ES _inference API path) instead of createInferenceExecutor (the Kibana actions path).
Related
Current workaround: #259446 - kbn-evals detects .inference connectors and swaps the ID to config.inferenceId before calling the inference API. This aligns with the migration direction but shouldn't be necessary - the inference plugin should handle both ID formats.
Problem
After #258530,
getConnectorByIdcan resolve a Kibana stack connector ID (e.g.:elastic-llm-claude-46-opus) to an inference endpoint. However,resolveAndCreatePipelinein callback_api.ts makes its executor routing decision before resolution, based on the rawconnectorId:When a stack connector ID is passed,
endpointIdCache.has()returns false (the cache only contains ES inference endpoint IDs), so the actions-based path is chosen.getConnectorByIdthen resolves it to an inference endpoint withisInferenceEndpoint: true, butcreateInferenceExecutorstill callsactionsClient.execute()with the resolved endpoint ID - which doesn't exist as a Kibana saved object:Expected behavior
The inference plugin should accept both stack connector IDs and inference endpoint IDs. When a stack connector ID resolves to an inference endpoint, the execution should be routed to
createInferenceEndpointExecutor(the ES_inferenceAPI path) instead ofcreateInferenceExecutor(the Kibana actions path).Related
Current workaround: #259446 -
kbn-evalsdetects.inferenceconnectors and swaps the ID toconfig.inferenceIdbefore calling the inference API. This aligns with the migration direction but shouldn't be necessary - the inference plugin should handle both ID formats.