-
Notifications
You must be signed in to change notification settings - Fork 2.5k
[Feature Request] Support ingest pipeline execution in pull-based ingestion #20875
Copy link
Copy link
Closed
Labels
IndexingIndexing, Bulk Indexing and anything related to indexingIndexing, Bulk Indexing and anything related to indexingenhancementEnhancement or improvement to existing feature or requestEnhancement or improvement to existing feature or requestluceneuntriaged
Description
Is your feature request related to a problem? Please describe
Implement final_pipeline execution in the pull-based ingestion path.
Describe the solution you'd like
Documents consumed from streaming sources (Kafka, Kinesis) can be transformed by configured ingest pipelines before being written to Lucene.
Flow
Message → Mapper (extract _id, _version, _op_type) → Pipeline (transform _source) → Engine.Index
Example
PUT _ingest/pipeline/enrich-docs
{
"processors": [
{ "set": { "field": "ingested_at", "value": "{{_ingest.timestamp}}" } },
{ "rename": { "field": "old_name", "target_field": "new_name" } }
]
}
PUT /my-index
{
"settings": {
"index.final_pipeline": "enrich-docs",
"ingestion_source": {
"type": "kafka",
"mapper_type": "field_mapping",
"mapper_settings.id_field": "user_id"
}
}
}Related component
Indexing
Describe alternatives you've considered
No response
Additional context
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
IndexingIndexing, Bulk Indexing and anything related to indexingIndexing, Bulk Indexing and anything related to indexingenhancementEnhancement or improvement to existing feature or requestEnhancement or improvement to existing feature or requestluceneuntriaged
Type
Projects
Status
New