What is the bug?
I have added a neural_query_enricher search processor to my index with a default model ID. This means I should not have to specify a model ID in my search queries. However, for queries containing constant_score, it says a model ID is required
How can one reproduce the bug?
Steps to reproduce the behavior.
- Create search processor
PUT _search/pipeline/my_search_pipeline
{
"description": "example search pipeline",
"request_processors": [
{
"neural_query_enricher": {
"default_model_id": "some_model_id"
}
}
]
}
- Create knn Index with default search pipeline
PUT my_index
{
"settings": {
"index": {
"knn": true,
"search": {
"default_pipeline": "my_search_pipeline"
}
}
}
}
- Run search query with constant_score
POST my_index/_search
{
"query": {
"bool": {
"minimum_should_match": 0,
"must": {
"constant_score": {
"filter": {
"bool": {
"minimum_should_match": 1,
"should": [
{
"neural": {
"my_vector": {
"query_text": "nike",
"k": 5
}
}
}
]
}
}
}
}
}
}
}
What is the expected behavior?
should get hits from the query
What is your host/environment?
AWS Managed Service
Opensearch 2.17
Do you have any screenshots?



Do you have any additional context?
Add any other context about the problem.
What is the bug?
I have added a neural_query_enricher search processor to my index with a default model ID. This means I should not have to specify a model ID in my search queries. However, for queries containing
constant_score, it says a model ID is requiredHow can one reproduce the bug?
Steps to reproduce the behavior.
What is the expected behavior?
should get hits from the query
What is your host/environment?
AWS Managed Service
Opensearch 2.17
Do you have any screenshots?
Do you have any additional context?
Add any other context about the problem.