Hello,
The Custom AWS Logs is one of the few integrations that it is used to get custom logs, so it has write permissions to logs-*-*, and in the integration configuration the user can change the dataset to a custom one.

At the same time, the event.dataset is mapped as a constant_keyword with the value aws_logs.generic and this can lead to events being dropped.
Recently I've configured this integration to get logs for Session Manager from S3 + SQS, but I've changed the dataset to be aws_logs.ssm instead of aws_logs.generic, I also cloned the integration template and changed it to match the pattern logs-aws_logs.ssm-*, added a new component template with my custom mappings and edited the integration custom pipeline to call another custom pipeline based on the datastream name.
But no events where being ingested, looking on the agent logs we saw some errors saying that it failed to index the events and directing to look at the events file where we found the reason for the events to be dropped.
{status=400): {"type":"document_parsing_exception","reason":"[1:50915] failed to parse field [event.dataset] of type [constant_keyword] indocument with id '1d6c6918b1-000000000000'. Preview of field's value: 'aws_logs.ssm'","caused_by":{"type":"illegal_argument_exception","reason":"[constant_keyword] field [event.dataset] only accepts values that are equal to the value defined in the mappings [aws_logs.generic], but got [aws_logs.ssm]"}}, dropping event!"
I'm not sure from where this event.dataset field is coming, I'm assuming that it gets its value from the Dataset name configuration.
There are workaround two solutions on this case, one is to create a custom mapping for this field to override its value in the custom template, another one is to use a set processor to change the event.dataset name back to aws_logs.generic, but I couldn't find none of them present in the documentation.
So I think that the documentation for this integration needs to be improved and since the integration is used to get generic logs and has permissions into logs-*-*, the event.dataset should not be a constant_keyword.
Hello,
The Custom AWS Logs is one of the few integrations that it is used to get custom logs, so it has write permissions to
logs-*-*, and in the integration configuration the user can change the dataset to a custom one.At the same time, the
event.datasetis mapped as aconstant_keywordwith the valueaws_logs.genericand this can lead to events being dropped.Recently I've configured this integration to get logs for Session Manager from S3 + SQS, but I've changed the dataset to be
aws_logs.ssminstead ofaws_logs.generic, I also cloned the integration template and changed it to match the patternlogs-aws_logs.ssm-*, added a new component template with my custom mappings and edited the integration custom pipeline to call another custom pipeline based on the datastream name.But no events where being ingested, looking on the agent logs we saw some errors saying that it failed to index the events and directing to look at the events file where we found the reason for the events to be dropped.
I'm not sure from where this
event.datasetfield is coming, I'm assuming that it gets its value from the Dataset name configuration.There are workaround two solutions on this case, one is to create a custom mapping for this field to override its value in the custom template, another one is to use a set processor to change the
event.datasetname back toaws_logs.generic, but I couldn't find none of them present in the documentation.So I think that the documentation for this integration needs to be improved and since the integration is used to get generic logs and has permissions into
logs-*-*, theevent.datasetshould not be aconstant_keyword.