This package is deprecated, please use the Custom Logs (Filestream) package instead.
WARNING: Migrating from the "Custom Logs (Deprecated)" to "Custom Logs (Filestream)" will cause files to be re-ingested because the state is not migrated.
In future releases it's expected to have an automated way to migrate the state. However, this is not possible at the moment.
The current best option for minimizing the data duplication while migrating to "Custom Logs (Filestream)" is to use the 'Ignore Older' or 'Exclude Files' options.
The Custom Logs package is used to ingest arbitrary log files and parse their contents using Ingest Pipelines. Follow the steps below to set up and use this package.
-
Install Elastic Agent Install an Elastic Agent on the machine from which you want to collect logs.
-
Identify the Log Location Identify the log location on that machine, for example,
/tmp/custom.log.- If you need to include multiple log files or an entire directory, consider using wildcard patterns such as
/tmp/*.logto capture all.logfiles, or/tmp/*to include all file types. - Note that the System integration ingests
/var/log/*.log. You do not need to add this path if the System integration is in use.
- If you need to include multiple log files or an entire directory, consider using wildcard patterns such as
-
Enroll the Custom Logs Integration
- Add the Custom Logs integration to your installed Elastic Agent.
- Provide an Integration name. A descriptive name will make managing this integration in the Kibana UI more intuitive.
- Configure the path to match the location(s) identified in the previous step.
- Provide a dataset name that reflects the purpose of your logs (for example,
pythonfor Python application logs).
-
Verify Data in Discover
- Open Discover in Kibana and filter the
logs-*indices to your dataset name (e.g.,logs-python) to confirm that the raw log data is being ingested.
- Open Discover in Kibana and filter the
-
Configure Parsing Rules
- Use Ingest Pipelines to define parsing rules.
- See Parse and route logs for examples of how to extract structured fields and reroute log data to specific data streams.
-
Create a Custom Dashboard
- Use Kibana to build a dashboard for analyzing incoming log data based on your specific needs.
This integration includes an ECS Dynamic Template, so any fields following the ECS schema will automatically receive the correct index field mappings without additional manual configuration.