|
| 1 | +[[use-filebeat-modules-kafka]] |
| 2 | +=== Example: Set up {filebeat} modules to work with Kafka and {ls} |
| 3 | + |
| 4 | +This section shows how to set up {filebeat} |
| 5 | +{filebeat-ref}/filebeat-modules-overview.html[modules] to work with {ls} when |
| 6 | +you are using Kafka in between {filebeat} and {ls} in your publishing pipeline. |
| 7 | +The main goal of this example is to show how to load ingest pipelines from |
| 8 | +{filebeat} and use them with {ls}. |
| 9 | + |
| 10 | +The examples in this section show simple configurations with topic names hard |
| 11 | +coded. For a full list of configuration options, see documentation about |
| 12 | +configuring the <<plugins-inputs-kafka,Kafka input plugin>>. Also see |
| 13 | +{filebeat-ref}/kafka-output.html[Configure the Kafka output] in the _{filebeat} |
| 14 | +Reference_. |
| 15 | + |
| 16 | +==== Set up and run {filebeat} |
| 17 | + |
| 18 | +. If you haven't already set up the {filebeat} index template and sample {kib} |
| 19 | +dashboards, run the {filebeat} `setup` command to do that now: |
| 20 | ++ |
| 21 | +[source,shell] |
| 22 | +---------------------------------------------------------------------- |
| 23 | +filebeat -e setup |
| 24 | +---------------------------------------------------------------------- |
| 25 | ++ |
| 26 | +The `-e` flag is optional and sends output to standard error instead of syslog. |
| 27 | ++ |
| 28 | +A connection to {es} and {kib} is required for this one-time setup |
| 29 | +step because {filebeat} needs to create the index template in {es} and |
| 30 | +load the sample dashboards into {kib}. For more information about configuring |
| 31 | +the connection to {es}, see the Filebeat modules |
| 32 | +{filebeat-ref}/filebeat-modules-quickstart.html[quick start]. |
| 33 | ++ |
| 34 | +After the template and dashboards are loaded, you'll see the message `INFO |
| 35 | +{kib} dashboards successfully loaded. Loaded dashboards`. |
| 36 | + |
| 37 | +. Run the `modules enable` command to enable the modules that you want to run. |
| 38 | +For example: |
| 39 | ++ |
| 40 | +[source,shell] |
| 41 | +---------------------------------------------------------------------- |
| 42 | +filebeat modules enable system |
| 43 | +---------------------------------------------------------------------- |
| 44 | ++ |
| 45 | +You can further configure the module by editing the config file under the |
| 46 | +{filebeat} `modules.d` directory. For example, if the log files are not in the |
| 47 | +location expected by the module, you can set the `var.paths` option. |
| 48 | + |
| 49 | +. Run the `setup` command with the `--pipelines` and `--modules` options |
| 50 | +specified to load ingest pipelines for the modules you've enabled. This step |
| 51 | +also requires a connection to {es}. If you want use a {ls} pipeline instead of |
| 52 | +ingest node to parse the data, skip this step. |
| 53 | ++ |
| 54 | +[source,shell] |
| 55 | +---------------------------------------------------------------------- |
| 56 | +filebeat setup --pipelines --modules system |
| 57 | +---------------------------------------------------------------------- |
| 58 | + |
| 59 | +. Configure {filebeat} to send log lines to Kafka. To do this, in the |
| 60 | ++filebeat.yml+ config file, disable the {es} output by commenting it out, and |
| 61 | +enable the Kafka output. For example: |
| 62 | ++ |
| 63 | +[source,yaml] |
| 64 | +----- |
| 65 | +#output.elasticsearch: |
| 66 | + #hosts: ["localhost:9200"] |
| 67 | +output.kafka: |
| 68 | + hosts: ["kafka:9092"] |
| 69 | + topic: "filebeat" |
| 70 | + codec.json: |
| 71 | + pretty: false |
| 72 | +----- |
| 73 | + |
| 74 | +. Start {filebeat}. For example: |
| 75 | ++ |
| 76 | +[source,shell] |
| 77 | +---------------------------------------------------------------------- |
| 78 | +filebeat -e |
| 79 | +---------------------------------------------------------------------- |
| 80 | ++ |
| 81 | +{filebeat} will attempt to send messages to {ls} and continue until {ls} is |
| 82 | +available to receive them. |
| 83 | ++ |
| 84 | +NOTE: Depending on how you've installed {filebeat}, you might see errors |
| 85 | +related to file ownership or permissions when you try to run {filebeat} modules. |
| 86 | +See {beats-ref}/config-file-permissions.html[Config File Ownership and Permissions] |
| 87 | +in the _Beats Platform Reference_ if you encounter errors related to file |
| 88 | +ownership or permissions. |
| 89 | + |
| 90 | + |
| 91 | +==== Create and start the {ls} pipeline |
| 92 | + |
| 93 | +. On the system where {ls} is installed, create a {ls} pipeline configuration |
| 94 | +that reads from a Kafka input and sends events to an {es} output: |
| 95 | ++ |
| 96 | +-- |
| 97 | +[source,yaml] |
| 98 | +----- |
| 99 | +input { |
| 100 | + kafka { |
| 101 | + bootstrap_servers => "myhost:9092" |
| 102 | + topics => ["filebeat"] |
| 103 | + codec => json |
| 104 | + } |
| 105 | +} |
| 106 | +
|
| 107 | +output { |
| 108 | + if [@metadata][pipeline] { |
| 109 | + elasticsearch { |
| 110 | + hosts => "https://myEShost:9200" |
| 111 | + manage_template => false |
| 112 | + index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" |
| 113 | + pipeline => "%{[@metadata][pipeline]}" <1> |
| 114 | + user => "elastic" |
| 115 | + password => "secret" |
| 116 | + } |
| 117 | + } else { |
| 118 | + elasticsearch { |
| 119 | + hosts => "https://myEShost:9200" |
| 120 | + manage_template => false |
| 121 | + index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" |
| 122 | + user => "elastic" |
| 123 | + password => "secret" |
| 124 | + } |
| 125 | + } |
| 126 | +} |
| 127 | +----- |
| 128 | +<1> Set the `pipeline` option to `%{[@metadata][pipeline]}`. This setting |
| 129 | +configures {ls} to select the correct ingest pipeline based on metadata |
| 130 | +passed in the event. |
| 131 | + |
| 132 | +If you want use a {ls} pipeline instead of ingest node to parse the data, see |
| 133 | +the `filter` and `output` settings in the examples under |
| 134 | +<<logstash-config-for-filebeat-modules>>. |
| 135 | +-- |
| 136 | + |
| 137 | +. Start {ls}, passing in the pipeline configuration file you just defined. For |
| 138 | +example: |
| 139 | ++ |
| 140 | +[source,shell] |
| 141 | +---------------------------------------------------------------------- |
| 142 | +bin/logstash -f mypipeline.conf |
| 143 | +---------------------------------------------------------------------- |
| 144 | ++ |
| 145 | +{ls} should start a pipeline and begin receiving events from the Kafka input. |
| 146 | + |
| 147 | +==== Visualize the data |
| 148 | + |
| 149 | +To visualize the data in {kib}, launch the {kib} web interface by pointing your |
| 150 | +browser to port 5601. For example, http://127.0.0.1:5601[http://127.0.0.1:5601]. |
| 151 | +Click *Dashboards* then view the {filebeat} dashboards. |
0 commit comments