Skip to content
This repository was archived by the owner on Sep 17, 2024. It is now read-only.

[8.1](backport #2203) Separate ES workload from Agent spec and make it req for the scenarios#2230

Merged
ChrsMark merged 3 commits into8.1from
mergify/bp/8.1/pr-2203
Mar 15, 2022
Merged

[8.1](backport #2203) Separate ES workload from Agent spec and make it req for the scenarios#2230
ChrsMark merged 3 commits into8.1from
mergify/bp/8.1/pr-2203

Conversation

@mergify
Copy link
Copy Markdown
Contributor

@mergify mergify bot commented Mar 14, 2022

This is an automatic backport of pull request #2203 done by Mergify.


Mergify commands and options

More conditions and actions can be found in the documentation.

You can also trigger Mergify actions by commenting on this pull request:

  • @Mergifyio refresh will re-evaluate the rules
  • @Mergifyio rebase will rebase this PR on its base branch
  • @Mergifyio update will merge the base branch into this PR
  • @Mergifyio backport <destination> will backport this PR on <destination> branch

Additionally, on Mergify dashboard you can:

  • look at your merge queues
  • generate the Mergify configuration with the config editor.

Finally, you can contact us on https://mergify.com

@elasticmachine
Copy link
Copy Markdown
Contributor

elasticmachine commented Mar 14, 2022

💚 Build Succeeded

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts preview preview

Expand to view the summary

Build stats

  • Start Time: 2022-03-15T10:27:28.542+0000

  • Duration: 53 min 13 sec

Test stats 🧪

Test Results
Failed 0
Passed 272
Skipped 0
Total 272

💚 Flaky test report

Tests succeeded.

🤖 GitHub comments

To re-run your PR in the CI, just comment with:

  • /test : Re-trigger the build.

@mdelapenya
Copy link
Copy Markdown
Contributor

Mmm, it seems not resolved for this branch, as I can still see the same errors for 8.1

Do you think it could be a real bug on 8.1? 🤔

@ChrsMark
Copy link
Copy Markdown
Member

I will try to reproduce it locally.

@ChrsMark
Copy link
Copy Markdown
Member

ChrsMark commented Mar 15, 2022

UPDATE: checking the logs of Filebeat run inside the Agent Pod I found the following:

{"log.level":"error","@timestamp":"2022-03-15T08:11:08.305Z","log.logger":"publisher_pipeline_output","log.origin":{"file.name":"pipeline/client_worker.go","file.line":150},"message":"Failed to connect to backoff(elasticsearch(http://elasticsearch:9200)): Connection marked as failed because the onConnect callback failed: Elasticsearch is too old. Please upgrade the instance. If you would like to connect to older instances set output.elasticsearch.allow_older_versions to true. ES=8.1.0, Beat=8.2.0.","service.name":"filebeat","ecs.version":"1.6.0"}

It seems that ES version is 8.1 while Beat+Agent is 8.2, wondering how these versions are set on top level. However not sure if this is the problem with the CI failure since locally I only set BEAT_VERSION. @mdelapenya any ideas?

@mdelapenya
Copy link
Copy Markdown
Contributor

I've verified that the initial variables are properly set to 8.1:

[2022-03-15T06:58:03.217Z] INFO [2022-03-15T06:58:03Z] Initial artifact versions defined BeatVersion=8.1.1-da392dcd-SNAPSHOT BeatVersionBase=8.1.1-da392dcd-SNAPSHOT ElasticAgentVersion=8.1.1-da392dcd-SNAPSHOT KibanaVersion=8.1.1-da392dcd-SNAPSHOT StackVersion=8.1.1-da392dcd-SNAPSHOT

How are you seeing that 8.2 version for Beats/agent?

@ChrsMark
Copy link
Copy Markdown
Member

ChrsMark commented Mar 15, 2022

With export ELASTIC_AGENT_VERSION=8.1.0 and export BEAT_VERSION=8.1.0 locally the suite passes:

5 scenarios (5 passed)
22 steps (22 passed)
6m48.240438565s
testing: warning: no tests to run
PASS
ok  	github.com/elastic/e2e-testing/e2e/_suites/kubernetes-autodiscover	408.655s

It happens that the first time I execute the suite for a specific version it will exceed the deadline cause the image is not present. Maybe we hit something similar here?

I re-triggered the tests on this to check if it was an issue with the download time.

@mdelapenya
Copy link
Copy Markdown
Contributor

With export ELASTIC_AGENT_VERSION=8.1.0 and export BEAT_VERSION=8.1.0 locally the suite passes:

In this branch, what happens if you do not export the variables? Because we are in the 8.1 branch, it should use the hashed versions (daily snapshots) for 8.1

@ChrsMark
Copy link
Copy Markdown
Member

@ChrsMark
Copy link
Copy Markdown
Member

@mdelapenya Not sure why #2141 did not make it to 8.1 but it seems to be ok now.

@mdelapenya
Copy link
Copy Markdown
Contributor

@mdelapenya Not sure why #2141 did not make it to 8.1 but it seems to be ok now.

Interesting... It could be possible that 8.1 branch was created in the same time window that we were doing the backports and for that reason it was missed: I see that we intentionally added the backport labels for 7.16, 7.17 and 8.0 in that PR, so it could mean that 8.1 branch was not there yet.

This also explains why it consistently fails for 8.1.

OTOH, why installing jq in the pod fixes the issue?

@ChrsMark
Copy link
Copy Markdown
Member

jq is used to filter json responses from ES indexes. if jq is not installed the curl and "write-to-file" at

index=$(curl --user elastic:changeme http://elasticsearch:9200/_cat/indices/.ds-logs-generic-default*?format=json | jq .[0].'index' | tr -d '"');
curl --user elastic:changeme \
http://elasticsearch:9200/${index}/_search?format=json \
| jq -r .'hits'.'hits'[].'_source' >> /tmp/beats-events
will fail and hence the scenario will not find the required events/fields in the expected file and will wait "for ever". Depending on jq is quite prune to error and the approach is quite hacky but this would be solved with #1655

@ChrsMark ChrsMark merged commit 510c1ff into 8.1 Mar 15, 2022
@mergify mergify bot deleted the mergify/bp/8.1/pr-2203 branch March 15, 2022 11:31
@ChrsMark
Copy link
Copy Markdown
Member

Checking #2231 too.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants