Skip to content

[System] Adding condition to 0.9.2 package => 0.9.3#674

Merged
ycombinator merged 2 commits intoelastic:masterfrom
ycombinator:system-093
Feb 9, 2021
Merged

[System] Adding condition to 0.9.2 package => 0.9.3#674
ycombinator merged 2 commits intoelastic:masterfrom
ycombinator:system-093

Conversation

@ycombinator
Copy link
Copy Markdown
Contributor

@ycombinator ycombinator commented Feb 8, 2021

What does this PR do?

This PR effectively performs two steps:

  1. Reverts all changes in the system package back to the 0.9.2 version of the package (commit SHA: 1f5013c
  2. Then adds a condition to the load data stream and bumps up the package version to 0.9.3

However, due to step 1, the diff in this PR is very large. To get a true-r sense of the changes (i.e. just step 2), check out this PR and run git diff 1f5013cef1a441e3d76adcfe687b035c1390617c -- packages/system/.

Immediately following up this PR will be a PR that undoes step 1, therefore bringing the system package back to it's current state in this repo, i.e. as of commit SHA 293d243.

Why is this change important?

The latest released 0.9.x version of the system package is 0.9.2. We need to fix an issue in this series (by adding the condition field). For this, we need to release 0.9.3 of the package with this change.

Checklist

  • I have reviewed tips for building integrations and this pull request is aligned with them.
  • I have verified that all data streams collect metrics or logs.

@ycombinator ycombinator changed the title Adding condition [System] Adding condition to 0.9.2 package => 0.9.3 Feb 8, 2021
@ycombinator ycombinator requested a review from mtojek February 8, 2021 19:24
@fearful-symmetry
Copy link
Copy Markdown
Contributor

@ycombinator so is the plan to let the automation put in the SNAPSHOT PR for this change, then revert?

@ycombinator
Copy link
Copy Markdown
Contributor Author

ycombinator commented Feb 8, 2021

@ycombinator so is the plan to let the automation put in the SNAPSHOT PR for this change, then revert?

@fearful-symmetry yep, exactly! That way we end up following all the usual processes and have a record of all changes in the integrations repo as well.

@elasticmachine
Copy link
Copy Markdown

elasticmachine commented Feb 8, 2021

💚 Build Succeeded

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts preview

Expand to view the summary

Build stats

  • Build Cause: Pull request #674 updated

  • Start Time: 2021-02-09T13:25:21.615+0000

  • Duration: 7 min 23 sec

  • Commit: 2f8d31b

Test stats 🧪

Test Results
Failed 0
Passed 89
Skipped 0
Total 89

Trends 🧪

Image of Build Times

Image of Tests

Copy link
Copy Markdown
Contributor

@fearful-symmetry fearful-symmetry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assuming this is good. In the future we might want to find a less multi-step way to do this, since we're definitely going to see more cases where a stack release is constrained to certain integration releases.

@ycombinator
Copy link
Copy Markdown
Contributor Author

ycombinator commented Feb 8, 2021

Welp, we have a problem! CI is failing with this error:

[2021-02-08T19:30:32.915Z] Error: error running package asset tests: could not complete test run: could not install package: could not install package; API status code = 400; response body = {"statusCode":400,"error":"Bad Request","message":"system-0.9.3 is out-of-date and cannot be installed or updated"}

This is because the package registry distribution used for tests by CI contains a newer version of the system package.

Even if we merged this PR with the failing CI 😬, I don't think that's going to help us because I suspect we'll see the same error on the master build that follows. And once that errors out, it won't automatically publish the 0.9.3 version of the system package to https://github.com/elastic/package-storage/tree/snapshot. 😞

I'm wondering now if we should instead make a PR directly to https://github.com/elastic/package-storage/tree/snapshot with a new folder for 0.9.3 (copied from 0.9.2) with the necessary condition change. The down side of this approach is that we won't have a copy of the 0.9.3 system package in the integrations repo and we'd bypass any checks that normally happen on integrations repo PRs. 🤔

@mtojek do you have any opinions or other ideas? Please see the PR description to understand what we are trying to accomplish with this PR.

@ycombinator
Copy link
Copy Markdown
Contributor Author

ycombinator commented Feb 8, 2021

I chatted with @jen-huang off-PR about the error we're seeing in CI:

[2021-02-08T19:30:32.915Z] Error: error running package asset tests: could not complete test run: could not install package: could not install package; API status code = 400; response body = {"statusCode":400,"error":"Bad Request","message":"system-0.9.3 is out-of-date and cannot be installed or updated"}

Since we are trying to install the system-0.9.3 package for system testing purposes, I think it's okay to ignore the package version check. IOW, the system test runner needs to be able to install the package version under test so it can run the test, regardless of what other versions of the package are present in the registry.

@jen-huang informed me that this is, in fact, possible with the Fleet API by passing {force: true} in the API's request payload. I'm going to implement this change in elastic-package now: elastic/elastic-package#247.

@andresrc andresrc added the Team:Integrations Label for the Integrations team label Feb 9, 2021
@elasticmachine
Copy link
Copy Markdown

Pinging @elastic/integrations (Team:Integrations)

Copy link
Copy Markdown
Contributor

@mtojek mtojek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm wondering now if we should instead make a PR directly to https://github.com/elastic/package-storage/tree/snapshot with a new folder for 0.9.3 (copied from 0.9.2) with the necessary condition change. The down side of this approach is that we won't have a copy of the 0.9.3 system package in the integrations repo and we'd bypass any checks that normally happen on integrations repo PRs. 🤔

I met a case in the past where there was condition on Kibana version in the package. As no user could install that package I simply modified the broken source in the package-storage. I understand that in this case it's possible to install the package, so your idea is correct, hence LGTM for this PR and the fix in the elastic-package.

I would also re-consider fixing broken packages directly in the package-storage in such emergency cases. It's much easier to handle it operationally.

@ycombinator
Copy link
Copy Markdown
Contributor Author

ycombinator commented Feb 9, 2021

Thanks for weighing in, @mtojek .

I understand your point about broken package versions: because such versions cannot be installed anyway by the user, we can assume no one has installed them and, therefore, a fix to them will not require a re-install. So we can safely fix them directly in package-storage.

When you did this in the past, did you also fix the source package version in the integrations repo afterwards? Otherwise there is no record of that fix for that version in the integrations repo, right? If you did fix it in the integrations repo afterwards, how did you do it, considering that the package in the repo @ master could've probably been far ahead in terms of it's changes from the broken version — did you end up with a "messy" PR like the one here + a follow up PR to restore the latest version back in master?

@ycombinator
Copy link
Copy Markdown
Contributor Author

Just to be clear, the above discussion with @mtojek is not blocking this PR. I'm waiting to rebase this PR on master once #679 is merged.

@mtojek
Copy link
Copy Markdown
Contributor

mtojek commented Feb 9, 2021

When you did this in the past, did you also fix the source package version in the integrations repo afterwards? Otherwise there is no record of that fix for that version in the integrations repo, right? If you did fix it in the integrations repo afterwards, how did you do it, considering that the package in the repo @ master could've probably been far ahead in terms of it's changes from the broken version

In my case it was an issue with incompatible import of Kibana dashboards: elastic/package-storage@0767f7a . I fixed it only in the package-storage (must be compatible with Kibana 7.10), because the Apache package's version has been bumped up in Integrations together with bumped up Kibana's constraint (7.11.0 - https://github.com/elastic/integrations/blob/master/packages/apache/manifest.yml#L12). As we're working with latest master in Integrations, I couldn't alter the Git history there.

@ycombinator
Copy link
Copy Markdown
Contributor Author

As we're working with latest master in Integrations, I couldn't alter the Git history there.

Right, this is what I'm facing with this PR here as well. It's why this PR is so messy (so many changes!) and why we will need a follow up PR to restore the system package as it stands prior to this PR back in master.

I think, in general, we need to have a policy that covers how to fix older versions of packages so we are fixing them consistently and all package authors are aware of it. I'll file a separate issue to take this discussion off this PR.

@ycombinator
Copy link
Copy Markdown
Contributor Author

ycombinator commented Feb 9, 2021

So just to recap, the next steps with this PR are as follows:

  1. Now that this PR here is approved and CI is green, I will merge the PR. This will cause the master branch of the integrations repo to temporarily contain 0.9.3 of the system package. CI will build master, which will result in 0.9.3 of the system package being released (i.e. CI will make a PR to the snapshot branch of package-storage with 0.9.3 of the system package).

  2. Once the PR to package-storage resulting from step 1. looks good, I will put up another PR to the master branch of the integrations repo to restore the system package as it stands just prior to this PR here, effectively undoing this PR.

@fearful-symmetry @ChrsMark Please take note of the above, as you two come up as recent editors of the system package. You might want to hold off on further PRs for the system package to the integrations repo until the above two steps are completed. I will post a comment here once both steps are completed and the resulting PRs merged.

@ycombinator ycombinator merged commit 6255798 into elastic:master Feb 9, 2021
@ycombinator ycombinator deleted the system-093 branch February 9, 2021 13:34
ycombinator added a commit that referenced this pull request Feb 9, 2021
@ycombinator
Copy link
Copy Markdown
Contributor Author

I think, in general, we need to have a policy that covers how to fix older versions of packages so we are fixing them consistently and all package authors are aware of it. I'll file a separate issue to take this discussion off this PR.

Created issue for discussion: #682.

@ycombinator
Copy link
Copy Markdown
Contributor Author

So just to recap, the next steps with this PR are as follows:

  1. Now that this PR here is approved and CI is green, I will merge the PR. This will cause the master branch of the integrations repo to temporarily contain 0.9.3 of the system package. CI will build master, which will result in 0.9.3 of the system package being released (i.e. CI will make a PR to the snapshot branch of package-storage with 0.9.3 of the system package).

This is done now. The PR to package-storage is up: elastic/package-storage#869.

  1. Once the PR to package-storage resulting from step 1. looks good, I will put up another PR to the master branch of the integrations repo to restore the system package as it stands just prior to this PR here, effectively undoing this PR.

This is also done now. The PR to revert this PR here is up: #681.

@mtojek @fearful-symmetry I've requested your reviews on both PRs above.

ycombinator added a commit that referenced this pull request Feb 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Team:Integrations Label for the Integrations team

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants