Skip to content

[es] disable wildcards in destructive actions#88986

Merged
spalger merged 5 commits intoelastic:masterfrom
spalger:implement/es-disable-destructive-wildcard
Feb 3, 2021
Merged

[es] disable wildcards in destructive actions#88986
spalger merged 5 commits intoelastic:masterfrom
spalger:implement/es-disable-destructive-wildcard

Conversation

@spalger
Copy link
Copy Markdown
Contributor

@spalger spalger commented Jan 21, 2021

In elastic/elasticsearch#66908 ES is going to change the default value for the action.destructive_requires_name to true, which will remove our ability to use wildcards in destructive API calls like indices.delete(). This PR enables the setting when running Kibana in most dev scenarios as well as on CI, so that we can make sure we're ready for it.

Many places in the functional tests are deleting indices with wildcards, and those instances have been migrated to use the esDeleteAllIndices service, which resolves the wildcards to a list of concrete indices to delete and deletes them, in a loop, until the wildcard no longer resolves to any indices.

The detection engine code seems to be the only place we were supporting deletion of wildcard indices server side, and that code has been updated to use a similar strategy.

Release note:

Enables support for ES clusters using action.destructive_requires_name=true.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to add a refresh: wait_for here for us since we're looping now? It should be ok for us and our users and not a big breaking change. Might make tests and other things less in-deterministic and get us a better chance of getting this right on the first loop around.

Copy link
Copy Markdown
Contributor Author

@spalger spalger Feb 1, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The index delete API doesn't support refresh, and AFAIK is a "synchronous" action and the entire cluster will refuse to search across shards for deleted indexes immediately.

Copy link
Copy Markdown
Contributor

@banderror banderror Feb 3, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"synchronous" action and the entire cluster will refuse to search across shards for deleted indexes immediately

Do you mean, when we execute a request like DELETE /delme-idx-1,delme-idx-2, before returning "acknowledged" : true every node in the cluster must acknowledge the deletion and disable all other actions with the indices - synchronously? To me "acknowledged" semantically means "ok, received your request, will process it" w/o time guarantees, so I wouldn't personally rely on that. But I might be wrong.

I think my question here is: after deleting indices in await callWithRequest('indices.delete', can we still get the same indices from await callWithRequest('indices.getAlias' in the next iteration? If the cluster is under load or whatever? Can we catch an error due to a race condition? Because if it's possible, on the next iteration when trying to delete the same index that was already asked to be deleted, we can receive an error.

I tried this:

PUT delme-idx-1/_doc/1
{ "foo": "bar" }

DELETE /delme-idx-1,delme-idx-2

where deletion returns an error

{
  "error" : {
    "root_cause" : [
      {
        "type" : "index_not_found_exception",
        "reason" : "no such index [delme-idx-2]",
        "index_uuid" : "_na_",
        "resource.type" : "index_or_alias",
        "resource.id" : "delme-idx-2",
        "index" : "delme-idx-2"
      }
    ],
    "type" : "index_not_found_exception",
    "reason" : "no such index [delme-idx-2]",
    "index_uuid" : "_na_",
    "resource.type" : "index_or_alias",
    "resource.id" : "delme-idx-2",
    "index" : "delme-idx-2"
  },
  "status" : 404
}

and then

GET /delme-idx-*/_alias

returns

{
  "delme-idx-1" : {
    "aliases" : { }
  }
}

Which means that DELETE /delme-idx-1,delme-idx-2 did nothing.

Is a similar situation possible due to a race condition? And would await callWithRequest('indices.delete' throw an exception in this case?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And just thinking out loud.. Maybe specifying a timeout + exp wait would do the job?

namesToDelete = get index names from the pattern
send the delete request

while (not timed out) {
  existingNames = get index names
  if (namesToDelete not in existingNames) {
    return
  }
  get next delay // exponential
  wait
}

throw timed out

Here the difference is: this will delete only those specific indices which existed at the moment of receiving the request. So if a new index matching the pattern is added during the deletion (while loop), it will be ignored. Which can be good or bad depending on the case.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@banderror You definitely have a point. I am pretty darn confident that a successful response from Elasticsearch means that the cluster state has been updated and all nodes have acknowledged the update, that said there isn't any reason to not protect ourselves from a situation like this. To do so I've updated the indices.delete call to use the ignore_unavailable=true query string param which will ignore concrete indices which are deleted, preventing the 404 you showed and will instead just delete the indices that do exist.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WRT exponential backoff/timeout I'm not totally opposed, but I also feel like it's a good deal more complicated to implement and I don't really see the value personally. If you specify a wildcard to this API and new indexes show up that match the wildcard while it's running I think it makes sense to delete them too. Additionally, if new indexes keep showing up leading to a maximum number of iterations that's a issue which I don't think this API is intended to solve. Finally, picking a timeout seems like a very challenging task as operations to ES can take quite some time in certain scenarios and I think we should rely on the configured request timeout rather than defining a new timeout for this operation specifically.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ignore_unavailable=true sounds good, thank you! 👍

@spalger spalger force-pushed the implement/es-disable-destructive-wildcard branch from 2f7114a to 52ed849 Compare February 2, 2021 01:48
@spalger spalger force-pushed the implement/es-disable-destructive-wildcard branch from 52ed849 to 6e79008 Compare February 2, 2021 06:23
@spalger spalger marked this pull request as ready for review February 2, 2021 17:03
@spalger spalger requested review from a team as code owners February 2, 2021 17:03
@spalger spalger added release_note:skip Skip the PR/issue when compiling release notes Team:Operations Kibana-Operations Team v8.0.0 labels Feb 2, 2021
@elasticmachine
Copy link
Copy Markdown
Contributor

Pinging @elastic/kibana-operations (Team:Operations)

Copy link
Copy Markdown
Contributor

@YulNaumenko YulNaumenko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alerting related changes LGTM.

Copy link
Copy Markdown
Member

@pheyos pheyos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ML / Transform changes LGTM

Copy link
Copy Markdown
Contributor

@jportner jportner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code review only - Security integration test changes LGTM.

Copy link
Copy Markdown
Contributor

@FrankHassanabad FrankHassanabad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes are good with me too.

@spalger spalger merged commit d07ae11 into elastic:master Feb 3, 2021
@spalger spalger deleted the implement/es-disable-destructive-wildcard branch February 3, 2021 16:29
@spalger spalger added the backport:skip This PR does not require backporting label Feb 3, 2021
@spalger spalger added v7.12.0 release_note:enhancement and removed release_note:skip Skip the PR/issue when compiling release notes labels Feb 9, 2021
@kibanamachine
Copy link
Copy Markdown
Contributor

kibanamachine commented Feb 9, 2021

💔 Build Failed

Failed CI Steps


Test Failures

Kibana Pipeline / general / Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/data_frame_analytics/feature_importance·ts.machine learning data frame analytics total feature importance panel and decision path popover binary classification job should display the feature importance decision path in the data grid

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: machine learning
[00:00:00]           └-> "before all" hook
[00:00:00]           └-: 
[00:00:00]             └-> "before all" hook
[00:00:00]             └-> "before all" hook
[00:00:00]               │ debg creating role ft_ml_source
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_ml_source]
[00:00:00]               │ debg creating role ft_ml_source_readonly
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_ml_source_readonly]
[00:00:00]               │ debg creating role ft_ml_dest
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_ml_dest]
[00:00:00]               │ debg creating role ft_ml_dest_readonly
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_ml_dest_readonly]
[00:00:00]               │ debg creating role ft_ml_ui_extras
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_ml_ui_extras]
[00:00:00]               │ debg creating role ft_default_space_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_default_space_ml_all]
[00:00:00]               │ debg creating role ft_default_space1_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_default_space1_ml_all]
[00:00:00]               │ debg creating role ft_all_spaces_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_all_spaces_ml_all]
[00:00:00]               │ debg creating role ft_default_space_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_default_space_ml_read]
[00:00:00]               │ debg creating role ft_default_space1_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_default_space1_ml_read]
[00:00:00]               │ debg creating role ft_all_spaces_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_all_spaces_ml_read]
[00:00:00]               │ debg creating role ft_default_space_ml_none
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [ft_default_space_ml_none]
[00:00:00]               │ debg creating user ft_ml_poweruser
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_poweruser]
[00:00:00]               │ debg created user ft_ml_poweruser
[00:00:00]               │ debg creating user ft_ml_poweruser_spaces
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_poweruser_spaces]
[00:00:00]               │ debg created user ft_ml_poweruser_spaces
[00:00:00]               │ debg creating user ft_ml_poweruser_space1
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_poweruser_space1]
[00:00:00]               │ debg created user ft_ml_poweruser_space1
[00:00:00]               │ debg creating user ft_ml_poweruser_all_spaces
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_poweruser_all_spaces]
[00:00:00]               │ debg created user ft_ml_poweruser_all_spaces
[00:00:00]               │ debg creating user ft_ml_viewer
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_viewer]
[00:00:01]               │ debg created user ft_ml_viewer
[00:00:01]               │ debg creating user ft_ml_viewer_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_viewer_spaces]
[00:00:01]               │ debg created user ft_ml_viewer_spaces
[00:00:01]               │ debg creating user ft_ml_viewer_space1
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_viewer_space1]
[00:00:01]               │ debg created user ft_ml_viewer_space1
[00:00:01]               │ debg creating user ft_ml_viewer_all_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_viewer_all_spaces]
[00:00:01]               │ debg created user ft_ml_viewer_all_spaces
[00:00:01]               │ debg creating user ft_ml_unauthorized
[00:00:01]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [.ds-ilm-history-5-2021.02.09-000001] creating index, cause [initialize_data_stream], templates [ilm-history], shards [1]/[0]
[00:00:01]               │ info [o.e.c.m.MetadataCreateDataStreamService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] adding data stream [ilm-history-5] with write index [.ds-ilm-history-5-2021.02.09-000001] and backing indices []
[00:00:01]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ds-ilm-history-5-2021.02.09-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ilm-history-ilm-policy]
[00:00:01]               │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[.ds-ilm-history-5-2021.02.09-000001][0]]])." previous.health="YELLOW" reason="shards started [[.ds-ilm-history-5-2021.02.09-000001][0]]"
[00:00:01]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ds-ilm-history-5-2021.02.09-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] in policy [ilm-history-ilm-policy]
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_unauthorized]
[00:00:01]               │ debg created user ft_ml_unauthorized
[00:00:01]               │ debg creating user ft_ml_unauthorized_spaces
[00:00:01]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ds-ilm-history-5-2021.02.09-000001] from [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] to [{"phase":"hot","action":"rollover","name":"check-rollover-ready"}] in policy [ilm-history-ilm-policy]
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [ft_ml_unauthorized_spaces]
[00:00:01]               │ debg created user ft_ml_unauthorized_spaces
[00:43:05]             └-: data frame analytics
[00:43:05]               └-> "before all" hook
[00:46:21]               └-: total feature importance panel and decision path popover
[00:46:21]                 └-> "before all" hook
[00:46:21]                 └-> "before all" hook
[00:46:21]                   │ debg applying update to kibana config: {"dateFormat:tz":"UTC"}
[00:46:22]                   │ debg SecurityPage.forceLogout
[00:46:22]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=100
[00:46:22]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:46:23]                   │ debg Redirecting to /logout to force the logout
[00:46:23]                   │ debg Waiting on the login form to appear
[00:46:23]                   │ debg Waiting for Login Page to appear.
[00:46:23]                   │ debg Waiting up to 100000ms for login page...
[00:46:23]                   │ debg browser[INFO] http://localhost:61111/logout?_t=1612895341260 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:46:23]                   │
[00:46:23]                   │ debg browser[INFO] http://localhost:61111/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:46:23]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:46:25]                   │ERROR browser[SEVERE] http://localhost:61111/internal/security/me - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:46:25]                   │ debg browser[INFO] http://localhost:61111/login?msg=LOGGED_OUT 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:46:25]                   │
[00:46:25]                   │ debg browser[INFO] http://localhost:61111/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:46:25]                   │ERROR browser[SEVERE] http://localhost:61111/internal/spaces/_active_space - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:46:25]                   │ERROR browser[SEVERE] http://localhost:61111/internal/security/me - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:46:25]                   │ debg browser[INFO] http://localhost:61111/40119/bundles/core/core.entry.js 12:159380 "Detected an unhandled Promise rejection.
[00:46:25]                   │      Error: Unauthorized"
[00:46:25]                   │ERROR browser[SEVERE] http://localhost:61111/40119/bundles/core/core.entry.js 5:3002 
[00:46:25]                   │ERROR browser[SEVERE] http://localhost:61111/api/licensing/info - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:46:25]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:46:26]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:46:26]                   │ debg TestSubjects.exists(loginForm)
[00:46:26]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:46:26]                   │ debg Waiting for Login Form to appear.
[00:46:26]                   │ debg Waiting up to 100000ms for login form...
[00:46:26]                   │ debg TestSubjects.exists(loginForm)
[00:46:26]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:46:26]                   │ debg TestSubjects.setValue(loginUsername, ft_ml_poweruser)
[00:46:26]                   │ debg TestSubjects.click(loginUsername)
[00:46:26]                   │ debg Find.clickByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:46:26]                   │ debg Find.findByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:46:27]                   │ debg TestSubjects.setValue(loginPassword, mlp001)
[00:46:27]                   │ debg TestSubjects.click(loginPassword)
[00:46:27]                   │ debg Find.clickByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:46:27]                   │ debg Find.findByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:46:27]                   │ debg TestSubjects.click(loginSubmit)
[00:46:27]                   │ debg Find.clickByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:46:27]                   │ debg Find.findByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:46:27]                   │ debg Waiting for login result, expected: chrome.
[00:46:27]                   │ debg Find.findByCssSelector('[data-test-subj="kibanaChrome"] .app-wrapper:not(.hidden-chrome)') with timeout=20000
[00:46:27]                   │ proc [kibana]   log   [18:29:05.478] [info][plugins][routes][security] Logging in with provider "basic" (basic)
[00:46:29]                   │ debg browser[INFO] http://localhost:61111/app/home 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:46:29]                   │
[00:46:29]                   │ debg browser[INFO] http://localhost:61111/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:46:29]                   │ debg Finished login process currentUrl = http://localhost:61111/app/home#/
[00:46:29]                   │ debg Waiting up to 20000ms for logout button visible...
[00:46:29]                   │ debg TestSubjects.exists(userMenuButton)
[00:46:29]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenuButton"]') with timeout=2500
[00:46:29]                   │ debg TestSubjects.exists(userMenu)
[00:46:29]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"]') with timeout=2500
[00:46:32]                   │ debg --- retry.tryForTime error: [data-test-subj="userMenu"] is not displayed
[00:46:32]                   │ debg TestSubjects.click(userMenuButton)
[00:46:32]                   │ debg Find.clickByCssSelector('[data-test-subj="userMenuButton"]') with timeout=10000
[00:46:32]                   │ debg Find.findByCssSelector('[data-test-subj="userMenuButton"]') with timeout=10000
[00:46:32]                   │ debg TestSubjects.exists(userMenu)
[00:46:32]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"]') with timeout=120000
[00:46:32]                   │ debg TestSubjects.exists(userMenu > logoutLink)
[00:46:32]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"] [data-test-subj="logoutLink"]') with timeout=2500
[00:46:32]                   │ info [ml/ihp_outlier] Loading "mappings.json"
[00:46:32]                   │ info [ml/ihp_outlier] Loading "data.json.gz"
[00:46:32]                   │ info [ml/ihp_outlier] Skipped restore for existing index "ft_ihp_outlier"
[00:46:32]                   │ debg Searching for 'index-pattern' with title 'ft_ihp_outlier'...
[00:46:32]                   │ debg  > Found '5e6fba30-6afe-11eb-9026-1db6602de315'
[00:46:32]                   │ debg Index pattern with title 'ft_ihp_outlier' already exists. Nothing to create.
[00:46:32]                   │ debg Creating data frame analytic job with id 'ihp_fi_binary_1612892557468' ...
[00:46:32]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [.ml-config] creating index, cause [auto(bulk api)], templates [], shards [1]/[1]
[00:46:32]                   │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updating number_of_replicas to [0] for indices [.ml-config]
[00:46:32]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [.ml-annotations-6] creating index, cause [api], templates [], shards [1]/[1]
[00:46:32]                   │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updating number_of_replicas to [0] for indices [.ml-annotations-6]
[00:46:32]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [.ml-notifications-000001] creating index, cause [auto(bulk api)], templates [.ml-notifications-000001], shards [1]/[1]
[00:46:32]                   │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updating number_of_replicas to [0] for indices [.ml-notifications-000001]
[00:46:33]                   │ debg Waiting up to 5000ms for 'ihp_fi_binary_1612892557468' to exist...
[00:46:33]                   │ debg Fetching data frame analytics job 'ihp_fi_binary_1612892557468'...
[00:46:33]                   │ debg > DFA job fetched.
[00:46:33]                   │ debg > DFA job created.
[00:46:33]                   │ debg Starting data frame analytics job 'ihp_fi_binary_1612892557468'...
[00:46:33]                   │ info [o.e.x.m.a.TransportStartDataFrameAnalyticsAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Starting data frame analytics from state [stopped]
[00:46:33]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [.ml-inference-000003] creating index, cause [api], templates [], shards [1]/[1]
[00:46:33]                   │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updating number_of_replicas to [0] for indices [.ml-inference-000003]
[00:46:33]                   │ info [o.e.x.c.m.u.MlIndexAndAlias] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] About to create first concrete index [.ml-state-000001] with alias [.ml-state-write]
[00:46:33]                   │ debg > DFA job started.
[00:46:33]                   │ debg Waiting up to 60000ms for 'ihp_fi_binary_1612892557468' to have training_docs_count > 0...
[00:46:33]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:33]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [.ml-state-000001] creating index, cause [api], templates [.ml-state], shards [1]/[1]
[00:46:33]                   │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updating number_of_replicas to [0] for indices [.ml-state-000001]
[00:46:33]                   │ debg > DFA job stats fetched.
[00:46:33]                   │ debg --- retry.waitForWithTimeout error: expected data frame analytics job 'ihp_fi_binary_1612892557468' to have training_docs_count > 0 (got 0)
[00:46:33]                   │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ml-state-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ml-size-based-ilm-policy]
[00:46:33]                   │ info [o.e.x.c.m.u.MlIndexAndAlias] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] About to create first concrete index [.ml-stats-000001] with alias [.ml-stats-write]
[00:46:33]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [.ml-stats-000001] creating index, cause [api], templates [.ml-stats], shards [1]/[1]
[00:46:33]                   │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updating number_of_replicas to [0] for indices [.ml-stats-000001]
[00:46:33]                   │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ml-state-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] in policy [ml-size-based-ilm-policy]
[00:46:34]                   │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ml-stats-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ml-size-based-ilm-policy]
[00:46:34]                   │ info [o.e.x.m.d.s.ReindexingStep] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Creating destination index [user-ihp_fi_binary_1612892557468]
[00:46:34]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [user-ihp_fi_binary_1612892557468] creating index, cause [api], templates [], shards [1]/[1]
[00:46:34]                   │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ml-state-000001] from [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] to [{"phase":"hot","action":"rollover","name":"check-rollover-ready"}] in policy [ml-size-based-ilm-policy]
[00:46:34]                   │ info [o.e.x.m.d.s.ReindexingStep] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Started reindexing
[00:46:34]                   │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ml-stats-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] in policy [ml-size-based-ilm-policy]
[00:46:34]                   │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] moving index [.ml-stats-000001] from [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] to [{"phase":"hot","action":"rollover","name":"check-rollover-ready"}] in policy [ml-size-based-ilm-policy]
[00:46:34]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:34]                   │ debg > DFA job stats fetched.
[00:46:34]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:34]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Started loading data
[00:46:34]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Started analyzing
[00:46:34]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Waiting for result processor to complete
[00:46:34]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:34]                   │ debg > DFA job stats fetched.
[00:46:34]                   │ debg Waiting up to 120000ms for analytics state to be stopped...
[00:46:34]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:34]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:34]                   │ debg > DFA job stats fetched.
[00:46:34]                   │ debg --- retry.waitForWithTimeout error: expected analytics state to be stopped but got started
[00:46:35]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:35]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:35]                   │ debg > DFA job stats fetched.
[00:46:35]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:35]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:35]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:35]                   │ debg > DFA job stats fetched.
[00:46:35]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:36]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:36]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:36]                   │ debg > DFA job stats fetched.
[00:46:36]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:36]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:36]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:36]                   │ debg > DFA job stats fetched.
[00:46:36]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:37]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:37]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:37]                   │ debg > DFA job stats fetched.
[00:46:37]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:37]                   │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] [data_frame_analyzer/280342] [CBoostedTreeImpl.cc@245] Hyperparameter selection failed: exiting loop early
[00:46:37]                   │ info [o.e.x.m.d.p.ChunkedTrainedModelPersister] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] finished storing trained model with id [ihp_fi_binary_1612892557468-1612895355955]
[00:46:37]                   │ info [o.e.x.m.d.p.AnalyticsResultProcessor] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Started writing results
[00:46:37]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Result processor has completed
[00:46:37]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Closing process
[00:46:37]                   │ info [o.e.x.m.p.AbstractNativeProcess] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] State output finished
[00:46:37]                   │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] [data_frame_analyzer/280342] [Main.cc@241] [{"name":"E_DFTPMEstimatedPeakMemoryUsage","description":"The upfront estimate of the peak memory training the predictive model would use","value":6040636}
[00:46:37]                   │      ,{"name":"E_DFTPMPeakMemoryUsage","description":"The peak memory training the predictive model used","value":290451}
[00:46:37]                   │      ,{"name":"E_DFTPMTimeToTrain","description":"The time it took to train the predictive model","value":2970}
[00:46:37]                   │      ,{"name":"E_DFTPMTrainedForestNumberTrees","description":"The total number of trees in the trained forest","value":6}
[00:46:37]                   │      ]
[00:46:37]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Closed process
[00:46:37]                   │ info [o.e.x.m.d.i.InferenceRunner] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Started inference on test data against model [ihp_fi_binary_1612892557468-1612895355955]
[00:46:37]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:37]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:38]                   │ debg > DFA job stats fetched.
[00:46:38]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:38]                   │ info [o.e.x.m.d.DataFrameAnalyticsManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_binary_1612892557468] Marking task completed
[00:46:38]                   │ debg Fetching analytics state for job ihp_fi_binary_1612892557468
[00:46:38]                   │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1612892557468...
[00:46:38]                   │ debg > DFA job stats fetched.
[00:46:38]                   │ info [ml/ihp_outlier] Loading "mappings.json"
[00:46:38]                   │ info [ml/ihp_outlier] Loading "data.json.gz"
[00:46:38]                   │ info [ml/ihp_outlier] Skipped restore for existing index "ft_ihp_outlier"
[00:46:38]                   │ debg Searching for 'index-pattern' with title 'ft_ihp_outlier'...
[00:46:38]                   │ debg  > Found '5e6fba30-6afe-11eb-9026-1db6602de315'
[00:46:38]                   │ debg Index pattern with title 'ft_ihp_outlier' already exists. Nothing to create.
[00:46:38]                   │ debg Creating data frame analytic job with id 'ihp_fi_multi_1612892557468' ...
[00:46:39]                   │ debg Waiting up to 5000ms for 'ihp_fi_multi_1612892557468' to exist...
[00:46:39]                   │ debg Fetching data frame analytics job 'ihp_fi_multi_1612892557468'...
[00:46:39]                   │ debg > DFA job fetched.
[00:46:39]                   │ debg > DFA job created.
[00:46:39]                   │ debg Starting data frame analytics job 'ihp_fi_multi_1612892557468'...
[00:46:39]                   │ info [o.e.x.m.a.TransportStartDataFrameAnalyticsAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Starting data frame analytics from state [stopped]
[00:46:39]                   │ debg > DFA job started.
[00:46:39]                   │ debg Waiting up to 60000ms for 'ihp_fi_multi_1612892557468' to have training_docs_count > 0...
[00:46:39]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:39]                   │ info [o.e.x.m.d.s.ReindexingStep] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Creating destination index [user-ihp_fi_multi_1612892557468]
[00:46:39]                   │ debg > DFA job stats fetched.
[00:46:39]                   │ debg --- retry.waitForWithTimeout error: expected data frame analytics job 'ihp_fi_multi_1612892557468' to have training_docs_count > 0 (got 0)
[00:46:39]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [user-ihp_fi_multi_1612892557468] creating index, cause [api], templates [], shards [1]/[1]
[00:46:39]                   │ info [o.e.x.m.d.s.ReindexingStep] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Started reindexing
[00:46:40]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:40]                   │ debg > DFA job stats fetched.
[00:46:40]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:40]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Started loading data
[00:46:40]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Started analyzing
[00:46:40]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Waiting for result processor to complete
[00:46:40]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:40]                   │ debg > DFA job stats fetched.
[00:46:40]                   │ debg Waiting up to 120000ms for analytics state to be stopped...
[00:46:40]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:40]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:40]                   │ debg > DFA job stats fetched.
[00:46:40]                   │ debg --- retry.waitForWithTimeout error: expected analytics state to be stopped but got started
[00:46:41]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:41]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:41]                   │ debg > DFA job stats fetched.
[00:46:41]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:41]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:41]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:41]                   │ debg > DFA job stats fetched.
[00:46:41]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:42]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:42]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:42]                   │ debg > DFA job stats fetched.
[00:46:42]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:42]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:42]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:42]                   │ debg > DFA job stats fetched.
[00:46:42]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:43]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:43]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:43]                   │ debg > DFA job stats fetched.
[00:46:43]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:43]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:43]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:43]                   │ debg > DFA job stats fetched.
[00:46:43]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:44]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:44]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:44]                   │ debg > DFA job stats fetched.
[00:46:44]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:44]                   │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] [data_frame_analyzer/280539] [CBoostedTreeImpl.cc@245] Hyperparameter selection failed: exiting loop early
[00:46:44]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:44]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:44]                   │ debg > DFA job stats fetched.
[00:46:44]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:45]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:45]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:45]                   │ debg > DFA job stats fetched.
[00:46:45]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:45]                   │ info [o.e.x.m.d.p.ChunkedTrainedModelPersister] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] finished storing trained model with id [ihp_fi_multi_1612892557468-1612895363826]
[00:46:45]                   │ info [o.e.x.m.d.p.AnalyticsResultProcessor] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Started writing results
[00:46:45]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Result processor has completed
[00:46:45]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Closing process
[00:46:45]                   │ info [o.e.x.m.p.AbstractNativeProcess] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] State output finished
[00:46:45]                   │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] [data_frame_analyzer/280539] [Main.cc@241] [{"name":"E_DFTPMEstimatedPeakMemoryUsage","description":"The upfront estimate of the peak memory training the predictive model would use","value":9613500}
[00:46:45]                   │      ,{"name":"E_DFTPMPeakMemoryUsage","description":"The peak memory training the predictive model used","value":1209256}
[00:46:45]                   │      ,{"name":"E_DFTPMTimeToTrain","description":"The time it took to train the predictive model","value":4291}
[00:46:45]                   │      ,{"name":"E_DFTPMTrainedForestNumberTrees","description":"The total number of trees in the trained forest","value":26}
[00:46:45]                   │      ]
[00:46:45]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Closed process
[00:46:45]                   │ info [o.e.x.m.d.i.InferenceRunner] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Started inference on test data against model [ihp_fi_multi_1612892557468-1612895363826]
[00:46:45]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:45]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:45]                   │ debg > DFA job stats fetched.
[00:46:45]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:46]                   │ info [o.e.x.m.d.DataFrameAnalyticsManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [ihp_fi_multi_1612892557468] Marking task completed
[00:46:46]                   │ debg Fetching analytics state for job ihp_fi_multi_1612892557468
[00:46:46]                   │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1612892557468...
[00:46:46]                   │ debg > DFA job stats fetched.
[00:46:46]                   │ info [ml/egs_regression] Loading "mappings.json"
[00:46:46]                   │ info [ml/egs_regression] Loading "data.json.gz"
[00:46:46]                   │ info [ml/egs_regression] Skipped restore for existing index "ft_egs_regression"
[00:46:46]                   │ debg Searching for 'index-pattern' with title 'ft_egs_regression'...
[00:46:46]                   │ debg  > Found '5539e1b0-6b04-11eb-9026-1db6602de315'
[00:46:46]                   │ debg Index pattern with title 'ft_egs_regression' already exists. Nothing to create.
[00:46:46]                   │ debg Creating data frame analytic job with id 'egs_fi_reg_1612892557468' ...
[00:46:47]                   │ debg Waiting up to 5000ms for 'egs_fi_reg_1612892557468' to exist...
[00:46:47]                   │ debg Fetching data frame analytics job 'egs_fi_reg_1612892557468'...
[00:46:47]                   │ debg > DFA job fetched.
[00:46:47]                   │ debg > DFA job created.
[00:46:47]                   │ debg Starting data frame analytics job 'egs_fi_reg_1612892557468'...
[00:46:47]                   │ info [o.e.x.m.a.TransportStartDataFrameAnalyticsAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Starting data frame analytics from state [stopped]
[00:46:47]                   │ debg > DFA job started.
[00:46:47]                   │ debg Waiting up to 60000ms for 'egs_fi_reg_1612892557468' to have training_docs_count > 0...
[00:46:47]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:47]                   │ info [o.e.x.m.d.s.ReindexingStep] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Creating destination index [user-egs_fi_reg_1612892557468]
[00:46:47]                   │ debg > DFA job stats fetched.
[00:46:47]                   │ debg --- retry.waitForWithTimeout error: expected data frame analytics job 'egs_fi_reg_1612892557468' to have training_docs_count > 0 (got 0)
[00:46:47]                   │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [user-egs_fi_reg_1612892557468] creating index, cause [api], templates [], shards [1]/[1]
[00:46:47]                   │ info [o.e.x.m.d.s.ReindexingStep] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Started reindexing
[00:46:48]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:48]                   │ debg > DFA job stats fetched.
[00:46:48]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:48]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Started loading data
[00:46:48]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Started analyzing
[00:46:48]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Waiting for result processor to complete
[00:46:48]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:48]                   │ debg > DFA job stats fetched.
[00:46:48]                   │ debg Waiting up to 120000ms for analytics state to be stopped...
[00:46:48]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:48]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:48]                   │ debg > DFA job stats fetched.
[00:46:48]                   │ debg --- retry.waitForWithTimeout error: expected analytics state to be stopped but got started
[00:46:49]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:49]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:49]                   │ debg > DFA job stats fetched.
[00:46:49]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:49]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:49]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:49]                   │ debg > DFA job stats fetched.
[00:46:49]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:50]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:50]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:50]                   │ debg > DFA job stats fetched.
[00:46:50]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:50]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:50]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:50]                   │ debg > DFA job stats fetched.
[00:46:50]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:51]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:51]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:51]                   │ debg > DFA job stats fetched.
[00:46:51]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:51]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:51]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:51]                   │ debg > DFA job stats fetched.
[00:46:51]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:52]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:52]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:52]                   │ debg > DFA job stats fetched.
[00:46:52]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:52]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:52]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:52]                   │ debg > DFA job stats fetched.
[00:46:52]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:53]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:53]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:53]                   │ debg > DFA job stats fetched.
[00:46:53]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:53]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:53]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:54]                   │ debg > DFA job stats fetched.
[00:46:54]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:54]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:54]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:54]                   │ debg > DFA job stats fetched.
[00:46:54]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:55]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:55]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:55]                   │ debg > DFA job stats fetched.
[00:46:55]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:55]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:55]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:55]                   │ debg > DFA job stats fetched.
[00:46:55]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:56]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:56]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:56]                   │ debg > DFA job stats fetched.
[00:46:56]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:56]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:56]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:56]                   │ debg > DFA job stats fetched.
[00:46:56]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:57]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:57]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:57]                   │ debg > DFA job stats fetched.
[00:46:57]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:57]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:57]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:57]                   │ debg > DFA job stats fetched.
[00:46:57]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:58]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:58]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:58]                   │ debg > DFA job stats fetched.
[00:46:58]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:58]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:58]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:58]                   │ debg > DFA job stats fetched.
[00:46:58]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:59]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:59]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:59]                   │ debg > DFA job stats fetched.
[00:46:59]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:46:59]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:46:59]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:46:59]                   │ debg > DFA job stats fetched.
[00:46:59]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:00]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:00]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:00]                   │ debg > DFA job stats fetched.
[00:47:00]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:00]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:00]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:00]                   │ debg > DFA job stats fetched.
[00:47:00]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:01]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:01]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:01]                   │ debg > DFA job stats fetched.
[00:47:01]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:01]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:01]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:01]                   │ debg > DFA job stats fetched.
[00:47:01]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:02]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:02]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:02]                   │ debg > DFA job stats fetched.
[00:47:02]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:02]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:02]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:02]                   │ debg > DFA job stats fetched.
[00:47:02]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:03]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:03]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:03]                   │ debg > DFA job stats fetched.
[00:47:03]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:03]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:03]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:03]                   │ debg > DFA job stats fetched.
[00:47:03]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:04]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:04]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:04]                   │ debg > DFA job stats fetched.
[00:47:04]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:04]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:04]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:04]                   │ debg > DFA job stats fetched.
[00:47:04]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:05]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:05]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:05]                   │ debg > DFA job stats fetched.
[00:47:05]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:05]                   │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] [data_frame_analyzer/281060] [CBoostedTreeImpl.cc@245] Hyperparameter selection failed: exiting loop early
[00:47:05]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:05]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:05]                   │ debg > DFA job stats fetched.
[00:47:05]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:05]                   │ info [o.e.x.m.d.p.ChunkedTrainedModelPersister] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] finished storing trained model with id [egs_fi_reg_1612892557468-1612895384147]
[00:47:05]                   │ info [o.e.x.m.d.p.AnalyticsResultProcessor] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Started writing results
[00:47:06]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Result processor has completed
[00:47:06]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Closing process
[00:47:06]                   │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] [data_frame_analyzer/281060] [Main.cc@241] [{"name":"E_DFTPMEstimatedPeakMemoryUsage","description":"The upfront estimate of the peak memory training the predictive model would use","value":7582544}
[00:47:06]                   │      ,{"name":"E_DFTPMPeakMemoryUsage","description":"The peak memory training the predictive model used","value":7314584}
[00:47:06]                   │      ,{"name":"E_DFTPMTimeToTrain","description":"The time it took to train the predictive model","value":17096}
[00:47:06]                   │      ,{"name":"E_DFTPMTrainedForestNumberTrees","description":"The total number of trees in the trained forest","value":190}
[00:47:06]                   │      ]
[00:47:06]                   │ info [o.e.x.m.p.AbstractNativeProcess] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] State output finished
[00:47:06]                   │ info [o.e.x.m.d.p.AnalyticsProcessManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Closed process
[00:47:06]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:06]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:06]                   │ info [o.e.x.m.d.i.InferenceRunner] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Started inference on test data against model [egs_fi_reg_1612892557468-1612895384147]
[00:47:06]                   │ debg > DFA job stats fetched.
[00:47:06]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:06]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:06]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:06]                   │ debg > DFA job stats fetched.
[00:47:06]                   │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:47:07]                   │ info [o.e.x.m.d.DataFrameAnalyticsManager] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] [egs_fi_reg_1612892557468] Marking task completed
[00:47:07]                   │ debg Fetching analytics state for job egs_fi_reg_1612892557468
[00:47:07]                   │ debg Fetching data frame analytics job stats for job egs_fi_reg_1612892557468...
[00:47:07]                   │ debg > DFA job stats fetched.
[00:47:07]                 └-: binary classification job
[00:47:07]                   └-> "before all" hook
[00:47:07]                   └-> "before all" hook
[00:47:07]                     │ debg navigating to ml url: http://localhost:61111/app/ml
[00:47:07]                     │ debg navigate to: http://localhost:61111/app/ml
[00:47:07]                     │ debg browser[INFO] http://localhost:61111/app/ml?_t=1612895385429 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:47:07]                     │
[00:47:07]                     │ debg browser[INFO] http://localhost:61111/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:47:07]                     │ debg ... sleep(700) start
[00:47:08]                     │ debg ... sleep(700) end
[00:47:08]                     │ debg returned from get, calling refresh
[00:47:08]                     │ERROR browser[SEVERE] http://localhost:61111/40119/bundles/core/core.entry.js 12:158432 TypeError: Failed to fetch
[00:47:08]                     │          at fetch_Fetch.fetchResponse (http://localhost:61111/40119/bundles/core/core.entry.js:6:32451)
[00:47:08]                     │          at async interceptResponse (http://localhost:61111/40119/bundles/core/core.entry.js:6:28637)
[00:47:08]                     │          at async http://localhost:61111/40119/bundles/core/core.entry.js:6:31117
[00:47:08]                     │ debg browser[INFO] http://localhost:61111/app/ml?_t=1612895385429 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:47:08]                     │
[00:47:08]                     │ debg browser[INFO] http://localhost:61111/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:47:09]                     │ debg currentUrl = http://localhost:61111/app/ml
[00:47:09]                     │          appUrl = http://localhost:61111/app/ml
[00:47:09]                     │ debg TestSubjects.find(kibanaChrome)
[00:47:09]                     │ debg Find.findByCssSelector('[data-test-subj="kibanaChrome"]') with timeout=60000
[00:47:09]                     │ debg ... sleep(501) start
[00:47:09]                     │ debg ... sleep(501) end
[00:47:09]                     │ debg in navigateTo url = http://localhost:61111/app/ml/overview
[00:47:09]                     │ debg --- retry.try error: URL changed, waiting for it to settle
[00:47:10]                     │ debg ... sleep(501) start
[00:47:10]                     │ debg ... sleep(501) end
[00:47:10]                     │ debg in navigateTo url = http://localhost:61111/app/ml/overview
[00:47:10]                     │ debg TestSubjects.exists(statusPageContainer)
[00:47:10]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="statusPageContainer"]') with timeout=2500
[00:47:13]                     │ debg --- retry.tryForTime error: [data-test-subj="statusPageContainer"] is not displayed
[00:47:13]                     │ debg TestSubjects.exists(mlApp)
[00:47:13]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlApp"]') with timeout=2000
[00:47:13]                     │ debg TestSubjects.click(~mlMainTab & ~dataFrameAnalytics)
[00:47:13]                     │ debg Find.clickByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"]') with timeout=10000
[00:47:13]                     │ debg Find.findByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"]') with timeout=10000
[00:47:14]                     │ debg TestSubjects.exists(~mlMainTab & ~dataFrameAnalytics & ~selected)
[00:47:14]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"][data-test-subj~="selected"]') with timeout=120000
[00:47:14]                     │ debg TestSubjects.exists(mlPageDataFrameAnalytics)
[00:47:14]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlPageDataFrameAnalytics"]') with timeout=120000
[00:47:14]                     │ debg TestSubjects.exists(~mlAnalyticsTable)
[00:47:14]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlAnalyticsTable"]') with timeout=60000
[00:47:14]                     │ debg TestSubjects.exists(mlAnalyticsTable loaded)
[00:47:14]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsTable loaded"]') with timeout=30000
[00:47:14]                     │ debg TestSubjects.exists(~mlAnalyticsTable > ~row-ihp_fi_binary_1612892557468 > mlAnalyticsJobViewButton)
[00:47:14]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-ihp_fi_binary_1612892557468"] [data-test-subj="mlAnalyticsJobViewButton"]') with timeout=120000
[00:47:14]                     │ debg TestSubjects.click(~mlAnalyticsTable > ~row-ihp_fi_binary_1612892557468 > mlAnalyticsJobViewButton)
[00:47:14]                     │ debg Find.clickByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-ihp_fi_binary_1612892557468"] [data-test-subj="mlAnalyticsJobViewButton"]') with timeout=10000
[00:47:14]                     │ debg Find.findByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-ihp_fi_binary_1612892557468"] [data-test-subj="mlAnalyticsJobViewButton"]') with timeout=10000
[00:47:14]                     │ debg TestSubjects.exists(mlPageDataFrameAnalyticsExploration)
[00:47:14]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlPageDataFrameAnalyticsExploration"]') with timeout=20000
[00:47:14]                   └-> should display the total feature importance in the results view
[00:47:14]                     └-> "before each" hook: global before each
[00:47:14]                     │ debg TestSubjects.exists(mlDFExpandableSection-FeatureImportanceSummary)
[00:47:14]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlDFExpandableSection-FeatureImportanceSummary"]') with timeout=120000
[00:47:16]                     │ debg TestSubjects.exists(mlTotalFeatureImportanceChart)
[00:47:16]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlTotalFeatureImportanceChart"]') with timeout=5000
[00:47:16]                     └- ✓ pass  (1.3s) "machine learning  data frame analytics total feature importance panel and decision path popover binary classification job should display the total feature importance in the results view"
[00:47:16]                   └-> should display the feature importance decision path in the data grid
[00:47:16]                     └-> "before each" hook: global before each
[00:47:16]                     │ debg TestSubjects.exists(mlExplorationDataGrid loaded)
[00:47:16]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlExplorationDataGrid loaded"]') with timeout=5000
[00:47:16]                     │ debg TestSubjects.findAll(mlExplorationDataGrid loaded > dataGridRow)
[00:47:16]                     │ debg Find.allByCssSelector('[data-test-subj="mlExplorationDataGrid loaded"] [data-test-subj="dataGridRow"]') with timeout=10000
[00:47:17]                     │ debg TestSubjects.findAll(mlExplorationDataGrid loaded > dataGridRow)
[00:47:17]                     │ debg Find.allByCssSelector('[data-test-subj="mlExplorationDataGrid loaded"] [data-test-subj="dataGridRow"]') with timeout=10000
[00:47:17]                     │ debg TestSubjects.find(mlExplorationDataGrid loaded > dataGridRow)
[00:47:17]                     │ debg Find.findByCssSelector('[data-test-subj="mlExplorationDataGrid loaded"] [data-test-subj="dataGridRow"]') with timeout=10000
[00:47:18]                     │ debg TestSubjects.exists(mlDFADecisionPathPopover)
[00:47:18]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlDFADecisionPathPopover"]') with timeout=120000
[00:47:21]                     │ debg --- retry.tryForTime error: [data-test-subj="mlDFADecisionPathPopover"] is not displayed
[00:47:24]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:27]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:30]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:33]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:36]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:39]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:42]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:45]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:48]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:51]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:54]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:57]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:00]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:03]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:06]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:09]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:12]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:15]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:18]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:21]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:24]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:27]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:30]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:33]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:36]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:39]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:43]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:46]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:49]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:52]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:55]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:58]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:01]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:04]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:07]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:10]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:13]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:16]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:19]                     │ debg --- retry.tryForTime failed again with the same message...
[00:49:19]                     │ info Taking screenshot "/dev/shm/workspace/parallel/11/kibana/x-pack/test/functional/screenshots/failure/machine learning  data frame analytics total feature importance panel and decision path popover binary classification job should display the feature importance decision path in the data grid.png"
[00:49:20]                     │ info Current URL is: http://localhost:61111/app/ml/data_frame_analytics/exploration?_g=(ml%3A(analysisType%3Aclassification%2CjobId%3Aihp_fi_binary_1612892557468))
[00:49:20]                     │ info Saving page source to: /dev/shm/workspace/parallel/11/kibana/x-pack/test/functional/failure_debug/html/machine learning  data frame analytics total feature importance panel and decision path popover binary classification job should display the feature importance decision path in the data grid.html
[00:49:20]                     └- ✖ fail: machine learning  data frame analytics total feature importance panel and decision path popover binary classification job should display the feature importance decision path in the data grid
[00:49:20]                     │      Error: expected testSubject(mlDFADecisionPathPopover) to exist
[00:49:20]                     │       at TestSubjects.existOrFail (/dev/shm/workspace/parallel/11/kibana/test/functional/services/common/test_subjects.ts:51:15)
[00:49:20]                     │       at Object.openFeatureImportanceDecisionPathPopover (test/functional/services/ml/data_frame_analytics_results.ts:98:7)
[00:49:20]                     │       at Context.<anonymous> (test/functional/apps/ml/data_frame_analytics/feature_importance.ts:204:11)
[00:49:20]                     │       at Object.apply (/dev/shm/workspace/parallel/11/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16)
[00:49:20]                     │ 
[00:49:20]                     │ 

Stack Trace

Error: expected testSubject(mlDFADecisionPathPopover) to exist
    at TestSubjects.existOrFail (/dev/shm/workspace/parallel/11/kibana/test/functional/services/common/test_subjects.ts:51:15)
    at Object.openFeatureImportanceDecisionPathPopover (test/functional/services/ml/data_frame_analytics_results.ts:98:7)
    at Context.<anonymous> (test/functional/apps/ml/data_frame_analytics/feature_importance.ts:204:11)
    at Object.apply (/dev/shm/workspace/parallel/11/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16)

Kibana Pipeline / general / X-Pack Detection Engine API Integration Tests.x-pack/test/detection_engine_api_integration/security_and_spaces/tests/create_index·ts.detection engine api security and spaces enabled create_index t1_analyst should NOT be able to create a signal index when it has not been created yet. Should return a 403 and error that the user is unauthorized

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has failed 1 times on tracked branches: https://dryrun

[00:00:00]       │
[00:00:00]         └-: detection engine api security and spaces enabled
[00:00:00]           └-> "before all" hook
[00:00:00]           └-: 
[00:00:00]             └-> "before all" hook
[00:02:37]             └-: create_index
[00:02:37]               └-> "before all" hook
[00:02:38]               └-: t1_analyst
[00:02:38]                 └-> "before all" hook
[00:02:38]                 └-> should return a 404 when the signal index has never been created
[00:02:38]                   └-> "before each" hook: global before each
[00:02:38]                   └-> "before each" hook
[00:02:38]                     │ debg creating role t1_analyst
[00:02:38]                     │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [t1_analyst]
[00:02:38]                     │ debg creating user t1_analyst
[00:02:38]                     │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [t1_analyst]
[00:02:38]                     │ debg created user t1_analyst
[00:02:39]                   └- ✓ pass  (376ms) "detection engine api security and spaces enabled  create_index t1_analyst should return a 404 when the signal index has never been created"
[00:02:39]                 └-> "after each" hook
[00:02:39]                 └-> should NOT be able to create a signal index when it has not been created yet. Should return a 403 and error that the user is unauthorized
[00:02:39]                   └-> "before each" hook: global before each
[00:02:39]                   └-> "before each" hook
[00:02:39]                     │ debg creating role t1_analyst
[00:02:39]                     │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updated role [t1_analyst]
[00:02:39]                     │ debg creating user t1_analyst
[00:02:39]                     │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updated user [t1_analyst]
[00:02:39]                     │ debg created user t1_analyst
[00:02:39]                   └- ✖ fail: detection engine api security and spaces enabled  create_index t1_analyst should NOT be able to create a signal index when it has not been created yet. Should return a 403 and error that the user is unauthorized
[00:02:39]                   │       Error: expected { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]',
[00:02:39]                   │   status_code: 403 } to sort of equal { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]',
[00:02:39]                   │   status_code: 403 }
[00:02:39]                   │       + expected - actual
[00:02:39]                   │ 
[00:02:39]                   │        {
[00:02:39]                   │       -  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]"
[00:02:39]                   │       +  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]"
[00:02:39]                   │          "status_code": 403
[00:02:39]                   │        }
[00:02:39]                   │       
[00:02:39]                   │       at Assertion.assert (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:100:11)
[00:02:39]                   │       at Assertion.eql (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:244:8)
[00:02:39]                   │       at Context.<anonymous> (test/detection_engine_api_integration/security_and_spaces/tests/create_index.ts:87:25)
[00:02:39]                   │       at Object.apply (/dev/shm/workspace/parallel/9/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16)
[00:02:39]                   │ 
[00:02:39]                   │ 

Stack Trace

Error: expected { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]',
  status_code: 403 } to sort of equal { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]',
  status_code: 403 }
    at Assertion.assert (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:100:11)
    at Assertion.eql (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:244:8)
    at Context.<anonymous> (test/detection_engine_api_integration/security_and_spaces/tests/create_index.ts:87:25)
    at Object.apply (/dev/shm/workspace/parallel/9/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16) {
  actual: '{\n' +
    '  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]"\n' +
    '  "status_code": 403\n' +
    '}',
  expected: '{\n' +
    '  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]"\n' +
    '  "status_code": 403\n' +
    '}',
  showDiff: true
}

Kibana Pipeline / general / X-Pack Detection Engine API Integration Tests.x-pack/test/detection_engine_api_integration/security_and_spaces/tests/create_index·ts.detection engine api security and spaces enabled create_index t1_analyst should NOT be able to create a signal index when it has not been created yet. Should return a 403 and error that the user is unauthorized

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: detection engine api security and spaces enabled
[00:00:00]           └-> "before all" hook
[00:00:00]           └-: 
[00:00:00]             └-> "before all" hook
[00:02:35]             └-: create_index
[00:02:35]               └-> "before all" hook
[00:02:36]               └-: t1_analyst
[00:02:36]                 └-> "before all" hook
[00:02:36]                 └-> should return a 404 when the signal index has never been created
[00:02:36]                   └-> "before each" hook: global before each
[00:02:36]                   └-> "before each" hook
[00:02:36]                     │ debg creating role t1_analyst
[00:02:36]                     │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added role [t1_analyst]
[00:02:36]                     │ debg creating user t1_analyst
[00:02:36]                     │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] added user [t1_analyst]
[00:02:36]                     │ debg created user t1_analyst
[00:02:36]                   └- ✓ pass  (421ms) "detection engine api security and spaces enabled  create_index t1_analyst should return a 404 when the signal index has never been created"
[00:02:36]                 └-> "after each" hook
[00:02:36]                 └-> should NOT be able to create a signal index when it has not been created yet. Should return a 403 and error that the user is unauthorized
[00:02:36]                   └-> "before each" hook: global before each
[00:02:36]                   └-> "before each" hook
[00:02:36]                     │ debg creating role t1_analyst
[00:02:36]                     │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updated role [t1_analyst]
[00:02:36]                     │ debg creating user t1_analyst
[00:02:36]                     │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-centos-tests-xxl-1612890497656166486] updated user [t1_analyst]
[00:02:36]                     │ debg created user t1_analyst
[00:02:36]                   └- ✖ fail: detection engine api security and spaces enabled  create_index t1_analyst should NOT be able to create a signal index when it has not been created yet. Should return a 403 and error that the user is unauthorized
[00:02:36]                   │       Error: expected { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]',
[00:02:36]                   │   status_code: 403 } to sort of equal { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]',
[00:02:36]                   │   status_code: 403 }
[00:02:36]                   │       + expected - actual
[00:02:36]                   │ 
[00:02:36]                   │        {
[00:02:36]                   │       -  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]"
[00:02:36]                   │       +  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]"
[00:02:36]                   │          "status_code": 403
[00:02:36]                   │        }
[00:02:36]                   │       
[00:02:36]                   │       at Assertion.assert (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:100:11)
[00:02:36]                   │       at Assertion.eql (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:244:8)
[00:02:36]                   │       at Context.<anonymous> (test/detection_engine_api_integration/security_and_spaces/tests/create_index.ts:87:25)
[00:02:36]                   │       at Object.apply (/dev/shm/workspace/parallel/9/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16)
[00:02:36]                   │ 
[00:02:36]                   │ 

Stack Trace

Error: expected { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]',
  status_code: 403 } to sort of equal { message: 'security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]',
  status_code: 403 }
    at Assertion.assert (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:100:11)
    at Assertion.eql (/dev/shm/workspace/parallel/9/kibana/packages/kbn-expect/expect.js:244:8)
    at Context.<anonymous> (test/detection_engine_api_integration/security_and_spaces/tests/create_index.ts:87:25)
    at Object.apply (/dev/shm/workspace/parallel/9/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16) {
  actual: '{\n' +
    '  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the cluster privileges [read_ilm,manage_ilm,manage,all]"\n' +
    '  "status_code": 403\n' +
    '}',
  expected: '{\n' +
    '  "message": "security_exception: action [cluster:admin/ilm/get] is unauthorized for user [t1_analyst], this action is granted by the privileges [read_ilm,manage_ilm,manage,all]"\n' +
    '  "status_code": 403\n' +
    '}',
  showDiff: true
}

and 1 more failures, only showing the first 3.

Metrics [docs]

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

spalger added a commit that referenced this pull request Feb 9, 2021
Co-authored-by: spalger <spalger@users.noreply.github.com>
this._log.indent(4);

const esArgs = options.esArgs || [];
const esArgs = ['action.destructive_requires_name=true', ...(options.esArgs || [])];
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@spalger any thoughts about making this overridable? Being able to delete indices via wildcard has been helpful for exploring empty case behavior. Specifically I found myself doing it while working on #110432

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd prefer to add an action to node scripts/es_archiver or something which deletes all indices with a wildcard like https://github.com/elastic/kibana/blob/master/test/common/services/es_delete_all_indices.ts

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Developing against an ES cluster with this =false feels a little risky, but I hear you and we could definitely make that an option.

Copy link
Copy Markdown
Contributor Author

@spalger spalger Sep 16, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Example of adding a command to the CLI: https://github.com/elastic/kibana/blob/master/packages/kbn-es-archiver/src/cli.ts#L260-L267, and if you wanted to move the majority of the logic from that FTR service to the esArchiver and then import it we could share the implementation.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh interesting. This is the first I'd heart of es_archiver. I'll check that out. Thanks!

Copy link
Copy Markdown
Contributor

@matschaffer matschaffer Sep 16, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I spent some time playing with this today and came up with #112376 but running it is a little cumbersome for things like local docker testing envs.

Would you have any interest in also making es_archiver able to read kibana.dev.yml?

Needing to replicate my connection configuration into format that es-archiver can work with means it might just be easier to override destructive_requires_name (via _cluster/settings) when I need this sort of thing.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm definitely interested in es_archiver being able to read kibana.dev.yml. All of that is handled in packages/kbn-es-archiver/src/cli.ts, and shouldn't really impact the EsArchiver class. Reading the kibana.yml and kibana.dev.yml files can be a little tricky as we technically support compound keys:

nested.setting: foo
# and
nested:
  setting: foo

There is an implementation in https://github.com/elastic/kibana/blob/72dd0578eadf395897903075a2ae4992fb52c577/packages/kbn-apm-config-loader/src/utils/read_config.ts#L41-L54 but maybe @pgayvallet has a better example in mind.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice, thanks! If we can do that I can probably get something like node scripts/es_archiver.js delete-indices '.monitoring-*' which would be better than manually deleting specific indices via dev tools.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport:skip This PR does not require backporting release_note:enhancement Team:Operations Kibana-Operations Team v7.12.0 v8.0.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants