Skip to content

[Metrics] Metrics are not exported when using different jobs #13372

@architkulkarni

Description

@architkulkarni

What is the problem?

In an ipython window, run

import ray
from ray.util import metrics
ray.init(_metrics_export_port=9999)

and leave it running. In a second terminal, in an ipython window run

import ray
from ray.util import metrics

ray.init(address="auto")
error_counter = metrics.Count( 
            "fake_error_counter", 
            description=("The number of exceptions that have " 
                         "occurred in the backend."), 
            tag_keys=("backend", )) 
error_counter.set_default_tags({"backend": "FAKE BACKEND"}) 

error_counter.record(12345)

If you check localhost:9999, this metric never gets printed. However, if you now go back to the first ipython window and paste in a different custom metric (say fake_error_counter_2), that metric will successfully be printed.

Ray version and other system information (Python version, TensorFlow version, OS):
Ray nightly, Python 3.6, Mac OS Big Sur

Reproduction (REQUIRED)

Please provide a short code snippet (less than 50 lines if possible) that can be copy-pasted to reproduce the issue. The snippet should have no external library dependencies (i.e., use fake or mock data / environments):

If the code snippet cannot be run by itself, the issue will be closed with "needs-repro-script".

  • I have verified my script runs in a clean environment and reproduces the issue.
  • I have verified the issue also occurs with the latest wheels.

Metadata

Metadata

Labels

P1Issue that should be fixed within a few weeksbugSomething that is supposed to be working; but isn't

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions