Skip to content

Capture telemetry for user perceived time to start jupyter server and start executing user code #2255

@DonJayamanne

Description

@DonJayamanne

Today we have the following:

  • Telemetry to capture time taken to start jupyter
    • We draw conclusions that this is how long it takes for user to run a cell
    • This is not correct, as we now start jupyter server in the background. Assume starting jupyter is 10 seconds, 30 seconds later user opens an nb and runs a cell. Cell would run in <1s, but telemetry reports 10s.
    • We have identified this as we're now using ExecuteCellPerceivedCold/Warm telemetry for nb
  • Using ExecuteCellPerceivedCold/Warm to draw the conclusion on how long it takes to run the first cell is still inaccurate
    • Assume majority of the users were to run a large complex cell that takes 20s.
    • This telemetry would report >20s, when in fact jupyter could start in 3s!

Suggestion (more accurate)

1. Create new telemetry to capture PerceivedJupyterStartup

  • In code we check time taken in InteractiveBase.submitCode from start of method to after this._notebook.executeObservable. We do this for first time code is executed.
  • I believe this accurately represents time taken for nb to be ready to execute user cells:
  • How long it takes for jupyter to start before it is ready to run user code (in notebook)

2. Create new telemetry to capture StartExecuteCellPerceivedCold

  • We could then make it more accurate to determine time taken for Jupyter to start running a user cell.
  • Previous suggestion + time till Jupyter reports it is busy (when its busy we know it is executing user code).
  • I think this is what we're after today, i.e. how long does it take our code to start running user code from the time they hit run

@jmew @rchiodo @IanMatthewHuff @DavidKutu /cc

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions