Skip to content

Permanent elements are not always preserved #101

@stgm

Description

@stgm

Related to #80

Actual problem

Permanent elements are not always preserved because in certain circumstances the "previous page" is cached in a state where permanent elements have been replaced by meta tags. This means that when browsing "back", the incorrectly cached page is reinstated, and the incorrectly cached meta tags aren't replaced by the current contents of their associated permanent elements.

Circumstances

For me this only happens in Safari when I navigate towards a page containing a Youtube embed. It doesn't happen when I navigate to a very similar page that doesn't have such an embed. It is also completely consistent, so it always happens.

What does the log say

View.cacheSnapshot() is responsible for cloning the DOM of the old page. In general circumstances, the line making the clone executes at the very end of the page navigation sequence, right after Visit.loadResponse() is completely done.

With the Youtube embed present, the cloning code from View.cacheSnapshot() is executed much earlier, right when the new body is assigned: assignNewBody() -> replaceElementWithElement() -> parentElement.replaceChild() (and before replaceElementWithElement() returns).

What does the debugger say

Unfortunately, not much. When stepping into the parentElement.replaceChild() call, execution immediately jumps to the cloning part of View.cacheSnapshot().

Also, no Youtube JS is encountered in the debugger until after Turbo finishes the page transition.

Interestingly, if the new page itself contains a script tag, it will be executed right after the parentElement.replaceChild() call. This does not affect in any way the moment that the cloning code is run: that still depends only on the presence of the Youtube embed.

await vs setTimeout

The old page DOM is supposed to be cached after rendering the new page because this speeds up presenting the new page. This is effected by splitting off the cloning code as a new task using await. Previously, this was done using setTimeout().

As it happens, putting back setTimeout() does indeed ensure that the cloning always happens at the end of the navigation sequence.

Conclusion

For some reason, it seems that calling replaceChild() for the body, when the new body contains a Youtube embed, yields control to queued code.

Is this the Micro task queue vs the normal task queue? But why would it be triggered by the Youtube embed and not generally?

I would appreciate if anyone has any pointers on how to debug this further!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions