Skip to content
This repository was archived by the owner on Apr 1, 2023. It is now read-only.

Synchronize audio streams based on their frequencies #8

@marioortizmanero

Description

@marioortizmanero

Issue to keep track of the progress on audio synchronziation, inspired by this project and thanks to pinkman on the AUR comments, who linked it.

Spotify has an audio analysis feature that could give more information about the song It can provide us with rythm, pitch and timbre information.

But if we rely on the Spotify API then we won't be able to implement this for Linux (DBus API) users.

And the YouTube video is played with VLC directly, so the videos aren't fully downloaded and it may make the audio analysis harder to do. We would need a more dynamic way to do it on the go. That, or change how the videos are played.

I think the bigger issue right now is the latter. How can we analyze a video that isn't even downloaded yet?

EDIT: This is being worked on at https://github.com/marioortizmanero/vidify-audiosync

Metadata

Metadata

Labels

enhancementImproving an already existing featurehelp wantedExtra attention is needed

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions