Skip to content

Allow the shed_ operations to consume git URLs instead of paths.#174

Merged
jmchilton merged 6 commits intogalaxyproject:masterfrom
jmchilton:git_urls
May 1, 2015
Merged

Allow the shed_ operations to consume git URLs instead of paths.#174
jmchilton merged 6 commits intogalaxyproject:masterfrom
jmchilton:git_urls

Conversation

@jmchilton
Copy link
Member

create, upload, download, diff, and lint projects directly from Github - closes #169.

Inputs starting with git+ (e.g. git+/path/to/repo.git) with will have the git+ stripped and passed directly to the git command-line. Alternatively URLs starting with git: will be passed as is to the git command-line. This seems to roughly correspond to pip.

jmchilton added 6 commits May 1, 2015 14:51
create, upload, download, diff, and lint projects directly from Github - closes galaxyproject#169.

Inputs starting with ``git+`` (e.g. git+/path/to/repo.git) with will have the ``git+`` stripped and passed directly to the ``git`` command-line. Alternatively URLs starting with `git:` will be passed as is to the ``git`` command-line. This seems to roughly correspond to ``pip``.
jmchilton added a commit that referenced this pull request May 1, 2015
Allow the shed_ operations to consume git URLs instead of paths.
@jmchilton jmchilton merged commit 834da4e into galaxyproject:master May 1, 2015
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assuming we are only targeting github/bitbucket it would be faster to pull the latest tarball. With all the test data over time a git pull can become a bottle-neck.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this line is pulling Galaxy down for test/serve when used with --install_galaxy - which used to use a tar ball by default but now does a clone so that it can do incremental updates on each call very quickly.

It sounds like you mean for the actual shed stuff though - which could be optimized somewhat. I don't think we should just download the tars though because I would like to use the git metadata during upload (#170) and because this will need to support submodules at some point. I do think we could do clone with --depth=1 probably to speed things up though paired with creating a directory to cache repositories between calls (mirroring how Galaxy is quickly cloned).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair enough, thanks for explaining.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Shed Operations on Github URLs

2 participants