358

How can I update multiple git repositories from their shared parent's directory without cd'ing into each repo's root directory? I have the following which are all separate git repositories (not submodules):

/plugins/cms
/plugins/admin
/plugins/chart

I want to update them all at once or at least simplify my current workflow:

cd ~/plugins/admin
git pull origin master
cd ../chart
git pull

etc.

6
  • 20
    What's wrong with find -name .git -execdir git pull \;? Commented Jul 10, 2016 at 18:16
  • what about git do pull Commented Jun 1, 2017 at 22:54
  • The same question answered for hg mercurial. Commented Feb 14, 2018 at 15:13
  • 15
    find . -name .git -print -execdir git pull \; is OK. -print will echo the current dir. Commented Dec 29, 2018 at 9:18
  • See also, with Git 2.30 (Q4 2020), the [new git for-each-repo command] (stackoverflow.com/a/65766304/6309) Commented Jan 17, 2021 at 21:50

16 Answers 16

447

Run the following from the parent directory, plugins in this case:

find . -type d -depth 1 -exec git --git-dir={}/.git --work-tree=$PWD/{} pull origin master \;

To clarify:

  • find . searches the current directory
  • -type d to find directories, not files
  • -depth 1 for a maximum depth of one sub-directory
  • -exec {} \; runs a custom command for every find
  • git --git-dir={}/.git --work-tree=$PWD/{} pull git pulls the individual directories

To play around with find, I recommend using echo after -exec to preview, e.g.:

find . -type d -depth 1 -exec echo git --git-dir={}/.git --work-tree=$PWD/{} status \;

Note: if the -depth 1 option is not available, try -mindepth 1 -maxdepth 1.

Sign up to request clarification or add additional context in comments.

14 Comments

find: warning: you have specified the -depth option after a non-option argument -type, but options are not positional (-depth affects tests specified before it as well as those specified after it). Please specify options before other arguments.
I used find . -maxdepth 1 -type d -print -execdir git --git-dir={}/.git --work-tree=$PWD/{} pull origin master \; to output the name of the folder before doing the pull, to get rid of the warning and to only run the pull on subfolders.
replacing 'pull origin master' with fetch origin master:master tells git to explicitly update your 'master' branch with origin's master branch. This will not do a merge, any commits to master will be lost if you do this.
since git 1.8.5 it is possible to replace --git-dir and --work-tree by the -C option, see this question. -- I'm using find . -mindepth 1 -maxdepth 1 -type d -print -exec git -C {} pull \;
@ZsoltSzilagy as mentioned by @Rystraum you can use -maxdepth 1 instead of -depth 1
|
322
ls | xargs -I{} git -C {} pull

To do it in parallel:

ls | xargs -P10 -I{} git -C {} pull

12 Comments

Nice! I've put it as an alias in my .gitconfig: all = "!f() { ls | xargs -I{} git -C {} $1; }; f" Now I can do git all pull, git all "checkout master" etc.
Cleaned up a bit, will search all directories recursively for only git repos, and will strip out colors in case you have ls aliased ls -R --directory --color=never */.git | sed 's/\/.git//' | xargs -P10 -I{} git -C {} pull
I smashed some of the answers together to create this for git on macOS that filters on folder that contain a .git folder, and lets you run arbitrary commands like git all fetch --prune: git config --global alias.all '!f() { ls -R -d */.git | sed 's,\/.git,,' | xargs -P10 -I{} git -C {} $1; }; f'
@AWrightIV actually, ls -R -d */.git is returning a filtered list of the directories within the current folder that contain a .git directory. That way, when I run something like git all fetch, it only executes against subfolders that have .git folders. It's an answer to the original question, but it tries to be a bit more efficient by not assuming all the subdirectories are git repos.
A slight improvement over borisdiakur command, to avoid running on . and to have a printed list of on which directory it's running at each instant: git config --global alias.all '!f() { ls -R -d */.git | xargs -I{} bash -c "echo {} && git -C {}/../ $1"; }; f'
|
137

A bit more low-tech than leo's solution:

for i in */.git; do ( echo $i; cd $i/..; git pull; ); done

This will update all Git repositories in your working directory. No need to explicitly list their names ("cms", "admin", "chart"). The "cd" command only affects a subshell (spawned using the parenthesis).

10 Comments

Exactly what I was looking for
This one has the advantage of displaying which repository it is dealing with, usefeul when something goes wrong (missing branch, un-available remote..)
I like this solution because it only pulls on sub-directories that are a git repo, thx!
Alternate using git -C: for i in */.git; do git -C $i pull; done
This is the most straightforward solution IMO. Just add a '&' to the end of git pull to make it async so you're not waiting for each pull to complete before looping to the next item: for i in */.git; do ( echo $i; cd $i/..; git pull &) ; done
|
65

Actually, if you don't know if the subfolders have a git repo or not, the best would be to let find get the repos for you:

find . -type d -name .git -exec git --git-dir={} --work-tree=$PWD/{}/.. pull origin master \;

The PowerShell equivalent would be:

Get-ChildItem -Recurse -Directory -Hidden -Filter .git | ForEach-Object { & git --git-dir="$($_.FullName)" --work-tree="$(Split-Path $_.FullName -Parent)" pull origin master }

6 Comments

This works with my macOS.
Ensure all your git repos are on master before executing this as written. Otherwise you may be unintentionally merging master into your current branch.
Thank you for the powershell version, still works with PS 7.10
For PS 2.0 this works: gci | where {$_.Attributes -match'Directory'} | foreach { write-host $_.fullname; push-location $_; & git pull; & cd ..}
Note - I could not find a hidden parameter, but Force seemed to work, as in Get-ChildItem -Path "C:\Repos" -Recurse -Directory -Force -Filter ".git" -Depth 1
|
34

I use this one:

find . -name ".git" -type d | sed 's/\/.git//' |  xargs -P10 -I{} git -C {} pull

Universal: Updates all git repositories that are below current directory.

1 Comment

It's dangerous to use --git-dir without --work-tree which is why the -C shortcut was created. I just messed up my home directory because of this. I would recommend just doing find . -maxdepth 8 -type d -name .git | xargs -P8 -I{} git -C {}/../ fetch --all
16

None of the top 5 answers worked for me, and the question talked about directories.

This worked:

for d in *; do pushd $d && git pull && popd; done

4 Comments

For Windows, see my answer here: stackoverflow.com/a/51016478/207661. It's very similar to above.
The accepted answer used to work for me, but it quit sometime in the last year. I finally decided to look for another solution and found your answer. Thanks for sharing. It works perfectly.
no need to push and pup, just use git's -C option.
Another option is to use a subshell: for d in *; do (cd $d && git pull --ff-only); done This has an advantage over -C as this approach is universally applicable to programs which don't have such option. Note that --ff-only is good to have in case of automatic updates; I use it to update vim plugins if native vim package management is used.
14

This should happen automatically, so long as cms, admin and chart are all parts of the repository.

A likely issue is that each of these plugins is a git submodule.

Run git help submodule for more information.

EDIT

For doing this in bash:

cd plugins
for f in cms admin chart
do 
  cd $f && git pull origin master && cd ..
done

6 Comments

No, sorry you misunderstood. Each of those directories are a separate git repository. /plugins is not a repository
Ahhh. My mistake. Will give you the bash solution in a minute.
There you go. If you want to return to the parent directory, just run another cd .. afterwards.
Or use pushd and popd or put the group of commands in a subshell (when the subshell exits, you'll be left in the original directory). (cd dir; for ... done)
Out of curiousity - why are aren't you using ssh keys instead?
|
13

The mr utility (a.k.a., myrepos) provides an outstanding solution to this very problem. Install it using your favorite package manager, or just grab the mr script directly from github and put it in $HOME/bin or somewhere else on your PATH. Then, cd to the parent plugins folder shared by these repos and create a basic .mrconfig file with contents similar to the following (adjusting the URLs as needed):

# File: .mrconfig
[cms]
checkout = git clone 'https://<username>@github.com/<username>/cms' 'cms'

[admin]
checkout = git clone 'https://<username>@github.com/<username>/admin' 'admin'

[chart]
checkout = git clone 'https://<username>@github.com/<username>/chart' 'chart'

After that, you can run mr up from the top level plugins folder to pull updates from each repository. (Note that this will also do the initial clone if the target working copy doesn't yet exist.) Other commands you can execute include mr st, mr push, mr log, mr diff, etc—run mr help to see what's possible. There's a mr run command that acts as a pass-through, allowing you to access VCS commands not directly suported by mr itself (e.g., mr run git tag STAGING_081220015). And you can even create your own custom commands that execute arbitrary bits of shell script targeting all repos!

mr is an extremely useful tool for dealing with multiple repos. Since the plugins folder is in your home directory, you might also be interested in vcsh. Together with mr, it provides a powerful mechanism for managing all of your configuration files. See this blog post by Thomas Ferris Nicolaisen for an overview.

Comments

12

Most compact method, assuming all sub-dirs are git repos:

ls | parallel git -C {} pull

2 Comments

can't find command parallel. reference?
@LeonTepe This tool is usually included with moreutils package.
11

My humble construction that

  • shows the current path (using python, convenient and just works, see How to get full path of a file?)
  • looks directly for .git subfolder: low chance to emit a git command in a non-git subfolder
  • gets rid of some warnings of find

as follow:

find . \
    -maxdepth 2 -type d \
    -name ".git" \
    -execdir python -c 'import os; print(os.path.abspath("."))' \; \
    -execdir git pull \;

Of course, you may add other git commands with additional -execdir options to find, displaying the branch for instance:

find . \
    -maxdepth 2 -type d \
    -name ".git" \
    -execdir python -c 'import os; print(os.path.abspath("."))' \; \
    -execdir git branch \;
    -execdir git pull \;

2 Comments

Not sure why there's net downvotes on this answer. This was the most helpful, imo, since I could go to a greater max depth and not keep hitting non-git repos. I used this to run git gc on all all the repositories in my "developer" repo.
I also like this answer as it's easy to read and very easy to add multiple commands.
5

You can try this

find . -type d -name .git -exec sh -c "cd \"{}\"/../ && pwd && git pull" \;

Also, you can add your customized output by adding one more && argument like.

find . -type d -name .git -exec sh -c "cd \"{}\"/../ && pwd && git pull && git status" \;

Comments

5

gitfox is a tool to execute command on all subrepos

npm install gitfox -g
g pull

4 Comments

what's g? not everyone has the same alias as yours
@LưuVĩnhPhúc gitfox installs itself under the alias "g" for some reason (though the help message says "gitfox"). Personally I do not think it's a command important enough to claim such a shortcut but ah well. It does the job, though.
@LưuVĩnhPhúc Check the source repo for the usage github.com/eqfox/gitfox
how to make it goes down of another level? it seems it stops to the first level under the current directory
5

I combined points from several comments and answers:

find . -maxdepth 1 -type d -name .git -execdir git pull \;

Comments

2

I use this

for dir in $(find . -name ".git")
do cd ${dir%/*}
    echo $PWD
    git pull
    echo ""
    cd - > /dev/null
done

Github

Comments

0

Original answer 2010:

If all of those directories are separate git repo, you should reference them as submodules.

That means your "origin" would be that remote repo 'plugins' which only contains references to subrepos 'cms', 'admin', 'chart'.

A git pull followed by a git submodule update would achieve what your are looking for.


Update January 2016:

With Git 2.8 (Q1 2016), you will be able to fetch submodules in parallel (!) with git fetch --recurse-submodules -j2.
See "How to speed up / parallelize downloads of git submodules using git clone --recursive?"

2 Comments

See also stackoverflow.com/questions/1979167/git-submodule-update and stackoverflow.com/questions/1030169/…: git submodule foreach git pull can also be of interest.
Note: to clarify, 'plugins', which is not a git repo at the moment, should be made one, as a parent git repo for the submodules.
-1

If you have a lot of subdirs with git repositories, you can use parallel

ls | parallel -I{} -j100 '
  if [ -d {}/.git ]; then
    echo Pulling {}
    git -C {} pull > /dev/null && echo "pulled" || echo "error :("
  else
     echo {} is not a .git directory
  fi
'

1 Comment

Great one. I modified a bit to suit my style. pastebin

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.