Instantaneous (10k files small files):
git clone https://github.com/cirosantilli/test-github-many-issue-templates
Takes forever, seems to fetch one file at a time:
git clone --depth 1 --filter=blob:none --no-checkout https://github.com/cirosantilli/test-github-many-issue-templates
cd test-github-many-issue-templates
git checkout master -- .github/ISSUE_TEMPLATE
because you can see in the stdout files coming out one at a time every 0.5s:
remote: Enumerating objects: 1, done.
remote: Counting objects: 100% (1/1), done.
remote: Total 1 (delta 0), reused 1 (delta 0), pack-reused 0
Receiving objects: 100% (1/1), 67 bytes | 67.00 KiB/s, done.
remote: Enumerating objects: 1, done.
remote: Counting objects: 100% (1/1), done.
remote: Total 1 (delta 0), reused 1 (delta 0), pack-reused 0
Receiving objects: 100% (1/1), 67 bytes | 67.00 KiB/s, done.
remote: Enumerating objects: 1, done.
remote: Counting objects: 100% (1/1), done.
remote: Total 1 (delta 0), reused 1 (delta 0), pack-reused 0
Receiving objects: 100% (1/1), 68 bytes | 68.00 KiB/s, done.
remote: Enumerating objects: 1, done.
This makes partial clones unusable for most cases, even 100 files would be almost unbearable.
Tested with git version 2.27.0.
Context: https://stackoverflow.com/questions/600079/how-do-i-clone-a-subdirectory-only-of-a-git-repository/52269934#52269934
cc. @derrickstolee due to recent related blog post: https://github.blog/author/dstolee/ I have only read the blog very in passing, sorry if already addressed.