![]() ![]() We can confirm that by running: git ls-filesĪnd seeing that it contains the directories large files such as: big/0 directory listings, but not actual file contents). The above method downloads all Git tree objects (i.e. TODO also prevent download of unneeded tree objects Your branch is up to date with 'origin/master'. Remote: Total 253 (delta 0), reused 253 (delta 0), pack-reused 0 Remote: Total 3 (delta 0), reused 3 (delta 0), pack-reused 0 Remote: Compressing objects: 100% (3/3), done. Remote: Counting objects: 100% (3/3), done. On the above, git clone downloads a single object, presumably the commit: Cloning into 'test-git-partial-clone-big-small'.Īnd then the final checkout downloads the files we requested: remote: Enumerating objects: 3, done. So if you download anything you didn't want, you would get 100 MB extra, and it would be very noticeable. a small/ and small2/ subdirectories with 1000 files of size one byte eachĪll contents are pseudo-random and therefore incompressible, so we can easily notice if any of the big files were downloaded, e.g.9 on toplevel (this is because certain previous attempts would download toplevel files) a big/ subdirectory with 10x 10MB files.In this test, clone is basically instantaneous, and we can confirm that the cloned repository is very small as desired: du -apparent-size -hs *. This method doesn't work for individual files however, but here is another method that does: You could also select multiple directories for download with: git sparse-checkout set -no-cone small small2 Git clone -filter git sparse-checkout downloads only the required filesĮ.g., to clone only files in subdirectory small/ in this test repository: git clone -n -depth=1 -filter=tree:0 \Ĭd test-git-partial-clone-big-small-no-bigtree TL DR: Use all of -filter, sparse checkout and shallow clone to reduce the total download, or only use sparse checkout shallow clone if you don't care about the total download and just want that one directory however it may be obtained. Remote: Total 1 (delta 0), reused 1 (delta 0), pack-reused 0 Remote: Total 1 (delta 0), reused 0 (delta 0), pack-reused 0 Remote: Counting objects: 100% (1/1), done. ![]() ![]() % git checkout origin/master - UbuntuMono Remote: Total 52 (delta 1), reused 35 (delta 1), pack-reused 0 Remote: Compressing objects: 100% (49/49), done. Using the -filter checkout method in Ciro Santilli's answer really cuts down the size, but as mentioned there, downloads each blob one by one, which is slow: % git fetch -depth=1 -filter=blob:none Remote: Total 310 (delta 75), reused 260 (delta 71), pack-reused 0 The steps above ended up downloading some 11 MB, where the Ubuntu Fonts themselves are ~900 KB: % git pull -depth=1 origin master I had an occasion to test this again recently, trying to get only the Ubuntu Mono Powerline fonts. Instead of a normal git pull, try: git pull -depth=1 origin master You might be better off using a shallow clone. Should probably read the official documentation for sparse You might want to have a look at the extended tutorial and you git/info/sparse-checkoutĮcho "another/sub/tree" >. git/info/sparse-checkout, eg: echo "some/dir/" >. Now you need to define which files/folders you want to actually check Then do: git config core.sparseCheckout true This creates an empty repository with your remote, and fetches all Just add the remote without a fetch: git remote add origin Īnd then do a shallow fetch like described later. It will do a fetch, which will pull in the entire history. Since I'm quoting another post, I don't want to edit the quoted parts, but do not use -f with git remote add. The steps to do a sparse clone are as follows: mkdir You will end up downloading the entire history, so I don't see much benefit in it, but you can checkout specific parts using a "sparse" checkout. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |